names
stringlengths 1
98
| readmes
stringlengths 8
608k
| topics
stringlengths 0
442
| labels
stringclasses 6
values |
---|---|---|---|
palisade | build push to ipfs https github com compound finance palisade actions workflows build workflow yml badge svg compound web3 front end palisade is the web3 front end experience to interact with the compound ethereum protocol contributing we welcome contributions from the community to help keep the compound web3 front end working great you can read more here about how to contribute contributing md configuration the web3 front end requires several items to be configured before it can be started properly the required format is of the form of several json files that specify config options between local development development json and the version that is intended to be deployed production json the local and deployment scripts automatically look for those files to exist in the path config env config env development json production json the following is an example configuration file json api base url map v3 api https v3 api compound finance data providers development http localhost 8545 goerli https goerli infura io v3 your project id rinkeby https rinkeby infura io v3 your project id kovan https kovan infura io v3 your project id ropsten https ropsten infura io v3 your project id mainnet https mainnet infura io v3 your project id network map mainnet 1 ropsten 3 rinkeby 4 goerli 5 kovan 42 development 999 default network mainnet blocknative api key your blocknative key wallet connect project id your wallet connect project id each of the top level keys have the following functions api base url map object mapping of eth network name as key and value being the desired compound api host this can be left as is data providers object mapping of eth network name as key and value being the url of a corresponding json rpc host this example shows infura as a sample json rpc provider and you can find more information here https infura io docs ethereum note this can be specified by setting in the env var data providers as json e g export data providers rinkeby https infura io network map object mapping of eth network name as key and value being the corresponding networkid value this can be left as is blocknative api key blocknative api key required to track transaction notifications you can find more information here https docs blocknative com notify note this can be specified by setting the env var blocknative api key this key is not strictly required but provides a better user experience wallet connect project id wallect connect project id required to use wallet connect as a wallet type in the app you can find more information here https docs walletconnect com 2 0 note this can be specified by setting the env var wallet connect project id this id is not required unless you want to enable wallet connect usage getting started the compound web3 front end is written in elm http elm lang org and was bootstrapped with create elm app https github com halfzebra create elm app we strongly recommmend getting familiar with the elm framework before jumping into the compound source code to get started first clone this repo bash git clone https github com compound finance palisade git cd palisade next install yarn dependencies note you should not use npm intsead of yarn during install because npm does not respect yarn lock but you should be able to use npm for the other commands bash yarn install lock file next build and watch for string translation changes bash yarn watch i18n note for more information on string translations see i18n md i18n md next build and watch for sass changes bash yarn watch css or if you prefer just to build your css once run yarn build css and separately start your development server for the front end bash yarn start note elm may take a while to pull in dependencies when you first run the app at this point you should be able to navigate to http localhost 3000 http localhost 3000 to view your application deployment this application is set up for easy deployment as a static web site generic deployment to deploy this application first build your static assets bash yarn run build css yarn run build now the build directly should contain all of the files necessary to serve your application from whatever hosting provider you choose this repo includes support for two options as possible deployment targets ipfs and google cloud storage ipfs to deploy the web3 front end on ipfs you first should be familiar with hosting a single page website on ipfs https docs ipfs io how to websites on ipfs single page website follow the instructions and you should be able to add all the files in the build directory and obtain a ipfs hash which you can then open on any gateway provider to view the hosted web3 front end alternatively you may wish to deploy to an ipfs hosting service like infura ipfs https infura io docs ipfs section getting started this repo includes a script to deploy the build directory to an ipfs host specified by several environment variables to deploy a build to infura ipfs bash ipfs auth project id project secret ipfs host ipfs infura io ipfs port 5001 yarn deploy ipfs each of environment variables have the following functions ipfs auth basic authentication for header for using the infura ipfs add endpoint you can find more information here https infura io docs ipfs section authentication ipfs host ipfs pinning service host ipfs port ipfs pinning service host port note the deploy ipfs script has been tested and used with infura ipfs you may need a few changes to support alternative pinning services google cloud storage to deploy the compound web3 front end to google cloud storage you should be familar with hosting a static site https cloud google com storage docs hosting static website on google cloud follow the instructions on creating a cname record with your dns provider and creating a fully public bucket also make sure to have the cloud sdk https cloud google com sdk tools installed that you re logged in via gcloud auth and that you have your correct project set https cloud google com sdk gcloud reference config set to deploy to a gcloud bucket bash yarn deploy gcloud your bucket name internationalization to learn more about internationalization please view i18n md i18n md discussion for any concerns with the web3 front end open an issue or visit us on discord https discord com invite compound to discuss license copyright 2022 compound labs inc and repository contributors this repository is licensed under gplv3 please see license license for the full text of the license all contributors to this repository must release contributed code under this gplv3 license free of any other encumbrance contributors also agree that contributions may be re licensed under mit or bsd 3 licenses in the future without notice in such instances all copyright notices will be retained | ethereum web3 defi | front_end |
nlprov | p align center img width 250 src https github com johnsonandjohnson nlprov raw main images nlplogo png p nlprov natural language processing tool p a href https github com johnsonandjohnson nlprov actions img src https github com johnsonandjohnson nlprov workflows build 20test 20and 20package badge svg alt a a href https codecov io gh johnsonandjohnson nlprov img src https codecov io gh johnsonandjohnson nlprov branch main graph badge svg token hfbf8id1na alt a p nlprov is a python library developed by johnson johnson s advanced analytics team that combines existing libraries for common natural language processing tasks it combines several existing open source libraries such as pandas spacy and scikit learn to make a pipeline that is ready to process text data there are many user defined parameters depending on your type of project such as the ability to choose stemming or lemmatization or you might want to define explicitly what to substitute with nan text fields overall it is a way to get you started in your nlp task no matter what you need a tutorial on how to use this package can be found here tutorial ipynb installation instructions using pip with python version 3 7 or higher shell pip install nlprov for more information on installing packages using pip click here https pip pypa io en stable reference pip install contributing to help develop this package you ll need to install a conda virtual environment defined by our dev environment yml file using the below command shell conda env create f dev environment yml then just activate the environment when attempting to develop or run tests using the below command shell conda activate nlp env when you re all done developing or testing just deactivate the environment with the below command shell conda deactivate docker configuration this codebase is dockerized to build run all of the unit tests using pytest and perform pip packaging in order to run the docker container ensure you have docker https www docker com products docker desktop installed and running on your local machine to start the docker container locally simply navigate to the root of the project directory and type shell docker compose up build note docker compose is included in the docker desktop installation link above for macos and windows based systems if you have issues executing docker compose navigate here https docs docker com compose install to ensure docker compose is supported on your system a notey er note you can use docker compose up build during development to quickly run the tests after code changes without setting up running a local conda environment github action ci configuration every commit to this repository will trigger a build in github actions following the github workflows pythonapp yml located in the root of this project github actions is used to build and lint the nlprov package run the tests and perform pip packaging if the environment name or version changes the pythonapp yml file will need to be updated to follow the new pattern our workflow our methods and tools style guide pep8 pycodestyle https www python org dev peps pep 0008 git strategy git flow https www atlassian com git tutorials comparing workflows gitflow workflow upcoming features here is a roadmap of features to be implemented in this package if you have any ideas for additional features please let us know preprocessing ability to use custom stop words incorporation of bi grams ability for user to chose which langauge detection package to use vectorization spacy pre trained models spacy custom models similarity metrics additional pairwise distances levenshtein distance word mover s distance visualizations tf idf jaccard | ai |
|
004-movie-industry | 004 movie industry original dataset from kaggle danielgrijalvas movies https www kaggle com danielgrijalvas movies github danielgrijalvas movie stats https github com danielgrijalva movie stats about the movie industry dataset this dataset about the movie industry was scraped from imdb and published on kaggle https www kaggle com danielgrijalvas movies and github https github com danielgrijalva movie stats br by danielgrijalvas the dataset contains 7 668 movies from 1980 to 2020 7 668 observations and 15 variables br click here movie industry csv original dataset for more details overview from the beginning of the film industry to the present many movies from different studios are released every year producing and shooting one movie requires a huge budget in terms of business if we invest in something what is expected is profit and so is the film industry however the difference is in the factors that affect the revenue of movies such as feedbacks ratings reviews film directors screenwriters and many more we selected this dataset to analyze the factors affecting the profits of individual films changes in movie budgets duration of the movie and the revenue of movies over the past four decades this dataset is obtained from kaggle explored in microsoft excel cleaned and analyzed by r language in r studio in addition the data was analyzed to determine the correlation of the data steps 1 define questions 2 explore data from the original dataset 3 data cleaning and data transformation 4 exploratory data analysis 5 analytical inferential statistics 6 data visualization tools r studio desktop https www rstudio com microsoft excel https www microsoft com en us microsoft 365 excel power bi https powerbi microsoft com en us desktop languages r language https www r project org table of contents 1 define questions https docs google com document d 1w2onje5 9l69huw4t2cg gm9jm1zd9pz5xbpab oay4 edit usp sharing 2 exploratory data movie industry data exploration 2 cleaning data and transformation movie industry data cleaning 4 data analysis movie industry data analysis 5 analytical inferential statistics movie industry hypothesis testing 6 data visualization bi dashboard https app powerbi com view r eyjrijoiyzfjmjuxoditymmzos00mdzlltg0nmityti4y2zhmzywndviiiwidci6ijzmndqzmmrjltiwzditndqxzc1imwrilwfjmzm4mgjhnjmzzcisimmiojewfq 3d 3d pagename reportsection resources important files in repository original dataset movies original csv movie industry csv original dataset movies original csv cleaned dataset movies cleaned csv movie industry csv cleaned dataset movies cleaned csv important folders in repository folder link exploratory dataset click here movie industry data exploration cleaning dataset click here movie industry data cleaning data analysis click here movie industry data analysis hypothesis testing click here movie industry hypothesis testing data visualization click here movie industry data visualization about us the work is part of int214 statistics for information technology br semester 1 2021 school of information technology kmutt team 39 42 46 48 53 group no name student id 1 denphum nakglam 63130500039 2 songglod petchamras 63130500042 3 thanakrit paithun 63130500046 4 thanaphon sukkasem 63130500048 5 thanatorn roswan 63130500053 instructors atchara tran u raikul jatawat xie git safesit23 https github com safesit23 | statistics int214 data-analysis powerbi | server |
deep-pwning | p align center img src https github com cchio deep pwning blob master dpwn repo assets dpwn splash png raw true alt deep pwning splash p deep pwning is a lightweight framework for experimenting with machine learning models with the goal of evaluating their robustness against a motivated adversary note that deep pwning in its current state is no where close to maturity or completion it is meant to be experimented with expanded upon and extended by you only then can we help it truly become the goto penetration testing toolkit for statistical machine learning models background researchers have found that it is surprisingly trivial to trick a machine learning model classifier clusterer regressor etc into making an objectively wrong decisions this field of research is called adversarial machine learning https people eecs berkeley edu tygar papers sml2 adversarial aisec pdf it is not hyperbole to claim that any motivated attacker can bypass any machine learning system given enough information and time however this issue is often overlooked when architects and engineers design and build machine learning systems the consequences are worrying when these systems are put into use in critical scenarios such as in the medical transportation financial or security related fields hence when one is evaluating the efficacy of applications using machine learning their malleability in an adversarial setting should be measured alongside the system s precision and recall this tool was released at def con 24 in las vegas august 2016 during a talk titled machine duping 101 pwning deep learning systems https www defcon org html defcon 24 dc 24 speakers html chio structure this framework is built on top of tensorflow https www tensorflow org and many of the included examples in this repository are modified tensorflow examples obtained from the tensorflow github repository https github com tensorflow tensorflow all of the included examples and code implement deep neural networks but they can be used to generate adversarial images for similarly tasked classifiers that are not implemented with deep neural networks this is because of the phenomenon of transferability in machine learning which was papernot et al expounded expertly upon in this paper https arxiv org abs 1605 07277 this means means that adversarial samples crafted with a dnn model a may be able to fool another distinctly structured dnn model b as well as some other svm model c this figure taken from the aforementioned paper papernot et al shows the percentage of successful adversarial misclassification for a source model used to generate the adversarial sample on a target model upon which the adversarial sample is tested p align center img src https github com cchio deep pwning blob master dpwn repo assets transferability misclassificaiton matrix png raw true alt transferability misclassificaiton matrixh p components deep pwning is modularized into several components to minimize code repetition because of the vastly different nature of potential classification tasks the current iteration of the code is optimized for classifying images and phrases using word vectors these are the code modules that make up the current iteration of deep pwning 1 drivers the drivers are the main execution point of the code this is where you can tie the different modules and components together and where you can inject more customizations into the adversarial generation processes 2 models this is where the actual machine learning model implementations are located for example the provided lenet5 model definition is located in the model function witihn lenet5 py it defines the network as the following input convolutional layer 1 max pooling layer 1 convolutional layer 2 max pooling layer 2 dropout layer softmax layer output p align center img src https github com cchio deep pwning blob master dpwn repo assets lenet5 png raw true alt lenet 5 p lecun et al lenet 5 convolutional neural network http yann lecun com exdb lenet 3 adversarial advgen this module contains the code that generates adversarial output for the models the run function defined in each of these advgen classes takes in an input dict that contains several predefined tensor operations for the machine learning model defined in tensorflow if the model that you are generating the adversarial sample for is known the variables in the input dict should be based off that model definition else if the model is unknown black box generation a substitute model should be used implemented and that model definition should be used variables that need to be passed in are the input tensor placeholder variables and labels often refered to as x input and y labels the model output often refered to as y conv and the actual test data and labels that the adversarial images will be based off of 4 config application configurations 5 utils miscellaneous utilities that don t belong anywhere else these include helper functions to read data deal with tensorflow queue inputs etc these are the resource directories relevant to the application 1 checkpoints tensorflow allows you to load a partially trained model to resume training or load a fully trained model into the application for evaluation or performing other operations all these saved checkpoints are stored in this resource directory 2 data this directory stores all the input data in whatever format that the driver application takes in 3 output this is the output directory for all application output including adversarial images that are generated getting started installation please follow the directions to install tensorflow found here https www tensorflow org versions r0 8 get started os setup html which will allow you to pick the tensorflow binary to install bash pip install r requirements txt execution example with the mnist driver to restore from a previously trained checkpoint configuration in config mnist conf bash cd dpwn python mnist driver py restore checkpoint to train from scratch note that any previous checkpoint s located in the folder specified in the configuration will be overwritten bash cd dpwn python mnist driver py task list implement saliency graph method of generating adversarial samples add defense module to the project for examples of some defenses proposed in literature upgrade to tensorflow 0 9 0 add support for using pretrained word2vec model in sentiment driver add svm logistic regression support in models example that uses them add non image and non phrase classifier example add multi gpu training support for faster training speeds requirements tensorflow 0 8 0 https www tensorflow org versions r0 8 get started os setup html matplotlib 1 5 1 http matplotlib org faq installing faq html numpy 1 11 1 https pypi python org pypi numpy pandas 0 18 1 http pandas pydata org pandas docs stable install html six 1 10 0 https pypi python org pypi six note that dpwn requires tensorflow 0 8 0 tensorflow 0 9 0 introduces some contributing borrowed from the amazing requests repository https github com kennethreitz requests by kennethreitz check for open issues or open a fresh issue to start a discussion around a feature idea or a bug fork the repository on github to start making your changes to the master branch or branch off of it write a test which shows that the bug was fixed or that the feature works as expected send a pull request and bug the maintainer until it gets merged and published make sure to add yourself to authors md acknowledgements there is so much impressive work from so many machine learning and security researchers that directly or indirectly contributed to this project and inspired this framework this is an inconclusive list of resources that was used or referenced in one way or another papers szegedy et al intriguing properties of neural networks https arxiv org abs 1312 6199 papernot et al the limitations of deep learning in adversarial settings https arxiv org abs 1511 07528 papernot et al practical black box attacks against deep learning systems using adversarial examples https arxiv org abs 1602 02697 goodfellow et al explaining and harnessing adversarial examples https arxiv org abs 1412 6572 papernot et al transferability in machine learning from phenomena to black box attacks using adversarial samples https arxiv org abs 1605 07277 grosse et al adversarial perturbations against deep neural networks for malware classification http arxiv org abs 1606 04435 nguyen et al deep neural networks are easily fooled high confidence predictions for unrecognizable images http arxiv org abs 1412 1897 xu et al automatically evading classifiers a case study on pdf malware classifiers https www cs virginia edu evans pubs ndss2016 kantchelian et al evasion and hardening of tree ensemble classifiers https arxiv org abs 1509 07892 biggio et al support vector machines under adversarial label noise http www jmlr org proceedings papers v20 biggio11 biggio11 pdf biggio et al poisoning attacks against support vector machines http arxiv org abs 1206 6389 papernot et al distillation as a defense to adversarial perturbations against deep neural networks http arxiv org abs 1511 04508 ororbia ii et al unifying adversarial training algorithms with flexible deep data gradient regularization http arxiv org abs 1601 07213 jin et al robust convolutional neural networks under adversarial noise http arxiv org abs 1511 06306 goodfellow et al deep learning adversarial examples clarifying misconceptions http www kdnuggets com 2015 07 deep learning adversarial examples misconceptions html code tensorflow tensorflow https github com tensorflow tensorflow wildml implementing a cnn for text classification in tensorflow http www wildml com 2015 12 implementing a cnn for text classification in tensorflow wgrathwohl captcha crusher https github com wgrathwohl captcha crusher josecl cool php captcha https github com josecl cool php captcha datasets krizhevsky et al the cifar 10 dataset https www cs toronto edu kriz cifar html lecun et al the mnist database of handwritten digits http yann lecun com exdb mnist pang et al movie review data v2 0 from rotten tomatoes http www cs cornell edu people pabo movie review data | ai |
|
blockchain-core | bitchian https user images githubusercontent com 119077822 208686842 6175b6a9 d652 4a4d aa91 5def6a55ad81 png audit report https github com bitindi bitindi blob main audit repots bitindichain audit report pdf introduction to bitindi chain bitindi chain bitindi is a decentralized high efficiency and energy saving public chain it is compatible with smart contracts and supports high performance transactions the endogenous token of bitindi is bni and it adopts the bpos consensus mechanism declaration to help developers evolvement at every stage vision technological innovation is the driving force behind the advancement of the blockchain industry but many innovative projects have been misunderstood and ignored at their early stages we have witnessed the growth process of great projects recalling that ethereum and polkadot were questioned as altcoins in the early days they all went through difficult times therefore bitindi s mission is not only a public chain but also to focus on the discovery and support of high potential developers and innovative projects relying on the world s largest trading ecosystem bitindi is committed to becoming the birthplace of innovative technologies and innovative businesses and building a complete ecological loop of technology development application promotion and trading bitindi s performance tps 2500 average block interval 3s consensus mechanism bpos consensus mechanism it has the characteristics of low transaction cost low transaction delay and high transaction concurrency the maximum number of validators supported is 21 economic model the endogenous token on the chain is bni the transactions consume bni as gas fee miners pledge bni to become validator nodes the reward of nodes is gas fee which is distributed according to the mortgage proportion cross chain assets such as btc eth and stable coins can be mapped to bitindi by an asset bridge the realization method is to lock a certain amount of tokens on the original chain then generate a corresponding number of tokens on bitindi hbitindi encourages developers to provide more decentralized cross chain solution meta transaction function the meta transaction function is supported which allows users to reduce gas fees step wise and bitindi will cover the payment of the reduced part the meta transaction function allows to minimize the migration cost of dapp developers as well as to effectively reduce the cost of dapp users bitindi technical characteristics an open and decentralized network to maintain the security of the network and assets support the programmability of evm the compatibility of smart contracts to reduce development or migration costs meta transaction function gas fee reduction effectively reducing the cost of developers and users on the chain support cross chain asset transfer to optimize users experience four stages of bitindi table tr style background rgba 0 0 0 0 th colspan 5 bitindi technical route th tr tr style background rgba 0 0 0 0 th stage th th features th th time th th sub stage th th technical points th tr tr style background rgba 0 0 0 0 tr style background rgba 0 0 0 0 td rowspan 9 tinder td td rowspan 9 the initial version of bitindi the system is stable and easy to use developers can develop and promote dapp at low cost users can participate in dapp on bitindi with a low threshold td td rowspan 9 2022 q4 2023 q1 td td rowspan 3 public beta td td higher transaction performance td tr tr style background rgba 0 0 0 0 td lower transaction costs td tr tr style background rgba 0 0 0 0 td meta transaction subsidy td tr tr style background rgba 0 0 0 0 td rowspan 3 node election td td more decentralized and safer td tr tr style background rgba 0 0 0 0 td complete mainstream assets td tr tr style background rgba 0 0 0 0 td basic tools in place td tr tr style background rgba 0 0 0 0 td rowspan 3 ecosystem incubation td td technical service systemization td tr tr style background rgba 0 0 0 0 td basic tool customization td tr tr style background rgba 0 0 0 0 td convenient asset transfer td tr tr style background rgba 0 0 0 0 td rowspan 5 spark td td rowspan 5 the protocol is further optimized bitindi will take the mission of connecting cefi and defi allowing more users to use defi applications at a low threshold td td rowspan 5 2023 q1 td td rowspan 5 to be announced td td complete developer tools td tr tr style background rgba 0 0 0 0 td complete developer forum blog and faq information td tr tr style background rgba 0 0 0 0 td chain ecological infrastructure booms td tr tr style background rgba 0 0 0 0 td innovative open id td tr tr style background rgba 0 0 0 0 td personalized portal accurately matches users and dapps td tr tr style background rgba 0 0 0 0 td rowspan 3 flame td td rowspan 3 enable layer2 technology expand performance while retaining the decentralized advantages of distributed protocols td td rowspan 3 2023 q2 td td rowspan 3 to be announced td td application of layer2 td tr tr style background rgba 0 0 0 0 td cross chain interoperability protocol td tr tr style background rgba 0 0 0 0 td cross chain interoperability integration td tr tr style background rgba 0 0 0 0 td rowspan 5 blaze td td rowspan 5 landing of large scale commercial applications support a variety of traditional businesses to run smoothly on the chain td td rowspan 5 2023 q4 td td rowspan 5 to be announced td td multiple virtual machine s supported td tr tr style background rgba 0 0 0 0 td multiple zero knowledge proofs and privacy protection capabilities td tr tr style background rgba 0 0 0 0 td multiple signature schemes td tr tr style background rgba 0 0 0 0 td storage compression and expansion solution td tr tr style background rgba 0 0 0 0 td multi dimensional sharding scheme td tr table current stage of bitindi in december 2022 the bitindi chain bitindi officially launched its tinder phase which will focus on improving the on chain infrastructure including but not limited to oracles voting tools anchor coins dex lending financial management insurance synthetic assets cross chain solutions data analysis smart contract innovation etc in jan 2023 bitindi will entered the spark phase which focuses on improving developer and user experience and infrastructure support plan of bitindi financial support bitindi will set up a special fund to invest support and incentivize high potential developers bitindi will launch a variety of developer activities and competitions to discover and fund potential developers in order to reduce the cost of users on bitindi dapp the bitindi meta transaction function will reduce the gas fee of users holding bni in a step wise manner resource support projects or developers that have received investment and support from bitindi have not only the opportunity to get official news report but also can apply for marketing service packages and promote their projects globally participate in test environment of bitindi official website s www bitindi com www bitindi org public mainnet chainid 4099 public testnet chainid 4096 prc mainnet https rpc mainnet bitindi org wss ws mainnet bitindi org prc testnet https rpc testnet bitindi org wss ws testnet bitindi org blockchain explorer https bitindiscan com test coin faucet https faucet bitindi org technical support email dev bitindi com dev bitindi org interact with us on social media twitter https twitter com bitindichain https twitter com bitindichain telegram channel https t me bitindichain https t me bitindichain medium https bitindi medium com https bitindi medium com github https github com bitindi https github com bitindi bitindi explorer https bitindiscan com https bitindiscan com documentation https docs bitindi com https docs bitindi com risk warning all users and developers can participate in the current test environment and subsequent stages of bitindi for free and there is no charging scenario all users must distinguish the test environment from the mainnet the assets generated in the test environment have no value be aware of counterfeit currency fraud bitini announces authorization promotion and other collaborations only through the official social media platform developers and users should check carefully to avoid losses do not misread the official website s https bitindi com https bitindi org and be cautious with private key phishing | blockchain blockchain-explorer blockchain-platform blockchain-technology crypto cryptocurrencies cryptocurrency validators | blockchain |
Smart-City--Traffic-Density-Based-Crossroads | smart city traffic density based crossroads the project was implemented as part of embedded system design course in this project me and my team members designed developed and implemented a working model of an automated traffic management system to reduce congestion at cross roads based on the observed traffic density data in the individual lanes we also integrated an android application and a rf module to implement over ride features and manually control traffic during emergencies apart from the code we used several hardware components like infrared ir sensors atmega32 micro controller rf module etc | embedded-systems microcontroller c smart-city | os |
mistica-design | mistica design github resources mistica design light svg gh light mode only mistica design github resources mistica design dark svg gh dark mode only nbsp nbsp other m stica repos description mistica web https github com telefonica mistica web repository with code libraries for m stica in web mistica ios https github com telefonica mistica ios repository with code libraries for m stica in ios mistica android https github com telefonica mistica android repository with code libraries for m stica in android mistica icons https github com telefonica mistica icons the source of truth for icons in our digital products m stica discussions https github com telefonica mistica design discussions the place to discuss all the topics to work in m stica m stica roadmap https github com orgs telefonica projects 20 views 2 an overview of m stica planning m stica issues https github com telefonica mistica design issues all the tasks with details contribute to m stica https brandfactory telefonica com document 1846 contribute how to contribute current components status https brandfactory telefonica com d isp7b1dkyygv n a components overview prototype with real m stica components in playroom https mistica web vercel app playroom m stica storybook https mistica web vercel app m stica catalog interactive components in ios android https brandfactory telefonica com d isp7b1dkyygv n a get started start to design mistica catalog native br branch organization there are exists 2 important branches in this repo production this branch contains the currently live and stable version of the software or application that is being used by android ios web platforms pre production this branch is a temporary working branch used to test and make adjustments in features or tokens values when they are finished they are merged into production branch figma how to sync design tokens if you want to sync design tokens with figma files you can use figma tokens plugin https www figma com community plugin 843461159747178978 figma tokens and setup the plugin with the following information 1 open figma tokens plugin go to settings and select github in token storage 2 add new credentials name the name of the brand personal access token you have to generate a token from github read how to do it https docs github com en authentication keeping your account and data secure creating a personal access token creating a personal access token classic repository telefonica mistica design default branch production file path tokens brandname json see files here tokens image https user images githubusercontent com 6722153 166447592 e3d1b545 199d 4155 9024 2fb88351b444 png 3 finally go to tokens select global and apply to document and clic in update | mistica design design-system cdco managed org-cdo srv-novum | os |
ROS-LLM | official https img shields io badge official 20 auromix blue style flat logo world logocolor white https github com auromix nbsp ros2 version https img shields io badge ros ros 202 20humble brightgreen http docs ros org en humble index html nbsp ubuntu version https img shields io badge ubuntu 22 04 green https ubuntu com nbsp license https img shields io badge license apache 2 0 informational https github com auromix ros llm blob ros2 humble license nbsp github repo stars https img shields io github stars auromix ros llm style social https github com auromix ros llm stargazers nbsp twitter follow https img shields io twitter follow hermanye233 style social https twitter com hermanye233 nbsp ros llm the ros llm project is a ros framework for embodied intelligence applications it enables natural language interactions and large model based control of robot motion and navigation for any robot operating on ros ros llm empowers you to utilize functionalities based on large language models such as gpt 4 and chatgpt for robot decision making and control this framework is designed to be easy to extend by simply providing a function interface for your robot following the provided example you can integrate and use ros llm within ten minutes ros llm offers a simple solution for quickly creating interactive and control experiences with any robot related schematics llm imgs flow diagram png features ros integration smoothly interacts with the robot operating system ros for expansive robotic control large language models support leverages gpt 4 and chatgpt for enhanced decision making and task management natural interaction facilitates intuitive communication with robots through conversational engagement flexible control utilizes llm based systems for tasks such as motion and navigation based on language model interpretation simplified extensibility provides an easy interface for seamless robot function integration quick development creates interactive robot control experiences swiftly sometimes in under ten minutes instructional examples offers comprehensive tutorials and examples for easier understanding and implementation history storage retains local chat histories for convenient review and reference quickstart guide follow the instructions below to set up ros llm 1 clone the repository use the command below to clone the repository bash git clone https github com auromix ros llm git 2 install dependencies navigate to the llm install directory and execute the installation script bash cd ros llm llm install bash dependencies install sh 3 configure openai settings if you don t have an openai api key you can obtain one from openai platform https platform openai com use the script below to configure your openai api key bash cd ros llm llm install bash config openai api key sh 4 configure aws settings optional for cloud natural interaction capabilities configure the aws settings if you prefer to use local asr this step can be skipped for low performance edge embedded platforms it is recommended to use asr cloud services to reduce computing pressure and for high performance personal hosts it is recommended to use local asr services to speed up response bash cd ros llm llm install bash config aws sh 4 configure openai whisper settings optional for local natural interaction capabilities configure the openai whisper settings if you prefer to use cloud asr this step can be skipped for low performance edge embedded platforms it is recommended to use asr cloud services to reduce computing pressure and for high performance personal hosts it is recommended to use local asr services to speed up response bash pip install u openai whisper pip install setuptools rust 5 build the workspace navigate to your workspace directory and build the workspace bash cd your ws rosdep install from paths src ignore src r y install dependencies colcon build symlink install 6 run the demo source the setup script and launch the turtlesim demo with cloud asr bash source your ws install setup bash ros2 launch llm bringup chatgpt with turtle robot launch py start listening bash ros2 topic pub llm state std msgs msg string data listening 1 configure your own robot optional to use the framework with your own robot modify the llm robot and llm config packages to suit your robot s specifications this allows you to customize the behavior of your robot future development plans we are continuously striving to enhance ros llm to better serve the developers and roboticists in the community below are the key developments we plan to undertake in the upcoming updates agent mechanism adding an agent mechanism allows long sequence tasks to be well divided feedback channel from external functions we plan to add a feedback mechanism for the robot to receive information from external functions this would significantly assist model based decision making processes navigation interface a new interface for robot navigation is also in the pipeline it will enable the utilization of this framework in navigation oriented tasks sensor input interface the addition of other sensor input interfaces is another major development this will incorporate environmental perception into model decision premises preparing for functionalities such as obstacle avoidance integration with vision based models like palm e we aim to extend the capabilities of ros llm by integrating models that allow for visual input like palm e this would enable the use of advanced computer vision technologies for better environment interaction continuous optimization last but not least we will focus on continuous optimization of the framework we are committed to improving the rationality and extensibility of ros llm to make it easier for developers to customize and extend the framework according to their needs keep an eye on this repo for updates your suggestions and contributions are always welcome to user if you find this project useful please consider giving it a star on github your support helps us improve the project and encourages further development don t forget to also share it with your friends and colleagues who might it beneficial thank you for your support contributing contributions are welcome please read the contributing guidelines contributing md before submitting a pull request license copyright 2023 herman ye auromix licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license | gpt llm ros embodied-intelligence | ai |
thingsboard-client-sdk | client sdk to connect with thingsboard iot platform from various iot devices arduino espressif etc mit license https img shields io badge license mit yellow svg style flat square https lbesson mit license org esp32 https img shields io badge esp 32 green svg style flat square https www espressif com en products socs esp32 esp8266 https img shields io badge esp 8266 blue svg style flat square https www espressif com en products socs esp8266 github release https img shields io github release thingsboard thingsboard arduino sdk all svg style flat square https github com thingsboard thingsboard arduino sdk releases github downloads https img shields io github downloads thingsboard thingsboard arduino sdk all svg style flat square https github com thingsboard thingsboard arduino sdk releases arduino actions status https github com thingsboard thingsboard arduino sdk actions workflows arduino compile yml badge svg https github com thingsboard thingsboard arduino sdk actions workflows arduino compile yml espressif idf v4 4 actions status https github com thingsboard thingsboard arduino sdk actions workflows espidf compile v4 4 yml badge svg https github com thingsboard thingsboard arduino sdk actions workflows espidf compile v4 4 yml espressif idf v5 1 actions status https github com thingsboard thingsboard arduino sdk actions workflows espidf compile v5 1 yml badge svg https github com thingsboard thingsboard arduino sdk actions workflows espidf compile v5 1 yml esp32 actions status https github com thingsboard thingsboard arduino sdk actions workflows esp32 compile yml badge svg https github com thingsboard thingsboard arduino sdk actions workflows esp32 compile yml esp8266 actions status https github com thingsboard thingsboard arduino sdk actions workflows esp8266 compile yml badge svg https github com thingsboard thingsboard arduino sdk actions workflows esp8266 compile yml github stars https img shields io github stars thingsboard thingsboard arduino sdk style social this library provides access to the thingsboard platform over the mqtt or http s protocols examples the sdk comes with a number of example sketches see files examples thingsboard within the arduino application please review the complete guide for esp32 pico kit gpio control and dht22 sensor monitoring available here https thingsboard io docs samples esp32 gpio control pico kit dht22 sensor supported frameworks thingsboardarduinosdk does not directly depend on any specific mqtt client or http client implementation instead any implementation of the imqtt client or ihttp client can be used because there are no further dependencies on arduino besides the client that communicates it allows us to use this library with arduino when using the arduino mqtt client or with espressif idf when using the espressif mqtt client example usage for espressif can be found in the examples 0014 espressif esp32 send data folder all other code portions can be implemented the same way only initialization of the needed dependencies is slightly different meaning internal call to thingsboard works the same on both espressif and arduino this is also the case because the only always used dependency that is remaining is arduinojson https arduinojson org which despite its name does not require any arduino component installation this project can be built with either platformio https platformio org esp idf extension https www espressif com or arduino ide https www arduino cc en software the project can be found in the platformio registry https registry platformio org libraries thingsboard thingsboard esp component registry https components espressif com components thingsboard thingsboard or the arduino libraries https www arduino cc reference en libraries thingsboard a description on how to include the library in you project can be found below for each of the aforementioned possible methods of integrating the project platformio to add an external library the most important portion is the lib deps https docs platformio org en latest projectconf sections env options library lib deps html specification simply add thingsboard thingsboard there are multiple ways to define the version that should be fetched but the most basic is simply getting the last released version with the aforementioned line to learn more see package specifications https docs platformio org en latest core userguide pkg cmd install html package specifications lib deps thingsboard thingsboard esp idf extension to add an external library what needs to be done differs between versions if an esp idf https github com espressif esp idf version after and including v3 2 0 is used then the project can simply be added over the component manager https docs espressif com projects esp idf en latest esp32 api guides tools idf component manager html to do that we can simply call idf py add dependency dependency with the name of the dependency as an argument similar to platformio there are a multiple way to define the version that should be fetched but the method below is the most basic to simply get the last released version to learn more see using component manager with a project https docs espressif com projects esp idf en latest esp32 api guides tools idf component manager html using with a project idf py add dependency thingsboard thingsboard if an esp idf https github com espressif esp idf version prior to v3 2 0 is used then the component has to be added as a git submodule meaning the repository has to first be a git project if that is not the case already simply install git and call git init in the folder containing your project similar to the other call there are a multiple way to define the version that should be fetched but the method below is the most basic to simply get the last released version to learn more see git submodule help page https git scm com docs git submodule git submodule add https github com thingsboard thingsboard arduino sdk git components thingsboard arduino ide to add an external library we simply have to open tools manage libraries and then search for thingsboard then press the install button for the wanted version see how to install library on arduino ide https arduinogetstarted com faq how to install library on arduino ide for more detailed information and some troubleshooting if the aforementioned method does not work dependencies following dependencies are installed automatically or must be installed too installed automatically arduinojson https github com bblanchon arduinojson needed for dealing with the json payload that is sent to and received by thingsboard needs to be installed manually mqtt pubsub client https github com thingsboard pubsubclient for interacting with mqtt when using the arduino mqtt client instance as an argument to thingsboard arduino http client https github com arduino libraries arduinohttpclient for interacting with http s when using the arduino http client instance as an argument to thingsboardhttp mbedtls library https github com seeed studio seeed arduino mbedtls needed to create hashes for the ota update esp8266 only already included in esp32 base firmware wifiesp client https github com bportaluri wifiesp needed when using a arduino uno with a esp8266 streamutils https github com bblanchon streamutils needed if define thingsboard enable stream utils 1 is set it allows sending arbitrary amount of payload even if the buffer size is too small to hold that complete payload supported thingsboard features over mqtt all possible features are implemented over mqtt telemetry data upload https thingsboard io docs reference mqtt api telemetry upload api device attribute publish https thingsboard io docs reference mqtt api publish attribute update to the server server side rpc https thingsboard io docs reference mqtt api server side rpc client side rpc https thingsboard io docs reference mqtt api client side rpc request attribute values https thingsboard io docs reference mqtt api request attribute values from the server attribute update subscription https thingsboard io docs reference mqtt api subscribe to attribute updates from the server device provisioning https thingsboard io docs reference mqtt api device provisioning device claiming https thingsboard io docs reference mqtt api claiming devices firmware ota update https thingsboard io docs reference mqtt api firmware api over http s the remaining features have to be implemented by hand with the sendgetrequest or sendpostrequest method see the thingsboard documentation https thingsboard io docs reference http api on how these features could be implemented telemetry data upload https thingsboard io docs reference http api telemetry upload api device attribute publish https thingsboard io docs reference http api publish attribute update to the server example implementations for all base features mentioned above can be found in the examples folder see the according readme md to see which boards are supported and which functionality the example shows troubleshooting this troubleshooting guide contains common issues that are well known and can occur if the library is used wrongly ensure to read this section before creating a new github issue no progmem support causing crashes if the device is crashing with an exception especially exception 3 more specifically loadstoreerror or loadstoreerrorcause this might be because all constant variables are per default in flash memory to decrease the memory footprint of the library if the libraries used or the board itself don t support progmem this can cause crashes to mitigate that add a define thingsboard enable progmem 0 before including the thingsboard header file c if not set otherwise the value is 1 per default if the pgmspace include exists set to 0 if the board has problems with progmem variables and does not seem to work correctly define thingsboard enable progmem 0 include thingsboard h dynamic thingsboard usage the received json payload as well as the sendattributes and sendtelemetry methods use the staticjsondocument https arduinojson org v6 api staticjsondocument by default this furthermore requires the maxfieldsamt template argument passed in the constructor to set the size the buffer should have where bigger messages will cause not sent received key value pairs or failed de serialization to remove the need for the maxfieldsamt template argument in the constructor and ensure the size the buffer should have is always enough to hold the sent or received messages instead define thingsboard enable dynamic 1 can be set before including the thingsboard header file this makes the library use the dynamicjsondocument https arduinojson org v6 api dynamicjsondocument instead of the default staticjsondocument https arduinojson org v6 api staticjsondocument c if not set otherwise the value is 0 per default set to 1 if the maxfieldsamt template argument should not be required define thingsboard enable dynamic 1 include thingsboard h not enough space for json serialization the buffer size for the serialized json is fixed to 64 bytes the sdk will not send data if the size of it is bigger than the size originally passed in the constructor as a template argument payloadsize respective logs in the serial monitor window will indicate the condition tb buffer size 64 to small for the given payloads size 83 increase with setbuffersize accordingly or set thingsboard enable stream utils to 1 before including thingsboard if that s the case the buffer size for serialization should be increased to do so setbuffersize method can be used or the buffersize passed to the constructor can be increased as illustrated below cpp for the sake of example wifiespclient espclient initalize the mqtt client instance arduino mqtt client mqttclient espclient the sdk setup with 64 bytes for json buffer thingsboard tb mqttclient the sdk setup with 128 bytes for json buffer thingsboard tb mqttclient 128 void setup increase internal buffer size after inital creation tb setbuffersize 128 alternatively it is possible to enable the mentioned thingsboard enable stream utils option which sends messages that are bigger than the given buffer size with a method that skips the internal buffer be aware tough this only works for sent messages the internal buffer size still has to be big enough to receive the biggest possible message received by the client that is sent by the server cpp enable skipping usage of the buffer for sends that are bigger than the internal buffer size define thingsboard enable stream utils 1 for the sake of example wifiespclient espclient initalize the mqtt client instance arduino mqtt client mqttclient espclient the sdk setup with 64 bytes for json buffer thingsboard tb mqttclient too much data fields must be serialized a buffer allocated internally by arduinojson library is fixed and is capable for processing not more than 8 fields if you are trying to send more than that you will get a respective log showing an error in the serial monitor window tb too many json fields passed 26 increase maxfieldsamt 8 accordingly the solution is to use thingsboardsized class instead of thingsboard note that the serialized json buffer size must be specified explicitly as described here not enough space for json serialization see dynamic thingsboard usage above if the usage of maxfieldsamt should be replaced with automatic detection of the needed size cpp for the sake of example wifiespclient espclient initalize the mqtt client instance arduino mqtt client mqttclient espclient the sdk setup with 8 fields for json object thingsboard tb mqttclient the sdk setup with 128 bytes for json payload and 32 fields for json object thingsboardsized 32 tb mqttclient 128 tips and tricks custom updater instance when using the thingsboard class instance the class used to flash the binary data onto the device is not hard coded but instead the ota update callback class expects an optional argument the iupdater implementation thanks to it being a interface it allows an arbitrary implementation meaning as long as the device can flash binary data and supports the c stl it supports ota updates with the thingsboard library currently implemented in the library itself are the esp32 updater which is used for flashing the binary data when using a esp32 and the esp8266 updater which is used with the esp8266 if another device wants to be supported a custom interface implementation needs to be created for that a class that inherits the iupdater interface needs to be created and override the needed methods c include iupdater h class custom updater public iupdater public bool begin const size t firmware size override return true size t write uint8 t payload const size t total bytes override return total bytes void reset override nothing to do bool end override return true custom http instance when using the thingsboardhttp class instance the protocol used to send the data to the http broker is not hard coded but instead the thingsboardhttp class expects the argument to a ihttp client implementation thanks to it being a interface it allows an arbitrary implementation meaning the underlying http client can be whatever the user decides so it can for example be used to support platforms using arduino or even espressif idf currently implemented in the library itself is the arduino http client which is simply a wrapper around the arduinohttpclient https github com arduino libraries arduinohttpclient see dependencies https github com arduino libraries arduinohttpclient tab readme ov file dependencies for whether the board you are using is supported or not if another device wants to be supported a custom interface implementation needs to be created for that a class that inherits the ihttp client interface needs to be created and override the needed methods c include ihttp client h class custom http client public ihttp client public void set keep alive const bool keep alive override nothing to do int connect const char host const uint16 t port override return 0 void stop override nothing to do int post const char url path const char content type const char request body override return 0 int get response status code override return 200 int get const char url path override return 0 string get response body override return string custom mqtt instance when using the thingsboard class instance the protocol used to send the data to the mqtt broker is not hard coded but instead the thingsboard class expects the argument to a imqtt client implementation thanks to it being a interface it allows an arbitrary implementation meaning the underlying mqtt client can be whatever the user decides so it can for example be used to support platforms using arduino or even espressif idf currently implemented in the library itself is the arduino mqtt client which is simply a wrapper around the tbpubsubclient https github com thingsboard pubsubclient see compatible hardware https github com thingsboard pubsubclient tab readme ov file compatible hardware for whether the board you are using is supported or not if another device wants to be supported a custom interface implementation needs to be created for that a class that inherits the imqtt client interface needs to be created and override the needed methods c include imqtt client h class custom mqtt client public imqtt client public void set callback function cb override nothing to do bool set buffer size const uint16 t buffer size override return true uint16 t get buffer size override return 0u void set server const char domain const uint16 t port override nothing to do bool connect const char client id const char user name const char password override return true void disconnect override nothing to do bool loop override return true bool publish const char topic const uint8 t payload const size t length override return true bool subscribe const char topic override return true bool unsubscribe const char topic override return true bool connected override return true if thingsboard enable stream utils bool begin publish const char topic const size t length override return true bool end publish override return true print interface size t write uint8 t payload byte override return 1u size t write const uint8 t buffer size t size override return size endif thingsboard enable stream utils custom logger instance to use your own logger you have to create a class and pass it as second template parameter logger to your thingsboardsized class instance for example cpp class customlogger public static void log const char error serial print custom logger serial println error your class must have the method log with the same prototype as in the example it will be called if the library needs to print any log messages after that you can use it in place of the regular thingsboard class note that the serialized json buffer size must be specified explicitly as described here too much data fields must be serialized cpp for the sake of example wifiespclient espclient initalize the mqtt client instance arduino mqtt client mqttclient espclient the sdk setup with 8 fields for json object thingsboard tb mqttclient the sdk setup with 128 bytes for json payload and 32 fields for json object thingsboardsized 32 customlogger tb mqttclient 128 have a question or proposal you are welcome in our issues https github com thingsboard thingsboard arduino sdk issues and q a forum https groups google com forum forum thingsboard license this code is released under the mit license | thingsboard mqtt arduino iot embedded mcu esp32 esp8266 arduino-library arduino-nano-rp2040 arduino-nano-rp2040-connect m5stack-library raspberry-pi-pico-w esp esp8266-arduino rp2040 esp-idf espressif iot-platform | server |
XTinyRTOS | xtinyrtos rtos c br 0 2020 10 22 lwip 2 1 2 br fatfs br 2020 4 11 v0 1 br 1 1 br 2 br 3 br 4 stm32f1 br 5 lwip fatfs emwin br 2 3 4 5 | os |
|
IIITS-website-mimic | iiits website mimic a mimic website for my college indian institute of information technology college website my team and i did this project to implement the concepts that we have learnt in our college we chose this topic so that we can resolve the issues in our college website and also added few student requirement features in it by discussing among us and seniors the languages we used in our project were html css javascript php mysql bootstrap json note the last two were not much used in it the below two links will direct you to a short video that we made describing our whole project part1 https drive google com open id 1nu3tsckrcvlcnb5em x6iefxkofxdrar part2 https drive google com open id 1 yhv2asjbx4owcehnrfu4psohcqg0qyr | server |
|
Lona | p align center img src studio lonastudio assets xcassets appicon appiconset icon 256x256 2x png width 256 height 256 p h1 align center lona developer preview h1 br this project is in an early stage and lacks documentation however with some effort it can be used very effectively if you re interested in contributing or using it at your company feel free to open a github issue or get in touch with me on twitter dvnabbott https twitter com dvnabbott i m currently supporting a few early adopters airbnb doesn t provide support for this project the code and automated tests are not at the same degree of technical rigor as other airbnb projects overview build status https travis ci org airbnb lona svg branch master https travis ci org airbnb lona lona is a collection of tools for building design systems and using them to generate cross platform ui code sketch files and other artifacts lona consists primarily of 3 parts lona components lona components a data format component for cross platform components lona studio lona studio a gui tool for designing component files lona compiler lona compiler a cli tool api for generating ui code from component files why lona read more about challenges with cross platform design systems at scale and how lona solves them here docs overview background md lona components a design system is defined in json as a collection of components can be nested colors text styles gradients and shadows data types the specification for these files can be found in the docs docs file formats readme md lona studio lona studio provides a graphical interface for working with component files lona studio is primarily for building component systems but can also be used for quickly mocking up new screens from existing components viewing designs with real data from json files or apis experimenting with designs across multiple screen sizes automating design tasks e g localizing screenshots for different languages and exporting hundreds of images working with animations lottie and rendering videos from them and more if you have xcode installed you can try it out by following the installation instructions studio readme md wondering if this replaces sketch why a native mac app i answer some common questions here in the faq docs overview faq md lona compiler the lona compiler converts component files to ui code for various targets currently these targets are supported ios macos swift react dom react native react sketchapp javascript support is planned for android kotlin the target with the most features currently is swift however it s still fairly rough if you want to try it out check out the installation instructions compiler core readme md if you re looking for a sample of the generated code check out the test cases examples generated test core team created by dvnabbott https twitter com dvnabbott design tool integrations and infrastructure by mathieudutour https twitter com mathieudutour contributors lona compiler development by outdooricon https github com outdooricon design development help by ryngonzalez https twitter com ryngonzalez gorgeous logo by pablocar0 https twitter com pablocar0 lona studio development by nghia tran https github com nghiatranuit swift code generation help by laura skelton https twitter com skelovenko lona compiler development by jason zurita https twitter com jasonalexzurita | os |
|
vxl | vxl logo1 quant gif introduction what is vxl vxl the vision something libraries is a collection of c libraries designed for computer vision research and implementation it was created from targetjr and the iue with the aim of making a light fast and consistent system vxl is written in ansi iso c and is designed to be portable over many platforms the core libraries in vxl are a comprehensive description of the vxl project can be viewed at https vxl github io | ai |
|
udpipe | udpipe r package for tokenization tagging lemmatization and dependency parsing based on udpipe this repository contains an r package which is an rcpp wrapper around the udpipe c library http ufal mff cuni cz udpipe https github com ufal udpipe udpipe provides language agnostic tokenization tagging lemmatization and dependency parsing of raw text which is an essential part in natural language processing the techniques used are explained in detail in the paper tokenizing pos tagging lemmatizing and parsing ud 2 0 with udpipe available at https ufal mff cuni cz straka papers 2017 conll udpipe pdf in that paper you ll also find accuracies on different languages and process flow speed measured in words per second vignettes udpipe rlogo png general the udpipe r package was designed with the following things in mind when building the rcpp wrapper around the udpipe c library give r users simple access in order to easily tokenize tag lemmatize or perform dependency parsing on text in any language provide easy access to pre trained annotation models allow r users to easily construct your own annotation model based on data in conll u format as provided in more than 100 treebanks available at http universaldependencies org don t rely on python or java so that r users can easily install this package without configuration hassle no external r package dependencies except the strict necessary rcpp and data table no tidyverse installation license the package is available under the mozilla public license version 2 0 installation can be done as follows please visit the package documentation at https bnosac github io udpipe en and look at the r package vignettes for further details install packages udpipe vignette udpipe tryitout package udpipe vignette udpipe annotation package udpipe vignette udpipe universe package udpipe vignette udpipe usecase postagging lemmatisation package udpipe an overview of keyword extraction techniques https bnosac github io udpipe docs doc7 html vignette udpipe usecase topicmodelling package udpipe vignette udpipe parallel package udpipe vignette udpipe train package udpipe for installing the development version of this package remotes install github bnosac udpipe build vignettes true example currently the package allows you to do tokenisation tagging lemmatization and dependency parsing with one convenient function called udpipe library udpipe udmodel udpipe download model language dutch udmodel language file model dutch alpino c users jan dropbox work rforgebnosac bnosac udpipe dutch alpino ud 2 5 191206 udpipe x udpipe x ik ging op reis en ik nam mee mijn laptop mijn zonnebril en goed humeur object udmodel x doc id paragraph id sentence id start end term id token id token lemma upos xpos feats head token id dep rel misc doc1 1 1 1 2 1 1 ik ik pron vnw pers pron nomin vol 1 ev case nom person 1 prontype prs 2 nsubj na doc1 1 1 4 7 2 2 ging gaan verb ww pv verl ev number sing tense past verbform fin 0 root na doc1 1 1 9 10 3 3 op op adp vz init na 4 case na doc1 1 1 12 15 4 4 reis reis noun n soort ev basis zijd stan gender com number sing 2 obl na doc1 1 1 17 18 5 5 en en cconj vg neven na 7 cc na doc1 1 1 20 21 6 6 ik ik pron vnw pers pron nomin vol 1 ev case nom person 1 prontype prs 7 nsubj na doc1 1 1 23 25 7 7 nam nemen verb ww pv verl ev number sing tense past verbform fin 2 conj na doc1 1 1 27 29 8 8 mee mee adp vz fin na 7 compound prt spaceafter no doc1 1 1 30 30 9 9 punct let na 7 punct na pre trained models pre trained models build on universal dependencies treebanks are made available for more than 65 languages based on 101 treebanks namely afrikaans afribooms ancient greek perseus ancient greek proiel arabic padt armenian armtdp basque bdt belarusian hse bulgarian btb buryat bdt catalan ancora chinese gsd chinese gsdsimp classical chinese kyoto coptic scriptorium croatian set czech cac czech cltt czech fictree czech pdt danish ddt dutch alpino dutch lassysmall english ewt english gum english lines english partut estonian edt estonian ewt finnish ftb finnish tdt french gsd french partut french sequoia french spoken galician ctg galician treegal german gsd german hdt gothic proiel greek gdt hebrew htb hindi hdtb hungarian szeged indonesian gsd irish idt italian isdt italian partut italian postwita italian twittiro italian vit japanese gsd kazakh ktb korean gsd korean kaist kurmanji mg latin ittb latin perseus latin proiel latvian lvtb lithuanian alksnis lithuanian hse maltese mudt marathi ufal north sami giella norwegian bokmaal norwegian nynorsk norwegian nynorsklia old church slavonic proiel old french srcmf old russian torot persian seraji polish lfg polish pdb polish sz portuguese bosque portuguese br portuguese gsd romanian nonstandard romanian rrt russian gsd russian syntagrus russian taiga sanskrit ufal scottish gaelic arcosg serbian set slovak snk slovenian ssj slovenian sst spanish ancora spanish gsd swedish lines swedish talbanken tamil ttb telugu mtg turkish imst ukrainian iu upper sorbian ufal urdu udtb uyghur udt vietnamese vtb wolof wtb these have been made available easily to users of the package by using udpipe download model how good are these models accuracy statistics of models provided by the udpipe authors which you download with udpipe download model from the default repository are available at this link https github com jwijffels udpipe models ud 2 5 blob master inst udpipe ud 2 5 191206 readme accuracy statistics of models trained using this r package which you download with udpipe download model from the bnosac udpipe models ud repository are available at https github com bnosac udpipe models ud for a comparison between udpipe and spacy visit https github com jwijffels udpipe spacy comparison train your own models based on conll u data the package also allows you to build your own annotation model for this you need to provide data in conll u format these are provided for many languages at https universaldependencies org mostly under the cc by sa license how this is done is detailed in the package vignette vignette udpipe train package udpipe support in text mining need support in text mining contact bnosac http www bnosac be | udpipe nlp natural-language-processing r r-package pos-tagging dependency-parser tokenizer rcpp r-pkg conll lemmatization text-mining | ai |
front-end-interview | front end interview front end interview questions core js closures context function prototype bind polyfill classes in javascript es5 prototypal inheritance object create polyfill array prototype methods event loop promises async functions dom api event handling event delegation use strict hoisting html css selectors specificity rednering pipline advanced es6 features css animations requestanimationframe browser rendering pipeline modules commonjs vs es6 v8 optimisations optimisation killers memory leaks resources closures http speakingjs com es5 ch16 html context http speakingjs com es5 ch17 html this as an implicit parameter of functions and methods function prototype bind https developer mozilla org en docs web javascript reference global objects function bind object create https developer mozilla org en docs web javascript reference global objects object create what the heck is event loop https www youtube com watch v 8aghzqkofbq event loop docs https github com atotic event loop microtasks https jakearchibald com 2015 tasks microtasks queues and schedules es6 features https github com lukehoban es6features how browser works https www html5rocks com en tutorials internals howbrowserswork requestanimationframe https www html5rocks com en tutorials speed animations memory leaks https auth0 com blog four types of leaks in your javascript code and how to get rid of them | front-end interview interview-questions | front_end |
Data-Engineering-Concepts-Practice-in-Python | data engineering concepts practice in python | server |
|
wavemaker | wavemaker copyright 2009 2013 the original author or authors licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license notice the git version of this project is currently being restructured the project may not be completely stable in all configurations and uses the source code for the latest stable version of this project can be downloaded as a tar bundle from http dev wavemaker com wiki bin wmdoc 6 4 644download in the source code bundle section instructions building in order to build wavemaker you will need the following applications java jdk 6 http www oracle com technetwork java javase downloads index html apache maven 3 x http maven apache org to clone the wavemaker repository use git clone pre git clone https github com cloudjee wavemaker git pre to build the complete wavemaker distribution including installer run the maven from the root directory pre mvn clean install pre to build only wavemaker studio and required libraries build in the wavemaker sub folder pre cd wavemaker mvn clean install pre the resultant war file can be deployed to your own tomcat see also inplace deployment if git is in the path of the build environment the last commit sha1 object name will be recorded in boot js and the installer s version file dojo build use the dojobuild profile to build the gzip version of the libraries this will enable you to run without debug pre mvn pdojobuild clean install pre installer builds the installer package built will be dependent on your operating system the following operating systems are currently supported mac osx linux centos or ubuntu windows all of these need java in the class path importing into eclipse sts to develop wavemaker using eclipse or sts run the following from the root directory pre mvn eclipse eclipse pre from eclipse choose file import existing projects to load projects into eclipse wavemaker studio can be deployed using eclipse webtools to an apache tomcat 6 installation inplace deployment if you are primarily working on the javascript aspects of wavemaker you might want to use inplace deployment inplace deployment allows changes to be immediately updated in the deployed application to use live deployment follow these steps download and install apache tomcat 6 from http tomcat apache org download 60 cgi enable the manager application by editing conf tomcat users xml and adding the following pre lt tomcat users gt lt role rolename manager gt lt user username manager password manager roles manager gt lt tomcat users gt pre build and deploy the project using maven pre cd wavemaker mvn clean install cd wavemaker studio mvn pinplace resources resources war inplace tomcat inplace pre use the deployed application http localhost 8080 wavemaker debug | front_end |
|
Mobile-Apps | mobile development print hello github this repository is created in order to save the source codes of the mobile development projects | java swift4 | front_end |
NodeJs | node js getting started a barebones node js app using express 4 http expressjs com this application supports the getting started with node on heroku https devcenter heroku com articles getting started with nodejs article check it out running locally make sure you have node js http nodejs org and the heroku cli https cli heroku com installed sh git clone https github com heroku node js getting started git or clone your own fork cd node js getting started npm install npm start your app should now be running on localhost 5000 http localhost 5000 deploying to heroku heroku create git push heroku master heroku open or deploy to heroku https www herokucdn com deploy button png https heroku com deploy documentation for more information about using node js on heroku see these dev center articles getting started with node js on heroku https devcenter heroku com articles getting started with nodejs heroku node js support https devcenter heroku com articles nodejs support node js on heroku https devcenter heroku com categories nodejs best practices for node js development https devcenter heroku com articles node best practices using websockets on heroku with node js https devcenter heroku com articles node websockets | cloud |
|
coursera-fewdwr | this project was bootstrapped with create react app https github com facebookincubator create react app below you will find some information on how to perform common tasks br you can find the most recent version of this guide here https github com facebookincubator create react app blob master packages react scripts template readme md table of contents updating to new releases updating to new releases sending feedback sending feedback folder structure folder structure available scripts available scripts npm start npm start npm test npm test npm run build npm run build npm run eject npm run eject supported browsers supported browsers supported language features and polyfills supported language features and polyfills syntax highlighting in the editor syntax highlighting in the editor displaying lint output in the editor displaying lint output in the editor debugging in the editor debugging in the editor formatting code automatically formatting code automatically changing the page title changing the page title installing a dependency installing a dependency importing a component importing a component code splitting code splitting adding a stylesheet adding a stylesheet post processing css post processing css adding a css preprocessor sass less etc adding a css preprocessor sass less etc adding images fonts and files adding images fonts and files using the public folder using the public folder changing the html changing the html adding assets outside of the module system adding assets outside of the module system when to use the public folder when to use the public folder using global variables using global variables adding bootstrap adding bootstrap using a custom theme using a custom theme adding flow adding flow adding a router adding a router adding custom environment variables adding custom environment variables referencing environment variables in the html referencing environment variables in the html adding temporary environment variables in your shell adding temporary environment variables in your shell adding development environment variables in env adding development environment variables in env can i use decorators can i use decorators fetching data with ajax requests fetching data with ajax requests integrating with an api backend integrating with an api backend node node ruby on rails ruby on rails proxying api requests in development proxying api requests in development invalid host header errors after configuring proxy invalid host header errors after configuring proxy configuring the proxy manually configuring the proxy manually configuring a websocket proxy configuring a websocket proxy using https in development using https in development generating dynamic meta tags on the server generating dynamic meta tags on the server pre rendering into static html files pre rendering into static html files injecting data from the server into the page injecting data from the server into the page running tests running tests filename conventions filename conventions command line interface command line interface version control integration version control integration writing tests writing tests testing components testing components using third party assertion libraries using third party assertion libraries initializing test environment initializing test environment focusing and excluding tests focusing and excluding tests coverage reporting coverage reporting continuous integration continuous integration disabling jsdom disabling jsdom snapshot testing snapshot testing editor integration editor integration debugging tests debugging tests debugging tests in chrome debugging tests in chrome debugging tests in visual studio code debugging tests in visual studio code developing components in isolation developing components in isolation getting started with storybook getting started with storybook getting started with styleguidist getting started with styleguidist publishing components to npm publishing components to npm making a progressive web app making a progressive web app opting out of caching opting out of caching offline first considerations offline first considerations progressive web app metadata progressive web app metadata analyzing the bundle size analyzing the bundle size deployment deployment static server static server other solutions other solutions serving apps with client side routing serving apps with client side routing building for relative paths building for relative paths azure azure firebase firebase github pages github pages heroku heroku netlify netlify now now s3 and cloudfront s3 and cloudfront surge surge advanced configuration advanced configuration troubleshooting troubleshooting npm start doesn t detect changes npm start doesnt detect changes npm test hangs on macos sierra npm test hangs on macos sierra npm run build exits too early npm run build exits too early npm run build fails on heroku npm run build fails on heroku npm run build fails to minify npm run build fails to minify moment js locales are missing momentjs locales are missing alternatives to ejecting alternatives to ejecting something missing something missing updating to new releases create react app is divided into two packages create react app is a global command line utility that you use to create new projects react scripts is a development dependency in the generated projects including this one you almost never need to update create react app itself it delegates all the setup to react scripts when you run create react app it always creates the project with the latest version of react scripts so you ll get all the new features and improvements in newly created apps automatically to update an existing project to a new version of react scripts open the changelog https github com facebookincubator create react app blob master changelog md find the version you re currently on check package json in this folder if you re not sure and apply the migration instructions for the newer versions in most cases bumping the react scripts version in package json and running npm install in this folder should be enough but it s good to consult the changelog https github com facebookincubator create react app blob master changelog md for potential breaking changes we commit to keeping the breaking changes minimal so you can upgrade react scripts painlessly sending feedback we are always open to your feedback https github com facebookincubator create react app issues folder structure after creation your project should look like this my app readme md node modules package json public index html favicon ico src app css app js app test js index css index js logo svg for the project to build these files must exist with exact filenames public index html is the page template src index js is the javascript entry point you can delete or rename the other files you may create subdirectories inside src for faster rebuilds only files inside src are processed by webpack br you need to put any js and css files inside src otherwise webpack won t see them only files inside public can be used from public index html br read instructions below for using assets from javascript and html you can however create more top level directories br they will not be included in the production build so you can use them for things like documentation available scripts in the project directory you can run npm start runs the app in the development mode br open http localhost 3000 http localhost 3000 to view it in the browser the page will reload if you make edits br you will also see any lint errors in the console npm test launches the test runner in the interactive watch mode br see the section about running tests running tests for more information npm run build builds the app for production to the build folder br it correctly bundles react in production mode and optimizes the build for the best performance the build is minified and the filenames include the hashes br your app is ready to be deployed see the section about deployment deployment for more information npm run eject note this is a one way operation once you eject you can t go back if you aren t satisfied with the build tool and configuration choices you can eject at any time this command will remove the single build dependency from your project instead it will copy all the configuration files and the transitive dependencies webpack babel eslint etc right into your project so you have full control over them all of the commands except eject will still work but they will point to the copied scripts so you can tweak them at this point you re on your own you don t have to ever use eject the curated feature set is suitable for small and middle deployments and you shouldn t feel obligated to use this feature however we understand that this tool wouldn t be useful if you couldn t customize it when you are ready for it supported browsers by default the generated project uses the latest version of react you can refer to the react documentation https reactjs org docs react dom html browser support for more information about supported browsers supported language features and polyfills this project supports a superset of the latest javascript standard br in addition to es6 https github com lukehoban es6features syntax features it also supports exponentiation operator https github com rwaldron exponentiation operator es2016 async await https github com tc39 ecmascript asyncawait es2017 object rest spread properties https github com sebmarkbage ecmascript rest spread stage 3 proposal dynamic import https github com tc39 proposal dynamic import stage 3 proposal class fields and static properties https github com tc39 proposal class public fields part of stage 3 proposal jsx https facebook github io react docs introducing jsx html and flow https flowtype org syntax learn more about different proposal stages https babeljs io docs plugins presets stage x experimental presets while we recommend using experimental proposals with some caution facebook heavily uses these features in the product code so we intend to provide codemods https medium com cpojer effective javascript codemods 5a6686bb46fb if any of these proposals change in the future note that the project only includes a few es6 polyfills https en wikipedia org wiki polyfill object assign https developer mozilla org en docs web javascript reference global objects object assign via object assign https github com sindresorhus object assign promise https developer mozilla org en us docs web javascript reference global objects promise via promise https github com then promise fetch https developer mozilla org en docs web api fetch api via whatwg fetch https github com github fetch if you use any other es6 features that need runtime support such as array from or symbol make sure you are including the appropriate polyfills manually or that the browsers you are targeting already support them also note that using some newer syntax features like for of or nonarrayvalue causes babel to emit code that depends on es6 runtime features and might not work without a polyfill when in doubt use babel repl https babeljs io repl to see what any specific syntax compiles down to syntax highlighting in the editor to configure the syntax highlighting in your favorite text editor head to the relevant babel documentation page https babeljs io docs editors and follow the instructions some of the most popular editors are covered displaying lint output in the editor note this feature is available with react scripts 0 2 0 and higher br it also only works with npm 3 or higher some editors including sublime text atom and visual studio code provide plugins for eslint they are not required for linting you should see the linter output right in your terminal as well as the browser console however if you prefer the lint results to appear right in your editor there are some extra steps you can do you would need to install an eslint plugin for your editor first then add a file called eslintrc to the project root js extends react app now your editor should report the linting warnings note that even if you edit your eslintrc file further these changes will only affect the editor integration they won t affect the terminal and in browser lint output this is because create react app intentionally provides a minimal set of rules that find common mistakes if you want to enforce a coding style for your project consider using prettier https github com jlongster prettier instead of eslint style rules debugging in the editor this feature is currently only supported by visual studio code https code visualstudio com and webstorm https www jetbrains com webstorm visual studio code and webstorm support debugging out of the box with create react app this enables you as a developer to write and debug your react code without leaving the editor and most importantly it enables you to have a continuous development workflow where context switching is minimal as you don t have to switch between tools visual studio code you would need to have the latest version of vs code https code visualstudio com and vs code chrome debugger extension https marketplace visualstudio com items itemname msjsdiag debugger for chrome installed then add the block below to your launch json file and put it inside the vscode folder in your app s root directory json version 0 2 0 configurations name chrome type chrome request launch url http localhost 3000 webroot workspaceroot src sourcemappathoverrides webpack src webroot note the url may be different if you ve made adjustments via the host or port environment variables advanced configuration start your app by running npm start and start debugging in vs code by pressing f5 or by clicking the green debug icon you can now write code set breakpoints make changes to the code and debug your newly modified code all from your editor having problems with vs code debugging please see their troubleshooting guide https github com microsoft vscode chrome debug blob master readme md troubleshooting webstorm you would need to have webstorm https www jetbrains com webstorm and jetbrains ide support https chrome google com webstore detail jetbrains ide support hmhgeddbohgjknpmjagkdomcpobmllji chrome extension installed in the webstorm menu run select edit configurations then click and select javascript debug paste http localhost 3000 into the url field and save the configuration note the url may be different if you ve made adjustments via the host or port environment variables advanced configuration start your app by running npm start then press d on macos or f9 on windows and linux or click the green debug icon to start debugging in webstorm the same way you can debug your application in intellij idea ultimate phpstorm pycharm pro and rubymine formatting code automatically prettier is an opinionated code formatter with support for javascript css and json with prettier you can format the code you write automatically to ensure a code style within your project see the prettier s github page https github com prettier prettier for more information and look at this page to see it in action https prettier github io prettier to format our code whenever we make a commit in git we need to install the following dependencies sh npm install save husky lint staged prettier alternatively you may use yarn sh yarn add husky lint staged prettier husky makes it easy to use githooks as if they are npm scripts lint staged allows us to run scripts on staged files in git see this blog post about lint staged to learn more about it https medium com okonetchnikov make linting great again f3890e1ad6b8 prettier is the javascript formatter we will run before commits now we can make sure every file is formatted correctly by adding a few lines to the package json in the project root add the following line to scripts section diff scripts precommit lint staged start react scripts start build react scripts build next we add a lint staged field to the package json for example diff dependencies lint staged src js jsx json css prettier single quote write git add scripts now whenever you make a commit prettier will format the changed files automatically you can also run node modules bin prettier single quote write src js jsx json css to format your entire project for the first time next you might want to integrate prettier in your favorite editor read the section on editor integration https prettier io docs en editors html on the prettier github page changing the page title you can find the source html file in the public folder of the generated project you may edit the title tag in it to change the title from react app to anything else note that normally you wouldn t edit files in the public folder very often for example adding a stylesheet adding a stylesheet is done without touching the html if you need to dynamically update the page title based on the content you can use the browser document title https developer mozilla org en us docs web api document title api for more complex scenarios when you want to change the title from react components you can use react helmet https github com nfl react helmet a third party library if you use a custom server for your app in production and want to modify the title before it gets sent to the browser you can follow advice in this section generating dynamic meta tags on the server alternatively you can pre build each page as a static html file which then loads the javascript bundle which is covered here pre rendering into static html files installing a dependency the generated project includes react and reactdom as dependencies it also includes a set of scripts used by create react app as a development dependency you may install other dependencies for example react router with npm sh npm install save react router alternatively you may use yarn sh yarn add react router this works for any library not just react router importing a component this project setup supports es6 modules thanks to babel br while you can still use require and module exports we encourage you to use import and export http exploringjs com es6 ch modules html instead for example button js js import react component from react class button extends component render export default button don t forget to use export default dangerbutton js js import react component from react import button from button import a component from another file class dangerbutton extends component render return button color red export default dangerbutton be aware of the difference between default and named exports http stackoverflow com questions 36795819 react native es 6 when should i use curly braces for import 36796281 36796281 it is a common source of mistakes we suggest that you stick to using default imports and exports when a module only exports a single thing for example a component that s what you get when you use export default button and import button from button named exports are useful for utility modules that export several functions a module may have at most one default export and as many named exports as you like learn more about es6 modules when to use the curly braces http stackoverflow com questions 36795819 react native es 6 when should i use curly braces for import 36796281 36796281 exploring es6 modules http exploringjs com es6 ch modules html understanding es6 modules https leanpub com understandinges6 read leanpub auto encapsulating code with modules code splitting instead of downloading the entire app before users can use it code splitting allows you to split your code into small chunks which you can then load on demand this project setup supports code splitting via dynamic import http 2ality com 2017 01 import operator html loading code on demand its proposal https github com tc39 proposal dynamic import is in stage 3 the import function like form takes the module name as an argument and returns a promise https developer mozilla org en us docs web javascript reference global objects promise which always resolves to the namespace object of the module here is an example modulea js js const modulea hello export modulea app js js import react component from react class app extends component handleclick import modulea then modulea use modulea catch err handle failure render return div button onclick this handleclick load button div export default app this will make modulea js and all its unique dependencies as a separate chunk that only loads after the user clicks the load button you can also use it with async await syntax if you prefer it with react router if you are using react router check out this tutorial http serverless stack com chapters code splitting in create react app html on how to use code splitting with it you can find the companion github repository here https github com anomalyinnovations serverless stack demo client tree code splitting in create react app also check out the code splitting https reactjs org docs code splitting html section in react documentation adding a stylesheet this project setup uses webpack https webpack js org for handling all assets webpack offers a custom way of extending the concept of import beyond javascript to express that a javascript file depends on a css file you need to import the css from the javascript file button css css button padding 20px button js js import react component from react import button css tell webpack that button js uses these styles class button extends component render you can use them as regular css styles return div classname button this is not required for react but many people find this feature convenient you can read about the benefits of this approach here https medium com seek ui engineering block element modifying your javascript components d7f99fcab52b however you should be aware that this makes your code less portable to other build tools and environments than webpack in development expressing dependencies this way allows your styles to be reloaded on the fly as you edit them in production all css files will be concatenated into a single minified css file in the build output if you are concerned about using webpack specific semantics you can put all your css right into src index css it would still be imported from src index js but you could always remove that import if you later migrate to a different build tool post processing css this project setup minifies your css and adds vendor prefixes to it automatically through autoprefixer https github com postcss autoprefixer so you don t need to worry about it for example this css app display flex flex direction row align items center becomes this css app display webkit box display ms flexbox display flex webkit box orient horizontal webkit box direction normal ms flex direction row flex direction row webkit box align center ms flex align center align items center if you need to disable autoprefixing for some reason follow this section https github com postcss autoprefixer disabling adding a css preprocessor sass less etc generally we recommend that you don t reuse the same css classes across different components for example instead of using a button css class in acceptbutton and rejectbutton components we recommend creating a button component with its own button styles that both acceptbutton and rejectbutton can render but not inherit https facebook github io react docs composition vs inheritance html following this rule often makes css preprocessors less useful as features like mixins and nesting are replaced by component composition you can however integrate a css preprocessor if you find it valuable in this walkthrough we will be using sass but you can also use less or another alternative first let s install the command line interface for sass sh npm install save node sass chokidar alternatively you may use yarn sh yarn add node sass chokidar then in package json add the following lines to scripts diff scripts build css node sass chokidar src o src watch css npm run build css node sass chokidar src o src watch recursive start react scripts start build react scripts build test react scripts test env jsdom note to use a different preprocessor replace build css and watch css commands according to your preprocessor s documentation now you can rename src app css to src app scss and run npm run watch css the watcher will find every sass file in src subdirectories and create a corresponding css file next to it in our case overwriting src app css since src app js still imports src app css the styles become a part of your application you can now edit src app scss and src app css will be regenerated to share variables between sass files you can use sass imports for example src app scss and other component style files could include import shared scss with variable definitions to enable importing files without using relative paths you can add the include path option to the command in package json build css node sass chokidar include path src include path node modules src o src watch css npm run build css node sass chokidar include path src include path node modules src o src watch recursive this will allow you to do imports like scss import styles colors scss assuming a styles directory under src import nprogress nprogress importing a css file from the nprogress node module at this point you might want to remove all css files from the source control and add src css to your gitignore file it is generally a good practice to keep the build products outside of the source control as a final step you may find it convenient to run watch css automatically with npm start and run build css as a part of npm run build you can use the operator to execute two scripts sequentially however there is no cross platform way to run two scripts in parallel so we will install a package for this sh npm install save npm run all alternatively you may use yarn sh yarn add npm run all then we can change start and build scripts to include the css preprocessor commands diff scripts build css node sass chokidar src o src watch css npm run build css node sass chokidar src o src watch recursive start react scripts start build react scripts build start js react scripts start start npm run all p watch css start js build js react scripts build build npm run all build css build js test react scripts test env jsdom eject react scripts eject now running npm start and npm run build also builds sass files why node sass chokidar node sass has been reported as having the following issues node sass watch has been reported to have performance issues in certain conditions when used in a virtual machine or with docker infinite styles compiling 1939 https github com facebookincubator create react app issues 1939 node sass has been reported as having issues with detecting new files in a directory 1891 https github com sass node sass issues 1891 node sass chokidar is used here as it addresses these issues adding images fonts and files with webpack using static assets like images and fonts works similarly to css you can import a file right in a javascript module this tells webpack to include that file in the bundle unlike css imports importing a file gives you a string value this value is the final path you can reference in your code e g as the src attribute of an image or the href of a link to a pdf to reduce the number of requests to the server importing images that are less than 10 000 bytes returns a data uri https developer mozilla org en us docs web http basics of http data uris instead of a path this applies to the following file extensions bmp gif jpg jpeg and png svg files are excluded due to 1153 https github com facebookincubator create react app issues 1153 here is an example js import react from react import logo from logo png tell webpack this js file uses this image console log logo logo 84287d09 png function header import result is the url of your image return img src logo alt logo export default header this ensures that when the project is built webpack will correctly move the images into the build folder and provide us with correct paths this works in css too css logo background image url logo png webpack finds all relative module references in css they start with and replaces them with the final paths from the compiled bundle if you make a typo or accidentally delete an important file you will see a compilation error just like when you import a non existent javascript module the final filenames in the compiled bundle are generated by webpack from content hashes if the file content changes in the future webpack will give it a different name in production so you don t need to worry about long term caching of assets please be advised that this is also a custom feature of webpack it is not required for react but many people enjoy it and react native uses a similar mechanism for images br an alternative way of handling static assets is described in the next section using the public folder note this feature is available with react scripts 0 5 0 and higher changing the html the public folder contains the html file so you can tweak it for example to set the page title changing the page title the script tag with the compiled code will be added to it automatically during the build process adding assets outside of the module system you can also add other assets to the public folder note that we normally encourage you to import assets in javascript files instead for example see the sections on adding a stylesheet adding a stylesheet and adding images and fonts adding images fonts and files this mechanism provides a number of benefits scripts and stylesheets get minified and bundled together to avoid extra network requests missing files cause compilation errors instead of 404 errors for your users result filenames include content hashes so you don t need to worry about browsers caching their old versions however there is an escape hatch that you can use to add an asset outside of the module system if you put a file into the public folder it will not be processed by webpack instead it will be copied into the build folder untouched to reference assets in the public folder you need to use a special variable called public url inside index html you can use it like this html link rel shortcut icon href public url favicon ico only files inside the public folder will be accessible by public url prefix if you need to use a file from src or node modules you ll have to copy it there to explicitly specify your intention to make this file a part of the build when you run npm run build create react app will substitute public url with a correct absolute path so your project works even if you use client side routing or host it at a non root url in javascript code you can use process env public url for similar purposes js render note this is an escape hatch and should be used sparingly normally we recommend using import for getting asset urls as described in adding images and fonts above this section return img src process env public url img logo png keep in mind the downsides of this approach none of the files in public folder get post processed or minified missing files will not be called at compilation time and will cause 404 errors for your users result filenames won t include content hashes so you ll need to add query arguments or rename them every time they change when to use the public folder normally we recommend importing stylesheets adding a stylesheet images and fonts adding images fonts and files from javascript the public folder is useful as a workaround for a number of less common cases you need a file with a specific name in the build output such as manifest webmanifest https developer mozilla org en us docs web manifest you have thousands of images and need to dynamically reference their paths you want to include a small script like pace js http github hubspot com pace docs welcome outside of the bundled code some library may be incompatible with webpack and you have no other option but to include it as a script tag note that if you add a script that declares global variables you also need to read the next section on using them using global variables when you include a script in the html file that defines global variables and try to use one of these variables in the code the linter will complain because it cannot see the definition of the variable you can avoid this by reading the global variable explicitly from the window object for example js const window this makes it obvious you are using a global variable intentionally rather than because of a typo alternatively you can force the linter to ignore any line by adding eslint disable line after it adding bootstrap you don t have to use react bootstrap https react bootstrap github io together with react but it is a popular library for integrating bootstrap with react apps if you need it you can integrate it with create react app by following these steps install react bootstrap and bootstrap from npm react bootstrap does not include bootstrap css so this needs to be installed as well sh npm install save react bootstrap bootstrap 3 alternatively you may use yarn sh yarn add react bootstrap bootstrap 3 import bootstrap css and optionally bootstrap theme css in the beginning of your src index js file js import bootstrap dist css bootstrap css import bootstrap dist css bootstrap theme css put any other imports below so that css from your components takes precedence over default styles import required react bootstrap components within src app js file or your custom component files js import navbar jumbotron button from react bootstrap now you are ready to use the imported react bootstrap components within your component hierarchy defined in the render method here is an example app js https gist githubusercontent com gaearon 85d8c067f6af1e56277c82d19fd4da7b raw 6158dd991b67284e9fc8d70b9d973efe87659d72 app js redone using react bootstrap using a custom theme sometimes you might need to tweak the visual styles of bootstrap or equivalent package br we suggest the following approach create a new package that depends on the package you wish to customize e g bootstrap add the necessary build steps to tweak the theme and publish your package on npm install your own theme npm package as a dependency of your app here is an example of adding a customized bootstrap https medium com tacomanator customizing create react app aa9ffb88165 that follows these steps adding flow flow is a static type checker that helps you write code with fewer bugs check out this introduction to using static types in javascript https medium com preethikasireddy why use static types in javascript part 1 8382da1e0adb if you are new to this concept recent versions of flow http flowtype org work with create react app projects out of the box to add flow to a create react app project follow these steps 1 run npm install save flow bin or yarn add flow bin 2 add flow flow to the scripts section of your package json 3 run npm run flow init or yarn flow init to create a flowconfig file https flowtype org docs advanced configuration html in the root directory 4 add flow to any files you want to type check for example to src app js now you can run npm run flow or yarn flow to check the files for type errors you can optionally use an ide like nuclide https nuclide io docs languages flow for a better integrated experience in the future we plan to integrate it into create react app even more closely to learn more about flow check out its documentation https flowtype org adding a router create react app doesn t prescribe a specific routing solution but react router https reacttraining com react router is the most popular one to add it run sh npm install save react router dom alternatively you may use yarn sh yarn add react router dom to try it delete all the code in src app js and replace it with any of the examples on its website the basic example https reacttraining com react router web example basic is a good place to get started note that you may need to configure your production server to support client side routing serving apps with client side routing before deploying your app adding custom environment variables note this feature is available with react scripts 0 2 3 and higher your project can consume variables declared in your environment as if they were declared locally in your js files by default you will have node env defined for you and any other environment variables starting with react app the environment variables are embedded during the build time since create react app produces a static html css js bundle it can t possibly read them at runtime to read them at runtime you would need to load html into memory on the server and replace placeholders in runtime just like described here injecting data from the server into the page alternatively you can rebuild the app on the server anytime you change them note you must create custom environment variables beginning with react app any other variables except node env will be ignored to avoid accidentally exposing a private key on the machine that could have the same name https github com facebookincubator create react app issues 865 issuecomment 252199527 changing any environment variables will require you to restart the development server if it is running these environment variables will be defined for you on process env for example having an environment variable named react app secret code will be exposed in your js as process env react app secret code there is also a special built in environment variable called node env you can read it from process env node env when you run npm start it is always equal to development when you run npm test it is always equal to test and when you run npm run build to make a production bundle it is always equal to production you cannot override node env manually this prevents developers from accidentally deploying a slow development build to production these environment variables can be useful for displaying information conditionally based on where the project is deployed or consuming sensitive data that lives outside of version control first you need to have environment variables defined for example let s say you wanted to consume a secret defined in the environment inside a form jsx render return div small you are running this application in b process env node env b mode small form input type hidden defaultvalue process env react app secret code form div during the build process env react app secret code will be replaced with the current value of the react app secret code environment variable remember that the node env variable will be set for you automatically when you load the app in the browser and inspect the input you will see its value set to abcdef and the bold text will show the environment provided when using npm start html div small you are running this application in b development b mode small form input type hidden value abcdef form div the above form is looking for a variable called react app secret code from the environment in order to consume this value we need to have it defined in the environment this can be done using two ways either in your shell or in a env file both of these ways are described in the next few sections having access to the node env is also useful for performing actions conditionally js if process env node env production analytics disable when you compile the app with npm run build the minification step will strip out this condition and the resulting bundle will be smaller referencing environment variables in the html note this feature is available with react scripts 0 9 0 and higher you can also access the environment variables starting with react app in the public index html for example html title react app website name title note that the caveats from the above section apply apart from a few built in variables node env and public url variable names must start with react app to work the environment variables are injected at build time if you need to inject them at runtime follow this approach instead generating dynamic meta tags on the server adding temporary environment variables in your shell defining environment variables can vary between oses it s also important to know that this manner is temporary for the life of the shell session windows cmd exe cmd set react app secret code abcdef npm start note quotes around the variable assignment are required to avoid a trailing whitespace windows powershell powershell env react app secret code abcdef and npm start linux macos bash bash react app secret code abcdef npm start adding development environment variables in env note this feature is available with react scripts 0 5 0 and higher to define permanent environment variables create a file called env in the root of your project react app secret code abcdef note you must create custom environment variables beginning with react app any other variables except node env will be ignored to avoid accidentally exposing a private key on the machine that could have the same name https github com facebookincubator create react app issues 865 issuecomment 252199527 changing any environment variables will require you to restart the development server if it is running env files should be checked into source control with the exclusion of env local what other env files can be used note this feature is available with react scripts 1 0 0 and higher env default env local local overrides this file is loaded for all environments except test env development env test env production environment specific settings env development local env test local env production local local overrides of environment specific settings files on the left have more priority than files on the right npm start env development local env development env local env npm run build env production local env production env local env npm test env test local env test env note env local is missing these variables will act as the defaults if the machine does not explicitly set them br please refer to the dotenv documentation https github com motdotla dotenv for more details note if you are defining environment variables for development your ci and or hosting platform will most likely need these defined as well consult their documentation how to do this for example see the documentation for travis ci https docs travis ci com user environment variables or heroku https devcenter heroku com articles config vars expanding environment variables in env note this feature is available with react scripts 1 1 0 and higher expand variables already on your machine for use in your env file using dotenv expand https github com motdotla dotenv expand for example to get the environment variable npm package version react app version npm package version also works react app version npm package version or expand variables local to the current env file domain www example com react app foo domain foo react app bar domain bar can i use decorators many popular libraries use decorators https medium com google developers exploring es7 decorators 76ecb65fb841 in their documentation br create react app doesn t support decorator syntax at the moment because it is an experimental proposal and is subject to change the current specification version is not officially supported by babel if the specification changes we won t be able to write a codemod because we don t use them internally at facebook however in many cases you can rewrite decorator based code without decorators just as fine br please refer to these two threads for reference 214 https github com facebookincubator create react app issues 214 411 https github com facebookincubator create react app issues 411 create react app will add decorator support when the specification advances to a stable stage fetching data with ajax requests react doesn t prescribe a specific approach to data fetching but people commonly use either a library like axios https github com axios axios or the fetch api https developer mozilla org en us docs web api fetch api provided by the browser conveniently create react app includes a polyfill for fetch so you can use it without worrying about the browser support the global fetch function allows to easily makes ajax requests it takes in a url as an input and returns a promise that resolves to a response object you can find more information about fetch here https developer mozilla org en us docs web api fetch api using fetch this project also includes a promise polyfill https github com then promise which provides a full implementation of promises a a promise represents the eventual result of an asynchronous operation you can find more information about promises here https www promisejs org and here https developer mozilla org en us docs web javascript reference global objects promise both axios and fetch use promises under the hood you can also use the async await https davidwalsh name async await syntax to reduce the callback nesting you can learn more about making ajax requests from react components in the faq entry on the react website https reactjs org docs faq ajax html integrating with an api backend these tutorials will help you to integrate your app with an api backend running on another port using fetch to access it node check out this tutorial https www fullstackreact com articles using create react app with a server you can find the companion github repository here https github com fullstackreact food lookup demo ruby on rails check out this tutorial https www fullstackreact com articles how to get create react app to work with your rails api you can find the companion github repository here https github com fullstackreact food lookup demo rails proxying api requests in development note this feature is available with react scripts 0 2 3 and higher people often serve the front end react app from the same host and port as their backend implementation br for example a production setup might look like this after the app is deployed static server returns index html with react app todos static server returns index html with react app api todos server handles any api requests using the backend implementation such setup is not required however if you do have a setup like this it is convenient to write requests like fetch api todos without worrying about redirecting them to another host or port during development to tell the development server to proxy any unknown requests to your api server in development add a proxy field to your package json for example js proxy http localhost 4000 this way when you fetch api todos in development the development server will recognize that it s not a static asset and will proxy your request to http localhost 4000 api todos as a fallback the development server will only attempt to send requests without text html in its accept header to the proxy conveniently this avoids cors issues http stackoverflow com questions 21854516 understanding ajax cors and security considerations and error messages like this in development fetch api cannot load http localhost 4000 api todos no access control allow origin header is present on the requested resource origin http localhost 3000 is therefore not allowed access if an opaque response serves your needs set the request s mode to no cors to fetch the resource with cors disabled keep in mind that proxy only has effect in development with npm start and it is up to you to ensure that urls like api todos point to the right thing in production you don t have to use the api prefix any unrecognized request without a text html accept header will be redirected to the specified proxy the proxy option supports http https and websocket connections br if the proxy option is not flexible enough for you alternatively you can configure the proxy yourself configuring the proxy manually enable cors on your server here s how to do it for express http enable cors org server expressjs html use environment variables adding custom environment variables to inject the right server host and port into your app invalid host header errors after configuring proxy when you enable the proxy option you opt into a more strict set of host checks this is necessary because leaving the backend open to remote hosts makes your computer vulnerable to dns rebinding attacks the issue is explained in this article https medium com webpack webpack dev server middleware security issues 1489d950874a and this issue https github com webpack webpack dev server issues 887 this shouldn t affect you when developing on localhost but if you develop remotely like described here https github com facebookincubator create react app issues 2271 you will see this error in the browser after enabling the proxy option invalid host header to work around it you can specify your public development host in a file called env development in the root of your project host mypublicdevhost com if you restart the development server now and load the app from the specified host it should work if you are still having issues or if you re using a more exotic environment like a cloud editor you can bypass the host check completely by adding a line to env development local note that this is dangerous and exposes your machine to remote code execution from malicious websites note this is dangerous it exposes your machine to attacks from the websites you visit dangerously disable host check true we don t recommend this approach configuring the proxy manually note this feature is available with react scripts 1 0 0 and higher if the proxy option is not flexible enough for you you can specify an object in the following form in package json br you may also specify any configuration value http proxy middleware https github com chimurai http proxy middleware options or http proxy https github com nodejitsu node http proxy options supports js proxy api target url ws true all requests matching this path will be proxies no exceptions this includes requests for text html which the standard proxy option does not proxy if you need to specify multiple proxies you may do so by specifying additional entries matches are regular expressions so that you can use a regexp to match multiple paths js proxy matches any request starting with api api target url 1 ws true matches any request starting with foo foo target url 2 ssl true pathrewrite foo foo beta matches bar abc html but not bar sub def html bar html target url 3 matches baz abc html and baz sub def html baz html target url 4 configuring a websocket proxy when setting up a websocket proxy there are a some extra considerations to be aware of if you re using a websocket engine like socket io https socket io you must have a socket io server running that you can use as the proxy target socket io will not work with a standard websocket server specifically don t expect socket io to work with the websocket org echo test http websocket org echo html there s some good documentation available for setting up a socket io server https socket io docs standard websockets will work with a standard websocket server as well as the websocket org echo test you can use libraries like ws https github com websockets ws for the server with native websockets in the browser https developer mozilla org en us docs web api websocket either way you can proxy websocket requests manually in package json js proxy socket your compatible websocket server target ws socket url tell http proxy middleware that this is a websocket proxy also allows you to proxy websocket requests without an additional http request https github com chimurai http proxy middleware external websocket upgrade ws true using https in development note this feature is available with react scripts 0 4 0 and higher you may require the dev server to serve pages over https one particular case where this could be useful is when using the proxy feature proxying api requests in development to proxy requests to an api server when that api server is itself serving https to do this set the https environment variable to true then start the dev server as usual with npm start windows cmd exe cmd set https true npm start windows powershell powershell env https true and npm start note the lack of whitespace is intentional linux macos bash bash https true npm start note that the server will use a self signed certificate so your web browser will almost definitely display a warning upon accessing the page generating dynamic meta tags on the server since create react app doesn t support server rendering you might be wondering how to make meta tags dynamic and reflect the current url to solve this we recommend to add placeholders into the html like this html doctype html html lang en head meta property og title content og title meta property og description content og description then on the server regardless of the backend you use you can read index html into memory and replace og title og description and any other placeholders with values depending on the current url just make sure to sanitize and escape the interpolated values so that they are safe to embed into html if you use a node server you can even share the route matching logic between the client and the server however duplicating it also works fine in simple cases pre rendering into static html files if you re hosting your build with a static hosting provider you can use react snapshot https www npmjs com package react snapshot or react snap https github com stereobooster react snap to generate html pages for each route or relative link in your application these pages will then seamlessly become active or hydrated when the javascript bundle has loaded there are also opportunities to use this outside of static hosting to take the pressure off the server when generating and caching routes the primary benefit of pre rendering is that you get the core content of each page with the html payload regardless of whether or not your javascript bundle successfully downloads it also increases the likelihood that each route of your application will be picked up by search engines you can read more about zero configuration pre rendering also called snapshotting here https medium com superhighfives an almost static stack 6df0a2791319 injecting data from the server into the page similarly to the previous section you can leave some placeholders in the html that inject global variables for example js doctype html html lang en head script window server data server data script then on the server you can replace server data with a json of real data right before sending the response the client code can then read window server data to use it make sure to sanitize the json before sending it to the client https medium com node security the most common xss vulnerability in react js applications 2bdffbcc1fa0 as it makes your app vulnerable to xss attacks running tests note this feature is available with react scripts 0 3 0 and higher br read the migration guide to learn how to enable it in older projects https github com facebookincubator create react app blob master changelog md migrating from 023 to 030 create react app uses jest https facebook github io jest as its test runner to prepare for this integration we did a major revamp https facebook github io jest blog 2016 09 01 jest 15 html of jest so if you heard bad things about it years ago give it another try jest is a node based runner this means that the tests always run in a node environment and not in a real browser this lets us enable fast iteration speed and prevent flakiness while jest provides browser globals such as window thanks to jsdom https github com tmpvar jsdom they are only approximations of the real browser behavior jest is intended to be used for unit tests of your logic and your components rather than the dom quirks we recommend that you use a separate tool for browser end to end tests if you need them they are beyond the scope of create react app filename conventions jest will look for test files with any of the following popular naming conventions files with js suffix in tests folders files with test js suffix files with spec js suffix the test js spec js files or the tests folders can be located at any depth under the src top level folder we recommend to put the test files or tests folders next to the code they are testing so that relative imports appear shorter for example if app test js and app js are in the same folder the test just needs to import app from app instead of a long relative path colocation also helps find tests more quickly in larger projects command line interface when you run npm test jest will launch in the watch mode every time you save a file it will re run the tests just like npm start recompiles the code the watcher includes an interactive command line interface with the ability to run all tests or focus on a search pattern it is designed this way so that you can keep it open and enjoy fast re runs you can learn the commands from the watch usage note that the watcher prints after every run jest watch mode http facebook github io jest img blog 15 watch gif version control integration by default when you run npm test jest will only run the tests related to files changed since the last commit this is an optimization designed to make your tests run fast regardless of how many tests you have however it assumes that you don t often commit the code that doesn t pass the tests jest will always explicitly mention that it only ran tests related to the files changed since the last commit you can also press a in the watch mode to force jest to run all tests jest will always run all tests on a continuous integration continuous integration server or if the project is not inside a git or mercurial repository writing tests to create tests add it or test blocks with the name of the test and its code you may optionally wrap them in describe blocks for logical grouping but this is neither required nor recommended jest provides a built in expect global function for making assertions a basic test could look like this js import sum from sum it sums numbers expect sum 1 2 toequal 3 expect sum 2 2 toequal 4 all expect matchers supported by jest are extensively documented here https facebook github io jest docs en expect html content br you can also use jest fn and expect fn tobecalled https facebook github io jest docs en expect html tohavebeencalled to create spies or mock functions testing components there is a broad spectrum of component testing techniques they range from a smoke test verifying that a component renders without throwing to shallow rendering and testing some of the output to full rendering and testing component lifecycle and state changes different projects choose different testing tradeoffs based on how often components change and how much logic they contain if you haven t decided on a testing strategy yet we recommend that you start with creating simple smoke tests for your components js import react from react import reactdom from react dom import app from app it renders without crashing const div document createelement div reactdom render app div this test mounts a component and makes sure that it didn t throw during rendering tests like this provide a lot of value with very little effort so they are great as a starting point and this is the test you will find in src app test js when you encounter bugs caused by changing components you will gain a deeper insight into which parts of them are worth testing in your application this might be a good time to introduce more specific tests asserting specific expected output or behavior if you d like to test components in isolation from the child components they render we recommend using shallow rendering api http airbnb io enzyme docs api shallow html from enzyme http airbnb io enzyme to install it run sh npm install save enzyme enzyme adapter react 16 react test renderer alternatively you may use yarn sh yarn add enzyme enzyme adapter react 16 react test renderer as of enzyme 3 you will need to install enzyme along with an adapter corresponding to the version of react you are using the examples above use the adapter for react 16 the adapter will also need to be configured in your global setup file initializing test environment src setuptests js js import configure from enzyme import adapter from enzyme adapter react 16 configure adapter new adapter note keep in mind that if you decide to eject before creating src setuptests js the resulting package json file won t contain any reference to it read here initializing test environment to learn how to add this after ejecting now you can write a smoke test with it js import react from react import shallow from enzyme import app from app it renders without crashing shallow app unlike the previous smoke test using reactdom render this test only renders app and doesn t go deeper for example even if app itself renders a button that throws this test will pass shallow rendering is great for isolated unit tests but you may still want to create some full rendering tests to ensure the components integrate correctly enzyme supports full rendering with mount http airbnb io enzyme docs api mount html and you can also use it for testing state changes and component lifecycle you can read the enzyme documentation http airbnb io enzyme for more testing techniques enzyme documentation uses chai and sinon for assertions but you don t have to use them because jest provides built in expect and jest fn for spies here is an example from enzyme documentation that asserts specific output rewritten to use jest matchers js import react from react import shallow from enzyme import app from app it renders welcome message const wrapper shallow app const welcome h2 welcome to react h2 expect wrapper contains welcome to equal true expect wrapper contains welcome toequal true all jest matchers are extensively documented here http facebook github io jest docs en expect html br nevertheless you can use a third party assertion library like chai http chaijs com if you want to as described below additionally you might find jest enzyme https github com blainekasten enzyme matchers helpful to simplify your tests with readable matchers the above contains code can be written more simply with jest enzyme js expect wrapper tocontainreact welcome to enable this install jest enzyme sh npm install save jest enzyme alternatively you may use yarn sh yarn add jest enzyme import it in src setuptests js initializing test environment to make its matchers available in every test js import jest enzyme using third party assertion libraries we recommend that you use expect for assertions and jest fn for spies if you are having issues with them please file those against jest https github com facebook jest issues new and we ll fix them we intend to keep making them better for react supporting for example pretty printing react elements as jsx https github com facebook jest pull 1566 however if you are used to other libraries such as chai http chaijs com and sinon http sinonjs org or if you have existing code using them that you d like to port over you can import them normally like this js import sinon from sinon import expect from chai and then use them in your tests like you normally do initializing test environment note this feature is available with react scripts 0 4 0 and higher if your app uses a browser api that you need to mock in your tests or if you just need a global setup before running your tests add a src setuptests js to your project it will be automatically executed before running your tests for example src setuptests js js const localstoragemock getitem jest fn setitem jest fn clear jest fn global localstorage localstoragemock note keep in mind that if you decide to eject before creating src setuptests js the resulting package json file won t contain any reference to it so you should manually create the property setuptestframeworkscriptfile in the configuration for jest something like the following js jest setuptestframeworkscriptfile rootdir src setuptests js focusing and excluding tests you can replace it with xit to temporarily exclude a test from being executed br similarly fit lets you focus on a specific test without running any other tests coverage reporting jest has an integrated coverage reporter that works well with es6 and requires no configuration br run npm test coverage note extra in the middle to include a coverage report like this coverage report http i imgur com 5bfhnts png note that tests run much slower with coverage so it is recommended to run it separately from your normal workflow configuration the default jest coverage configuration can be overriden by adding any of the following supported keys to a jest config in your package json supported overrides collectcoveragefrom https facebook github io jest docs en configuration html collectcoveragefrom array coveragereporters https facebook github io jest docs en configuration html coveragereporters array string coveragethreshold https facebook github io jest docs en configuration html coveragethreshold object snapshotserializers https facebook github io jest docs en configuration html snapshotserializers array string example package json json name your package jest collectcoveragefrom src js jsx rootdir node modules rootdir path to dir coveragethreshold global branches 90 functions 90 lines 90 statements 90 coveragereporters text snapshotserializers my serializer module continuous integration by default npm test runs the watcher with interactive cli however you can force it to run tests once and finish the process by setting an environment variable called ci when creating a build of your application with npm run build linter warnings are not checked by default like npm test you can force the build to perform a linter warning check by setting the environment variable ci if any warnings are encountered then the build fails popular ci servers already set the environment variable ci by default but you can do this yourself too on ci servers travis ci 1 following the travis getting started https docs travis ci com user getting started guide for syncing your github repository with travis you may need to initialize some settings manually in your profile https travis ci org profile page 1 add a travis yml file to your git repository language node js node js 6 cache directories node modules script npm run build npm test 1 trigger your first build with a git push 1 customize your travis ci build https docs travis ci com user customizing the build if needed circleci follow this article https medium com knowbody circleci and zeits now sh c9b7eebcd3c1 to set up circleci with a create react app project on your own environment windows cmd exe cmd set ci true npm test cmd set ci true npm run build note the lack of whitespace is intentional windows powershell powershell env ci true and npm test powershell env ci true and npm run build linux macos bash bash ci true npm test bash ci true npm run build the test command will force jest to run tests once instead of launching the watcher if you find yourself doing this often in development please file an issue https github com facebookincubator create react app issues new to tell us about your use case because we want to make watcher the best experience and are open to changing how it works to accommodate more workflows the build command will check for linter warnings and fail if any are found disabling jsdom by default the package json of the generated project looks like this js scripts start react scripts start build react scripts build test react scripts test env jsdom if you know that none of your tests depend on jsdom https github com tmpvar jsdom you can safely remove env jsdom and your tests will run faster diff scripts start react scripts start build react scripts build test react scripts test env jsdom test react scripts test to help you make up your mind here is a list of apis that need jsdom any browser globals like window and document reactdom render https facebook github io react docs top level api html reactdom render testutils renderintodocument https facebook github io react docs test utils html renderintodocument a shortcut https github com facebook react blob 34761cf9a252964abfaab6faf74d473ad95d1f21 src test reacttestutils js l83 l91 for the above mount http airbnb io enzyme docs api mount html in enzyme http airbnb io enzyme index html in contrast jsdom is not needed for the following apis testutils createrenderer https facebook github io react docs test utils html shallow rendering shallow rendering shallow http airbnb io enzyme docs api shallow html in enzyme http airbnb io enzyme index html finally jsdom is also not needed for snapshot testing http facebook github io jest blog 2016 07 27 jest 14 html snapshot testing snapshot testing is a feature of jest that automatically generates text snapshots of your components and saves them on the disk so if the ui output changes you get notified without manually writing any assertions on the component output read more about snapshot testing http facebook github io jest blog 2016 07 27 jest 14 html editor integration if you use visual studio code https code visualstudio com there is a jest extension https github com orta vscode jest which works with create react app out of the box this provides a lot of ide like features while using a text editor showing the status of a test run with potential fail messages inline starting and stopping the watcher automatically and offering one click snapshot updates vs code jest preview https cloud githubusercontent com assets 49038 20795349 a032308a b7c8 11e6 9b34 7eeac781003f png debugging tests there are various ways to setup a debugger for your jest tests we cover debugging in chrome and visual studio code https code visualstudio com note debugging tests requires node 8 or higher debugging tests in chrome add the following to the scripts section in your project s package json json scripts test debug react scripts inspect brk test runinband env jsdom place debugger statements in any test and run bash npm run test debug this will start running your jest tests but pause before executing to allow a debugger to attach to the process open the following in chrome about inspect after opening that link the chrome developer tools will be displayed select inspect on your process and a breakpoint will be set at the first line of the react script this is done simply to give you time to open the developer tools and to prevent jest from executing before you have time to do so click the button that looks like a play button in the upper right hand side of the screen to continue execution when jest executes the test that contains the debugger statement execution will pause and you can examine the current scope and call stack note the runinband cli option makes sure jest runs test in the same process rather than spawning processes for individual tests normally jest parallelizes test runs across processes but it is hard to debug many processes at the same time debugging tests in visual studio code debugging jest tests is supported out of the box for visual studio code https code visualstudio com use the following launch json https code visualstudio com docs editor debugging launch configurations configuration file version 0 2 0 configurations name debug cra tests type node request launch runtimeexecutable workspaceroot node modules bin react scripts args test runinband no cache env jsdom cwd workspaceroot protocol inspector console integratedterminal internalconsoleoptions neveropen developing components in isolation usually in an app you have a lot of ui components and each of them has many different states for an example a simple button component could have following states in a regular state with a text label in the disabled mode in a loading state usually it s hard to see these states without running a sample app or some examples create react app doesn t include any tools for this by default but you can easily add storybook for react https storybook js org source https github com storybooks storybook or react styleguidist https react styleguidist js org source https github com styleguidist react styleguidist to your project these are third party tools that let you develop components and see all their states in isolation from your app storybook for react demo http i imgur com 7ciawpb gif you can also deploy your storybook or style guide as a static app this way everyone in your team can view and review different states of ui components without starting a backend server or creating an account in your app getting started with storybook storybook is a development environment for react ui components it allows you to browse a component library view the different states of each component and interactively develop and test components first install the following npm package globally sh npm install g storybook cli then run the following command inside your app s directory sh getstorybook after that follow the instructions on the screen learn more about react storybook screencast getting started with react storybook https egghead io lessons react getting started with react storybook github repo https github com storybooks storybook documentation https storybook js org basics introduction snapshot testing ui https github com storybooks storybook tree master addons storyshots with storybook addon storyshot getting started with styleguidist styleguidist combines a style guide where all your components are presented on a single page with their props documentation and usage examples with an environment for developing components in isolation similar to storybook in styleguidist you write examples in markdown where each code snippet is rendered as a live editable playground first install styleguidist sh npm install save react styleguidist alternatively you may use yarn sh yarn add react styleguidist then add these scripts to your package json diff scripts styleguide styleguidist server styleguide build styleguidist build start react scripts start then run the following command inside your app s directory sh npm run styleguide after that follow the instructions on the screen learn more about react styleguidist github repo https github com styleguidist react styleguidist documentation https react styleguidist js org docs getting started html publishing components to npm create react app doesn t provide any built in functionality to publish a component to npm if you re ready to extract a component from your project so other people can use it we recommend moving it to a separate directory outside of your project and then using a tool like nwb https github com insin nwb react components and libraries to prepare it for publishing making a progressive web app by default the production build is a fully functional offline first progressive web app https developers google com web progressive web apps progressive web apps are faster and more reliable than traditional web pages and provide an engaging mobile experience all static site assets are cached so that your page loads fast on subsequent visits regardless of network connectivity such as 2g or 3g updates are downloaded in the background your app will work regardless of network state even if offline this means your users will be able to use your app at 10 000 feet and on the subway on mobile devices your app can be added directly to the user s home screen app icon and all you can also re engage users using web push notifications this eliminates the need for the app store the sw precache webpack plugin https github com goldhand sw precache webpack plugin is integrated into production configuration and it will take care of generating a service worker file that will automatically precache all of your local assets and keep them up to date as you deploy updates the service worker will use a cache first strategy https developers google com web fundamentals instant and offline offline cookbook cache falling back to network for handling all requests for local assets including the initial html ensuring that your web app is reliably fast even on a slow or unreliable network opting out of caching if you would prefer not to enable service workers prior to your initial production deployment then remove the call to registerserviceworker from src index js src index js if you had previously enabled service workers in your production deployment and have decided that you would like to disable them for all your existing users you can swap out the call to registerserviceworker in src index js src index js first by modifying the service worker import javascript import unregister from registerserviceworker and then call unregister instead after the user visits a page that has unregister the service worker will be uninstalled note that depending on how service worker js is served it may take up to 24 hours for the cache to be invalidated offline first considerations 1 service workers require https https developers google com web fundamentals getting started primers service workers you need https although to facilitate local testing that policy does not apply to localhost http stackoverflow com questions 34160509 options for testing service workers via http 34161385 34161385 if your production web server does not support https then the service worker registration will fail but the rest of your web app will remain functional 1 service workers are not currently supported https jakearchibald github io isserviceworkerready in all web browsers service worker registration won t be attempted src registerserviceworker js on browsers that lack support 1 the service worker is only enabled in the production environment deployment e g the output of npm run build it s recommended that you do not enable an offline first service worker in a development environment as it can lead to frustration when previously cached assets are used and do not include the latest changes you ve made locally 1 if you need to test your offline first service worker locally build the application using npm run build and run a simple http server from your build directory after running the build script create react app will give instructions for one way to test your production build locally and the deployment instructions deployment have instructions for using other methods be sure to always use an incognito window to avoid complications with your browser cache 1 if possible configure your production environment to serve the generated service worker js with http caching disabled http stackoverflow com questions 38843970 service worker javascript update frequency every 24 hours if that s not possible github pages github pages for instance does not allow you to change the default 10 minute http cache lifetime then be aware that if you visit your production site and then revisit again before service worker js has expired from your http cache you ll continue to get the previously cached assets from the service worker if you have an immediate need to view your updated production deployment performing a shift refresh will temporarily disable the service worker and retrieve all assets from the network 1 users aren t always familiar with offline first web apps it can be useful to let the user know https developers google com web fundamentals instant and offline offline ux inform the user when the app is ready for offline consumption when the service worker has finished populating your caches showing a this web app works offline message and also let them know when the service worker has fetched the latest updates that will be available the next time they load the page showing a new content is available please refresh message showing this messages is currently left as an exercise to the developer but as a starting point you can make use of the logic included in src registerserviceworker js src registerserviceworker js which demonstrates which service worker lifecycle events to listen for to detect each scenario and which as a default just logs appropriate messages to the javascript console 1 by default the generated service worker file will not intercept or cache any cross origin traffic like http api requests integrating with an api backend images or embeds loaded from a different domain if you would like to use a runtime caching strategy for those requests you can eject npm run eject and then configure the runtimecaching https github com googlechrome sw precache runtimecaching arrayobject option in the swprecachewebpackplugin section of webpack config prod js config webpack config prod js progressive web app metadata the default configuration includes a web app manifest located at public manifest json public manifest json that you can customize with details specific to your web application when a user adds a web app to their homescreen using chrome or firefox on android the metadata in manifest json public manifest json determines what icons names and branding colors to use when the web app is displayed the web app manifest guide https developers google com web fundamentals engage and retain web app manifest provides more context about what each field means and how your customizations will affect your users experience analyzing the bundle size source map explorer https www npmjs com package source map explorer analyzes javascript bundles using the source maps this helps you understand where code bloat is coming from to add source map explorer to a create react app project follow these steps sh npm install save source map explorer alternatively you may use yarn sh yarn add source map explorer then in package json add the following line to scripts diff scripts analyze source map explorer build static js main start react scripts start build react scripts build test react scripts test env jsdom then to analyze the bundle run the production build then run the analyze script npm run build npm run analyze deployment npm run build creates a build directory with a production build of your app set up your favorite http server so that a visitor to your site is served index html and requests to static paths like static js main hash js are served with the contents of the static js main hash js file static server for environments using node https nodejs org the easiest way to handle this would be to install serve https github com zeit serve and let it handle the rest sh npm install g serve serve s build the last command shown above will serve your static site on the port 5000 like many of serve https github com zeit serve s internal settings the port can be adjusted using the p or port flags run this command to get a full list of the options available sh serve h other solutions you don t necessarily need a static server in order to run a create react app project in production it works just as fine integrated into an existing dynamic one here s a programmatic example using node https nodejs org and express http expressjs com javascript const express require express const path require path const app express app use express static path join dirname build app get function req res res sendfile path join dirname build index html app listen 9000 the choice of your server software isn t important either since create react app is completely platform agnostic there s no need to explicitly use node the build folder with static assets is the only output produced by create react app however this is not quite enough if you use client side routing read the next section if you want to support urls like todos 42 in your single page app serving apps with client side routing if you use routers that use the html5 pushstate history api https developer mozilla org en us docs web api history api adding and modifying history entries under the hood for example react router https github com reacttraining react router with browserhistory many static file servers will fail for example if you used react router with a route for todos 42 the development server will respond to localhost 3000 todos 42 properly but an express serving a production build as above will not this is because when there is a fresh page load for a todos 42 the server looks for the file build todos 42 and does not find it the server needs to be configured to respond to a request to todos 42 by serving index html for example we can amend our express example above to serve index html for any unknown paths diff app use express static path join dirname build app get function req res app get function req res res sendfile path join dirname build index html if you re using apache http server https httpd apache org you need to create a htaccess file in the public folder that looks like this options multiviews rewriteengine on rewritecond request filename f rewriterule index html qsa l it will get copied to the build folder when you run npm run build if you re using apache tomcat http tomcat apache org you need to follow this stack overflow answer https stackoverflow com a 41249464 4878474 now requests to todos 42 will be handled correctly both in development and in production on a production build and in a browser that supports service workers https developers google com web fundamentals getting started primers service workers the service worker will automatically handle all navigation requests like for todos 42 by serving the cached copy of your index html this service worker navigation routing can be configured or disabled by eject ing npm run eject and then modifying the navigatefallback https github com googlechrome sw precache navigatefallback string and navigatefallbackwhitelist https github com googlechrome sw precache navigatefallbackwhitelist arrayregexp options of the swpreacheplugin configuration config webpack config prod js when users install your app to the homescreen of their device the default configuration will make a shortcut to index html this may not work for client side routers which expect the app to be served from edit the web app manifest at public manifest json public manifest json and change start url to match the required url scheme for example js start url building for relative paths by default create react app produces a build assuming your app is hosted at the server root br to override this specify the homepage in your package json for example js homepage http mywebsite com relativepath this will let create react app correctly infer the root path to use in the generated html file note if you are using react router 4 you can root link s using the basename prop on any router br more information here https reacttraining com react router web api browserrouter basename string br br for example js browserrouter basename calendar link to today renders a href calendar today serving the same build from different paths note this feature is available with react scripts 0 9 0 and higher if you are not using the html5 pushstate history api or not using client side routing at all it is unnecessary to specify the url from which your app will be served instead you can put this in your package json js homepage this will make sure that all the asset paths are relative to index html you will then be able to move your app from http mywebsite com to http mywebsite com relativepath or even http mywebsite com relative path without having to rebuild it azure https azure microsoft com see this https medium com to pe deploying create react app on microsoft azure c0f6686a4321 blog post on how to deploy your react app to microsoft azure see this https medium com strid host create react app on azure 986bc40d5bf2 pycfnafbg blog post or this https github com ulrikaugustsson azure appservice static repo for a way to use automatic deployment to azure app service firebase https firebase google com install the firebase cli if you haven t already by running npm install g firebase tools sign up for a firebase account https console firebase google com and create a new project run firebase login and login with your previous created firebase account then run the firebase init command from your project s root you need to choose the hosting configure and deploy firebase hosting sites and choose the firebase project you created in the previous step you will need to agree with database rules json being created choose build as the public directory and also agree to configure as a single page app by replying with y sh project setup first let s associate this project directory with a firebase project you can create multiple project aliases by running firebase use add but for now we ll just set up a default project what firebase project do you want to associate as default example app example app fd690 database setup firebase realtime database rules allow you to define how your data should be structured and when your data can be read from and written to what file should be used for database rules database rules json database rules for example app fd690 have been downloaded to database rules json future modifications to database rules json will update database rules when you run firebase deploy hosting setup your public directory is the folder relative to your project directory that will contain hosting assets to uploaded with firebase deploy if you have a build process for your assets use your build s output directory what do you want to use as your public directory build configure as a single page app rewrite all urls to index html yes wrote build index html i writing configuration info to firebase json i writing project information to firebaserc firebase initialization complete important you need to set proper http caching headers for service worker js file in firebase json file or you will not be able to see changes after first deployment issue 2440 https github com facebookincubator create react app issues 2440 it should be added inside hosting key like next hosting headers source service worker js headers key cache control value no cache now after you create a production build with npm run build you can deploy it by running firebase deploy sh deploying to example app fd690 i deploying database hosting database rules ready to deploy i hosting preparing build directory for upload uploading 75 hosting build folder uploaded successfully hosting 8 files uploaded successfully i starting release process may take several minutes deploy complete project console https console firebase google com project example app fd690 overview hosting url https example app fd690 firebaseapp com for more information see add firebase to your javascript project https firebase google com docs web setup github pages https pages github com note this feature is available with react scripts 0 2 0 and higher step 1 add homepage to package json the step below is important br if you skip it your app will not deploy correctly open your package json and add a homepage field for your project json homepage https myusername github io my app or for a github user page json homepage https myusername github io create react app uses the homepage field to determine the root url in the built html file step 2 install gh pages and add deploy to scripts in package json now whenever you run npm run build you will see a cheat sheet with instructions on how to deploy to github pages to publish it at https myusername github io my app https myusername github io my app run sh npm install save gh pages alternatively you may use yarn sh yarn add gh pages add the following scripts in your package json diff scripts predeploy npm run build deploy gh pages d build start react scripts start build react scripts build the predeploy script will run automatically before deploy is run if you are deploying to a github user page instead of a project page you ll need to make two additional modifications 1 first change your repository s source branch to be any branch other than master 1 additionally tweak your package json scripts to push deployments to master diff scripts predeploy npm run build deploy gh pages d build deploy gh pages b master d build step 3 deploy the site by running npm run deploy then run sh npm run deploy step 4 ensure your project s settings use gh pages finally make sure github pages option in your github project settings is set to use the gh pages branch img src http i imgur com hujer9l png width 500 alt gh pages branch setting step 5 optionally configure the domain you can configure a custom domain with github pages by adding a cname file to the public folder notes on client side routing github pages doesn t support routers that use the html5 pushstate history api under the hood for example react router using browserhistory this is because when there is a fresh page load for a url like http user github io todomvc todos 42 where todos 42 is a frontend route the github pages server returns 404 because it knows nothing of todos 42 if you want to add a router to a project hosted on github pages here are a couple of solutions you could switch from using html5 history api to routing with hashes if you use react router you can switch to hashhistory for this effect but the url will be longer and more verbose for example http user github io todomvc todos 42 k yknaj read more https reacttraining com react router web api router about different history implementations in react router alternatively you can use a trick to teach github pages to handle 404 by redirecting to your index html page with a special redirect parameter you would need to add a 404 html file with the redirection code to the build folder before deploying your project and you ll need to add code handling the redirect parameter to index html you can find a detailed explanation of this technique in this guide https github com rafrex spa github pages troubleshooting dev tty no such a device or address if when deploying you get dev tty no such a device or address or a similar error try the follwing 1 create a new personal access token https github com settings tokens 2 git remote set url origin https user token github com user repo 3 try npm run deploy again heroku https www heroku com use the heroku buildpack for create react app https github com mars create react app buildpack br you can find instructions in deploying react with zero configuration https blog heroku com deploying react with zero configuration resolving heroku deployment errors sometimes npm run build works locally but fails during deploy via heroku following are the most common cases module not found error cannot resolve file or directory if you get something like this remote failed to create a production build reason remote module not found error cannot resolve file or directory mydirectory in tmp build 1234 src it means you need to ensure that the lettercase of the file or directory you import matches the one you see on your filesystem or on github this is important because linux the operating system used by heroku is case sensitive so mydirectory and mydirectory are two distinct directories and thus even though the project builds locally the difference in case breaks the import statements on heroku remotes could not find a required file if you exclude or ignore necessary files from the package you will see a error similar this one remote could not find a required file remote name index html remote searched in tmp build a2875fc163b209225122d68916f1d4df public remote remote npm err linux 3 13 0 105 generic remote npm err argv tmp build a2875fc163b209225122d68916f1d4df heroku node bin node tmp build a2875fc163b209225122d68916f1d4df heroku node bin npm run build in this case ensure that the file is there with the proper lettercase and that s not ignored on your local gitignore or gitignore global netlify https www netlify com to do a manual deploy to netlify s cdn sh npm install netlify cli g netlify deploy choose build as the path to deploy to setup continuous delivery with this setup netlify will build and deploy when you push to git or open a pull request 1 start a new netlify project https app netlify com signup 2 pick your git hosting service and select your repository 3 set yarn build as the build command and build as the publish directory 4 click deploy site support for client side routing to support pushstate make sure to create a public redirects file with the following rewrite rules index html 200 when you build the project create react app will place the public folder contents into the build output now https zeit co now now offers a zero configuration single command deployment you can use now to deploy your app for free 1 install the now command line tool either via the recommended desktop tool https zeit co download or via node with npm install g now 2 build your app by running npm run build 3 move into the build directory by running cd build 4 run now name your project name from within the build directory you will see a now sh url in your output like this ready https your project name tpspyhtdtk now sh copied to clipboard paste that url into your browser when the build is complete and you will see your deployed app details are available in this article https zeit co blog unlimited static s3 https aws amazon com s3 and cloudfront https aws amazon com cloudfront see this blog post https medium com omgwtfmarc deploying create react app to s3 or cloudfront 48dae4ce0af on how to deploy your react app to amazon web services s3 and cloudfront surge https surge sh install the surge cli if you haven t already by running npm install g surge run the surge command and log in you or create a new account when asked about the project path make sure to specify the build folder for example sh project path path to project build note that in order to support routers that use html5 pushstate api you may want to rename the index html in your build folder to 200 html before deploying to surge this ensures that every url falls back to that file https surge sh help adding a 200 page for client side routing advanced configuration you can adjust various development and production settings by setting environment variables in your shell or with env adding development environment variables in env variable development production usage browser white check mark x by default create react app will open the default system browser favoring chrome on macos specify a browser https github com sindresorhus opn app to override this behavior or set it to none to disable it completely if you need to customize the way the browser is launched you can specify a node script instead any arguments passed to npm start will also be passed to this script and the url where your app is served will be the last argument your script s file name must have the js extension host white check mark x by default the development web server binds to localhost you may use this variable to specify a different host port white check mark x by default the development web server will attempt to listen on port 3000 or prompt you to attempt the next available port you may use this variable to specify a different port https white check mark x when set to true create react app will run the development server in https mode public url x white check mark create react app assumes your application is hosted at the serving web server s root or a subpath as specified in package json homepage building for relative paths normally create react app ignores the hostname you may use this variable to force assets to be referenced verbatim to the url you provide hostname included this may be particularly useful when using a cdn to host your application ci large orange diamond white check mark when set to true create react app treats warnings as failures in the build it also makes the test runner non watching most cis set this flag by default react editor white check mark x when an app crashes in development you will see an error overlay with clickable stack trace when you click on it create react app will try to determine the editor you are using based on currently running processes and open the relevant source file you can send a pull request to detect your editor of choice https github com facebookincubator create react app issues 2636 setting this environment variable overrides the automatic detection if you do it make sure your systems path https en wikipedia org wiki path variable environment variable points to your editor s bin folder you can also set it to none to disable it completely chokidar usepolling white check mark x when set to true the watcher runs in polling mode as necessary inside a vm use this option if npm start isn t detecting changes generate sourcemap x white check mark when set to false source maps are not generated for a production build this solves oom issues on some smaller machines node path white check mark white check mark same as node path in node js https nodejs org api modules html modules loading from the global folders but only relative folders are allowed can be handy for emulating a monorepo setup by setting node path src troubleshooting npm start doesn t detect changes when you save a file while npm start is running the browser should refresh with the updated code br if this doesn t happen try one of the following workarounds if your project is in a dropbox folder try moving it out if the watcher doesn t see a file called index js and you re referencing it by the folder name you need to restart the watcher https github com facebookincubator create react app issues 1164 due to a webpack bug some editors like vim and intellij have a safe write feature that currently breaks the watcher you will need to disable it follow the instructions in adjusting your text editor https webpack js org guides development adjusting your text editor if your project path contains parentheses try moving the project to a path without them this is caused by a webpack watcher bug https github com webpack watchpack issues 42 on linux and macos you might need to tweak system settings https github com webpack docs wiki troubleshooting not enough watchers to allow more watchers if the project runs inside a virtual machine such as a vagrant provisioned virtualbox create an env file in your project directory if it doesn t exist and add chokidar usepolling true to it this ensures that the next time you run npm start the watcher uses the polling mode as necessary inside a vm if none of these solutions help please leave a comment in this thread https github com facebookincubator create react app issues 659 npm test hangs on macos sierra if you run npm test and the console gets stuck after printing react scripts test env jsdom to the console there might be a problem with your watchman https facebook github io watchman installation as described in facebookincubator create react app 713 https github com facebookincubator create react app issues 713 we recommend deleting node modules in your project and running npm install or yarn if you use it first if it doesn t help you can try one of the numerous workarounds mentioned in these issues facebook jest 1767 https github com facebook jest issues 1767 facebook watchman 358 https github com facebook watchman issues 358 ember cli ember cli 6259 https github com ember cli ember cli issues 6259 it is reported that installing watchman 4 7 0 or newer fixes the issue if you use homebrew http brew sh you can run these commands to update it watchman shutdown server brew update brew reinstall watchman you can find other installation methods https facebook github io watchman docs install html build install on the watchman documentation page if this still doesn t help try running launchctl unload f library launchagents com github facebook watchman plist there are also reports that uninstalling watchman fixes the issue so if nothing else helps remove it from your system and try again npm run build exits too early it is reported that npm run build can fail on machines with limited memory and no swap space which is common in cloud environments even with small projects this command can increase ram usage in your system by hundreds of megabytes so if you have less than 1 gb of available memory your build is likely to fail with the following message the build failed because the process exited too early this probably means the system ran out of memory or someone called kill 9 on the process if you are completely sure that you didn t terminate the process consider adding some swap space https www digitalocean com community tutorials how to add swap on ubuntu 14 04 to the machine you re building on or build the project locally npm run build fails on heroku this may be a problem with case sensitive filenames please refer to this section resolving heroku deployment errors moment js locales are missing if you use a moment js https momentjs com you might notice that only the english locale is available by default this is because the locale files are large and you probably only need a subset of all the locales provided by moment js https momentjs com multiple locale support to add a specific moment js locale to your bundle you need to import it explicitly br for example js import moment from moment import moment locale fr if import multiple locales this way you can later switch between them by calling moment locale with the locale name js import moment from moment import moment locale fr import moment locale es moment locale fr this will only work for locales that have been explicitly imported before npm run build fails to minify some third party packages don t compile their code to es5 before publishing to npm this often causes problems in the ecosystem because neither browsers except for most modern versions nor some tools currently support all es6 features we recommend to publish code on npm as es5 at least for a few more years br to resolve this 1 open an issue on the dependency s issue tracker and ask that the package be published pre compiled note create react app can consume both commonjs and es modules for node js compatibility it is recommended that the main entry point is commonjs however they can optionally provide an es module entry point with the module field in package json note that even if a library provides an es modules version it should still precompile other es6 features to es5 if it intends to support older browsers 2 fork the package and publish a corrected version yourself 3 if the dependency is small enough copy it to your src folder and treat it as application code in the future we might start automatically compiling incompatible third party modules but it is not currently supported this approach would also slow down the production builds alternatives to ejecting ejecting npm run eject lets you customize anything but from that point on you have to maintain the configuration and scripts yourself this can be daunting if you have many similar projects in such cases instead of ejecting we recommend to fork react scripts and any other packages you need this article https auth0 com blog how to configure create react app dives into how to do it in depth you can find more discussion in this issue https github com facebookincubator create react app issues 682 something missing if you have ideas for more how to recipes that should be on this page let us know https github com facebookincubator create react app issues or contribute some https github com facebookincubator create react app edit master packages react scripts template readme md | front_end |
|
Full-Stack-Development-Learning-Path | full stack development learning path full stack development learning path https socialify git ci kaiwalyakoparkar full stack development learning path image description 1 font inter forks 1 issues 1 language 1 name 1 owner 1 pattern solid pulls 1 stargazers 1 theme dark this repo contains all the things which i practice while on learning path to become a full stack developer the learning module is divided in to 54 weeks and into front end and backend this repo contains the 2 main folders frontend backend and every folder will contain the week wise and topic wise segregation of the tasks performed by me index front end html css practice project front end html css practice project social proof section master front end html css practice project social proof section master html5 css advance front end html5 css advance html5 css3 basics front end html5 css3 basics google landing page project front end html5 css3 basics google landing page project http basics front end http basics javascript basics front end javascript basics javascript intermidiate front end javascript intermidiate api front end javascript intermidiate api material ui front end material 20ui practice projects front end material 20ui practice 20projects anime suggester front end material 20ui practice 20projects anime suggester spotify clone front end material 20ui practice 20projects spotify clone the net ninja tutorial front end material 20ui the 20net 20ninja 20tutorial react js front end react 20js code with harry tutorial front end react 20js code with harry tutorial textutils front end react 20js code with harry tutorial textutils practice projects front end react 20js practice projects dog info front end react 20js practice projects dog info news monkey front end react 20js practice projects news monkey portfolio front end react 20js practice projects portfolio portfolio2 front end react 20js practice projects portfolio2 tailwind css front end tailwind 20css code with harry tutorial front end tailwind 20css code 20with 20harry 20tutorial react js x tailwind css front end tailwind 20css react 20js 20x 20tailwind 20css landing page vanilla js playlist front end vanilla js playlist back end jonas node mongodb course back end jonas 20node 2bmongodb 20course nodejs expressjs back end jonas 20node 2bmongodb 20course nodejs 20 2b 20expressjs express api back end jonas 20node 2bmongodb 20course nodejs 20 2b 20expressjs express api mongodb back end mongodb mongoosetutorial back end mongodb mongoosetutorial nodejs back end nodejs api front end part back end nodejs api 28front end 20part 29 json tutorial back end nodejs json tutorial practice projects back end nodejs practice projects blogsite back end nodejs practice projects blogsite blogsite1 back end nodejs practice projects blogsite1 generalstore back end nodejs practice projects generalstore movieflix back end nodejs practice projects movieflix tutorial 1 codewithharry 1 video back end nodejs tutorial 1 codewithharry 281 video 29 tutorial 1 codewithharry back end nodejs tutorial 1 codewithharry tutorial 2 codewithharry series back end nodejs tutorial 2 codewithharry 28series 29 1 creating server back end nodejs tutorial 2 codewithharry 28series 29 1 creating server 2 node modules back end nodejs tutorial 2 codewithharry 28series 29 2 node modules 3 serving html files back end nodejs tutorial 2 codewithharry 28series 29 3 serving html files 4 creating custom backend back end nodejs tutorial 2 codewithharry 28series 29 4 creating custom backend 5 creating custom modules back end nodejs tutorial 2 codewithharry 28series 29 5 creating custom modules 6 npm package manager back end nodejs tutorial 2 codewithharry 28series 29 6 npm package manager 7 create first express app back end nodejs tutorial 2 codewithharry 28series 29 7 create first express app 8 starting with template engine back end nodejs tutorial 2 codewithharry 28series 29 8 starting with template engine tutorial 1 codewithharry back end tutorial 1 codewithharry api practice back end api practice bookstore back end api practice bookstore learning kubernetes deployment back end api practice learning kubernetes deployment | frontend backend learning-by-doing learning 365daysofcode docker docker-compose kubernetes k8s fullstack | front_end |
IOTFuzzer_Full | triforceafl new a tool for simulation dynamic analysis and fuzzing of iot firmware combination of triforceafl firmadyne and decaf decaf upgraded to the newest qemu version 2 10 1 it is included in qemu mode qemu dir in our case run configure target list mipsel softmmu run make firmadyne we use its custom kernel and libnvram to emulate iot firmware cd firmadyne see readme in firmadyne and do as it says notice need to set firmware dir in firmadyne config here we test dir 815 firmware 1 01 zip a router firmware image based on mipsel cpu arch run qemu mode qemu qemu img convert f raw o qcow2 scratch 2 image raw scratch 2 image qcow2 finally we replace the run sh in scratch num with our modified one in firmadyne dev dir triforceafl afl fuzzing with full system emulation run make usage cd firmadyne run scratch num run sh in another terminal run telnet 127 0 0 1 4444 into qemu monitor console firmfuzzer plugin load plugin qemu mode qemu plugins callbacktests callbacktests so do callbacktests httpd do callbacktests hedwig cgi when firmware system initialization is completed and poll system call is executed open a browser type a request 192 168 0 1 hedwig cgi in url the fuzz process will be started malscalpel plugin load plugin qemu mode qemu plugins unpacker unpacker so trace by name mirai mpsl then telnet into system telnet 192 168 0 1 with username alphanetworks and password wrgnd08 dlob dir815 run file load mirai mpsl the plugin works iotfuzzer full | server |
|
LLM_HDL_Design | llm hdl design some hardware design experiments using large language models | ai |
|
StudSysCppLibrary | studsyscpplibrary the library for a student management system coded in c this repository provides the library for the code behind the student management system it provides a means for separation of backend implementation from ui implementation this has been done so the library can be used with any ui whether it is cli or gui rather than duplicating code how to compile change directory into the root of project you cloned and open a terminal there and type make you can then copy the libstudsys so file to usr lib studsys or in a separate directory and specify the path to it with l path flag how to use you can copy all the headers into a studsys folder in usr include or into a different directory and then include the headers required when compiling the ui code make sure you specify lstudsys and if necessary l path where path is the location of the libstudsys so for the sql files when setting up the database you need to set up a user that has full access to the database that will be used you can then import the sql files using on linux mysql u user password password database name sql file sql see database setup database setup you need a mysql server either mysql or mariadb steps 1 login to mysql using mysql u user name p this user must be able to create databases and users 2 type create database database name 3 exit the mysql shell 4 now type mysql u user name password user password database name file name sql you will need to do this with schema sql triggers sql and procedures sql 5 log back into the mysql shell and create a user for the system to use and password by typing create user user name localhost identified by user password 6 then type grant all privileges on database name to user name 7 then type flush privileges and then exit 8 ensure bind address in mysql config is set to 0 0 0 0 on the database server machine the mysql server is now set up and ready to use with the system logging to enable logging you need to set the environment variable stud logs with the location of where you want the log file saved on linux you can do this with export stud logs path to store log the log is saved in this path as studsys log to disable it unset the environment variable on linux unset stud logs documentation to generate documentation you must have doxygen and graphviz installed from the root of the project run doxygen doxy cfg | student-management | server |
Altschool-cloud-exercises | altschool cloud engineering exercises documenting my cloud engineering journey at altschool africa table of contents exercise 1 exercise 1 task setup ubuntu 20 04 on your local machine using vagrant instruction customize your vagrantfile as necessary with private network set to dhcp once the machine is up run ifconfig and share the output in your submission along with your vagrantfile in a folder for this exercise exercise 2 exercise 2 task research online for 10 more linux commands aside the ones already mentioned in this module submit using your altschool cloud exercises project explaining what each command is used for with examples of how to use use each and example screenshots of using each of them instruction submit your work in a folder for this exercise in your altschool cloud exercises project you will need to learn how to embed images in the markdown files exercise 3 exercise 3 task create 2 vagrants machines a and b on the same private network create an ssh key and log into b from a using the ssh key instruction submit your script exercise 4 exercise 4 task create 3 groups admin support engineering and add the admin group to sudoers create a user in each of the groups generate ssh keys for the user in the admin group instruction submit the content of etc passwd etc group and etc sudoers exercise 5 exercise 5 task install php7 4 on your local linux machine using the ppa ondrej php package repo instruction learn how to use the add apt repository command submit the content of etc apt sources list and the output of php v command exercise 6 exercise 6 task you already have a github account also setup a gitlab account if you you don t have one already you already have a altschool cloud exercises project clone the project to your local system setup your name and email in git global config instruction submit the output git config l git remote v git log exercise 7 exercise 7 task review the cis benchmark for ubuntu and try to implement at least 10 of the recommendations that we made within the benchmark instruction n a exercise 8 exercise 8 task create a bash script to run at every hour saving system memory ram usage to a specified file and at midnight it sends the content of the file to a specified email address then starts over for the new day instruction submit the content of your script cronjob and a sample of the email sent all in the folder for this exercise exercise 9 exercise 9 task create a ansible playbook to setup a server with apache the server should be set to africa lagos timezone host an index php file with the following content as the main file on the server php date f d y h i s a e time instruction submit the ansible playbook the output of the systemctl status apache2 after deploying the playbook and a screenshot of the rendered page exercise 10 exercise 10 task 193 16 20 35 29 what is 1 network ip 2 number of hosts 3 range of ip addresses 4 broadcast ip from the this subnet instruction submit all your answers as a markdown file in the for this exercise mini project mini project | cloud |
|
bootplus | bootplus v1 0 5 https github io aozora bootplus please pay attention that this project is no longer maintained bootplus is a front end framework for faster and easier web development inspired by the lates google look feel created and maintained by aozora http twitter com aozoralabs bootplus is based on twitter bootstrap http twitter github io bootstrap to get started check out http aozora github com bootplus http aozora github io bootplus quick start three quick start options are available download the latest release http aozora github io bootplus zipball master clone the repo git clone git github com aozora bootplus git read the getting started page http aozora github io bootplus getting started for information on the framework contents templates and examples and more versioning for transparency and insight into our release cycle and for striving to maintain backward compatibility bootplus will be maintained under the semantic versioning guidelines as much as possible releases will be numbered with the following format major minor patch and constructed with the following guidelines breaking backward compatibility bumps the major and resets the minor and patch new additions without breaking backward compatibility bumps the minor and resets the patch bug fixes and misc changes bumps the patch for more information on semver please visit http semver org http semver org bug tracker have a bug or a feature request please open a new issue https github com twitter bootplus issues before opening any issue please search for existing issues and read the issue guidelines https github com necolas issue guidelines written by nicolas gallagher https github com necolas community keep track of development and community news follow aozoralabs on twitter http twitter com aozoralabs compiling css and javascript bootplus includes a makefile makefile with convenient methods for working with the framework before getting started be sure to install the necessary local dependencies package json npm install when completed you ll be able to run the various make commands provided build make runs the recess compiler to rebuild the less files and compiles the docs requires recess and uglify js test make test runs jshint and qunit tests headlessly in phantomjs http code google com p phantomjs used for ci depends on having phantomjs installed watch make watch this is a convenience method for watching just less files and automatically building them whenever you save requires the watchr gem should you encounter problems with installing dependencies or running the makefile commands be sure to first uninstall any previous versions global and local you may have installed and then rerun npm install authors marcello palmitessa http twitter com aozoralabs http twitter com aozoralabs https github com aozora https github com aozora copyright and license bootplus is dual licensed gpl 2 and apache 2 see the license file | bootstrap css | front_end |
CISE_Team_33_SEER | cise team 33 seer seer database project for assignment 1b of contemporary methods of software engineering the purpose of this assignment was to learn and get practice at the techniques tools and skills needed to collaboratively develop software using an agile way of working from requirements discovery to application deployment this project was developed in mongodb react express and nodejs we also used scrum as our way of working for this project what is seer seer is the software engineering evidence repository it is a searchable database of evidence curated by se experts on the different se practices that developers can use to develop their product what does seer do a user is able to search and add evidence records to the seer database for example a user wants to find whether tdd is a good way of working they would be able to search about tdd in the database and get a list of peer reviewed research articles about tdd with a summary about what measures and results came from the study on tdd secondly a user is able to add an evidence record to the database a user will provide details for the evidence record such as author the peer reviewed article link etc once the user submits this record a moderator makes sure that this record is peer reviewed and relevant to se if so the moderator accepts the record and it is passed onto the analyst who extracts creates a summary of the information and uploads it to the seer database | server |
|
gccrs | c c ci https github com rust gcc gccrs workflows c c 20ci badge svg gcc bootstrap build https github com rust gcc gccrs actions workflows bootstrap yml badge svg https github com rust gcc gccrs actions workflows bootstrap yml build docker image https github com rust gcc gccrs actions workflows docker yml badge svg https github com rust gcc gccrs actions workflows docker yml docker pulls https img shields io docker pulls philberty gccrs project chat https img shields io badge zulip join chat brightgreen svg https gcc rust zulipchat com bors enabled https bors tech images badge small svg https app bors tech repositories 32890 justforfunnoreally dev badge https img shields io badge justforfunnoreally dev 9ff https justforfunnoreally dev gcc rust gcc rust logo png raw true gcc rust logo please note the compiler is in a very early stage and not usable yet for compiling real rust programs gccrs is a full alternative implementation of the rust language ontop of gcc with the goal to become fully upstream with the gnu toolchain the origin of this project was a community effort several years ago where rust was still at version 0 9 the language was subject to so much change that it became difficult for a community effort to play catch up now that the language is stable it is an excellent time to create alternative compilers the developers of the project are keen rustaceans with a desire to give back to the rust community and to learn what gcc is capable of when it comes to a modern language build farm status debian i386 https builder sourceware org buildbot builders gccrust debian i386 debian i386 https builder sourceware org buildbot badges gccrust debian i386 svg https builder sourceware org buildbot builders gccrust debian i386 debian ppc64 https builder sourceware org buildbot builders gccrust debian ppc64 debian ppc64 https builder sourceware org buildbot badges gccrust debian ppc64 svg https builder sourceware org buildbot builders gccrust debian ppc64 debian testing x86 64 https builder sourceware org buildbot builders 146 debian testing x86 64 https builder sourceware org buildbot badges gccrust debian testing x86 64 svg https builder sourceware org buildbot builders 146 fedora arm64 https builder sourceware org buildbot builders 179 fedora arm64 https builder sourceware org buildbot badges gccrust fedora arm64 svg https builder sourceware org buildbot builders 179 fedora ppc64le https builder sourceware org buildbot builders gccrust fedora ppc64le fedora ppc64le https builder sourceware org buildbot badges gccrust fedora ppc64le svg https builder sourceware org buildbot builders gccrust fedora ppc64le fedora s390x https builder sourceware org buildbot builders gccrust fedora s390x fedora s390x https builder sourceware org buildbot badges gccrust fedora s390x svg https builder sourceware org buildbot builders gccrust fedora s390x fedora x86 64 https builder sourceware org buildbot builders gccrust fedora x86 64 fedora x86 64 https builder sourceware org buildbot badges gccrust fedora x86 64 svg https builder sourceware org buildbot builders gccrust fedora x86 64 opensuse leap x86 64 https builder sourceware org buildbot builders 104 opensuse leap x86 64 https builder sourceware org buildbot badges gccrust opensuseleap x86 64 svg https builder sourceware org buildbot builders 104 opensuse tw x86 64 https builder sourceware org buildbot builders 103 opensuse tw x86 64 https builder sourceware org buildbot badges gccrust opensusetw x86 64 svg https builder sourceware org buildbot builders 103 rawhide x86 64 https builder sourceware org buildbot builders 132 rawhide x86 64 https builder sourceware org buildbot badges gccrust rawhide x86 64 svg https builder sourceware org buildbot builders 132 faq please find the answers to frequently asked questions over on https github com rust gcc gccrs wiki frequently asked questions development environment building fetch dependencies for ubuntu bash apt install build essential libgmp3 dev libmpfr dev libmpc dev flex bison autogen gcc multilib dejagnu clone the repository bash git clone https github com rust gcc gccrs linux it is important to remember that gnu toolchain projects are designed to be built outside of their source directory which is why a build directory is created bash mkdir gccrs build cd gccrs build gccrs configure prefix home gccrs install disable bootstrap enable multilib enable languages rust make macos the path of header dir and sysroot should be specified when you configure the project bash mkdir mac build cd mac build gccrs configure prefix home gccrs install disable bootstrap enable multilib enable languages rust with native system header dir usr include with sysroot library developer commandlinetools sdks macosx sdk make running gcc rust running the compiler itself without make install we can simply invoke the compiler proper bash gcc crab1 test rs frust debug frust dump parse warray bounds dumpbase test rs mtune generic march x86 64 o0 version fdump tree gimple o test s l lib x86 64 linux gnu l lib lib64 l usr lib x86 64 linux gnu l usr lib lib64 frust incomplete and experimental compiler do not use to invoke the compiler driver gccrs we need to bash make install then invoke the compiler from the installation directory bash home gccrs install bin gccrs g o2 c test rs o test o frust incomplete and experimental compiler do not use home gccrs install bin gccrs o test test o you can also setup your shell to automatically find the installed compiler for example for bash add the following in your home bashrc bash export path home gccrs install bin path testsuite invoke the full testsuite from the build directory gccrs build in the previous commands bash make check rust invoke a subset of the testsuite for example to only run tests that are currently known expected to fail bash make check rust runtestflags xfail exp there are the following sets of tests compile exp compilation tests execute exp execution tests xfail exp tests that are currently known expected to fail invoke only a specific test bash make check rust runtestflags all compile exp continue1 rs logs with corresponding commands can be found in gccrs build gcc testsuite rust rust log see gcc testing documentation https gcc gnu org install test html for more details test cases are located within gcc testsuite rust gcc testsuite rust please contribute your specific test cases referencing any issues on github debugging enabling internal checks gcc has several internal checks that can be enabled during configuration in the case of gccrs you can enable the following bash gccrs configure prefix home gccrs install disable bootstrap enable multilib enable languages rust enable checking gimple tree types gdb you can directly invoke gdb on the crab1 compiler process you can find the exact command adding verbose to your gccrs invocation bash gccrs test rs o0 s o arithmetic expressions1 s verbose some path crab1 test rs quiet dumpbase arithmetic expressions1 rs dumpbase ext rs mtune generic march x86 64 o0 w version fdiagnostics color never fno diagnostics show caret fno diagnostics show line numbers fdiagnostics urls never fdiagnostics path format separate events o test s l lib x86 64 linux gnu l lib lib64 l usr lib x86 64 linux gnu gdb args some path crab1 test rs quiet dumpbase arithmetic expressions1 rs dumpbase ext rs mtune generic march x86 64 o0 w version fdiagnostics color never fno diagnostics show caret fno diagnostics show line numbers fdiagnostics urls never fdiagnostics path format separate events o test s l lib x86 64 linux gnu l lib lib64 l usr lib x86 64 linux gnu or simply add the wrapper gdb args option this will call each subcommand in gdb and you simply have to break debug in crab1 bash gccrs test rs o0 s o arithmetic expressions1 s wrapper gdb args docker image there is a docker image hosted over on https hub docker com repository docker philberty gccrs bash docker pull philberty gccrs or you can build your own image bash docker build t gccrs dev if you want to build an object file bash docker run rm v pwd usr src myapp w usr src myapp gccrs dev latest gccrs g o2 c gcc testsuite rust compile torture type infer1 rs o type infer1 o if you want to build an executable file bash docker run rm v pwd usr src myapp w usr src myapp gccrs dev latest gccrs g o2 gcc testsuite rust compile torture type infer1 rs o type infer1 to emit assembly bash docker run rm v pwd usr src myapp w usr src myapp gccrs dev latest gccrs g o2 gcc testsuite rust compile torture type infer1 rs s o type infer1 s to emit rust front end debug output you may add options like frust debug frust dump all contributing if you want to contribute to gcc rust you can find more information in contributing md https github com rust gcc gccrs blob master contributing md please be aware this project is designed to be pushed upstream to gcc when we reach some milestones and this means we require copyright assignment or the developer s certificate of origin sign off please see the contributing to gcc https gcc gnu org contribute html guide or developer s certificate of origin dco sign off https gcc gnu org dco html guide not all contributions must be code we would love to see new test cases or bugs and issues to be reported feel free to add any comments on open prs continuous integration when submitting or updating a github pull request https github com rust gcc gccrs pull several automated checks are run generally a green status is necessary before merge compiler diagnostics that is here diagnostics emitted by the initial compiler used to build gcc rust if building a native toolchain gcc by default does a 3 stage bootstrap build https gcc gnu org install configure html in addition to making sure that gcc is able to reproduce itself bit by bit this also means that stages 2 are built with werror turning most warning into error diagnostics see enable werror possibly enabled by default this helps to catch a good number of bugs because it enforces that gcc compiles without compiler diagnostics it s a requirement for upstream patch submission https gcc gnu org contribute html testing gcc generally is only expected to be warning clean without disable bootstrap that is default enable bootstrap for a native build and not for the initial stage where it s using the initial compiler otherwise we re at the mercy of whatever initial compiler we re using doing a disable bootstrap build is much faster of course so we re often doing that for example per the instructions above or in the standard ci with that we re missing out on the aspect that enforces that gcc compiles without compiler diagnostics to encounter that the default ci has a check for new warnings step https github com rust gcc gccrs pull 1026 that verifies in the ci disable bootstrap build configuration that no new warnings are introduced if that step fails it usually points out a new warning you ve introduced erroneously and should address occasionally it means that simply the github bors log expected warnings file needs to be updated for example if due to any kind of environmental changes for example ci initial compiler changes unless diligently reproducing the ci configuration in particular initial compiler gcc version it s not really feasible to reproduce this check locally if in doubt do a local enable bootstrap build or submit your changes and wait for the ci system s results community we can be found on all usual rust channels such as zulip but we also have our own channels gcc rust zulip https gcc rust zulipchat com twitter https twitter com gcc rust gcc mailing list https gcc gnu org mailman listinfo gcc rust irc irc oftc net gccrust | gcc gcc-rust rust rust-lang compiler hacktoberfest | front_end |
blockchain | rust blockchain crates io https img shields io crates v blockchain svg https crates io crates blockchain documentation https docs rs blockchain badge svg https docs rs blockchain rust blockchain is an unopinioned blockchain framework that helps you to develop a blockchain project chain the chain module handles block import and state storage assumptions we have in this module we have block which consists of a hash and has a parent block it forms a chain at each block there is a corresponding state an executor that takes a block and parent block s state executing it should get the current block s state | blockchain |
|
netx | azure rtos netx a high performance implementation of tcp ip protocol standards azure rtos netx is fully integrated with azure rtos threadx and available for all supported processors it has a unique piconet architecture combined with a zero copy api it makes it a perfect fit for today s deeply embedded applications that require network connectivity documentation documentation for this library can be found here http docs microsoft com azure rtos netx understanding inter component dependencies the main components of azure rtos are each provided in their own repository but there are dependencies between them shown in the following graph that are important to understand when setting up your builds dependency graph docs deps png building and using the library prerequisites install the following tools cmake https cmake org download version 3 0 or later gcc compilers for arm none eabi https developer arm com tools and software open source software developer tools gnu toolchain gnu rm downloads ninja https ninja build org cloning the repo bash git clone https github com azure rtos netx git building as a static library each component of azure rtos comes with a composible cmake based build system that supports many different mcus and host systems integrating any of these components into your device app code is as simple as adding a git submodule and then including it in your build using the cmake command add subdirectory while the typical usage pattern is to include threadx into your device code source tree to be built linked with your code you can compile this project as a standalone static library to confirm your build is set up correctly bash cmake bbuild dcmake toolchain file cmake cortex m4 cmake gninja cmake build build note you will have to take the dependency graph above into account when building anything other than threadx itself repository structure and usage branches releases the master branch has the most recent code with all new features and bug fixes it does not represent the latest general availability ga release of the library releases each official release preview or ga will be tagged to mark the commit and push it into the github releases tab e g v6 0 rel directory layout addons auto ip bsd dhcp dns ftp http mdns mqtt nat pop3 ppp pppoe smtp sntp telnet tftp web cmake common inc src ports cortex m0 gnu inc src cortex m3 gnu inc src cortex m4 gnu inc src cortex m7 gnu inc src samples security azure rtos provides oems with components to secure communication and to create code and data isolation using underlying mcu mpu hardware protection mechanisms it is ultimately the responsibility of the device builder to ensure the device fully meets the evolving security requirements associated with its specific use case contribution feedback and issues if you encounter any bugs have suggestions for new features or if you would like to become an active contributor to this project please follow the instructions provided in the contribution guideline for the corresponding repo for general support please post a question to stack overflow http stackoverflow com questions tagged azure rtos threadx using the threadx and azure rtos tags | os |
|
blockchain-hyperledger-fabric-electronic-patient-records | blockchain hyperledger fabric electronic patient records project report is available here https github com kshitijyelpale blockchain hyperledger fabric electronic patient records blob main docs final 20report pdf and presentation https github com kshitijyelpale blockchain hyperledger fabric electronic patient records blob main docs final 20presentation pdf demo youtube https youtu be x fx1znmxgy project management https share clickup com l h 4 6668897 1 8922645f320f3c0 from april 1st 2021 all the project managment to be done in github all the issues to be created on github after successul installation of application the login details for each user as follows role username password admin hosp1admin hosp1lithium admin hosp2admin hosp2lithium patient pid0 pid0 patient pid1 pid1 patient pid2 pid2 patient pid3 pid3 patient pid4 pid4 patient pid5 pid5 doctor hosp1 doc0 password doctor hosp2 doc1 password doctor hosp1 doc2 password doctor hosp2 doc3 password | blockchain |
|
metamask-mobile | metamask logo logo png raw true metamask ci https github com metamask metamask mobile actions workflows ci yml badge svg branch main https github com metamask metamask mobile actions workflows ci yml cla https github com metamask metamask mobile actions workflows cla yml badge svg branch main https github com metamask metamask mobile actions workflows cla yml metamask is a mobile wallet that provides easy access to websites that use the ethereum https ethereum org blockchain for up to the minute news follow our twitter https twitter com metamask or medium https medium com metamask pages to learn how to develop metamask compatible applications visit our developer docs https docs metamask io metamask mobile environment setup the code is built using react native and running code locally requires a mac or linux os install sentry cli https github com getsentry sentry cli tools brew install getsentry tools sentry cli install node js https nodejs org version 16 if you are using nvm https github com creationix nvm installation recommended running nvm use will automatically choose the right node version for you install yarn v1 https yarnpkg com en docs install one way to install yarn v1 is by using brew bash brew install yarn 1 22 19 to check you ve installed the right version bash yarn version install the shared react native dependencies https reactnative dev docs environment setup installing dependencies react native cli not expo cli xcode version 14 2 or below install cocoapods https guides cocoapods org using getting started html by running bash sudo gem install cocoapods v 1 12 1 install python https www python org downloads version 3 10 note m1 user might need to stay with version 3 10 device environment setup android install java https www java com en download to check if java is already installed run java version install the android sdk via android studio https developer android com studio metamask only to create production builds you need to install google play licensing library via the sdk manager in android studio install the android ndk version 21 4 7075529 via android studio https developer android com studio s sdk manager go to settings appearance behavior system settings android sdk shortcut selecting more actions sdk manager from the welcome to android studio page will also bring you here select sdk tools tab check show package details option below the tools list to show available versions locate ndk side by side option in the tools list check ndk version 21 4 7075529 locate cmake option in the tools list check cmake version 3 22 1 click apply or ok to download linux only ensure that you have the secret tool binary on your machine part of the libsecret tools https launchpad net ubuntu bionic package libsecret tools package on debian ubuntu based distributions install the correct emulator follow the instructions at react native getting started android https reactnative dev docs environment setup installing dependencies react native cli quickstart your os android fyi as of today 7 18 23 there is currently an issue when running detox on android 12 and 13 api 32 33 which prevents the tests from running the issue is the tap action is treated like a tapandhold action see the open issue in wix detox here https github com wix detox issues 3762 more details can be found on the android developer site https developer android com studio run emulator you should use the following android os version latest unless told otherwise device google pixel 5 finally start the emulator from android studio open virtual device manager launch emulator for pixel 5 relevant api version mentioned in react native getting started https reactnative dev docs environment setup installing dependencies ios install the ios dependencies react native getting started ios https reactnative dev docs environment setup installing dependencies react native cli quickstart your os ios install the correct simulator ios os version latest unless told otherwise device iphone 12 pro building locally clone this repo bash git clone cd metamask mobile metamask only rename the env example files remove the example in the root of the project and fill in the appropriate values for each key get the values from another metamask mobile developer non metamask only in the project root folder run if you intend to use walletconnect v2 during your development you should register to get a projectid from walletconnect website and set the wallet connect project id value accordingly in js env file cp ios env example ios env cp android env example android env cp js env example js env non metamask only create an account and generate your own api key at infura https infura io in order to connect to main and test nets fill mm infura project id in js env app will run without it but will not be able to connect to actual network non metamask only fill mm sentry dsn in js env if you want the app to emit logs to your own sentry project install the app yarn setup not the usual install command this will run a lengthy postinstall flow then in one terminal run bash yarn watch android bash yarn start android ios bash yarn start ios build troubleshooting unfortunately the build system may fail to pick up local changes such as installing new npm packages or yarn link ing a dependency if the app is behaving strangely or not picking up your local changes it may be due to build issues to ensure that you re starting with a clean slate close all emulators simulators stop the yarn watch process and run bash yarn clean if you re going to yarn link any packages do that here before the next command yarn watch clean and then in another terminal yarn start ios or yarn start android if yarn link fails after going through these steps try directly yarn add ing the local files instead debugging first make sure you have the following running yarn watch your android emulator or ios simulator yarn start android or yarn start ios next install the flipper https fbflipper com desktop app verified working with v0 127 0 once flipper is installed configure your system as follows install react devtools npm i g react devtools 4 22 1 update android sdk location settings by accessing flipper s settings via the gear icon settings example sdk path users user name library android sdk finally check that the debugger is working open your emulator or simulator alongside the flipper app flipper should auto detect the device and the application to debug you should now be able to access features such as logs debugging physical ios devices debugging physical ios devices requires idb to be installed which consists of 2 parts install the two idb parts 1 brew tap facebook fb brew install idb companion 2 pip3 9 install fb idb this step may require that you install python3 via python m pip3 install upgrade pip debug a website inside the webview in app browser android run the app in debug mode for example in a simulator open chrome on your desktop go to chrome inspect devices look for the device and click inspect ios run the app in debug mode for example in a simulator open safari on your desktop go to the menu develop your device website you should see the console for the website that is running inside the webview miscellaneous troubleshooting for react native https facebook github io react native docs troubleshooting content flipper documentation https fbflipper com docs features react native running tests unit tests bash yarn test unit e2e tests platforms for both ios and android platforms our chosen e2e test framework is detox we also utilize appium for android wdio folder test wallet e2e tests use a wallet able to access testnet and mainnet on bitrise ci the wallet is created using the secret recovery phrase from secret env var for local testing the wallet is created using the secret recovery phrase from the e2e env file detox all tests live within the e2e specs folder ios prerequisites for running tests make sure to install detox cli by referring to the instructions mentioned here https wix github io detox docs introduction getting started detox prerequisites additionally install applesimutils by following the guidelines provided here https github com wix applesimulatorutils before running any tests it s recommended to refer to the ios section above and check the latest simulator device specified under install the correct simulator the default device for ios is the iphone 12 pro and android the pixel 5 ensure you have these set up make sure that metro is running use this command to launch the metro server bash yarn watch you can trigger the tests against a release or debug build it recommended that you trigger the tests against a debug build to trigger the tests on a debug build run this command for ios bash yarn test e2e ios debug and on android bash yarn test e2e android debug if you choose to run tests against a release build you can do so by running this command for ios bash yarn test e2e ios and on android bash yarn test e2e android if you have already built the application for detox and want to run a specific test from the test folder you can use this command for ios bash yarn test e2e ios debug single e2e specs test name spec js and on android bash yarn test e2e android debug single e2e specs test name spec js to run tests associated with a certain tag you can do so using the testnamepattern flag for example bash yarn test e2e ios debug testnamepattern smoke bash yarn test e2e android debug testnamepattern smoke this runs all tests that are tagged smoke appium the appium tests lives within the wdio feature folder by default the tests use an avd named android 11 pixel 4a api 31 with api level 30 android 11 you can modify the emulator and platform version by navigating to wdio config android config debug js and adjusting the values of devicename to match your emulator s name and platformversion to match your operating system s version make sure to verify that the config file accurately represents your emulator settings before executing any tests the sequence in which you should run tests create a test build using this command bash yarn start android qa then run tests using this command bash yarn test wdio android if you want to run a specific test you can include the spec flag in the aforementioned command for example bash yarn test wdio android spec wdio features onboarding createnewwallet feature changing dependencies whenever you change dependencies adding removing or updating either in package json or yarn lock there are various files that must be kept up to date yarn lock run yarn setup again after your changes to ensure yarn lock has been properly updated the allow scripts configuration in package json run yarn allow scripts auto to update the allow scripts configuration automatically this config determines whether the package s install postinstall scripts are allowed to run review each new package to determine whether the install script needs to run or not testing if necessary unfortunately yarn allow scripts auto will behave inconsistently on different platforms macos and windows users may see extraneous changes relating to optional dependencies architecture to get a better understanding of the internal architecture of this app take a look at this diagram https github com metamask metamask mobile blob main architecture svg storybook we have begun documenting our components using storybook please read the documentation guidelines storybook documentation guidelines md to get up and running other docs adding confirmations docs confirmations md | react-native web3 dapps-browser javascript ios android metamask | blockchain |
AWS-Step_Functions | aws step functions api engineering cloud computing assignment 4 a state machine under aws step functions that is triggered by the upload of an image to a s3 bucket it uses lambda functions to perform actions such as extension check label detection save content into dynamodb and s3 bucket | cloud |
|
CTSE_game_reviewer_mobile_application-Epic_Games | game reviewer application a new flutter application getting started this project is a starting point for a flutter application a few resources to get you started if this is your first flutter project lab write your first flutter app https flutter dev docs get started codelab cookbook useful flutter samples https flutter dev docs cookbook for help getting started with flutter view our online documentation https flutter dev docs which offers tutorials samples guidance on mobile development and a full api reference | flutter dart firebase firestore-database firebase-storage firebase-auth | front_end |
Online-Banking-System-Database-Design | online banking system database design database design for an online banking system this project was implemented as a part of the inst 733 database design course at the university of maryland college park the purpose of this project is to develop an online banking database design that provides banks with the facility to organize information related to the employees customers and other relevant information in an efficient manner a database design is an important part of any system application design traditionally consists of two steps you develop a logical model of the business process you re automating then you map that model to the database by creating a physical model which is implemented as a series of tables thus a database design definitely makes an impact on the overall efficiency of a system in this technologically advanced era most of the banking transactions are done online hence this is an increasingly interesting topic to study as this project deals with banking database design the project report consists of information on logical and physical design of the database tables created queries used to generate different views recommendations to improve the database and problems faced and lessons learnt are also a part of the project report | os |
|
BKecomm | bkecomm project my project setting up the local environment and workspace this guide explains how to set up your environment for angular development using the angular cli tool it includes information about prerequisites installing the cli creating an initial workspace and starter app and running that app locally to verify your setup try angular without local setup if you are new to angular you might want to start with try it now which introduces the essentials of angular in the context of a ready made basic online store app that you can examine and modify this standalone tutorial takes advantage of the interactive stackblitz environment for online development you don t need to set up your local environment until you re ready prerequisites to use the angular framework you should be familiar with the following javascript html css knowledge of typescript is helpful but not required to install angular on your local system you need the following node js angular requires a current active lts or maintenance lts version of node js for information about specific version requirements see the engines key in the package json file for more information on installing node js see nodejs org if you are unsure what version of node js runs on your system run node v in a terminal window npm package manager angular the angular cli and angular applications depend on npm packages for many features and functions to download and install npm packages you need an npm package manager this guide uses the npm client command line interface which is installed with node js by default to check that you have the npm client installed run npm v in a terminal window install the angular cli you use the angular cli to create projects generate application and library code and perform a variety of ongoing development tasks such as testing bundling and deployment to install the angular cli open a terminal window and run the following command content copy npm install g angular cli create a workspace and initial application you develop apps in the context of an angular workspace to create a new workspace and initial starter app run the cli command ng new and provide the name my app as shown here content copy ng new my app the ng new command prompts you for information about features to include in the initial app accept the defaults by pressing the enter or return key the angular cli installs the necessary angular npm packages and other dependencies this can take a few minutes the cli creates a new workspace and a simple welcome app ready to run you also have the option to use angular s strict mode which can help you write better more maintainable code for more information see strict mode run the application the angular cli includes a server so that you can build and serve your app locally navigate to the workspace folder such as my app run the following command content copy cd my app ng serve open the ng serve command launches the server watches your files and rebuilds the app as you make changes to those files the open or just o option automatically opens your browser to http localhost 4200 if your installation and setup was successful you should see a page similar to the following welcome to my app next steps for a more thorough introduction to the fundamental concepts and terminology of angular single page app architecture and design principles read the angular concepts section work through the tour of heroes tutorial a complete hands on exercise that introduces you to the app development process using the angular cli and walks through important subsystems to learn more about using the angular cli see the cli overview in addition to creating the initial workspace and app scaffolding you can use the cli to generate angular code such as components and services the cli supports the full development cycle including building testing bundling and deployment for more information about the angular files generated by ng new see workspace and project file structure deploying an application deploying your application is the process of compiling or building your code and hosting the javascript css and html on a web server this section builds on the previous steps in the getting started tutorial and shows you how to deploy your application prerequisites a best practice is to run your project locally before you deploy it to run your project locally you need the following installed on your computer node js the angular cli from the terminal install the angular cli globally with content copy npm install g angular cli with the angular cli you can use the command ng to create new workspaces new projects serve your application during development or produce builds to share or distribute running your application locally download the source code from your stackblitz project by clicking the download project icon in the left menu across from project to download your files create a new angular cli workspace using the ng new command where my project name is what you would like to call your project content copy ng new my project name this command displays a series of configuration prompts for this tutorial accept the default settings for each prompt in your newly cli generated application replace the src folder with the src folder from your stackblitz download use the following cli command to run your application locally content copy ng serve to see your application in the browser go to http localhost 4200 if the default port 4200 is not available you can specify another port with the port flag as in the following example content copy ng serve port 4201 while serving your application you can edit your code and see the changes update automatically in the browser to stop the ng serve command press ctrl c building and hosting your application to build your application for production use the build command with the prod flag content copy ng build prod this command creates a dist folder in the application root directory with all the files that a hosting service needs for serving your application if the above ng build command throws an error about missing packages append the missing dependencies in your local project s package json file to match the one in the downloaded stackblitz project copy the contents of the dist my project name folder to your web server because these files are static you can host them on any web server capable of serving files such as node js java net or any backend such as firebase google cloud or app engine for more information see building amp serving and deployment what s next in this tutorial you ve laid the foundation to explore the angular world in areas such as mobile development ux ui development and server side rendering you can go deeper by studying more of angular s features engaging with the vibrant community and exploring the robust ecosystem learning more angular for a more in depth tutorial that leads you through building an application locally and exploring many of angular s most popular features see tour of heroes to explore angular s foundational concepts see the guides in the understanding angular section such as angular components overview or template syntax joining the community tweet that you ve completed this tutorial tell us what you think or submit suggestions for future editions keep current by following the angular blog exploring the angular ecosystem to support your ux ui development see angular material to test your angular applications see angular protractor the angular community also has an extensive network of third party tools and libraries | server |
|
portofolio | portofolio portfolio web dev automation design embedded systems data analysis explore my diverse projects open to collaborations | os |
|
ComputerVision-with-PyTorch-Learning-Program | computervision with pytorch learning program computer vision using pytorch learning program by tinkerhub foundation pytorch https pytorch org is an open source deep learning framework created by facebook ai research https ai facebook com this learning program will cover the following computer vision pytorch framework torchvision library image classification and object detection transfer learning we will be extensively using the pytorch docs for conducting this program participants criteria should know object oriented programming and python should know git and github should know what is machine learning and some basics different categories of ml what is training what is testing what is dataset etc all the resources to get you started with the program is given in the resources folder https github com tinkerhub org computervision with pytorch learning program tree master resources you can learn it and finish the task for joining the program join the program this learning program need you to have knowledge in following areas 1 python 2 github 3 basics of machine learning we have a coding task for selction process you could check out the task and how to submit task here https github com tinkerhub org computervision with pytorch learning program tree master tasks registration task watch the following video for reference watch this video for reference https github com tinkerhub org computervision with pytorch learning program blob master resources th pytorch reference png https youtu be y ikkxfbwic selection process once the task is completed you could fill this google form we will update the selction status latest by 5th june program starts on 6th june general program structure every 2 days we will share a chapter of the pytorch docs in the morning participants should go through it in the evening of the odd days we will give a task on the following day there will be hangouts sessions with mentors program schedule day 1 google meets with the following contents welcome note to tinkerhub computer vision using pytorch program intro to pytorch navigate through pytorch website reading documentations explaining the structure of pytorch docs general structure of the program prerequisites installations getting to know each other day 2 morning in the telegram channel we share the following content 1 what is pytorch https pytorch org tutorials beginner blitz tensor tutorial html sphx glr beginner blitz tensor tutorial py 2 autograd automatic differentiation https pytorch org tutorials beginner blitz autograd tutorial html sphx glr beginner blitz autograd tutorial py participants should go through the docs and learn try out each of the code snippets in the docs note down the doubts ask doubts in the github repo issues evening a task will be given in the telegram channel participants should try to do the task upload the code to the specified github repo day 3 morning in the telegram channel we share the following content 1 neural networks https pytorch org tutorials beginner blitz neural networks tutorial html sphx glr beginner blitz neural networks tutorial py 2 training a classifier https pytorch org tutorials beginner blitz cifar10 tutorial html sphx glr beginner blitz cifar10 tutorial py participants should go through the docs and learn try out each of the code snippets in the docs note down the doubts ask doubts in the github repo issues evening a task will be given in the telegram channel participants should try to do the task upload the code to the specified github repo day 4 google meets sessions with the mentor in the evening doubts about the last 2 days topics can be asked solution of the task will be discussed day 5 morning in the telegram channel we share the following content 1 what is torch nn really https pytorch org tutorials beginner nn tutorial html 2 visualising models data and training with tensorboard https pytorch org tutorials intermediate tensorboard tutorial html participants should go through the docs and learn try out each of the code snippets in the docs note down the doubts ask doubts in the github repo issues evening a task will be given in the telegram channel participants should try to do the task upload the code to the specified github repo day 6 google meets session with mentor s in the evening explain the concepts of content shared last day explain the code line by line clear doubts day 7 google meets session on 1 what is computer vision 2 algorithms used for computer vision 3 torchvision library day 8 in the morning the following content will be shared in telegram 1 torchvision fine tuning for computer vision tutorial https pytorch org tutorials intermediate torchvision tutorial html 2 transfer learning for computer vision tutorial https pytorch org tutorials beginner transfer learning tutorial html participants should go through the doc and try to understand note down the doubts try out the code ask doubts in the github repo issues day 9 google meets sessions with the mentor in the evening doubts about the last days topics can be asked solution of the task will be discussed day 10 google meets session with mentor s on choosing projects participants can choose transfer learning projects or object detection projects finding a dataset can find a dataset from kaggle etc day 11 participants figuring out the project and dataset participants should create a github repo for the project update the readme file with the details of the project find the model and dataset they are going to implement fill the project form with the project repo link day 12 mentor s verify the project ideas participants make the changes day 13 14 project days participants do the project ask doubts to the mentors in github issues and via call upload the code and trained models to the repo day 15 mentors verify the projects provide feedback as github issues day 16 final google meets session with discussion on next steps from here project demos certificates distribution program partner facebook developer circle kochi https www facebook com groups devckochi about is a forum for developers in kochi india and its surroundings who are interested in building on the facebook platform to interact and collaborate other developers who share similar interests contributors gopikrishnansasikumar https github com gopikrishnansasikumar | pytorch course deep-learning | ai |
AI_Learning_Hub | ai learning hub license https img shields io badge license mit lightgrey svg https raw githubusercontent com wei2624 ai learning hub master license a href url img src https raw githubusercontent com wei2624 ai learning hub master ai jpg align left height 450 width 900 a br br br br photo credit liam kay https www thirdsector co uk author 4626 liam kay ai learning hub is an open sourced machine learning handbook we contribute to this repo by summarizing interesting blog course and or notes of machine learning deep learning computer vision robotics and or statistics we also intend to provide each post with chinese version we do this because we love ai and sharing excellent materials are the step stone for learning ai we think everyone is deserved a chance to study ai with excellent materials we welcome anyone to join us to make it better and you own whatever you write here what notes are can be posted here we are looking for any related notes that are genuinely created by your own by genuinity we mean one of the following 1 you create and write the contents of notes from scratch everything is original 2 you summarize contents from related course s book s and note s you can merge contents from multiple sources although this is expected to be a summary your summary should be original 3 you translate one of the notes in this repo view contents we provide with two ways to view and learn the blogs view author s homepage highly recommended the best way to view the contents of any blog is to view the homepage of the author of that blog that especially interests you the information of author s homepage of each blog is listed in this readme and will be updated as any changes happen we highly recommend this way to view the contents of any blog use jekyll and ruby to view locally not recommended 1 install ruby environment instructions can be found here https jekyllrb com docs installation 2 run gem install jekyll bundler 3 run git clone https github com wei2624 ai learning hub git cd ai learning hub bundle install bundle exec jekyll build 4 in site directory you can find html file then you are able to view them locally join us you are very welcome to join us to improve this repo more write blog the easiest way to contribute is to fork https help github com articles fork a repo this project and write your own contents remember that you own whatever you write to unify the style of each blog you should use markdown as the syntax with mathjax as a plugin for math of course you can insert html code whenever you want an example of header of a blog can be as below layout single mathjax true title regularization and model selection share true permalink machinelearning sv regularization model selection for layout you better either choose single where comments are enabled or archive where comments are disabled for more layout options you can view here https mmistakes github io minimal mistakes docs layouts permalink is a slef defined relative url path if you want to host up your blog you can append permalink to your site url you better follow this procedure so that people can run ruby command to generate local page for view host blog you can put up your own blog the easiest way to do this is to use submodule https git scm com book en v2 git tools submodules from git essentially you have your own repo then you can run git submodule command to add this repo as a subdirectory to your original repo this repo will just become one of the folders in your repo you can access whatever you write here distribution of contents distribution of contents without author s permission is strictly prohibited please respect the authorship of each blog there if you want to distribute them you can ask the author for permission every author here has all the rights to their written blog and is fully responsible for their written blogs blog information blogs in english module blog title lang author contact ml generative algorithm https wei2624 github io machinelearning sv generative model en wei zhang https wei2624 github io weiuw2624 gmail com ml discriminative algorithm https wei2624 github io machinelearning sv discriminative model en wei zhang https wei2624 github io weiuw2624 gmail com ml support vector machine https wei2624 github io machinelearning sv svm en wei zhang https wei2624 github io weiuw2624 gmail com ml bias varaince and error analysis https wei2624 github io machinelearning sv bias variance tradeoff en wei zhang https wei2624 github io weiuw2624 gmail com ml learning theory https wei2624 github io machinelearning sv learning theory en wei zhang https wei2624 github io weiuw2624 gmail com ml regularization and model selection https wei2624 github io machinelearning sv regularization model selection en wei zhang https wei2624 github io weiuw2624 gmail com ml online learning and perceptron algorithm https wei2624 github io machinelearning sv online learning perceptron en wei zhang https wei2624 github io weiuw2624 gmail com ml k means https wei2624 github io machinelearning usv kmeans en wei zhang https wei2624 github io weiuw2624 gmail com ml em algorithm https wei2624 github io machinelearning usv em en wei zhang https wei2624 github io weiuw2624 gmail com ml variational inference https wei2624 github io machinelearning bayes vi en wei zhang https wei2624 github io weiuw2624 gmail com dl nerual networks https wei2624 github io machinelearning dl neural network en wei zhang https wei2624 github io weiuw2624 gmail com dl backpropagation https wei2624 github io machinelearning dl propagtion en wei zhang https wei2624 github io weiuw2624 gmail com blogs in chinese module blog title lang author contact ml generative algorithm https air yan github io machine 20learning generative learning algorithm ch zishi yan https air yan github io wechat air sowhat ml discriminative algorithm https dark417 github io machinelearning sv discriminative model ch ch xiaoxiao lei https dark417 github io wechat dark417 ml support vector machine https air yan github io machinelearning sv svm ch ch zishi yan https air yan github io wechat air sowhat ml bias varaince and error analysis https dark417 github io machinelearning sv bias variance tradeoff ch ch xiaoxiao lei https dark417 github io wechat dark417 ml regularization and model selection https dark417 github io machinelearning sv regularization model selection ch ch xiaoxiao lei https dark417 github io wechat dark417 | machine-learning machine-learning-algorithms deep-learning deep-neural-networks statistics svm-classifier discriminant-analysis generative-model expectation-maximization-algorithm variational-inference learningnotes learning-theory regularization perceptron-learning-algorithm k-means-clustering gaussian-mixture-models | ai |
LLM_training_strategies | large language models training strategies over the last few months as i ve been reading up on llms i have realised that deep within the actual papers lie some great advice on training strategies tweaks made to data choice of hyperparamters and such which get lost in the noise of hype this repository will serve as an index detailing various strategies tweaks and recent architectural or hyperparameter choices made by different large language models strategies tweaks and techniques data preparation strategies minhashlsh with jaccard similarity for de duplicating dataset dataset distillation a process where a smaller student dataset is created that can be used to train a model to similar performance as the original teacher dataset dynamic mixing mixing data from different datasets or domains at different ratios during training model architectural choices transformer architecture the fundamental building block of most modern llms consisting of self attention and feed forward layers sparse and dense attention layers in alternation a combination of sparse attention for computational efficiency and dense attention for representational power flash attention layer a method to increase the efficiency of attention mechanisms rotary embeddings for positional embeddings a new technique that injects positional information more effectively into transformer models layernorm a type of normalization technique often used in transformer models to stabilize the inputs to each layer training strategies unpadding a technique to reduce the computational cost of training by removing unnecessary padding tokens knowledge distillation training a smaller student model to imitate the behavior of a larger teacher model pre training tasks token masking token deletion or masking is the most common strategy in autoregressive models like gpt it masks next token while in encoder decoder models like bert it randomly masks some percentage of tokens mlm next sentence prediction nsp the model is given two sentences and must predict whether the second sentence follows the first in the original document text infilling parts of the text are removed and the model is tasked with filling in the missing parts sentence permutation another form of sequence corruption where the sentences in a document are permuted and the model is tasked with sorting them back into the original order document rotation the document is rotated and the model s task is to understand the correct ordering of the document hyperparameter choices tba optimization strategies adamw optimizer a variant of the adam optimizer that decouples weight decay from the rest of the optimization algorithm gradient clipping a method to prevent gradients from becoming too large and causing numerical instability during training parallelism strategies tensor parallelism splitting the model s parameters across multiple gpus to allow for larger models data parallelism e g pytorch fsdp and ddp zero sharding strategy splitting the input data across multiple gpus to allow for larger batch sizes pipeline parallelism splitting the model into several stages that are each run on different gpus allowing for larger models and better hardware utilization fine tuning approaches p tuning a method of parameter efficient fine tuning where additional positional parameters are introduced and learned adapter tuning a method of fine tuning where additional smaller layers adapters are added to the model and trained while the original model parameters are frozen contributions contributions are welcome | ai |
|
xmrig-C3 | xmrig github all releases https img shields io github downloads xmrig xmrig total svg https github com c3pool xmrig releases github release https img shields io github release xmrig xmrig all svg https github com c3pool xmrig releases github release date https img shields io github release date xmrig xmrig svg https github com c3pool xmrig releases github license https img shields io github license xmrig xmrig svg https github com c3pool xmrig blob master license github stars https img shields io github stars xmrig xmrig svg https github com c3pool xmrig stargazers github forks https img shields io github forks xmrig xmrig svg https github com c3pool xmrig network xmrig is a high performance open source cross platform randomx kawpow cryptonight and ghostrider https github com xmrig xmrig tree master src crypto ghostrider readme unified cpu gpu miner and randomx benchmark https xmrig com benchmark official binaries are available for windows linux macos and freebsd mining backends cpu x86 x64 armv7 armv8 opencl for amd gpus cuda for nvidia gpus via external cuda plugin https github com c3pool xmrig cuda download binary releases https github com c3pool xmrig releases build from source https xmrig com docs miner build usage the preferred way to configure the miner is the json config file https xmrig com docs miner config as it is more flexible and human friendly the command line interface https xmrig com docs miner command line options does not cover all features such as mining profiles for different algorithms important options can be changed during runtime without miner restart by editing the config file or executing api https xmrig com docs miner api calls wizard https xmrig com wizard helps you create initial configuration for the miner workers http workers xmrig info helps manage your miners via http api donations default donation 1 1 minute in 100 minutes can be increased via option donate level or disabled in source code xmr 48edfhu7v9z84yzzma6fuueoelz9zrxq9vetwzygzkt52xu5xvqgzyndk9urnrojmk1j8nlwevsaswj4fhduyzijbguicod developers xmrig https github com xmrig sech1 https github com schernykh contacts support xmrig com reddit https www reddit com user xmrig twitter https twitter com xmrig dev | os |
|
F407freeRTOS | f407freertos based on f407 freertosv9 0 mongo canfestival 3 asc 1a25f5151a8d | os |
|
maven-pe-for-llms-4 | prompt engineering for llms notebooks and exercises welcome to the notebooks and exercises for the prompt engineering for llms course 1 sign up for an openai key to run the notebooks in this repo you are required to sign up for an openai paid account sign up a paid account here https platform openai com once done you can generate an api key 2 check out the repo sh git clone https github com dair ai maven pe for llms 4 git cd maven pe for llms 4 if you already have the repo go into it and make sure you have the latest sh cd maven pe for llms 4 git pull origin master if you have downloaded the zipped file instead unzip it and go into the directory sh cd maven pe for llms 4 3 setup the environment conda if you don t have conda you can install it here https docs conda io projects conda en latest user guide install once installed run the following command to create a new environment called pe for llms sh conda create n pe for llms next activate the conda environment sh conda activate pe for llms finally add the kernel to jupyter sh python m ipykernel install user name pe for llms python environment if you don t want to use conda you can create a virtual environment using python s venv module sh python3 m venv venv next activate your environment the command below is for linux sh source venv bin activate 4 install the packages next install the dependencies inside the requirements txt file sh pip install r requirements txt 5 run the preparation exercise run the prepare exercise notebook found inside the exercises folder https github com dair ai maven pe for llms 4 blob main exercises pe for llms preparation exercise ipynb before attempting the preparation exercise add a env file to your root folder and add your open api key that s it you re all setup to start working on the notebooks and exercises | ai |
|
pcs-remote-monitoring-webui | build build badge build url issues issues badge issues url gitter gitter badge gitter url intro azure pcs remote monitoring webui prerequisites prerequisites build run and test locally build run and test locally contributing to the solution contributing to the solution azure remote monitoring webui web app for azure iot remote monitoring solution dotnet https github com azure azure iot pcs remote monitoring dotnet and java https github com azure azure iot pcs remote monitoring java prerequisites 1 deploy a remote monitoring solution deploy to azure http azuredeploy net deploybutton png https www azureiotsolutions com accelerators solutions types rm2 the easiest way to test the web ui is against a deployed remote monitoring solution the solution can be deployed via the web interface https docs microsoft com azure iot suite iot suite remote monitoring deploy or via the command line https docs microsoft com azure iot suite iot suite remote monitoring deploy cli it is also possible to deploy the solution locally https docs microsoft com azure iot suite iot suite remote monitoring deploy local deploy the azure services 2 setup dependencies 1 install node js https nodejs org 2 for development you can use your preferred editor visual studio code https code visualstudio com atom https atom io sublime text https www sublimetext com or other preferred editor 3 environment variables required to run the web ui in order to run the web ui the following environment variables need to be created at least once more information on configuring environment variables here envvars howto url react app base service url your remote monitoring endpoint the endpoint given above is the base url you navigate to in order to see your deployed solution the web ui configuration is stored in app config js src app config js you can edit this file to update the endpoints build run and test locally cd pcs remote monitoring webui npm install npm start launches the project in browser watches for code changes and refreshes the page npm run build creates a production ready build npm test runs test in watch mode press q to quit project structure the web ui contains the following sections under src src assets contains assets used across the application these include fonts icons images etc components contains all the application react components these in include containers and presentational components services contains the logic for making ajax calls as well as mapping request response objects to front end models store contains all logic related to the redux store styles contains sass used across the application mixins theming variables etc utilities contains helper scripts used across the application contributing to the solution please follow our contribution guildelines contributing md and the code style conventions customizing the solution please see our developer walkthroughs docs walkthrough readme md for more information on customizing and adding to the application references this project was bootstrapped with create react app https github com facebookincubator create react app you can find a guide to using it here https github com facebookincubator create react app blob master packages react scripts template readme md core technologies overview reactjs https reactjs org react router v4 https github com reacttraining react router redux https redux js org redux observable https redux observable js org rxjs http reactivex io rxjs sass http sass lang com react i18nnext https github com i18next react i18next build badge https solutionaccelerators visualstudio com remotemonitoring apis build status pcs remote monitoring webui build url https solutionaccelerators visualstudio com remotemonitoring build latest definitionid 32 issues badge https img shields io github issues azure pcs remote monitoring webui svg issues url https github com azure pcs remote monitoring webui issues new gitter badge https img shields io gitter room azure iot solutions js svg gitter url https gitter im azure iot solutions windows envvars howto url https superuser com questions 949560 how do i set system environment variables in windows 10 envvars howto url https www schrodinger com kb 1842 | server |
|
Black-Box-Tuning | black box tuning for language model as a service updates 2022 10 14 release the latest version of bbtv2 check out the updated results https docs google com spreadsheets d 1fa9zmw613ooski fbxuznnbvo7rfv5ovdz0yhna2j5q edit usp sharing mag 2022 07 05 release a paper list https github com txsun1997 lmaas papers on lmaas check out other awesome papers bookmark tabs 2022 06 05 support t5 and gpt 2 model clap 2022 05 15 support bert and bart model clap 2022 05 04 release bbtv2 check out our paper https arxiv org abs 2205 11200 and try it with deepbbt py tada 2022 02 18 support onnx runtime optimization training speed is doubled rocket 2022 01 13 release the first version of bbt check out our paper https arxiv org abs 2201 03514 tada quick links introduction introduction prepare your environment prepare your environment using bbt using bbt using bbtv2 using bbtv2 inference optimization inference optimization citation citation introduction black box tuning bbt is a gradient free method to drive large language models llms for few shot learning it optimizes a sequence of soft prompt tokens prepended to the input of llms without requiring gradients back propagation of the llms therefore pre trained general purposed llms can be viewed as black box models and deployed efficiently on some inference servers in such a scenario which we call language model as a service lmaas bbt can achieve comparable performance to full model tuning by only accessing model inference apis generally bbt can achieve considerable results on most language understanding datasets within 8k model forward passes more details are provided in our icml paper black box tuning for language model as a service https arxiv org abs 2201 03514 and our emnlp paper bbtv2 towards a gradient free future with large language models https arxiv org abs 2205 11200 to help reproduce results reported in the paper we also release a google sheets https docs google com spreadsheets d 1fa9zmw613ooski fbxuznnbvo7rfv5ovdz0yhna2j5q edit usp sharing recording bbtv2 performance on each dataset using each random seed feel free to reach out to me if you cannot obtain similar results prepare your environment the implementation of black box tuning is quite simple you can check our code and easily implement it in your own environment or you can create a new environment to run our implementation based on pycma transformers and fastnlp optionally you can use fitlog to monitor experimental results you can uncomment the fitlog related lines in our code to use it bash conda create name bbt python 3 8 conda activate bbt pip install transformers 4 1 1 pip install fastnlp 0 6 0 pip install datasets pip install cma pip install sklearn git clone https github com txsun1997 black box tuning cd black box tuning using bbt now you can run black box tuning with run sh bash bash run sh in general you will obtain the following results in 13 minutes tested on nvidia 3090 gpu sst 2 split best accuracy train 100 dev 96 88 test 90 48 to reproduce other experiments in our paper change the arguments of bbt py for example bash python bbt py task name sst2 n prompt tokens 50 intrinsic dim 500 k shot 16 device cuda 0 seed 42 loss type ce cat or add add budget 8000 print every 50 eval every 100 to obtain similar results as reported in the original paper we recommend using loss type hinge for sentence pair tasks i e mrpc snli and rte and using budget 20000 for dbpedia in addition black box tuning also supports parallel evaluation that is you can evaluate a population of solutions in parallel by putting them into a single large batch for example bash python bbt py task name sst2 n prompt tokens 50 intrinsic dim 500 k shot 16 device cuda 0 seed 42 loss type ce cat or add add budget 300 print every 10 eval every 20 parallel using bbtv2 bbtv2 is an improved version of bbt instead of optimizing the prompt merely in the input layer bbtv2 adopts a divide and conquer algorithm to alternately optimize prompts in every layer i e deep prompt you can simply try bbtv2 using the following command bash python deepbbt py model name roberta large task name agnews n prompt tokens 50 intrinsic dim 500 k shot 16 device cuda 0 seed 42 loss type ce cat or add add random proj normal sigma 0 2 alpha 0 2 popsize 20 bound 0 budget 8000 print every 50 eval every 100 bbtv2 usually confers better results on many label classification tasks e g dbpedia and entailment tasks e g mrpc snli rte etc check our google sheets https docs google com spreadsheets d 1fa9zmw613ooski fbxuznnbvo7rfv5ovdz0yhna2j5q edit usp sharing if you have problem reproducing the results of bbtv2 inference optimization in contrast to training with gradient descent bbt and bbtv2 only requires model forward computation and therefore can be significantly accelerated using onnx runtime https onnxruntime ai or nvidia tensorrt https developer nvidia com tensorrt here we provide an implementation of inference optimization using onnx runtime you can obtain 2x speedup using only one line of code sdk onnxruntime gpu is required for optimization installation of this package can be troublesome and there may be some environment specific errors or unexpected performance but in real world scenarios this is a part of the black box on the server side the following code works well to configure the environment on an nvidia geforce rtx 3090 gpu with driver version 470 82 00 and cuda version 11 4 bash pip install transformers 4 1 1 pip install datasets pip install fastnlp pip install cma pip install sklearn pip3 install torch extra index url https download pytorch org whl cu113 pip install onnx pip install onnxruntime gpu 1 10 0 pip install coloredlogs pip install sympy to export a bbt model based on pytorch to an onnx model you can run export and optimize py with all arguments set to default to get a demo onnx model bash python export and optimize py two models will be saved to onnx models namely exported not accelerated and optimized model then you can modify run sh by setting parameter inference framework to ort and onnx model path to your model path a faster version of bbt is ready here is an example bash python bbt py task name sst2 n prompt tokens 50 intrinsic dim 500 k shot 16 device cuda 0 seed 42 loss type ce cat or add add budget 8000 print every 50 eval every 100 inference framework ort onnx model path onnx models optimized model onnx to add some flexibility to model optimization we provided some options in export and optimize py you can adjust these arguments in export and optimize sh here is an example bash python export and optimize py batch size 32 max seq len 128 n prompt tokens 50 prompt embed dim 1024 cat or add add exported model name model optimized model name optimized model onnx models are static but to cat or to add is a branch in the model during building phase unused nodes in the model graph are removed for better performance so you have to build one for each mode you can get the following results in 4 3 0 1 minutes compared to pytorch version of bbt whose training time is 8 9 0 15 minutes depends on hardware settings you can get the following results by running bbt 100 times on sst2 with random seed set from 1 to 100 fp16 optimization does not hurt performance on all tasks sst 2 split best accuracy test 88 0 citation if you find this work helpful please cite bibtex inproceedings sun2022bbt title black box tuning for language model as a service author tianxiang sun and yunfan shao and hong qian and xuanjing huang and xipeng qiu booktitle proceedings of icml year 2022 bibtex inproceedings sun2022bbtv2 title bbtv2 towards a gradient free future with large language models author tianxiang sun and zhengfu he and hong qian and yunhua zhou and xuanjing huang and xipeng qiu booktitle proceedings of emnlp year 2022 | black-box-optimization deep-learning few-shot-learning language-model natural-language-processing pytorch | ai |
farwest | farwest farwest is a modern web application development platform goals farwest allows people with no erlang knowledge to painlessly build frontends to erlang applications farwest also provides ui components for allowing subsequent users to perform various tasks without the help of a programmer usage the farwest project is an empty template you can use to build applications by default it provides you with nothing but an administration ui to build you need erlang r16b relx and gnu make installed bash make farwest temporarily uses riak for storing content you need to setup a working cluster on your local environment by following the steps found in the documentation for farwest to work http docs basho com riak latest quickstart the farwest server can then be started bash rel farwest bin farwest console you can now access the administration ui at the following url http localhost 8080 farwest farwest is still very unstable features may or may not work at this point in time support official irc channel ninenines on irc freenode net mailing lists http lists ninenines eu commercial support http ninenines eu support | front_end |
|
Route-Planning-App-Project-FinalVersion | development of mobile application for route planning based on pollution detection the purpose of this project is to develop an application based on the android system that can optimize travel routes from selected locations by incorporating information on air pollution data the air pollution data are based upon the data measured and made available by the aircasting project http www aircasting org to access this data the application sends a request to aircasting s data server the server parses the relevant data and returns them to the mobile application the application integrates and calculates the processed data and finally calculates and draws different routes for visual display in the process of implementing this project there are many challenges related to application development the report also details the development process undertaken throughout this work flow chart of route planning application p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 30 png width 550 p implementation design start interface p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 1 png width 250 p welcome guide interface p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 31 png width 300 p p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 32 png width 300 p p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 33 png width 300 p main interface p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 11 png width 300 p choose map style p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 12 png width 300 p different map style display p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 34 png width 300 p p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 35 png width 300 p current location display p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 17 png width 300 p address auto complete p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 18 png width 300 p jump to google to check information p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 36 png width 300 p route planning for different mode p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 37 png width 300 p about us interface p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 24 png width 300 p link to social network p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 38 png width 300 p link to aircasting and epa vic p align center img src https github com kobespirit route planning app project finalversion blob master sample 20pics 39 png width 300 p future work some aspects of this project can be improved in the future first of all regarding the material design which is a new interface design language it is based on traditional design principles combined with rich creative and scientific technology it contains visual sports interactive effects and other characteristics it is a new design specification supporting android applications therefore it is necessary to have this application completely added to the material design concept which includes the drawing of the user interface the matching of the application colors the design of the interface layout and the addition of the jump animation effect in addition it is expected that in the future time a function called multiple choice route mode could be added when a user starts from one location to another there are multiple routing options available to the user to select the first one is the shortest distance between the two places the second function is between the two sites the lowest air pollution index line which may be the shortest distance the last one is a more comprehensive and reasonable trade off model of the route planning this model needs to be analyzed and calculated for the location selected by the user resulting in a route where the pollution index is within acceptable limits and the distance is also within an acceptable range this feature needs to optimize and improve the existing algorithm finally in the future the memory footprint and compatibility of this mobile application could have substantial improvements this requires a higher level of technology and experience conclusions in conclusion the project has developed a mobile application on the android system which can detect air pollution and plan the routes to minimize pollution exposure the app uses air pollution data from the aircasting project by calling its api interface the application uses google maps and through air pollution data analysis processing and calculation obtains the shortest routes between two points at the same time the optimal route between two places is visualised this route is the integration of the distance and the pollution coefficient it can offer the user a valuable travel reference in the process of implementing this project many challenges and problems have arisen since it is my first time to develop an android application the emergence of a variety of problems made the implementation of the application difficult compared to experienced android programmers for example when the application is running care needs to be taken to avoid loading too much data through the launcher in the layout file another issue is that in the map activity layout file the user must accurately identify the relevant functions a key consideration is how to improve user experience every challenge and problem encountered in the process of implementing this project resulted in a solution but other solutions are also possible in addition there are some features and interface performance which could be improved and enhanced in the future references 1 shen my liu cy huang b 2013 a development strategy of m commerce against mobile interne chapter 3 pp 1092 1096 2 nica e 2015 ict innovation internet sustainability and economic development addleton academic publishers 3 barry m vinton g david d 2012 http www internetsociety org internet what internet history internet brief history internet 4 the history of android http www androidcentral com android history 5 epa vic about us http www epa vic gov au about us who we are 6 aircasting project about us http www aircasting org about 7 the difference between functional and non functional requirements http reqtest com requirements blog understanding the difference between functional and non functional requirements 8 hartley s 1998 concurrent programming the java programming language oxford university press new york 9 may t 2016 the beginner s guide to flat design http www creativebloq com graphic design what flat design 3132112 10 delling d sanders p schultes d wagner d 2009 algorithmics of large and complex networks design analysis and simulation springer pp 117 139 appendix application video demo https www youtube com watch v ifty9vtkmrs google maps api instruction https developers google com maps google maps api demo for android https github com googlemaps android samples aircasting project instruction and api https github com habitatmap aircasting | front_end |
|
rust-blockchain-example | rust blockchain example simple example for building a blockchain in rust start using bash rust log info cargo run this starts the client locally the blockchain is not persisted anywhere you can start it in multiple terminals to get multiple connected peer to peer clients in each client you can enter the following commands ls p list peers ls c print local chain create b data data is just a string here this creates mines a new block with the data entry data and broadcasts it once a block is created by a node it s broadcasted and the blockchain in all other nodes is updated if it s a valid block on startup a node asks another node on the network for their blockchain and if it s valid and longer than the current local blockchain it updates it s own chain to the longest one it receives this is a very overly simplified offline running highly inefficient and insecure blockchain implementation if a node gets out of sync it s broken this is an example for showing some of the concepts behind building a blockchain system in rust so it shouldn t be used anywhere near a production scenario but you can have fun with it and learn something | blockchain |
|
evaluate | p align center br img src https huggingface co datasets evaluate media resolve main evaluate banner png width 400 br p p align center a href https github com huggingface evaluate actions workflows ci yml query branch 3amain img alt build src https github com huggingface evaluate actions workflows ci yml badge svg branch main a a href https github com huggingface evaluate blob master license img alt github src https img shields io github license huggingface evaluate svg color blue a a href https huggingface co docs evaluate index img alt documentation src https img shields io website http huggingface co docs evaluate index svg down color red down message offline up message online a a href https github com huggingface evaluate releases img alt github release src https img shields io github release huggingface evaluate svg a a href code of conduct md img alt contributor covenant src https img shields io badge contributor 20covenant 2 0 4baaaa svg a p evaluate is a library that makes evaluating and comparing models and reporting their performance easier and more standardized it currently contains implementations of dozens of popular metrics the existing metrics cover a variety of tasks spanning from nlp to computer vision and include dataset specific metrics for datasets with a simple command like accuracy load accuracy get any of these metrics ready to use for evaluating a ml model in any framework numpy pandas pytorch tensorflow jax comparisons and measurements comparisons are used to measure the difference between models and measurements are tools to evaluate datasets an easy way of adding new evaluation modules to the hub you can create new evaluation modules and push them to a dedicated space in the hub with evaluate cli create metric name which allows you to see easily compare different metrics and their outputs for the same sets of references and predictions documentation https huggingface co docs evaluate find a metric https huggingface co evaluate metric comparison https huggingface co evaluate comparison measurement https huggingface co evaluate measurement on the hub add a new evaluation module https huggingface co docs evaluate evaluate also has lots of useful features like type checking the input types are checked to make sure that you are using the right input formats for each metric metric cards each metrics comes with a card that describes the values limitations and their ranges as well as providing examples of their usage and usefulness community metrics metrics live on the hugging face hub and you can easily add your own metrics for your project or to collaborate with others installation with pip evaluate can be installed from pypi and has to be installed in a virtual environment venv or conda for instance bash pip install evaluate usage evaluate s main methods are evaluate list evaluation modules to list the available metrics comparisons and measurements evaluate load module name kwargs to instantiate an evaluation module results module compute kwargs to compute the result of an evaluation module adding a new evaluation module first install the necessary dependencies to create a new metric with the following command bash pip install evaluate template then you can get started with the following command which will create a new folder for your metric and display the necessary steps bash evaluate cli create awesome metric see this step by step guide https huggingface co docs evaluate creating and sharing in the documentation for detailed instructions credits thanks to marella https github com marella for letting us use the evaluate namespace on pypi previously used by his library https github com marella evaluate | evaluation machine-learning | ai |
M2_HomeSecurityWithDoorSensor | m2 homesecuritywithdoorsensor design and simulation of circuits and embedded systems here is a project developed as mini project in embedded c as home security with door sensor door sensors are an essential component of your home security system in this mini project you can get status of our home door which is closed or open if the led connected to atmega port is high means if led is flashing then you can get the idea about status of door i e our door is open similarly if the led connected to atmega controller port is low means led is not flashing then you can get that the door is closed this is all becoz of i have made a system that if door is closed then there will be no supply to the led so led will not glow similarly for open of door there will be supply of 5v that i have made so that led will glow there so this is how you can get status of your home door with this embedded project badges static analysis 1 codacy codacy badge https app codacy com project badge grade 058261b0584f4b10a16f4491a2b5a1b7 https www codacy com gh naiksandesh7175 m2 homesecuritywithdoorsensor dashboard utm source github com amp utm medium referral amp utm content naiksandesh7175 m2 homesecuritywithdoorsensor amp utm campaign badge grade 2 codiga codiga badge https api codiga io project 32911 status svg codiga badge https api codiga io project 32911 score svg | os |
|
Deploy-a-NLP-Similarity-API-using-Docker | restful api for similarity check using natural language processing introduction build a similarity check api using nlp run and deploy using docker docker compose documents similarity document similarity or distance between documents is a one of the central themes in information retrieval how humans usually define how similar are documents usually documents treated as similar if they are semantically close and describe similar concepts on other hand similarity can be used in context of duplicate detection we will review several common approaches imagesimilarity https miro medium com max 1838 1 l bzlw3juhd1mzbnq1mjqa png objective the objective of this api is to handle similarity of text plagiarism check api architecture resources url path method parameters statuscode register a user register post username password 200 ok 301 invalid username detect similarity of docs detect post username password text1 text2 200 ok return similarity 301 invalid username 302 invalid password 303 out of tokens refill refill post username admin pw refill amount 200 ok 301 invalid username 304 invalid admin pw requirements spacy io https spacy io models en is an open source software library for advanced natural language processing written in the programming languages python it is very easy python processing module download the spacy model from here https github com explosion spacy models releases tag en core web sm 2 1 0 flask framework see how to install and run the flask framework here https github com pallets flask for more details https www fullstackpython com flask html pymongo pymongo is a python distribution containing tools for working with mongodb download and install pymongo from here https api mongodb com python current docker https www docker com docker compose yml contributing please feel free to fork this package and contribute by submitting a pull request to enhance the functionalities how can i thank you why not star the github repo i d love the attention why not share the link for this repository on twitter hackernews or destructoid spread the word don t forget to follow me on twitter https twitter com thecraftman thanks ore aruwaji tola | python3 flask spacy restful-api natural-language-processing docker docker-compose pymongo api-rest natural-language | ai |
fecs | fecs fecs node js build status https img shields io travis ecomfe fecs svg style flat http travis ci org ecomfe fecs build status https img shields io appveyor ci chriswong fecs svg style flat https ci appveyor com project chriswong fecs npm version https img shields io npm v fecs svg style flat https www npmjs com package fecs coverage status https img shields io coveralls ecomfe fecs svg style flat https coveralls io r ecomfe fecs dependencies https img shields io david ecomfe fecs svg style flat https david dm org ecomfe fecs devdependencies https img shields io david dev ecomfe fecs svg style flat https david dm org ecomfe fecs greenkeeper badge https badges greenkeeper io ecomfe fecs svg https greenkeeper io sudo npm install fecs g fecs fecs v fecs check help fecs format help wiki cli https github com ecomfe fecs wiki cli api fecs leadname fecs javascript var fecs require fecs fecs leadname edp fecs getoptions array argv minimist fecs check fecs format javascript var options fecs getoptions process argv slice 2 console log options command check fecs check object options function done javascript options path to check stream options stream yourreadablestream options type js css callback after check finish param boolean success true as all files ok or false param object errors data for check result function done success errors blablabla fecs check options done fecs format object options javascript fecs check options x vim https github com hushicai fecs vim x webstorm https github com leeight baidu fe code style webstorm x eclipse https github com ecomfe fecs eclipse x sublime text 2 3 baidu fe code style https github com leeight baidu fe code style sublime helper https github com baidu lbs opn fe sublime fecshelper sublimelinter contrib fecs https github com robbenmu sublimelinter contrib fecs x visual studio code fecs visual studio code https github com 21paradox fecs visual studio code vscode fecs https github com marxjiao vscode fecs x atom https github com 8427003 atom fecs x emacs https github com niandalu flycheck fecs x grunt https github com ecomfe fecs grunt x gulp https github com ecomfe fecs gulp x git hook https github com cxtom fecs git hooks https github com ecomfe fecs wiki faq https github com ecomfe fecs wiki howtofix https github com ecomfe fecs wiki | front_end |
|
FXTools | for english readme zh md user fxtools a practical tool developed using javafx software development for pc or mobile image tools color tools svg tools font tools library and so on br download download from github windows https github com leewyatt fxtools releases macos https github com leewyatt fxtools releases linux https github com leewyatt fxtools releases download from gitee windows https gitee com leewyatt fxtools releases macos https gitee com leewyatt fxtools releases linux https gitee com leewyatt fxtools releases youtube video introduction https youtu be ldj1wa 2ifm fxtools doc tools doc screenshots screenshots image tools image tools color tools color tools svg tools svg tools font tools font tools library library tips tips thanks thanks donate donate donors donors i wrote an idea plug in named java fxtools before but the api of idea will change and i don t have the time and energy to maintain it all the time so i made this stand alone version rewrite the code re layout and add new functions br the project uses java17 for development but is trying to be compatible with java8 except for screenshot related apis so it doesn t use too many advanced syntax features br the jdk used is liberica 17 now br span id tools doc span a practical tool developed using java fx software development for pc or mobile the main functions are as follows span id screenshots span readme imgs yl 1 png readme imgs yl 2 png readme imgs yl 3 png readme imgs yl 4 png readme imgs yl cn 4 png readme imgs yl cn 1 png readme imgs yl cn 2 png readme imgs yl cn 3 png image tools span id image tools span 1 app icon generator supports icon generation for windows macos linux iphone ipad watchos android and other systems icon icns png 2 image sets generator support to generate multiple images eg 1x 2x 3x of javafx ios android 3 format converter support common image format conversion eg svg webp png bmp jpg gif 4 gif decoder decompose the gif animation into frame by frame pictures mainly convenient for game engines such as fxgl 5 image stitching splicing multiple pictures into one picture improving efficiency and reducing the number of io mainly convenient for game engines such as fxgl 6 screenshot take a picture of a specified location on the screen the screenshots of the java 9 version are clear java 8 cannot capture high definition resolution screens so the screenshots are too small under high resolution screens if modify the source code to support java 8 only need to modify a few lines of code related to the screenshot color tools span id color tools span 1 absorb the color of the specified position on the screen 2 the selected color can generate fx css code or java code 3 20 pages of color matching reference 4 convert between multiple color formats hsb rgb hsl hex svg tools span id svg tools span 1 support preview of svg path easy to view the display effect of svg under fx 2 it is convenient to extract the path attribute in the svg file which is convenient for use in fx 3 generate fx css code or java code font tools span id font tools span 1 preview the font effect that comes with the system 2 support adding external fonts preview 3 generate fx css code or java code library span id library span 1 reference awesomejavafx https github com mhrimaz awesomejavafx lists many great open source libraries books etc tips span id tips span 1 when processing images multi threading is supported the number of threads can be set on the settings page the default is 2 threads 2 turn off the image preview option and turn off the parse image size option in the settings page will get a faster loading speed of the image 3 turn off parsing image size and generating thumbnails when loading images which can speed up image processing 4 after the image processing is completed the output directory of the image will be opened by default 5 this tool support darkmode and lightmode thanks span id thanks span img src https www ej technologies com images product banners install4j large png width 128 thanks to ej technologies https www ej technologies com for their open source license https www ej technologies com buy install4j opensource we use install4j to build installers img src https gluonhq com wp content uploads 2015 01 gluon logo 2x png width 5 thanks to gluon https gluonhq com for documents img src https gluonhq com wp content uploads 2015 02 scenebuilderlogo 2x png width 5 thanks to scenebuilder https github com gluonhq scenebuilder used colorpicker doubletextfield etc img src https controlsfx github io images controlsfx png width 15 thanks to controlsfx https github com controlsfx controlsfx thanks to abhinay agarwal https github com abhinayagarwal for help thanks to https blog thetbw xyz for providing the storage thanks to anivie https github com anivie for testing documentation etc thanks to openjfx https openjfx io for documents sample project maven plugin etc thanks to awesomejavafx https github com mhrimaz awesomejavafx thanks to guava https github com google guava thanks to gson https github com google gson thanks to webp imageio https github com sejda pdf webp imageio for image processing thanks to thumbnailator https github com coobird thumbnailator for image processing thanks to image4j https github com imcdonagh image4j for image processing thanks to apache commons imaging https github com apache commons imaging for image processing thanks to batik https github com apache xmlgraphics batik for svg processing thanks to animated gif lib https github com rtyley animated gif lib for java for image processing thanks to twelvemonkeys https github com haraldk twelvemonkeys for image processing thanks to icns https github com gino0631 icns for image processing thanks to web color https gitee com song xiansen web color for color matching thanks to various references on the internet br support and donations span id donate span you can contribute and support this project by doing any of the following star the project on github give feedback commit pr contribute your ideas suggestions share fxtools with your friends colleagues if you like fxtools please consider donating br a href https www buymeacoffee com fxtools target blank img src https cdn buymeacoffee com buttons v2 default yellow png alt buy me a coffee style height 60px important width 217px important a br src main resources images donate wx png br src main resources images donate zfb png br note after using alipay wechat to pay for your donation please provide your name nickname and website by leaving a message or via email in the following format name nickname website message website and message are optional example leewyatt github com leewyatt i like fxtools if you choose to send an email please also provide the following information text donation amount amount payment platform alipay wechat pay payment number last 5 digits number email address leewyatt7788 gmail com mailto click to send email the name website and total donation amount you provide will be added to the donor list br thank you for your support donors span id donors span name website message amount https secaitong tmall com shop view shop htm spm a230r 1 14 10 65396cf7wjjng 288 00 cny cierconnor 200 cny 188 88 cny mnefo 50 cny 35 cny 30 cny 10 cny ik 10 cny gio 8 cny 6 66 cny andy97 6 66 cny 6g 5 cny 5 cny kiki 3 cny 6666 0 01 cny | javafx font image tools svg app color icon java | front_end |
opencv-gui-parameter-tuner | opencv parameter tuner building a gui tool to find the right parameters visually and quickly for various computer vision algorithms this is work in progress currently you can find parameters for following algorithms smoothing an image you can smooth blur an image with one filter or with a combination of up to 4 different types of blur filters provides average filter size gaussian filter size median filter size bilateral filter size canny edge detection provides upper threshold lower threshold gaussian filter size read more in my blog post https medium com maunesh finding the right parameters for your computer vision algorithm d55643b6f954 ss8w8scjf | computer-vision opencv computer-vision-algorithms | ai |
wp-react-starter | wp react starter wordpress react boilerplate deprecated wp react starter was a research project of devowl io https devowl io for the development of our wordpress plugins unfortunately we don t have enough resources to regularly contribute the developments of our private monorepo to wp react starter you are welcome to continue using or forking this project but it will no longer be updated or extended with new features structures etc img align right src https assets devowl io git wp react starter logo png alt wp react starter logo height 180 create multiple wordpress plugins that use react typescript and object oriented php in a fully customizable docker development environment commited in a monorepo wow i didn t know the wordpress plugin development could look like this instant no config plugin creation with create wp react app https github com devowlio create wp react app github stars https img shields io github stars devowlio wp react starter style flat logo github https github com devowlio wp react starter join on slack https img shields io badge slack join green svg style flat logo slack https matthias web com slack codecov https codecov io gl devowlio wp reactjs starter branch master graph badge svg https codecov io gl devowlio wp reactjs starter gitlab ci cd https img shields io badge ci 20 2f 20cd see 20history green logo gitlab https gitlab com devowlio wp reactjs starter pipelines why wordpress plugin development is fun with wp react starter everyone tells us wordpress plugins are a mess our answer is always let s take this opportunity to make the system that powers every third website on the internet better with wp react starter we have created a modern wordpress development boilerplate which contains everything you are used to from modern web development projects react frontend for reactive user interfaces with php fallback for server side rendering react is a part of wordpress since the gutenberg https wordpress org gutenberg release typescript for typesafe frontend development php in an object oriented style with namespaces for better backend code docker development environment to develop all you plugins without manual setup steps ci cd integration for automated code quality checks and release management publish on wordpress org https wordpress org plugins developers or wherever you want does that sound like crappy wordpress plugin development or what you really have been looking for for your plugins for a long time let s start today with your first wordpress plugin create it within 5 minutes thanks to our cli create wp react app https github com devowlio create wp react app client side features familiar react api patterns es6 with typescript react https reactjs org with babel env preset hooks mobx https github com mobxjs mobx for state management webpack https webpack js org build for assets core js https github com zloirock core js puts automatically needed polyfills to your distribution files sourcemap https www html5rocks com en tutorials developertools sourcemaps generation for debugging purposes css and typescript files sass http sass lang com stylesheets compiler scss files for next gen css postcss http postcss org for transforming scss including autoprefixing to css minified sources automatically generated for production js css grunt https gruntjs com for automation tasks build the installable plugin eslint https eslint org predefined configuration for proper linting typedoc https typedoc org guides doccomments for javascript documentation wp hookdoc https github com matzeeable wp hookdoc for filters actions documentation translation i18n with automatic generation of pot files add on development multiple wordpress plugins based on a predefined utils package that allows you to share typescript types across plugins admin backend components in this case an own page with a button admin ts frontend components in this case a simple widget widget ts server side features oop style for building a high quality php development php gt 5 6 required an admin notice is showed when not available wordpress gt 5 2 required an admin notice is showed when not available with a link to the updater php codesniffer https github com squizlabs php codesniffer predefined configuration for proper linting namespace http php net manual en language namespaces rationale php support autloading http php net manual en language oop5 autoload php classes in connection with namespaces wp rest api v2 http v2 wp api org for api programming no longer use admin ajax php for crud operations script debug https codex wordpress org debugging in wordpress script debug enables not minified sources for debug sources use in connection with yarn build js development cachebuster http www adopsinsider com ad ops basics what is a cache buster and how does it work for public resources automatic generation of pot files for translating i18n the backend plugin phpdocumentor https github com phpdocumentor phpdocumentor2 for php documentation apidoc http apidocjs com for api documentation automation features avoid repetitive work and develop more feature workspace creation with end to end setup create wp react app create workspace plugin creation with monorepo integration create wp react app create plugin package creation with monorepo integration create wp react app create package predefined gitlab ci https about gitlab com product continuous integration example for continous integration read more using ci cd scoping https github com humbug php scoper your php coding and dependencies so they are isolated avoid dependency version conflicts packaging and publishing of you plugin wordpress org https wordpress org plugins developers read more https devowlio gitbook io wp react starter gitlab integration deploy wp org license checker https www npmjs com package license checker for automated 3th party code license scanning and compliance check developer experience features providing the right development environment for high quality plugins built on top of visual studio code https code visualstudio com extensions are automatically installed all your plugins within yarn workspaces https yarnpkg com lang en docs workspaces prettier https prettier io for automatic javascript typescript code formatting on save vscode required php codesniffer s cbf https github com squizlabs php codesniffer wiki fixing errors automatically for automatic php code formatting on save vscode required husky https github com typicode husky integration for code formatting before git commit never have ugly code in your repository husky is also used for commitlint https github com conventional changelog commitlint to become a common commit message style in your repository lerna https lerna js org for semantic versioning and changelog generation webpackbar https github com nuxt webpackbar so you can get a real progress bar while development docker https www docker com for a local development environment predefined wordpress stubs so you get autocompletion for wordpress classes and functions e g add action within the docker environment you have wp cli https developer wordpress org cli commands available predefined review apps https docs gitlab com ee ci review apps example for branch deployment read more here using ci cd predefined vscode php debugging environment testing features cover your source code with test code to to guarantee the last piece quality phpunit https phpunit de for php unit testing jest https jestjs io for typescript unit and snapshot testing collect code coverage reports with a single command in each package automatically push coverage reports to codecov io https codecov io cypress https www cypress io for end to end e2e tests gherkin https cucumber io docs gherkin syntax to write e2e features combined with cypress automatically failure a gitlab ci pipeline if a coverage percent is not reached threshold the complete test suite is integrated in gitlab ci documentation you want to dive deep into the documentation of wp react starter check we convinced another developer to write high quality wordpress plugins usage getting started https devowlio gitbook io wp react starter usage getting started folder structure https devowlio gitbook io wp react starter usage folder structure root https devowlio gitbook io wp react starter usage folder structure root plugin https devowlio gitbook io wp react starter usage folder structure plugin available commands https devowlio gitbook io wp react starter usage available commands root https devowlio gitbook io wp react starter usage available commands root plugin https devowlio gitbook io wp react starter usage available commands plugin package https devowlio gitbook io wp react starter usage available commands package php development predefined constants https devowlio gitbook io wp react starter php development predefined constants predefined classes https devowlio gitbook io wp react starter php development predefined classes example implementations https devowlio gitbook io wp react starter php development example implementations add new classes hooks and libraries https devowlio gitbook io wp react starter php development add classes hooks libraries localization https devowlio gitbook io wp react starter php development localization debugging https devowlio gitbook io wp react starter php development debugging typescript development utils package https devowlio gitbook io wp react starter typescript development utils package example implementations https devowlio gitbook io wp react starter typescript development example implementations add external library https devowlio gitbook io wp react starter typescript development add external library consume php variable https devowlio gitbook io wp react starter typescript development consume php variable using entrypoints https devowlio gitbook io wp react starter typescript development using entrypoints localization https devowlio gitbook io wp react starter typescript development localization advanced build production plugin https devowlio gitbook io wp react starter advanced build production plugin how cachebuster works https devowlio gitbook io wp react starter advanced how cachebuster works tests https devowlio gitbook io wp react starter advanced tests extend compose and webpack https devowlio gitbook io wp react starter advanced extend compose webpack create package https devowlio gitbook io wp react starter advanced create package create add on https devowlio gitbook io wp react starter advanced create add on persistent database snapshot https devowlio gitbook io wp react starter advanced persistent database snapshot showcase https devowlio gitbook io wp react starter advanced showcase license checker https devowlio gitbook io wp react starter advanced license checker gitlab integration predefined pipeline https devowlio gitbook io wp react starter gitlab integration predefined pipeline extend gitlab ci pipeline https devowlio gitbook io wp react starter gitlab integration extend gitlab ci pipeline use own runner https devowlio gitbook io wp react starter gitlab integration use own runner review applications https devowlio gitbook io wp react starter gitlab integration review applications deploy wordpress org https devowlio gitbook io wp react starter gitlab integration deploy wp org licensing thank you for your interest in wp react starter this boilerplate was developed organically over years and we at devowl io https devowl io bring all our experience from best selling wordpress plugins like wordpress real media library https codecanyon net item wordpress real media library media categories folders 13155134 as well as customer web development orders to this project with wp react starter you get dozens of hundred working hours compressed into one easy to use solution we would like to share our knowledge and solution with you to make the development of wordpress plugins more professional but we are even happier if you also share your knowledge to make this project even better wp react starter is licensed partly under gnu general public license v3 0 gpl v3 0 or later https www gnu org licenses gpl 3 0 en html and partly under our isc license isc https opensource org licenses isc feel free to develop high quality wordpress plugins at light speed with wp react starter in real projects don t worry it s free to use for all non commercial and commercial wordpress plugins | wordpress-plugin-boilerplate wordpress-boilerplate wordpress-plugin-skeleton wordpress-skeleton wordpress-plugin-development wordpress-plugin wordpress-react wordpress-typescript wordpress-object-oriented-php wordpress-oop-php wordpress-docker wordpress-plugin-monorepo wordpress-gitlab-ci wordpress-continuous-integration wordpress-mobx wordpress-saas wordpress-i18n wordpress-cypress wordpress-jest wordpress-phpunit | front_end |
DeployableComputerVisionApp | deploying install https cloud ibm com docs cli topic cli install ibmcloud cli https github com cloudfoundry cli wiki v6 cli installation guide 1 git clone existing repo done 2 delete package lock and yarn lock done 3 create env cfignore manifest yml done 4 update model pointer done 5 delete package lock json and yarn lock and npm install done 6 build app done npm run build 7 log into ibmcloud cli done ibmcloud login ibmcloud target cf ibmcloud target r au syd ibmcloud cf push tensorflowapp 9 login create vcap variables done ibmcloud cf logs tfcvapp | ai |
|
roadmap-for-NLP | a natural language processing s roadmap for begginers robot nerd face img src https media0 giphy com media 13k4vsc3nglpuy giphy gif cid ecf05e47x8xl40r6pvs1rx7ou2c6i1hh0zaas3i77f4zdp2i rid giphy gif ct g width 500px a id sumario a books summary 1 what is nlp what 2 why nlp what is the benefits why 3 main uses of nlp uses 4 recommendation of papers on the use of nlp papers 5 best programming language programming 6 roadmap roadmap 7 open datasets for pratical hands on projects databases 8 books you should have books 9 important researchers in this field that you need to follow researchers a name what a computer 1 what is nlp according to ibm natural language processing nlp refers to the branch of computer science and more specifically the branch of artificial intelligence or ai concerned with giving computers the ability to understand text and spoken words in much the same way human beings can nlp combines computational linguistics rule based modeling of human language with statistical machine learning and deep learning models together these technologies enable computers to process human language in the form of text or voice data and to understand its full meaning complete with the speaker or writer s intent and sentiment a name why a woman technologist 2 why nlp what is the benefits you have probably wondered what is the benefit of using nlp in systems let s see a little bit about it star struck 1 perform large scale analysis nlp technology allows for text analysis at scale on all manner of documents internal systems emails social media data online reviews and more 2 get a more objective and accurate analysis 3 improve customer satisfaction 4 better understand your market 5 get actionable insights a name uses a computer 3 main uses of nlp email filters smart assistants search results autocomplete and autocorrect text language translation chatbots a name papers a page with curl 4 recommendation of papers on the use of nlp in addition to the examples above i m bringing some super cool research carried out throughout the year that used nlp for different purposes it s really worth looking at these papers and getting inspired wink 1 balakrishnan v khan s fernandez t arabnia h r 2019 cyberbullying detection on twitter using big five and dark triad features personality and individual differences 141 252 257 https www sciencedirect com science article pii s0191886919300364 casa token 3q6htjpu5nuaaaaa agichfotss lr2id911kd9olc6 ag0tfcmvssscgvm6mhhc zzazv3cmzcfrszteobgnmujrda 2 yang x mcewen r ong l r zihayat m 2020 a big data analytics framework for detecting user level depression from social networks international journal of information management 54 102141 https www sciencedirect com science article pii s0268401219313325 casa token zaa4sd4fmz4aaaaa r4s6ppk5rbqkc9uaynkbib1phgj00tudc9v6 3j2d5lsifsb7ew7auih2mdf8pqtrnfda1h4iw 3 shi a qu z jia q lyu c 2020 november rumor detection of covid 19 pandemic on online social networks in 2020 ieee acm symposium on edge computing sec pp 376 381 ieee https ieeexplore ieee org abstract document 9355716 4 vo t sharma r kumar r son l h pham b t tien bui d le t 2020 crime rate detection using social media of different crime locations and twitter part of speech tagger with brown clustering journal of intelligent fuzzy systems 38 4 4287 4299 https content iospress com articles journal of intelligent and fuzzy systems ifs190870 5 shu k zhou x wang s zafarani r liu h 2019 august the role of user profiles for fake news detection in proceedings of the 2019 ieee acm international conference on advances in social networks analysis and mining pp 436 439 https dl acm org doi abs 10 1145 3341161 3342927 a name programming a keyboard 5 best programming language for nlp 1 python python is very popular in this field because of its versatility and it offers developers a lot of libraries which handle many nlp related tasks like topic modeling document classification sentiment analysis etc 2 java java is another commonly used programming language in the field of natural language processing with the help of this language you can explore how to organize text utilizing full text search information extraction clustering and tagging you can use this libs opennlp lingpipe and stanford corenlp 3 r while r is popular for being used in statistical learning it s widely used for natural language processing i recommend you start with python because there re lots of things that make python the best programming language for a natural language processing project popular libraries in python for you learn nltk spacy core nlp text blob pynlpi gensim pattern a name roadmap a white check mark 6 roadmap prerequisite it s very importat you have some knowledge in machine learning especially supervised learning img src https github com lauradamacenoalmeida roadmap for nlp blob main roadmap png width 800px a name databases a mag 7 open datasets for pratical hands on projects general kaggle http kaggle com enron dataset https www cs cmu edu enron recommender systems datasets https cseweb ucsd edu jmcauley datasets html project gutenberg https www gutenberg org sentiment analysis yelp reviews https www yelp com dataset dictionaries for movies and finance https github com nproellochs sentimentdictionaries opinrank dataset http kavita ganesan com entity ranking data y0wjms 5rmd amazon reviews https www kaggle com datasets bittlingmayer amazonreviews portuguese tweets https www kaggle com datasets augustop portuguese tweets for sentiment analysis financial news https www kaggle com datasets ankurzing sentiment analysis for financial news women s e commerce clothing reviews https www kaggle com datasets nicapotato womens ecommerce clothing reviews text rick morty scripts https www kaggle com datasets andradaolteanu rickmorty scripts the wikiqa corpus https www microsoft com en us download details aspx id 52419 from http 3a 2f 2fresearch microsoft com 2fapps 2fmobile 2fdownload aspx 3fp 3d4495da01 db8c 4041 a7f6 7984a4f6a905 european parliament proceedings parallel corpus https www statmt org europarl jeopardy https www reddit com r datasets comments 1uyd0t 200000 jeopardy questions in a json file legal case reports dataset https archive ics uci edu ml datasets legal case reports scifi stories text corpus https www kaggle com datasets jannesklaas scifi stories text corpus ecommerce text classification https www kaggle com datasets saurabhshahane ecommerce text classification tweets about lord of the rings the rings of power https www kaggle com datasets paulk1992 tweets about lord of the rings the rings of power a name books a sunglasses 8 books you should have natural language processing with python pratical natural language processing natural language processing in action text mining with r applied text analysis with python natural language processing with pytorch deep learning with text a name researchers a woman scientist 9 important researchers in this field that you need to follow all these researchers had more important contributions to the academic environment and were highly cited in google scholar diana maynard https scholar google com citations hl pt br user yzzpj2oaaaaj nigel collier https scholar google com citations hl pt br user zmelba0aaaaj siegfried handschuh https scholar google com citations hl pt br user zl 3hgqaaaaj hoifung poon https scholar google com citations hl pt br user yqqmvbkaaaaj anders s gaard https scholar google com citations hl pt br user x3i4cryaaaaj references https www analyticsvidhya com blog 2022 01 roadmap to master nlp in 2022 https www linkedin com pulse nlp roadmap machine learning 2022 arya soni trk articles directory https www ibm com cloud learn natural language processing https monkeylearn com blog nlp benefits https odsc medium com 20 open datasets for natural language processing 538fbfaf8e38 | ai |
|
seniorProject | senior design project final report and accompanying poster fall 2015 encrypted radio communication via an embedded system the actual code is listed in a separate project for the tiny encryption algorithm | os |
|
Tap-the-Red | tap the red tap the red final project for cs 298 embedded systems a mini game designed on a breadboard connected to an arduino this project entailed the usage of different pieces of hardware studied in the embedded systems course in order to create any machine or any working computer of our choosing tap the red is a simple game where a 4x4 display would continually change between colors and players must hit the red button when the display color changes to red at every successful hit the game gets progressively faster and the player wins once every level differentiated by the speed in which colors change is passed the project was coded in c and the components used were breadboard arduino mini 4x4 led display 7 segment display toggle button cardboard red tape the ino file is the file that was transferred to the arduino board in order to run the game | os |
|
AppBackend | appbackend development of the backend of an instagram like app using go | server |
|
Angular-Firebase-CURD | angular firebase curd redux pipeline firebase is a mobile and web application development platform developed by firebase we used it as a backend api in our angular8 project | agular firebase-database json asynchronous-api curd | front_end |
ruby-nlp | ruby natural language processing resources a collection of natural language processing nlp ruby libraries tools and software suggestions and contributions are welcome categories apis apis bitext alignment bitext alignment books books case case chatbot chatbot classification classification date and time date and time emoji emoji error correction error correction full text search full text search keyword ranking keyword ranking language detection language detection language localization language localization lexical databases and ontologies lexical databases and ontologies machine learning machine learning machine translation machine translation miscellaneous miscellaneous multipurpose tools multipurpose tools named entity recognition named entity recognition ngrams ngrams numbers numbers parsers parsers part of speech taggers part of speech taggers readability readability regular expressions regular expressions ruby nlp presentations ruby nlp presentations sentence generation sentence generation sentence segmentation sentence segmentation speech to text speech to text stemmers stemmers stop words stop words summarization summarization text extraction text extraction text similarity text similarity text to speech text to speech tokenizers tokenizers word count word count apis 3rd party nlp services client libraries to various 3rd party nlp api services alchemy api https github com dbalatero alchemy api provides a client api library for alchemyapi s nlp services aylien textapi ruby https github com aylien aylien textapi ruby aylien s officially supported ruby client library for accessing text api biffbot https github com tevren biffbot ruby gem for diffbot http www diffbot com s apis that extract articles products images videos and discussions from any web page gengo ruby https github com gengo gengo ruby a ruby library to interface with the gengo api for translation monkeylearn ruby https github com monkeylearn monkeylearn ruby build and consume machine learning models for language processing from your ruby apps poliqarpr https github com apohllo poliqarpr ruby client for poliqarp text corpus server wlapi https github com arbox wlapi ruby based api for the project wortschatz leipzig instant messaging bots client server libraries to various 3rd party instant messengers chat bots apis facebook messenger botstack https github com davidmann4 botstack rapid fb chatbot development with ruby on rails facebook messenger https github com hyperoslo facebook messenger definitely the best ruby client for bots on messenger messenger ruby https github com netguru messenger ruby a simple library for supporting implementation of facebook messenger bot in ruby on rails kik kik https github com muaad kik build www kik com bots in ruby microsoft bot framework skype botbuilder https dev botframework com rest apis for skype and others instant messaging apps botframework ruby https github com tachyons botframework ruby microsoft bot framework ruby client slack slack bot server https github com dblock slack bot server a grape api serving a slack bot to multiple teams slack ruby bot https github com dblock slack ruby bot the easiest way to write a slack bot in ruby slack ruby client https github com dblock slack ruby client a ruby and command line client for the slack web and real time messaging apis slack ruby gem https github com aki017 slack ruby gem a ruby wrapper for the slack api telegram messenger botserver https github com solyaris botserver telegram bot api webhooks framework for rubyists telegrambot https github com eljojo telegram bot a charismatic ruby client for telegram s bot api telegrambotruby https github com shouya telegram bot yet another client for telegram s bot api telegram bot ruby https github com atipugin telegram bot ruby ruby wrapper for telegram s bot api wechat wechat https github com eric guo wechat api command and message handling for wechat http admin wechat com wiki index php title guide for message api in rails wechat api https github com lazing wechat api api natural language understanding tools dialogflow ruby client https github com dialogflow dialogflow ruby client a ruby sdk to the https dialogflow com natural language processing service expando https github com expando lang expando a translation language for defining user utterance examples in conversational interfaces for dialogflow https dialogflow com and similars wit ruby https github com wit ai wit ruby easy interface for wit ai natural language parsing voice based devices bots client server libraries to various 3rd party voice based devices apis amazon echo alexa skills alexa home https github com zachfeldman alexa home using amazon echo to control the home alexa hue https github com sarkonovich alexa hue control hue lights with alexa alexa ruby https github com mulev alexa ruby ruby toolkit for amazon alexa service alexa rubykit https github com damianfc alexa rubykit amazon echo alexa s app kit ruby implementation alexa skill https github com skierkowski alexa skill a ruby based dsl to create new alexa skills alexa skills ruby https github com danelbert alexa skills ruby simple library to interface with the alexa skills kit books mastering regular expressions http isbn directory book 9780596528126 by jeffrey e f friedl regular expressions cookbook http isbn directory book 9781449319434 by jan goyvaerts steven levithan regular expression pocket reference http isbn directory book 9780596514273 by tony stubblebine text processing with ruby https pragprog com book rmtpruby text processing with ruby by rob miller thoughtful machine learning a test driven approach http www amazon com thoughtful machine learning test driven approach dp 1449374069 ref sr 1 1 ie utf8 qid 1410923833 sr 8 1 keywords thoughtful machine learning by matthew kirk understanding computation http isbn directory book 9781449329273 by tom stuart bitext alignment bitext alignment is the process of aligning two parallel documents on a segment by segment basis in other words if you have one document in english and its translation in spanish bitext alignment is the process of matching each segment from document a with its corresponding translation in document b alignment https github com bloomrain alignment alignment functions for corpus linguistics gale church implementation case active support https github com rails rails tree master activesupport lib active support the rails active support gem has various string extensions that can handle case e g mb chars upcase to s or transliterate string pl https github com apohllo string pl additional support for polish encodings in ruby 1 9 twitter cldr rb https github com twitter twitter cldr rb blob master lib twitter cldr shared casefolder rb casefolding u http disu se software u 1 0 u extends ruby s unicode support unicode https github com blackwinter unicode unicode normalization library unicode utils https github com lang unicode utils unicode algorithms for ruby 1 9 chatbot chatterbot https github com muffinista chatterbot a straightforward ruby based twitter bot framework using oauth to authenticate jeffbot https github com armmaster17 jeffbot yet another comical and extensible chat bot lita https github com jimmycuadra lita lita is a chat bot written in ruby with persistent storage provided by redis megahal https github com jasonhutchens megahal megahal is a learning chatterbot markov chain bot module https github com lavirthewhiolet markov chain bot module a chat bot utilizing markov chains it speaks russian and english stealth https github com hellostealth stealth an open source ruby framework for conversational voice and text chatbots classification classification aims to assign a document or piece of text to one or more classes or categories making it easier to manage or sort classifier https github com cardmagic classifier a general module to allow bayesian and other types of classifications classifier reborn https github com jekyll classifier reborn a fork of cardmagic classifier a general classifier module to allow bayesian and other types of classifications fasttext ruby https github com ankane fasttext efficient text classification and representation learning for ruby latent dirichlet allocation https github com ealdent lda ruby used to automatically cluster documents into topics liblinear ruby swig https github com tomz liblinear ruby swig ruby interface to liblinear much more efficient than libsvm for text classification and other large linear classifications linnaeus https github com djcp linnaeus a redis backed bayesian classifier maxent string classifier https github com mccraigmccraig maxent string classifier a jruby maximum entropy classifier for string data based on the opennlp maxent framework naive bayes https github com reddavis naive bayes simple naive bayes classifier nbayes https github com oasic nbayes a full featured ruby implementation of naive bayes omnicat https github com mustafaturan omnicat a generalized rack framework for text classifications omnicat bayes https github com mustafaturan omnicat bayes naive bayes text classification implementation as an omnicat classifier strategy stuff classifier https github com alexandru stuff classifier a library for classifying text into multiple categories date and time chronic https github com mojombo chronic a pure ruby natural language date parser chronic between https github com jrobertson chronic between a simple ruby natural language parser for date and time ranges chronic duration https github com hpoydar chronic duration a simple ruby natural language parser for elapsed time dotiw https github com radar dotiw better distance of time in words for rails http ryanbigg com kronic https github com xaviershay kronic a dirt simple library for parsing and formatting human readable dates nickel https github com iainbeeston nickel extracts date time and message information from naturally worded text tickle https github com yb66 tickle a natural language parser for recurring events time ago in words https github com elgalu time ago in words humanize elapsed time from some time instance to time now time lord https github com krainboltgreene time lord adds extra functionality to the time class emoji active emoji https github com sferik active emoji a collection of emoji aliases for core ruby methods emoji https github com wpeterson emoji a gem for emoji for everyone gemoji https github com github gemoji emoji images and names gemoji parser https github com gmac gemoji parser the missing helper methods for github s gemoji gem rumoji https github com mwunsch rumoji encode and decode emoji unicode characters into emoji cheat sheet form article http mwunsch tumblr com post 34721548842 we need to talk about emoji error correction chat correct https github com diasks2 chat correct shows the errors and error types when a correct english sentence is diffed with an incorrect english sentence gingerice https github com subosito gingerice ruby wrapper for correcting spelling and grammar mistakes based on the context of complete sentences full text search ferret https github com jkraemer ferret an information retrieval library in the same vein as apache lucene ranguba http ranguba org a project to provide a full text search system built on groonga thinking sphinx https github com pat thinking sphinx sphinx plugin for activerecord rails keyword ranking graph rank https github com louismullie graph rank ruby implementation of the pagerank and textrank algorithms highscore https github com domnikl highscore find and rank keywords in text language detection compact language detection https github com jtoy cld blazing fast language detection for ruby provided by google chrome s compact language detector detect language api client https github com detectlanguage detectlanguage ruby detects language of given text and returns detected language codes and scores whatlanguage https github com peterc whatlanguage a language detection library for ruby that uses bloom filters for speed language localization fast gettext https github com grosser fast gettext ruby gettext but 3 5x faster 560x less memory simple clean namespace threadsave extendable multiple backends rails3 ready ruby gettext https github com ruby gettext gettext pure ruby localization l10n library and tool which is modeled after the gnu gettext package lexical databases and ontologies lexical databases knowledge base common sense multilingual lexicalized semantic networks and ontologies babelnet babelnet api client http babelnet org guide api with ruby examples for babelnet http babelnet org multilingual lexicalized semantic network and ontology conceptnet conceptnet api https github com commonsense conceptnet5 wiki api rest api for conceptnet https github com commonsense conceptnet5 wiki mediawiki wikipedia mediawiki ruby api https github com wikimedia mediawiki ruby api github mirror of mediawiki ruby api our actual code is hosted with gerrit please see https www mediawiki org wiki developer access for contributing wikipedia client https github com kenpratt wikipedia client ruby client for the wikipedia api http github com kenpratt wikipedia client wordnet ruby wordnet https github com ged ruby wordnet a ruby interface to the wordnet lexical database http deveiate org projects ruby wordnet rwordnet https github com doches rwordnet a pure ruby interface to the wordnet lexical semantic database machine learning decision tree https github com igrigorik decisiontree a ruby library which implements id3 information gain algorithm for decision tree learning rb libsvm https github com febeling rb libsvm implementation of svm a machine learning and classification algorithm rubyfann https github com tangledpath ruby fann a ruby gem that binds to fann fast artificial neural network from within a ruby rails environment tensorflow rb https github com somaticio tensorflow rb tensorflow for ruby tensor stream https github com jedld tensor stream a ground up and standalone reimplementation of tensorflow for ruby machine translation google api client https github com google google api ruby client google api ruby client microsoft translator https github com ikayzo microsoft translator ruby client for the microsoft translator api termit https github com pawurb termit google translate with speech synthesis in your terminal as ruby gem miscellaneous abbrev http ruby doc org stdlib 2 0 0 libdoc abbrev rdoc abbrev html calculates the set of unique abbreviations for a given set of strings calyx https github com maetl calyx a ruby library for generating text with declarative recursive grammars dialable https github com chorn dialable a ruby gem that provides parsing and output of north american numbering plan nanp phone numbers and includes location time zones gibber https github com timonv gibber gibber replaces text with nonsensical latin with a maximum size difference of 30 hiatus https github com ahanba hiatus a localization qa tool language filter https github com chrisvfritz language filter a ruby gem to detect and optionally filter multiple categories of language naturally https github com dogweather naturally natural version number sorting with support for legal document numbering college course codes and unicode rltk https github com chriswailes rltk the ruby language toolkit http chriswailes github io rltk ruby spacy https github com yohasebe ruby spacy a wrapper module for using spacy https spacy io natural language processing library from the ruby programming language via pycall https github com mrkn pycall rb shellwords http ruby doc org stdlib 2 0 0 libdoc shellwords rdoc shellwords html manipulates strings like the unix bourne shell sort alphabetical https github com grosser sort alphabetical sort utf8 strings alphabetical via enumerable extension spintax parser https github com flintinatux spintax parser a mixin to parse spintax a text format used for automated article generation can handle nested spintax stringex https github com rsl stringex some hopefully useful extensions to ruby s string class twitter text https github com twitter twitter text tree master rb gem that provides text processing routines for twitter tweets nameable https github com chorn nameable a ruby gem that provides parsing and output of person names as well as gender ethnicity matching multipurpose tools the following are libraries that integrate multiple nlp tools or functionality nlp https github com knife nlp nlp tools for the polish language nlptoolz https github com lefnord nlp toolz basic nlp tools mostly based on opennlp at this time sentence finder tokenizer and pos tagger implemented plus berkeley parser open nlp ruby bindings https github com louismullie open nlp stanford core nlp ruby bindings https github com louismullie stanford core nlp treat https github com louismullie treat natural language processing framework for ruby twitter cldr rb https github com twitter twitter cldr rb twittercldr uses unicode s common locale data repository cldr to format certain types of text into their localized equivalents ve https github com kimtaro ve a linguistic framework that s easy to use zipf https github com pks zipf a collection of various nlp tools and libraries named entity recognition confidential info redactor https github com diasks2 confidential info redactor a ruby gem to semi automatically redact confidential information from a text ruby ner https github com mblongii ruby ner named entity recognition with stanford ner and ruby ruby nlp https github com tiendung ruby nlp ruby binding for stanford pos tagger and name entity recognizer ngrams n gram https github com reddavis n gram n gram generator in ruby ngram https github com tkellen ruby ngram break words and phrases into ngrams raingrams https github com postmodern raingrams a flexible and general purpose ngrams library written in ruby numbers humanize https github com radar humanize takes your numbers and makes them fancy numbers and words https github com kslazarev numbers and words convert numbers to words using i18n numbers in words https github com markburns numbers in words to convert numbers into english words and vice versa parsers a natural language parser is a program that works out the grammatical structure of sentences for instance which groups of words go together as phrases and which words are the subject or object of a verb linkparser https github com ged linkparser a ruby binding for the abiword version of cmu s link grammar a syntactic parser of english parslet http kschiess github io parslet a small peg based parser library rley https github com famished tiger rley ruby gem implementing a general context free grammar parser based on earley s algorithm treetop https github com cjheath treetop a ruby based parsing dsl based on parsing expression grammars part of speech taggers engtagger https github com yohasebe engtagger english part of speech tagger library a ruby port of lingua en tagger rbtagger http rbtagger rubyforge org a simple ruby rule based part of speech tagger treetagger for ruby https github com lefnord rstt ruby based wrapper for the treetagger by helmut schmid treetagger ruby https github com arbox treetagger ruby the ruby based wrapper for the treetagger by helmut schmid readability lingua https github com dbalatero lingua lingua en readability is a ruby module which calculates statistics on english text regular expressions commonregexruby https github com talyssonoc commonregexruby find a lot of kinds of common information in a string regexp examples https github com tom lord regexp examples generate strings that match a given regular expression verbal expressions https github com ryan endacott verbal expressions make difficult regular expressions easy online resources explain regular expression http regexdoc com re explain pl breakdown and explanation of each part of your regular expression rubular http rubular com a ruby regular expression editor ruby nlp presentations quickly create a telegram bot in ruby tutorial http www sitepoint com quickly create a telegram bot in ruby ardian haxha 2016 n gram analysis for fun and profit tutorial http www blackbytes info 2015 09 ngram analysis ruby jesus castello https github com matugm 2015 machine learning made simple with ruby tutorial http www leanpanda com blog 2015 08 24 machine learning automatic classification lorenzo masini https github com rugginoso 2015 using ruby machine learning to find paris hilton quotes tutorial http datamelon io blog 2015 using ruby machine learning id paris hilton quotes html rick carlino https github com rickcarlino 2015 exploring natural language processing in ruby slides http www slideshare net diasks2 exploring natural language processing in ruby kevin dias https github com diasks2 2015 natural language parsing with ruby tutorial http blog glaucocustodio com 2014 11 10 natural language parsing with ruby glauco cust dio https github com glaucocustodio 2014 demystifying data science analyzing conference talks with rails and ngrams video railsconf 2014 https www youtube com watch v 2zdcxwb29bg repo from the video https github com genius abstractogram todd schneider https github com toddwschneider 2014 natural language processing with ruby video arrrrcamp 2014 https www youtube com watch v 5u86qvh8r0m video ruby conf india https www youtube com watch v ofmy qbq5du konstantin tennhard https github com t6d 2014 how to parse go natural language processing in ruby slides http www slideshare net tomcartwright natual language processing in ruby tom cartwright https github com tomcartwrightuk 2013 natural language processing in ruby slides https speakerdeck com brandonblack natural language processing in ruby video http confreaks tv videos railsconf2013 natural language processing with ruby brandon black https github com brandonblack 2013 natural language processing with ruby n grams tutorial http www sitepoint com natural language processing ruby n grams nathan kleyn https github com nathankleyn 2013 a tour through random ruby tutorial http www sitepoint com tour random ruby robert qualls 2013 sentence generation gabbler https github com michaeldv gabbler gab bler noun rapid unintelligible talk faker https github com stympy faker a library for generating fake data such as names addresses and phone numbers kusari https github com takuti kusari japanese random sentence generator based on markov chain literate randomizer https github com imikimi literate randomizer using markov chains this generates near english prose markov sentence generator https github com hrs markov sentence generator generates a random locally correct sentence using textual input and a markov model marky markov https github com zolrath marky markov markov chain generator poem generator https github com mindreframer poem generator a generator for gothic poems poetry https github com adimichele poetry poetry generator pwqgen rb https github com iphoting pwqgen rb ruby implementation of passwdqc s pwqgen a random pronouncable password generator ramble https github com saaadhu ramble library for generating sentences from a yacc grammar token phrase https github com genericsteele token phrase a token phrase generator sentence segmentation sentence segmentation aka sentence boundary disambiguation sentence boundary detection is the problem in natural language processing of deciding where sentences begin and end sentence segmentation is the foundation of many common nlp tasks machine translation bitext alignment summarization etc pragmatic segmenter https github com diasks2 pragmatic segmenter punkt segmenter https github com lfcipriani punkt segmenter tactfultokenizer https github com zencephalon tactful tokenizer scapel https github com louismullie scalpel srx english https github com apohllo srx english speech to text att speech https github com adhearsion att speech a ruby library for consuming the at t speech api for speech to text pocketsphinx ruby https github com watsonbox pocketsphinx ruby ruby speech recognition with pocketsphinx speech2text https github com taf2 speech2text using google speech to text api provide a simple interface to convert audio files stemmers stemming is the term used in linguistic morphology and information retrieval to describe the process for reducing inflected or sometimes derived words to their word stem base or root form greek stemmer https github com skroutz greek stemmer a greek stemmer ruby stemmer https github com aurelian ruby stemmer ruby stemmer exposes the snowball api to ruby sastrawi https github com meisyal sastrawi ruby ruby bindings for sastrawi a library which allows you to stem words in bahasa indonesia turkish stemmer https github com skroutz turkish stemmer a turkish stemmer uea stemmer https github com ealdent uea stemmer a conservative stemmer for search and indexing stop words clarifier https github com meducation clarifier stopwords https github com brez stopwords really just a list of stopwords with some helpers stopwords filter https github com brenes stopwords filter a very simple and naive implementation of a stopwords filter that remove a list of banned words stopwords from a sentence summarization automatic summarization is the process of reducing a text document with a computer program in order to create a summary that retains the most important points of the original document epitome https github com mcfreely epitome a small gem to make your text shorter an implementation of the lexrank algorithm ots https github com deepfryed ots ruby bindings to open text summarizer summarize https github com ssoper summarize ruby c wrapper for open text summarizer text extraction docsplit http documentcloud github io docsplit docsplit is a command line utility and ruby library for splitting apart documents into their component parts rtesseract https github com dannnylo rtesseract ruby library for working with the tesseract ocr ruby readability https github com cantino ruby readability a tool for extracting the primary readable content of a webpage ruby tesseract https github com meh ruby tesseract ocr this wrapper binds the tessbaseapi object through ffi inline which means it will work on jruby too and then proceeds to wrap said api in a more ruby esque engine class yomu https github com erol yomu a library for extracting text and metadata from files and documents using the apache tika content analysis toolkit text similarity amatch https github com flori amatch collection of five type of distances between strings including levenshtein sellers jaro winkler pair distance last one seems to work well to find similarity in long phrases damerau levenshtein https github com globalnamesarchitecture damerau levenshtein calculates edit distance using the damerau levenshtein algorithm fuzzymatch https github com seamusabshere fuzzy match find a needle in a haystack based on string similarity and regular expression rules fuzzy string match https github com kiyoka fuzzy string match fuzzy string matching library for ruby fuzzytools https github com brianhempel fuzzy tools in memory tf idf fuzzy document finding with a fancy default tokenizer tuned on diverse record linkage datasets for easy out of the box use going the distance https github com schneems going the distance contains scripts that do various distance calculations hotwater https github com colinsurprenant hotwater fast ruby ffi string edit distance algorithms levenshtein ffi https github com dbalatero levenshtein ffi fast string edit distance computation using the damerau levenshtein algorithm soundex https github com mindaslab soundex a soundex function coded in ruby text https github com threedaymonk text collection of text algorithms tf idf https github com reddavis tf idf term frequency inverse document frequency in ruby tf idf similarity https github com jpmckinney tf idf similarity calculate the similarity between texts using tf idf text to speech espeak ruby https github com dejan espeak ruby small ruby api for utilizing espeak and lame to create text to speech mp3 files isabella https github com chrisvfritz isabella a voice computing assistant built in ruby tts https github com c2h2 tts a ruby gem for converting text to speech using the google translate service tokenizers jieba https github com mimosa jieba jruby chinese tokenizer and segmenter jruby mecab https github com markburns mecab japanese morphological analyzer mecab heroku buildpack https github com diasks2 heroku buildpack mecab nlp pure https github com parhamr nlp pure natural language processing algorithms implemented in pure ruby with minimal dependencies pragmatic tokenizer https github com diasks2 pragmatic tokenizer a multilingual tokenizer to split a string into tokens rseg https github com yzhang rseg a chinese word segmentation routine in pure ruby textoken https github com manorie textoken simple and customizable text tokenization gem thailang4r https github com veer66 thailang4r thai tokenizer tiny segmenter https github com 6 tiny segmenter ruby port of tinysegmenter js for tokenizing japanese text tokenizer https github com arbox tokenizer a simple multilingual tokenizer word count wc https github com thesp0nge wc a rubygem to count word occurrences in a given text word count https github com atelierconvivialite word count a word counter for string and hash in ruby word count analyzer https github com diasks2 word count analyzer analyzes a string for potential areas of the text that might cause word count discrepancies depending on the tool used wordscounted https github com abitdodgy words counted a highly customisable ruby text analyser | ai |
|
developerlife.com | start doctoc generated toc please keep comment here to allow auto update don t edit this section instead re run doctoc to update i don t have jekyll and ruby installed i dont have jekyll and ruby installed installing ruby jekyll and running this project installing ruby jekyll and running this project creating a new project using jekyll creating a new project using jekyll i have jekyll and ruby installed and want to run this project i have jekyll and ruby installed and want to run this project running the site if you already have ruby installed running the site if you already have ruby installed rss readers and hero image handling rss readers and hero image handling customize minima theme customize minima theme overriding files in the base theme overriding files in the base theme how to customize syntax highlighting how to customize syntax highlighting documentation and references on jekyll styling minima customization and sass documentation and references on jekyll styling minima customization and sass add support for mermaid diagrams add support for mermaid diagrams references references running github pages locally running github pages locally more info on jekyll and liquid more info on jekyll and liquid change master to main change master to main end doctoc generated toc please keep comment here to allow auto update i don t have jekyll and ruby installed installing ruby jekyll and running this project to use rails on macos you ll need ruby an interpreter for the ruby programming language plus gems software libraries containing the rails web application development framework run the following commands in your terminal app 1 xcode select install 1 brew install ruby 1 go to the folder in which you ve cloned this repo 1 run bundle install like npm install and will download deps 1 and run jekyll serve like npm run serve will launch server creating a new project using jekyll 1 in order to create a new project using jekyll 1 go to a folder that you want to create your new website under eg github 1 run jekyll new jekyll test 1 your new site will be created in github jekyll test 1 run jekyll serve to run it 1 point your web browser to http localhost 4000 i have jekyll and ruby installed and want to run this project running the site if you already have ruby installed after you clone the repo go the jekyll test folder and 1 run bundler takes the gemfile imports and installs them 1 run jekyll serve builds the static site and serves it on port 4000 1 open http localhost 4000 in your browser rss readers and hero image handling using the hero image property in the yaml header of each md file in posts folder doesn t work with rss readers feedly and fluent rss reader here is the new preferred way of adding hero images so they work w rss readers instead of using this yaml hero image property key it is better to remove it and just add the image directly using an img tag like so img class post hero image src assets hero image here relative url i successfully tested this using fluent rss reader on http 127 0 0 1 4000 feed xml customize minima theme jekyll is configured to use minima theme this means that there are some files that are pulled from this dependency by jekyll when it builds the static site this dependency is located on your computer in echo bundle info path minima save the path to an environment variable called minima home using set minima home bundle info path minima here is an example of the files in the minima home folder text home nazmul ruby gems minima 2 5 1 assets main scss minima social icons svg includes disqus comments html layouts default html sass minima base scss layout scss syntax highlighting scss minima scss as you can imagine in order to customize this theme you can simply provide a file that is your repo that is located on a similar path to the path that is in minimia home more info here https ouyi github io post 2017 12 23 jekyll customization html if you edit these minima files by accident you will need sudo access to edit them you can simply regenerate them by running bundle install force overriding files in the base theme the interesting files are 1 minima scss sass minima scss 2 styles scss sass styles scss which is imported by the minima scss notes if we provide our own copy of these files in a similar path in this repo then they will simply be considered overrides by jekyll when it builds the static site think of this as operator overloading but for files so if the minima scss file is found in this repo then it overrides the equivalent one in the base theme located in minima home look at the bottom of the minima scss file and you will see imports that pull in styles scss and syntax scss used for syntax highlighting i ve created a file sass minima scss which overrides the corresponding file in the base theme this is where i do a lot of big customizations like creating variables and using import to bring in other scss files here are some examples of this scss font face font family jetbrains mono src url assets jetbrainsmono jetbrainsmono regular woff2 format woff2 brand color 2f9ece default text color e6e6e6 default import other scss files minima base override the minima theme files minima layout override the minima theme files syntax custom syntax highlighting not using minima defaults styles custom styles not using minima defaults import minima base minima layout syntax styles these import statements bring in lots of other scss files one of them handles syntax highlighting more on this below how to customize syntax highlighting here s a site assets main css map file that is generated as part of the build process which are driven by some key value pairs in the config yml file which has a list of all the scss files that are actually imported to give a clear picture of what files are actually used to generate the single site assets main css file everytime jekyll generates the static site css version 3 file main css sources main scss sass minima scss sass minima base scss sass minima layout scss sass syntax scss sass styles scss sourcescontent import minima n charset utf 8 n n font face n font family jetbrains mono n names mappings acea uaau how to customize syntax highlighting the syntax scss sass syntax scss file actually contains all the syntax highlighting scss this overrides whatever comes w minima it does come w some defaults in minima home sass minima syntax hihglihting scss there s a repo called pygments css https github com richleland pygments css which i simply copy from in this repo find the styling that you like and just copy paste the contents of that file into the syntax scss file as described in the comments in this file and it will be applied when jekyll builds the static site documentation and references on jekyll styling minima customization and sass jekyll docs on styling https jekyllrb com docs step by step 07 assets minima docs https github com jekyll minima tutorial on customization https ouyi github io post 2017 12 23 jekyll customization html sass basics https sass lang com guide add support for mermaid diagrams more info on mermaid mermaid install guide https github com mermaid js mermaid blob develop docs n00b gettingstarted md mermaid theming guide https mermaid js github io mermaid theming mermaid live editor https mermaid live edit to add mermaid diagrams to markdown files on the site you add snippets like the following div class mermaid graph td a christmas get money b go shopping b c let me think b g another c one d laptop c two e iphone c three f fa fa car car subgraph section c d e f g end div by default the dark theme font and color overrides are provided in mermaid html includes mermaid html if you wish to override them you can do as follows some of these theme variables don t work in overrides via init or specifying them in mermaid initialize block here s a snippet that overrides the default them and font family div class mermaid init theme dark themevariables fontfamily fira mono sequencediagram autonumber participant created not running created not running running startticking activate running participant running rect rgb 83 82 101 0 25 loop ticking running running ontick end end running stopped stopticking alt duration is set running stopped duration has passed end deactivate running div mailchimp form for newsletter sign up https us14 admin mailchimp com account connected sites app selection https www youtube com watch v zhhy4twpfz4 forms on mailchimp make sure to remove address this tutorial shows how to remove the mailing address that is automatically added to many things on mailchimp remove your address https www denisejoanne com remove address from mailchimp footer confirmation i new email template https us14 admin mailchimp com campaigns edit id 8994713 0 the mailing address was removed from here subscribe embedded form https us14 admin mailchimp com audience forms embedded form editor id 491533 this form is located in subscribe html popup subscribe form https us14 admin mailchimp com signup forms popup forms editor id 70673 site id 81293 this form is configured to popup when 1 2 of a page is scrolled and appears on the right side of the page not a modal subscribe confirmation form https us14 admin mailchimp com lists designer id 491533 the mailing address was removed from here references running github pages locally setting github pages site locally with jekyll http tinyurl com yytw8hus more info on jekyll and liquid printing debug variables http tinyurl com y763y5lx true and false in http tinyurl com ya793347 control flow http tinyurl com yd9ls9ut change master to main the internet engineering task force ietf points out https tools ietf org id draft knodel terminology 00 html rfc section 1 1 1 that master slave is an oppressive metaphor that will and should never become fully detached from history as well as in addition to being inappropriate and arcane the master slave metaphor https github com bitkeeper scm bitkeeper blob master doc howto ask wt mc id blog scottha l231 l232 is both technically and historically inaccurate there s lots of more accurate options depending on context and it costs me nothing to change my vocabulary especially if it is one less little speed bump to getting a new person excited about tech you might say i m all for not using master in master slave technical relationships but this is clearly an instance of master copy not master slave but that may not be the case https mail gnome org archives desktop devel list 2019 may msg00066 html turns out the original usage of master in git very likely came from another version control system bitkeeper that explicitly had a notion of slave branches https dev to lukeocodes change git s default branch from master 19le https www hanselman com blog easilyrenameyourgitdefaultbranchfrommastertomain aspx blacklivesmatter https blacklivesmatter com | kotlin kotlin-android android jetbrains-plugin javascript typescript react redux xml-parsing annotation-processing testing tdd ux-engineering webservice rust tui | cloud |
Kodluyoruz-MGD-Bootcamp-Homework3 | kodluyoruz mgd bootcamp homework3 kodluyoruz mobile game development bootcamp 3rd homework br a simple game that contains checkpoints the game contains 3 scenes main scene game scene and winning scene the game starts with the main scene after the player clicks the play button the game scene opens and the game starts the player can move with the wasd keys or the arrow keys and jump with the space key the player s goal is to pass the checkpoints in order the order is from dark to light color in this case the main color is blue when the game starts the position of the checkpoints is randomly assigned there are 25 possible positions in total where they can be if the player passes through the correct checkpoint the color of the checkpoint will be permanently green if the player passes through the wrong checkpoint the color of the checkpoint will be red for a short time and returns to its original color if the player passes the final checkpoint the game ends and the winning scene opens br screenshots table tr th main screen th th gameplay screen th tr tr td img src https user images githubusercontent com 55920002 96650918 647fe180 133c 11eb 817e ab2d9e221564 png td td img src https user images githubusercontent com 55920002 96650997 8c6f4500 133c 11eb 83fb 9784a69e74f8 png td tr tr th if the player passes wrong checkpoint th th winning screen th tr tr td img src https user images githubusercontent com 55920002 96651018 9abd6100 133c 11eb 9aab 850871aada00 png td td img src https user images githubusercontent com 55920002 96651035 a27d0580 133c 11eb 9c3a 8563e8eeb087 png td tr table can i play the game too of course you can play it by running the homework3 exe file under the build folder | csharp unity unity3d unity-3d unity3d-games | front_end |
Avant | avant avant information technologies | server |
|
ervell | ervell front end are na client built using artsy s ezel https github com artsy ezel also made possible by artsy s very generous decision to open source their front end client many patterns also bits of code probably adopted from force https github com artsy force public thanks y all the general idea here is something minimal utilitarian unobtrusive and adaptible to many different situations we try to make use of re usable components and views as often as possible getting started local fork this repo obtain a set of env files env env staging env prod from dzucconi https github com dzucconi or broskoski https github com broskoski install yarn http brewformulas org yarn install dependencies yarn install running the server bash yarn start dev or yarn start staging yarn start prod listening on port 5000 deploying production url www are na https www are na create a new pull request from the master branch against the deploy branch merges into deploy are automatically deployed to our production instance start a deploy https github com aredotna ervell compare deploy master expand 1 deploying staging url staging are na https staging are na merges into master are automatically deployed to our staging instance you can deploy your current local branch to staging if need be by exporting out the appropriate s3 keys and running yarn run deploy staging | front_end |
|
APP-Develop | app develop the best app developers for contract in gurgaon another versatile application structure and format again and again the more intricate the undertaking is the better is released out of our portable application engineers we have the capacity to convey the best portable application revealing motor and versatile application backend system to satisfy the requests of clients the android app developers are likewise advancing better approaches to make new highlights they are including new functionalities into their applications it is expanding the degrees of these applications and the stages as well individuals can get more highlights applications and accomplish their work simpler than previously thus in the event that you need a best portable application in your business the best ios and android application advancement organization ought to be your first need it will assist you with reaching the most profitable crowd in your field to go for a site and application as a matter of first importance there are include your applications works and versatile application advancement organization data appslure is a brilliant portable application development company in delhi delhi ncr noida gurgaon india our application engineers group are master in ios android iphone app service appslure not just needs to be an offshore improvement organization however your extended it group who you can depend on top points of mobile app development improves efficiency offers high scalability makes sure about your app data coordinates with existing software simple to maintain improves customer relationship encourages new client data retrieval gives real time project access straightforwardness in project management record digital files for accountability | server |
|
auto-cot | auto cot automatic chain of thought prompting in large language models iclr 2023 open auto cot in colab https colab research google com assets colab badge svg https colab research google com github amazon science auto cot blob main try cot colab ipynb cheer ai up with the let s think step by step prompt more plz let s think not just step by step but also one by one auto cot uses more cheers diversity to save huge manual efforts in chain of thought prompt design matching or even exceeding performance of manual design on gpt 3 check out our 25 page paper https arxiv org pdf 2210 03493 pdf for more information https user images githubusercontent com 22279212 194787183 a1f8dff8 a0ad 43a1 827f 819671503860 png https user images githubusercontent com 22279212 194787130 d28c9191 588c 41d2 a259 62377f19c934 png requirements python 3 8 pip install torch 1 8 2 cu111 torchtext 0 9 2 f https download pytorch org whl lts 1 8 torch lts html pip install r requirements txt datasets download the datasets from the following https github com kojima takeshi188 zero shot cot tree main dataset https github com kojima takeshi188 zero shot cot tree main log quick start see try cot ipynb instructions construct demos python run demo py task multiarith pred file log multiarith zero shot cot log demo save dir demos multiarith run inference python run inference py dataset multiarith demo path demos multiarith output dir experiment multiarith citing auto cot inproceedings zhang2023automatic title automatic chain of thought prompting in large language models author zhang zhuosheng and zhang aston and li mu and smola alex booktitle the eleventh international conference on learning representations iclr 2023 year 2023 security see contributing contributing md security issue notifications for more information license this project is licensed under the apache 2 0 license | large-language-models prompt-engineering gpt-3 gpt3-prompts gpt3-resources reasoning chain-of-thought | ai |
AzureRTOS-ThreadX-For-Arduino | azure rtos threadx for arduino arduino ci https github com xiongyu0523 azurertos threadx for arduino workflows arduino ci badge svg https github com marketplace actions arduino ci this is a port of azure rtos threadx to arduino as a library for more information about azure rtos please visit microsoft doc and source code on github a new azure rtos threadx for arduino 101 threads https www hackster io 485734 azure rtos threadx for arduino 101 threads 963a8d is avialable on hackster io hardware support the port and provided demo is verified on following board and arduino core board chip architecture verified arduino core seeeduino xiao https wiki seeedstudio com seeeduino xiao atsamd21 https www microchip com en us products microcontrollers and microprocessors 32 bit mcus sam 32 bit mcus sam d cortex m0 seeed studio arduinocore samd 1 8 3 https github com seeed studio arduinocore samd seeeduino wio terminal https wiki seeedstudio com wio terminal getting started atsamd51 https www microchip com en us products microcontrollers and microprocessors 32 bit mcus sam 32 bit mcus sam d cortex m4 seeed studio arduinocore samd 1 8 3 https github com seeed studio arduinocore samd b l4s5i iot01a https www st com en evaluation tools b l4s5i iot01a html stm32l4s5 https www st com zh microcontrollers microprocessors stm32l4r5 s5 html cortex m4 stm32duino arduino core stm32 2 3 0 https github com stm32duino arduino core stm32 32f746gdiscovery https www st com en evaluation tools 32f746gdiscovery html stm32f746 https www st com en microcontrollers microprocessors stm32f7x6 html cortex m7 stm32duino arduino core stm32 2 3 0 https github com stm32duino arduino core stm32 version the version of this library is evolving independent to azure rtos threadx version here is a tracking table library version threadx version note v1 0 0 v6 1 7 https github com azure rtos threadx tree v6 1 7 rel initial release v1 0 1 v6 1 12 https github com azure rtos threadx tree v6 1 12 rel update threadx version v1 0 2 v6 1 12 https github com azure rtos threadx tree v6 1 12 rel add arduino ide keywords v1 0 3 v6 1 12 https github com azure rtos threadx tree v6 1 12 rel add new example for 101 course license this repository inherit azure rtos license from microsoft see license txt license txt and licensed hardware txt licensed hardware txt | os |
|
safety-on-streets | safety on streets award winning ios app designed for the 2018 dallas regional science amp engineering fair according to the association for safe international road travel asirt nearly 1 3 million people die in road accidents each year about 3 287 deaths per day and 20 50 million people are injured or disabled because of road hazards to my surprise i found that no real time hazard reporting solution exists which public safety departments across the nation can use to clear road hazards in real time this shocking problem along with the near death situation of my father and brother motivated me to develop an ios app to report the gps coordinates of road hazards straight to a public safety agency like the txdps with just a single tap on a smartphone screen i used tools like xcode the swift programming language and several devices to develop and test my app the app consists mainly of a real time map view of the current location and a hazard reporting button to report gps coordinates to a public safety server as well as a feature to make an sos call in case of emergency after multiple reports of a hazard from a certain location a staff can be sent to clear the hazard with minimal delay specific measures can also be taken based on collected data to improve road safety and save lives in the future i plan to present the app to the irving city council members with a proposal to officially launch it for city of irving residents | os |
|
smile | smile join the chat at https gitter im haifengl smile https badges gitter im haifengl smile svg https gitter im haifengl smile utm source badge utm medium badge utm campaign pr badge utm content badge maven central https maven badges herokuapp com maven central com github haifengl smile core badge svg https maven badges herokuapp com maven central com github haifengl smile core smile statistical machine intelligence and learning engine https haifengl github io is a fast and comprehensive machine learning nlp linear algebra graph interpolation and visualization system in java and scala with advanced data structures and algorithms smile delivers state of art performance smile is well documented and please check out the project website https haifengl github io for programming guides and more information smile covers every aspect of machine learning including classification regression clustering association rule mining feature selection manifold learning multidimensional scaling genetic algorithms missing value imputation efficient nearest neighbor search etc smile implements the following major machine learning algorithms classification support vector machines decision trees adaboost gradient boosting random forest logistic regression neural networks rbf networks maximum entropy classifier knn na ve bayesian fisher linear quadratic regularized discriminant analysis regression support vector regression gaussian process regression trees gradient boosting random forest rbf networks ols lasso elasticnet ridge regression feature selection genetic algorithm based feature selection ensemble learning based feature selection treeshap signal noise ratio sum squares ratio clustering birch clarans dbscan denclue deterministic annealing k means x means g means neural gas growing neural gas hierarchical clustering sequential information bottleneck self organizing maps spectral clustering minimum entropy clustering association rule frequent itemset mining fp growth mining algorithm manifold learning isomap lle laplacian eigenmap t sne umap pca kernel pca probabilistic pca gha random projection ica multi dimensional scaling classical mds isotonic mds sammon mapping nearest neighbor search bk tree cover tree kd tree simhash lsh sequence learning hidden markov model conditional random field natural language processing sentence splitter and tokenizer bigram statistical test phrase extractor keyword extractor stemmer pos tagging relevance ranking you can use the libraries through maven central repository by adding the following to your project pom xml file dependency groupid com github haifengl groupid artifactid smile core artifactid version 3 0 1 version dependency for nlp use the artifactid smile nlp for scala api please use librarydependencies com github haifengl smile scala 3 0 1 for kotlin api add the below into the dependencies section of gradle build script implementation com github haifengl smile kotlin 3 0 1 for clojure api add the following dependency to your project or build file org clojars haifengl smile 3 0 1 some algorithms rely on blas and lapack e g manifold learning some clustering algorithms gaussian process regression mlp etc to use these algorithms you should include openblas for optimized matrix computation librarydependencies seq org bytedeco javacpp 1 5 8 classifier macosx x86 64 classifier windows x86 64 classifier linux x86 64 classifier linux arm64 classifier linux ppc64le classifier android arm64 classifier ios arm64 org bytedeco openblas 0 3 21 1 5 8 classifier macosx x86 64 classifier windows x86 64 classifier linux x86 64 classifier linux arm64 classifier linux ppc64le classifier android arm64 classifier ios arm64 org bytedeco arpack ng 3 8 0 1 5 8 classifier macosx x86 64 classifier windows x86 64 classifier linux x86 64 classifier linux arm64 classifier linux ppc64le in this example we include all supported 64 bit platforms and filter out 32 bit platforms the user should include only the needed platforms to save spaces if you prefer other blas implementations you can use any library found on the java library path or on the class path by specifying it with the org bytedeco openblas load system property for example to use the blas library from the accelerate framework on mac os x we can pass options such as djava library path usr lib dorg bytedeco openblas load blas for a default installation of mkl that would be dorg bytedeco openblas load mkl rt or you may simply include smile mkl module in your project which includes mkl binaries with smile mkl module in the class path smile will automatically switch to mkl librarydependencies com github haifengl smile mkl 3 0 1 shell smile comes with interactive shells for java scala and kotlin download pre packaged smile from the releases page https github com haifengl smile releases in the home directory of smile type bin smile to enter the scala shell you can run any valid scala expressions in the shell in the simplest case you can use it as a calculator besides all high level smile operators are predefined in the shell by default the shell uses up to 75 memory if you need more memory to handle large data use the option j xmx or xx maxrampercentage for example bin smile j xmx30g you can also modify the configuration file conf smile ini for the memory and other jvm settings to use java s jshell type bin jshell sh which has smile s jars in the classpath similarly run bin kotlin sh to enter kotlin repl model serialization most models support the java serializable interface all classifiers do support serializable interface so that you can use them in spark protostuff http code google com p protostuff is a nice alternative that supports forward backward compatibility schema evolution and validation beyond xml protostuff supports many other formats such as json yaml protobuf etc visualization smile provides a swing based data visualization library smileplot which provides scatter plot line plot staircase plot bar plot box plot histogram 3d histogram dendrogram heatmap hexmap qq plot contour plot surface and wireframe to use smileplot add the following to dependencies dependency groupid com github haifengl groupid artifactid smile plot artifactid version 3 0 1 version dependency smile also support data visualization in declarative approach with smile plot vega package we can create a specification that describes visualizations as mappings from data to properties of graphical marks e g points or bars the specification is based on vega lite https vega github io vega lite the vega lite compiler automatically produces visualization components including axes legends and scales it then determines properties of these components based on a set of carefully designed rules gallery table class center width 100 tr td width 50 figure a href http haifengl github io gallery smile demo kpca png img src http haifengl github io gallery smile demo kpca small png alt kernel pca a figcaption h2 kernel pca h2 figcaption figure td td width 50 figure a href http haifengl github io gallery smile demo isomap png img src http haifengl github io gallery smile demo isomap small png alt isomap a figcaption h2 isomap h2 figcaption figure td tr tr td width 50 figure a href http haifengl github io gallery smile demo mds png img src http haifengl github io gallery smile demo mds small png alt mds a figcaption h2 multi dimensional scaling h2 figcaption figure td td width 50 figure a href http haifengl github io gallery smile demo som png img src http haifengl github io gallery smile demo som small png alt som a figcaption h2 som h2 figcaption figure td tr tr td width 50 figure a href http haifengl github io gallery smile demo ann png img src http haifengl github io gallery smile demo ann small png alt neural network a figcaption h2 neural network h2 figcaption figure td td width 50 figure a href http haifengl github io gallery smile demo svm png img src http haifengl github io gallery smile demo svm small png alt svm a figcaption h2 svm h2 figcaption figure td tr tr td width 50 figure a href http haifengl github io gallery smile demo agglomerative clustering png img src http haifengl github io gallery smile demo agglomerative clustering small png alt agglomerative clustering a figcaption h2 agglomerative clustering h2 figcaption figure td td width 50 figure a href http haifengl github io gallery smile demo xmeans png img src http haifengl github io gallery smile demo xmeans small png alt x means a figcaption h2 x means h2 figcaption figure td tr tr td width 50 figure a href http haifengl github io gallery smile demo dbscan png img src http haifengl github io gallery smile demo dbscan small png alt dbscan a figcaption h2 dbscan h2 figcaption figure td td width 50 figure a href http haifengl github io gallery smile demo neural gas png img src http haifengl github io gallery smile demo neural gas small png alt neural gas a figcaption h2 neural gas h2 figcaption figure td tr tr td width 50 figure a href http haifengl github io gallery smile demo wavelet png img src http haifengl github io gallery smile demo wavelet small png alt wavelet a figcaption h2 wavelet h2 figcaption figure td td width 50 figure a href http haifengl github io gallery smile demo mixture png img src http haifengl github io gallery smile demo mixture small png alt mixture a figcaption h2 exponential family mixture h2 figcaption figure td tr table | machine-learning regression clustering manifold-learning nlp visualization classification nearest-neighbor-search interpolation wavelet graph linear-algebra computer-algebra-system multidimensional-scaling deep-learning statistics dataframe data-science genetic-algorithm | ai |
cortex | docs https docs cortexlabs com slack https community cortexlabs com br img src https cortex public s3 us west 2 amazonaws com logo png height 32 br note this project is no longer actively maintained by its original authors production infrastructure for machine learning at scale deploy manage and scale machine learning models in production br serverless workloads realtime respond to requests in real time and autoscale based on in flight request volumes async process requests asynchronously and autoscale based on request queue length batch run distributed and fault tolerant batch processing jobs on demand br automated cluster management autoscaling elastically scale clusters with cpu and gpu instances spot instances run workloads on spot instances with automated on demand backups environments create multiple clusters with different configurations br ci cd and observability integrations provisioning provision clusters with declarative configuration or a terraform provider metrics send metrics to any monitoring tool or use pre built grafana dashboards logs stream logs to any log management tool or use the pre built cloudwatch integration br built for aws eks cortex runs on top of eks to scale workloads reliably and cost effectively vpc deploy clusters into a vpc on your aws account to keep your data private iam integrate with iam for authentication and authorization workflows | machine-learning infrastructure | ai |
esp32-iot-uno | esp32 iot uno hardware features esp32 wifi bluetooth le soc 240mhz module esp wroom 32 automatic select 3 power sources dc6 28v usb and battery auto download flash mode integrated sdcard slot support 1 bit mode open hardware design with kicad cc by sa license i2c oled display header lithium ion battery charger 1 reset button 1 programable button 1 power led 1 programable led 1 charger led compatible with shields for esp32 in the future gateway gsm gprs gps and lora shield connectivity can rs485 rs232 shield audio shield images esp32 iot uno images1 assets esp32 uno 0776 jpg assets esp32 uno 0776 jpg esp32 iot uno images2 assets esp32 uno 0826 jpg assets esp32 uno 0826 jpg pinout esp32 iot pinout assets esp32 pinout png assets esp32 pinout png schematic esp32 iot uno schematic assets esp32 iot uno sch png assets esp32 iot uno sch svg pcb esp32 iot uno pcb assets esp32 iot uno pcb png assets esp32 iot uno pcb svg 3d top esp32 iot uno 3d top assets esp32 iot uno 3d top png assets esp32 iot uno pcb svg down esp32 iot uno 3d down assets esp32 iot uno 3d down png assets esp32 iot uno pcb svg gerber download assets gerber zip license cc by sa this creative commons license requires you to release all derivative works under this same license you must give credit to the original author of the work state their name and the title of the original work say that you modified the work if you did and include the attribution logo website https esp32 vn | esp32 iot arduino | server |
aepp-sdk-js | ternity https aeternity com s javascript sdk main action https github com aeternity aepp sdk js actions workflows main yml badge svg https github com aeternity aepp sdk js actions workflows main yml codecov https codecov io gh aeternity aepp sdk js branch develop graph badge svg token won6gocirp https codecov io gh aeternity aepp sdk js docs https github com aeternity aepp sdk js actions workflows docs develop yml badge svg https github com aeternity aepp sdk js actions workflows docs develop yml npm https img shields io npm v aeternity aepp sdk svg https www npmjs com package aeternity aepp sdk npm https img shields io npm l aeternity aepp sdk svg https www npmjs com package aeternity aepp sdk javascript sdk for the revolutionary ternity blockchain targeting the ternity node implementation the aepp sdk is hosted on github ternity https aeternity com ternity node https github com aeternity aeternity hosted on github https github com aeternity aepp sdk js guides examples introduction installation docs readme md quick start docs quick start md usage guides aens docs guides aens md ternity naming system contracts docs guides contracts md contract events docs guides contract events md oracles docs guides oracles md payingfortx docs guides paying for tx md meta transactions batch transactions docs guides batch requests md error handling docs guides error handling md low vs high level api docs guides low vs high usage md wallet interaction connect an pp to a wallet docs guides connect aepp to wallet md how to build a wallet docs guides build wallet md there are also examples examples readme md that you can directly use and play with cli command line interface to quickly test all of ternity s blockchain features from your terminal you can install and use the cli https github com aeternity aepp cli js by running 1 npm i g aeternity aepp cli to globally install the cli 2 aecli help to get a list of possible commands contributing for advanced use to get a deeper understanding of the sdk or to contribute to its development it is advised to read the contributing guidelines docs contrib readme md section changelog we keep our changelog docs changelog md up to date license isc license isc copyright 2023 ternity developers permission to use copy modify and or distribute this software for any purpose with or without fee is hereby granted provided that the above copyright notice and this permission notice appear in all copies the software is provided as is and the author disclaims all warranties with regard to this software including all implied warranties of merchantability and fitness in no event shall the author be liable for any special direct indirect or consequential damages or any damages whatsoever resulting from loss of use data or profits whether in an action of contract negligence or other tortious action arising out of or in connection with the use or performance of this software | aeternity blockchain browser nodejs sdk | blockchain |
lg.ring.nlnog.net | nlnog looking glass this is the nlnog looking glass for openbgpd written by teun vink this code is used as a front end to the a href https openbgpd org openbgpd a route collector operated by nlnog the looking glass is hosted at https lg ring nlnog net the code was inspired on bird lg https github com sileht bird lg by mehdi abaakouk please note this code is not intended as general purpose looking glass code it is custom made for this specific use case questions about the status of the looking glass or its peers should be directed at ring admins nlnog net known communities where possible the looking glass tries to provide information on communities used on routes this is done using information stored in the communities communities folder this folder contains a file per asn for known communities each line should contain a community entry followed by a comma followed by the description of the community any line not matching this format is ignored the following types of entries are accepted for communities exact matches for example 65535 666 only matching this exact community ranges for example 65535 0 100 matching anything from 65535 0 upto 65535 100 single digit wildcards for example 65535 x0 matching for 65535 00 65535 10 65535 20 etc any number for example 65535 nnn which matches any community staring with 65535 followed by any number large communities are supported as well they can be formatted similar to regular communities only contain three parts separated by semicolons for example 65535 0 12345 65535 nnn 0 65535 123 100x extended communities can be specified by the label number or label number number notation using the same wildcard options for example soo 65535 0 soo 65535 nnn when using wildcards wildcard values can be replaced in the description by referencing them by number for example 65535 0 nnn do not announce to as 0 65535 x nnn prepend 0 times to as 1 additions and updates to the list of communities are welcome if possible please provide a source for the data as a comment on the first line of the file and name the file asnnn txt where nnn is the asn license copyright c 2022 stichting nlnog stichting nlnog net permission to use copy modify and distribute this software for any purpose with or without fee is hereby granted provided that the above copyright notice and this permission notice appear in all copies the software is provided as is and the author disclaims all warranties with regard to this software including all implied warranties of merchantability and fitness in no event shall the author be liable for any special direct indirect or consequential damages or any damages whatsoever resulting from loss of use data or profits whether in an action of contract negligence or other tortious action arising out of or in connection with the use or performance of this software | front_end |
|
mgl | a id x 28mgl 3a 40mgl manual 20mgl pax 3asection 29 a mgl manual table of contents 1 mgl asdf system dbc7 2 introduction f7aa 2 1 overview 9192 2 2 links 00ee 2 3 dependencies e7ea 2 4 code organization 443c 2 5 glossary 4a8e 3 datasets 109e 3 1 samplers 7bc3 3 1 1 function sampler be8d 4 resampling a39b 4 1 partitions f790 4 2 cross validation f17b 4 3 bagging b647 4 4 cv bagging 3f9f 4 5 miscellaneous operations 59c2 5 core f257 5 1 persistence 29a1 5 2 batch processing ff82 5 3 executors 4476 5 3 1 parameterized executor cache ada2 6 monitoring e668 6 1 monitors c701 6 2 measurers cd3b 6 3 counters be95 6 3 1 attributes 6da5 6 3 2 counter classes 7ee3 7 classification 60e3 7 1 classification monitors c573 7 2 classification measurers 0ba7 7 3 classification counters 6598 7 3 1 confusion matrices 07c7 8 features c8db 8 1 feature selection 1b5e 8 2 feature encoding 24aa 9 gradient based optimization c74a 9 1 iterative optimizer 779d 9 2 cost function e746 9 3 gradient descent 10e7 9 3 1 batch based optimizers 2c39 9 3 2 segmented gd optimizer 989a 9 3 3 per weight optimization a884 9 3 4 utilities c40e 9 4 conjugate gradient 83e6 9 5 extension api 6a6f 9 5 1 implementing optimizers 5748 9 5 2 implementing gradient sources c58b 9 5 3 implementing gradient sinks a210 10 differentiable functions 2981 11 backpropagation neural networks 8788 11 1 backprop overview 56b2 11 2 clump api 7a28 11 3 bpn s d1e0 11 3 1 training 0d82 11 3 2 monitoring 4f0e 11 3 3 feed forward nets 1355 11 3 4 recurrent neural nets 871e 11 4 lumps 9641 11 4 1 lump base class 3045 11 4 2 inputs 207b 11 4 3 weight lump 6872 11 4 4 activations 9105 11 4 5 activation functions 5d86 11 4 6 losses 93a7 11 4 7 stochasticity aa2e 11 4 8 arithmetic 2fe9 11 4 9 operations for rnn s 51f7 11 5 utilities 91f3 12 boltzmann machines 332e 13 gaussian processes 60b3 14 natural language processing 0d6a 14 1 bag of words 0784 in package mgl a id x 28 22mgl 22 20asdf 2fsystem 3asystem 29 a 1 mgl asdf system version 0 1 0 description mgl dbc7 is a machine learning library for backpropagation neural networks boltzmann machines gaussian processes and more licence mit see copying author g bor melis mega retes hu mailto mega retes hu mailto mega retes hu homepage http melisgl github io mgl http melisgl github io mgl bug tracker https github com melisgl mgl issues https github com melisgl mgl issues source control git https github com melisgl mgl git a id x 28mgl 3a 40mgl introduction 20mgl pax 3asection 29 a 2 introduction a id x 28mgl 3a 40mgl overview 20mgl pax 3asection 29 a 2 1 overview mgl is a common lisp machine learning library by g bor melis http quotenil com with some parts originally contributed by ravenpack international it mainly concentrates on various forms of neural networks boltzmann machines feed forward and recurrent backprop nets most of mgl is built on top of mgl mat so it has blas and cuda support in general the focus is on power and performance not on ease of use perhaps one day there will be a cookie cutter interface with restricted functionality if a reasonable compromise is found between power and utility a id x 28mgl 3a 40mgl links 20mgl pax 3asection 29 a 2 2 links here is the official repository https github com melisgl mgl and the html documentation http melisgl github io mgl pax world mgl manual html for the latest version a id x 28mgl 3a 40mgl dependencies 20mgl pax 3asection 29 a 2 3 dependencies mgl used to rely on lla https github com tpapp lla to interface to blas and lapack that s mostly history by now but configuration of foreign libraries is still done via lla see the readme in lla on how to set things up note that these days openblas is easier to set up and just as fast as atlas cl cuda https github com takagi cl cuda and mgl mat https github com melisgl mgl are the two main dependencies and also the ones not yet in quicklisp so just drop them into quicklisp local projects if there is no suitable gpu on the system or the cuda sdk is not installed mgl will simply fall back on using blas and lisp code wrapping code in mgl mat with cuda is basically all that s needed to run on the gpu and with mgl mat cuda available p one can check whether the gpu is really being used a id x 28mgl 3a 40mgl code organization 20mgl pax 3asection 29 a 2 4 code organization mgl consists of several packages dedicated to different tasks for example package mgl resample is about resampling a39b and mgl gd is about gradient descent 10e7 and so on on one hand having many packages makes it easier to cleanly separate api and implementation and also to explore into a specific task at other times they can be a hassle so the mgl dbc7 package itself reexports every external symbol found in all the other packages that make up mgl and mgl mat see mgl mat mat manual on which it heavily relies one exception to this rule is the bundled but independent mgl gnuplot library the built in tests can be run with asdf oos asdf test op mgl note that most of the tests are rather stochastic and can fail once in a while a id x 28mgl 3a 40mgl glossary 20mgl pax 3asection 29 a 2 5 glossary ultimately machine learning is about creating models of some domain the observations in the modelled domain are called instances also known as examples or samples sets of instances are called datasets datasets are used when fitting a model or when making predictions sometimes the word predictions is too specific and the results obtained from applying a model to some instances are simply called results a id x 28mgl dataset 3a 40mgl dataset 20mgl pax 3asection 29 a 3 datasets in package mgl dataset an instance can often be any kind of object of the user s choice it is typically represented by a set of numbers which is called a feature vector or by a structure holding the feature vector the label etc a dataset is a sequence b9c1 of such instances or a samplers 7bc3 object that produces instances a id x 28mgl dataset 3amap dataset 20function 29 a function map dataset fn dataset call fn with each instance in dataset this is basically equivalent to iterating over the elements of a sequence or a sampler see samplers 7bc3 a id x 28mgl dataset 3amap datasets 20function 29 a function map datasets fn datasets key impute nil imputep call fn with a list of instances one from each dataset in datasets return nothing if impute is specified then iterate until the largest dataset is consumed imputing impute for missing values if impute is not specified then iterate until the smallest dataset runs out common lisp map datasets prin1 0 1 2 a b 0 a 1 b map datasets prin1 0 1 2 a b impute nil 0 a 1 b 2 nil it is of course allowed to mix sequences with samplers common lisp map datasets prin1 list 0 1 2 make sequence sampler a b max n samples 2 0 a 1 b a id x 28mgl dataset 3a 40mgl sampler 20mgl pax 3asection 29 a 3 1 samplers some algorithms do not need random access to the entire dataset and can work with a stream observations samplers are simple generators providing two functions sample f956 and finishedp 401f a id x 28mgl dataset 3asample 20generic function 29 a generic function sample sampler if sampler has not run out of data see finishedp 401f sample returns an object that represents a sample from the world to be experienced or in other words simply something the can be used as input for training or prediction it is not allowed to call sample if sampler is finishedp a id x 28mgl dataset 3afinishedp 20generic function 29 a generic function finishedp sampler see if sampler has run out of examples a id x 28mgl dataset 3alist samples 20function 29 a function list samples sampler max size return a list of samples of length at most max size or less if sampler runs out a id x 28mgl dataset 3amake sequence sampler 20function 29 a function make sequence sampler seq key max n samples create a sampler that returns elements of seq in their original order if max n samples is non nil then at most max n samples are sampled a id x 28mgl dataset 3amake random sampler 20function 29 a function make random sampler seq key max n samples reorder mgl resample shuffle create a sampler that returns elements of seq in random order if max n samples is non nil then at most max n samples are sampled the first pass over a shuffled copy of seq and this copy is reshuffled whenever the sampler reaches the end of it shuffling is performed by calling the reorder function a id x 28mgl dataset 3a 2ainfinitely empty dataset 2a 20variable 29 a variable infinitely empty dataset function sampler infinitely empty this is the default dataset for mgl opt minimize 46a4 it s an infinite stream of nil s a id x 28mgl dataset 3a 40mgl sampler function sampler 20mgl pax 3asection 29 a 3 1 1 function sampler a id x 28mgl dataset 3afunction sampler 20class 29 a class function sampler a sampler with a function in its generator 08ac that produces a stream of samples which may or may not be finite depending on max n samples 1cab finishedp 401f returns t iff max n samples is non nil and it s not greater than the number of samples generated n samples bdf9 list samples make instance function sampler generator lambda random 10 max n samples 5 10 3 5 2 3 3 a id x 28mgl dataset 3agenerator 20 28mgl pax 3areader 20mgl dataset 3afunction sampler 29 29 a reader generator function sampler generator a generator function of no arguments that returns the next sample a id x 28mgl dataset 3amax n samples 20 28mgl pax 3aaccessor 20mgl dataset 3afunction sampler 29 29 a accessor max n samples function sampler max n samples nil a id x 28mgl common 3aname 20 28mgl pax 3areader 20mgl dataset 3afunction sampler 29 29 a reader name function sampler name nil an arbitrary object naming the sampler only used for printing the sampler object a id x 28mgl dataset 3an samples 20 28mgl pax 3areader 20mgl dataset 3afunction sampler 29 29 a reader n samples function sampler n samples 0 a id x 28mgl resample 3a 40mgl resample 20mgl pax 3asection 29 a 4 resampling in package mgl resample the focus of this package is on resampling methods such as cross validation and bagging which can be used for model evaluation model selection and also as a simple form of ensembling data partitioning and sampling functions are also provided because they tend to be used together with resampling a id x 28mgl resample 3a 40mgl resample partitions 20mgl pax 3asection 29 a 4 1 partitions the following functions partition a dataset currently only sequence b9c1 s are supported into a number of partitions for each element in the original dataset there is exactly one partition that contains it a id x 28mgl resample 3afracture 20function 29 a function fracture fractions seq key weight partition seq into a number of subsequences fractions is either a positive integer or a list of non negative real numbers weight is nil or a function that returns a non negative real number when called with an element from seq if fractions is a positive integer then return a list of that many subsequences with equal sum of weights bar rounding errors else partition seq into subsequences where the sum of weights of subsequence i is proportional to element i of fractions if weight is nil then it s element is assumed to have the same weight to split into 5 sequences common lisp fracture 5 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 to split into two sequences whose lengths are proportional to 2 and 3 common lisp fracture 2 3 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 a id x 28mgl resample 3astratify 20function 29 a function stratify seq key key identity test eql return the list of strata of seq seq is a sequence of elements for which the function key returns the class they belong to such classes are opaque objects compared for equality with test a stratum is a sequence of elements with the same under test key common lisp stratify 0 1 2 3 4 5 6 7 8 9 key evenp 0 2 4 6 8 1 3 5 7 9 a id x 28mgl resample 3afracture stratified 20function 29 a function fracture stratified fractions seq key key identity test eql weight similar to fracture 6f82 but also makes sure that keys are evenly distributed among the partitions see stratify ba91 it can be useful for classification tasks to partition the data set while keeping the distribution of classes the same note that the sets returned are not in random order in fact they are sorted internally by key for example to make two splits with approximately the same number of even and odd numbers common lisp fracture stratified 2 0 1 2 3 4 5 6 7 8 9 key evenp 0 2 1 3 4 6 8 5 7 9 a id x 28mgl resample 3a 40mgl resample cross validation 20mgl pax 3asection 29 a 4 2 cross validation a id x 28mgl resample 3across validate 20function 29 a function cross validate data fn key n folds 5 folds alexandria iota n folds split fn split fold mod pass fold map fn over the folds of data split with split fn and collect the results in a list the simplest demonstration is common lisp cross validate 0 1 2 3 4 lambda test training list test training n folds 5 0 1 2 3 4 1 0 2 3 4 2 0 1 3 4 3 0 1 2 4 4 0 1 2 3 of course in practice one would typically train a model and return the trained model and or its score on test also sometimes one may want to do only some of the folds and remember which ones they were common lisp cross validate 0 1 2 3 4 lambda fold test training list fold fold test training folds 2 3 pass fold t fold 2 2 0 1 3 4 fold 3 3 0 1 2 4 finally the way the data is split can be customized by default split fold mod 5ded is called with the arguments data the fold from among folds and n folds split fold mod returns two values which are then passed on to fn one can use split fold cont 5293 or split stratified 8cb8 or any other function that works with these arguments the only real constraint is that fn has to take as many arguments plus the fold argument if pass fold as split fn returns a id x 28mgl resample 3asplit fold 2fmod 20function 29 a function split fold mod seq fold n folds partition seq into two sequences one with elements of seq with indices whose remainder is fold when divided with n folds and a second one with the rest the second one is the larger set the order of elements remains stable this function is suitable as the split fn argument of cross validate 9524 a id x 28mgl resample 3asplit fold 2fcont 20function 29 a function split fold cont seq fold n folds imagine dividing seq into n folds subsequences of the same size bar rounding return the subsequence of index fold as the first value and the all the other subsequences concatenated into one as the second value the order of elements remains stable this function is suitable as the split fn argument of cross validate 9524 a id x 28mgl resample 3asplit stratified 20function 29 a function split stratified seq fold n folds key key identity test eql weight split seq into n folds partitions as in fracture stratified 627a return the partition of index fold as the first value and the concatenation of the rest as the second value this function is suitable as the split fn argument of cross validate 9524 mostly likely as a closure with key test weight bound a id x 28mgl resample 3a 40mgl resample bagging 20mgl pax 3asection 29 a 4 3 bagging a id x 28mgl resample 3abag 20function 29 a function bag seq fn key ratio 1 n weight replacement t key test eql random state random state sample from seq with sample from 86fd passing ratio weight replacement or sample stratified aee6 if key is not nil call fn with the sample if n is nil then keep repeating this until fn performs a non local exit else n must be a non negative integer n iterations will be performed the primary values returned by fn collected into a list and returned see sample from and sample stratified for examples a id x 28mgl resample 3asample from 20function 29 a function sample from ratio seq key weight replacement random state random state return a sequence constructed by sampling with or without replacement from seq the sum of weights in the result sequence will approximately be the sum of weights of seq times ratio if weight is nil then elements are assumed to have equal weights else weight should return a non negative real number when called with an element of seq to randomly select half of the elements common lisp sample from 1 2 0 1 2 3 4 5 5 3 2 to randomly select some elements such that the sum of their weights constitute about half of the sum of weights across the whole sequence common lisp sample from 1 2 0 1 2 3 4 5 6 7 8 9 weight identity sums to 28 that s near 45 2 9 4 1 6 8 to sample with replacement that is allowing the element to be sampled multiple times common lisp sample from 1 0 1 2 3 4 5 replacement t 1 1 5 1 4 4 a id x 28mgl resample 3asample stratified 20function 29 a function sample stratified ratio seq key weight replacement key identity test eql random state random state like sample from 86fd but makes sure that the weighted proportion of classes in the result is approximately the same as the proportion in seq see stratify ba91 for the description of key and test a id x 28mgl resample 3a 40mgl resample cv bagging 20mgl pax 3asection 29 a 4 4 cv bagging a id x 28mgl resample 3abag cv 20function 29 a function bag cv data fn key n n folds 5 folds alexandria iota n folds split fn split fold mod pass fold random state random state perform cross validation on different shuffles of data n times and collect the results since cross validate 9524 collects the return values of fn the return value of this function is a list of lists of fn results if n is nil don t collect anything just keep doing repeated cvs until fn performs a non local exit the following example simply collects the test and training sets for 2 fold cv repeated 3 times with shuffled data commonlisp this is non deterministic bag cv 0 1 2 3 4 list n 3 n folds 2 2 3 4 1 0 1 0 2 3 4 2 1 0 4 3 4 3 2 1 0 1 0 3 2 4 2 4 1 0 3 cv bagging is useful when a single cv is not producing stable results as an ensemble method cv bagging has the advantage over bagging that each example will occur the same number of times and after the first cv is complete there is a complete but less reliable estimate for each example which gets refined by further cvs a id x 28mgl resample 3a 40mgl resample misc 20mgl pax 3asection 29 a 4 5 miscellaneous operations a id x 28mgl resample 3aspread strata 20function 29 a function spread strata seq key key identity test eql return a sequence that s a reordering of seq such that elements belonging to different strata under key and test see stratify ba91 are distributed evenly the order of elements belonging to the same stratum is unchanged for example to make sure that even and odd numbers are distributed evenly common lisp spread strata 0 2 4 6 8 1 3 5 7 9 key evenp 0 1 2 3 4 5 6 7 8 9 same thing with unbalanced classes common lisp spread strata vector 0 2 3 5 6 1 4 key lambda x if member x 1 4 t nil 0 1 2 3 4 5 6 a id x 28mgl resample 3azip evenly 20function 29 a function zip evenly seqs key result type make a single sequence out of the sequences in seqs so that in the returned sequence indices of elements belonging to the same source sequence are spread evenly across the whole range the result is a list is result type is list 0 592c 1 98f9 it s a vector if result type is vector 0 5003 1 e2a3 if result type is nil then it s determined by the type of the first sequence in seqs common lisp zip evenly 0 2 4 1 3 0 1 2 3 4 a id x 28mgl core 3a 40mgl core 20mgl pax 3asection 29 a 5 core in package mgl core a id x 28mgl core 3a 40mgl persistence 20mgl pax 3asection 29 a 5 1 persistence a id x 28mgl core 3aload state 20function 29 a function load state filename object load weights of object from filename return object a id x 28mgl core 3asave state 20function 29 a function save state filename object key if exists error ensure t save weights of object to filename if ensure then ensure directories exist e4b0 is called on filename if exists is passed on to open 117a return object a id x 28mgl core 3aread state 20function 29 a function read state object stream read the weights of object from the bivalent stream where weights mean the learnt parameters there is currently no sanity checking of data which will most certainly change in the future together with the serialization format return object a id x 28mgl core 3awrite state 20function 29 a function write state object stream write weight of object to the bivalent stream return object a id x 28mgl core 3aread state 2a 20generic function 29 a generic function read state object stream context this is the extension point for read state 8148 it is guaranteed that primary read state methods will be called only once for each object under eq a1d4 context is an opaque object and must be passed on to any recursive read state calls a id x 28mgl core 3awrite state 2a 20generic function 29 a generic function write state object stream context this is the extension point for write state 95fe it is guaranteed that primary write state methods will be called only once for each object under eq a1d4 context is an opaque object and must be passed on to any recursive write state calls a id x 28mgl core 3a 40mgl model stripe 20mgl pax 3asection 29 a 5 2 batch processing processing instances one by one during training or prediction can be slow the models that support batch processing for greater efficiency are said to be striped typically during or after creating a model one sets max n stripes 16c4 on it a positive integer when a batch of instances is to be fed to the model it is first broken into subbatches of length that s at most max n stripes for each subbatch set input 0c9e fixdoc is called and a before method takes care of setting n stripes 8dd7 to the actual number of instances in the subbatch when max n stripes is set internal data structures may be resized which is an expensive operation setting n stripes is a comparatively cheap operation often implemented as matrix reshaping note that for models made of different parts for example mgl bp bpn 5187 consists of mgl bp lump c1ac s setting these values affects the constituent parts but one should never change the number stripes of the parts directly because that would lead to an internal inconsistency in the model a id x 28mgl core 3amax n stripes 20generic function 29 a generic function max n stripes object the number of stripes with which the object is capable of dealing simultaneously a id x 28mgl core 3aset max n stripes 20generic function 29 a generic function set max n stripes max n stripes object allocate the necessary stuff to allow for max n stripes number of stripes to be worked with simultaneously in object this is called when max n stripes is setf 17b7 ed a id x 28mgl core 3an stripes 20generic function 29 a generic function n stripes object the number of stripes currently present in object this is at most max n stripes 16c4 a id x 28mgl core 3aset n stripes 20generic function 29 a generic function set n stripes n stripes object set the number of stripes out of max n stripes 16c4 that are in use in object this is called when n stripes is setf 17b7 ed a id x 28mgl core 3awith stripes 20mgl pax 3amacro 29 a macro with stripes specs body body bind start and optionally end indices belonging to stripes in striped objects with stripes stripe1 object1 start1 end1 stripe2 object2 start2 this is how one s supposed to find the index range corresponding to the nth input in an input lump of a bpn with stripes n input lump start end loop for i upfrom start below end do setf mref nodes input lump i 0d0 note how the input lump is striped but the matrix into which we are indexing nodes cc1c is not known to with stripes in fact for lumps the same stripe indices work with nodes and mgl bp derivatives a81b a id x 28mgl core 3astripe start 20generic function 29 a generic function stripe start stripe object return the start index of stripe in some array or matrix of object a id x 28mgl core 3astripe end 20generic function 29 a generic function stripe end stripe object return the end index exclusive of stripe in some array or matrix of object a id x 28mgl core 3aset input 20generic function 29 a generic function set input instances model set instances as inputs in model instances is always a sequence b9c1 of instances even for models not capable of batch operation it sets n stripes 8dd7 to length f466 instances in a before method a id x 28mgl core 3amap batches for model 20function 29 a function map batches for model fn dataset model call fn with batches of instances from dataset suitable for model the number of instances in a batch is max n stripes 16c4 of model or less if there are no more instances left a id x 28mgl core 3ado batches for model 20mgl pax 3amacro 29 a macro do batches for model batch dataset model body body convenience macro over map batches for model 5fdc a id x 28mgl core 3a 40mgl executors 20mgl pax 3asection 29 a 5 3 executors a id x 28mgl core 3amap over executors 20generic function 29 a generic function map over executors fn instances prototype executor divide instances between executors that perform the same function as prototype executor and call fn with the instances and the executor for which the instances are some objects conflate function and call the forward pass of a mgl bp bpn 5187 computes output from inputs so it is like a function but it also doubles as a function call in the sense that the bpn function object changes state during the computation of the output hence not even the forward pass of a bpn is thread safe there is also the restriction that all inputs must be of the same size for example if we have a function that builds bpn a for an input of a certain size then we can create a factory that creates bpns for a particular call the factory probably wants to keep the weights the same though in parameterized executor cache ada2 make executor with parameters 331b is this factory parallelization of execution is another possibility map over executors allows but there is no prebuilt solution for it yet the default implementation simply calls fn with instances and prototype executor a id x 28mgl core 3ado executors 20mgl pax 3amacro 29 a macro do executors instances object body body convenience macro on top of map over executors b01b a id x 28mgl core 3a 40mgl parameterized executor cache 20mgl pax 3asection 29 a 5 3 1 parameterized executor cache a id x 28mgl core 3aparameterized executor cache mixin 20class 29 a class parameterized executor cache mixin mix this into a model implement instance to executor parameters 0078 and make executor with parameters 331b and do executors f98e will be to able build executors suitable for different instances the canonical example is using a bpn to compute the means and convariances of a gaussian process since each instance is made of a variable number of observations the size of the input is not constant thus we have a bpn an executor for each input dimension the parameters a id x 28mgl core 3amake executor with parameters 20generic function 29 a generic function make executor with parameters parameters cache create a new executor for parameters cache is a parameterized executor cache mixin d3b2 in the bpn gaussian process example parameters would be a list of input dimensions a id x 28mgl core 3ainstance to executor parameters 20generic function 29 a generic function instance to executor parameters instance cache return the parameters for an executor able to handle instance called by map over executors b01b on cache that s a parameterized executor cache mixin d3b2 the returned parameters are keys in an equal 96d0 parameters executor hash table a id x 28mgl core 3a 40mgl monitoring 20mgl pax 3asection 29 a 6 monitoring in package mgl core when training or applying a model one often wants to track various statistics for example in the case of training a neural network with cross entropy loss these statistics could be the average cross entropy loss itself classification accuracy or even the entire confusion matrix and sparsity levels in hidden layers also there is the question of what to do with the measured values log and forget add to some counter or a list so there may be several phases of operation when we want to keep an eye on let s call these events there can also be many fairly independent things to do in response to an event let s call these monitors some monitors are a composition of two operations one that extracts some measurements and another that aggregates those measurements let s call these two measurers and counters respectively for example consider training a backpropagation neural network we want to look at the state of of network just after the backward pass mgl bp bp learner 00a0 has a monitors 6202 event hook corresponding to the moment after backpropagating the gradients suppose we are interested in how the training cost evolves push make instance monitor measurer lambda instances bpn declare ignore instances mgl bp cost bpn counter make instance basic counter monitors learner during training this monitor will track the cost of training examples behind the scenes if we want to print and reset this monitor periodically we can put another monitor on mgl opt iterative optimizer 8da0 s mgl opt on n instances changed 4f0b accessor push lambda optimizer gradient source n instances declare ignore optimizer when zerop mod n instances 1000 format t n instances s n instances dolist monitor monitors gradient source when counter monitor format t a counter monitor reset counter counter monitor mgl opt on n instances changed optimizer note that the monitor we push can be anything as long as apply monitor bbdf is implemented on it with the appropriate signature also note that the zerop afff mod 0 6c7f 1 9358 logic is fragile so you will likely want to use mgl opt monitor optimization periodically 4528 instead of doing the above so that s the general idea concrete events are documented where they are signalled often there are task specific utilities that create a reasonable set of default monitors see classification monitors c573 a id x 28mgl core 3aapply monitors 20function 29 a function apply monitors monitors rest arguments call apply monitor bbdf on each monitor in monitors and arguments this is how an event is fired a id x 28mgl core 3aapply monitor 20generic function 29 a generic function apply monitor monitor rest arguments apply monitor to arguments this sound fairly generic because it is monitor can be anything even a simple function or symbol in which case this is just cl apply 376e see monitors c701 for more a id x 28mgl core 3acounter 20generic function 29 a generic function counter monitor return an object representing the state of monitor or nil if it doesn t have any say because it s a simple logging function most monitors have counters into which they accumulate results until they are printed and reset see counters be95 for more a id x 28mgl core 3amonitor model results 20function 29 a function monitor model results fn dataset model monitors call fn with batches of instances from dataset until it runs out as in do batches for model faaa fn is supposed to apply model to the batch and return some kind of result for neural networks the result is the model state itself apply monitors to each batch and the result returned by fn for that batch finally return the list of counters of monitors the purpose of this function is to collect various results and statistics such as error measures efficiently by applying the model only once leaving extraction of quantities of interest from the model s results to monitors see the model specific versions of this functions such as mgl bp monitor bpn results 0933 a id x 28mgl core 3amonitors 20generic function 29 a generic function monitors object return monitors associated with object see various methods such as monitors 6202 for more documentation a id x 28mgl core 3a 40mgl monitor 20mgl pax 3asection 29 a 6 1 monitors a id x 28mgl core 3amonitor 20class 29 a class monitor a monitor that has another monitor called measurer eb05 embedded in it when this monitor is applied it applies the measurer and passes the returned values to add to counter 62de called on its counter a077 slot one may further specialize apply monitor bbdf to change that this class is useful when the same event monitor is applied repeatedly over a period and its results must be aggregated such as when training statistics are being tracked or when predictions are begin made note that the monitor must be compatible with the event it handles that is the embedded measurer must be prepared to take the arguments that are documented to come with the event a id x 28mgl core 3ameasurer 20 28mgl pax 3areader 20mgl core 3amonitor 29 29 a reader measurer monitor measurer this must be a monitor itself which only means that apply monitor bbdf is defined on it but see monitoring e668 the returned values are aggregated by counter 5752 see measurers cd3b for a library of measurers a id x 28mgl core 3acounter 20 28mgl pax 3areader 20mgl core 3amonitor 29 29 a reader counter monitor counter the counter of a monitor carries out the aggregation of results returned by measurer eb05 the see counters be95 for a library of counters a id x 28mgl core 3a 40mgl measurer 20mgl pax 3asection 29 a 6 2 measurers measurer eb05 is a part of monitor 7068 objects an embedded monitor that computes a specific quantity e g classification accuracy from the arguments of event it is applied to e g the model results measurers are often implemented by combining some kind of model specific extractor with a generic measurer function all generic measurer functions return their results as multiple values matching the arguments of add to counter 62de for a counter of a certain type see counters be95 so as to make them easily used in a monitor multiple value call add to counter some counter call to some measurer the counter class compatible with the measurer this way is noted for each function for a list of measurer functions see classification measurers 0ba7 a id x 28mgl core 3a 40mgl counter 20mgl pax 3asection 29 a 6 3 counters a id x 28mgl core 3aadd to counter 20generic function 29 a generic function add to counter counter rest args add args to counter in some way see specialized methods for type specific documentation the kind of arguments to be supported is the what the measurer functions see measurers cd3b intended to be paired with the counter return as multiple values a id x 28mgl core 3acounter values 20generic function 29 a generic function counter values counter return any number of values representing the state of counter see specialized methods for type specific documentation a id x 28mgl core 3acounter raw values 20generic function 29 a generic function counter raw values counter return any number of values representing the state of counter in such a way that passing the returned values as arguments add to counter 62de on a fresh instance of the same type recreates the original state a id x 28mgl core 3areset counter 20generic function 29 a generic function reset counter counter restore state of counter to what it was just after creation a id x 28mgl core 3a 40mgl attributes 20mgl pax 3asection 29 a 6 3 1 attributes a id x 28mgl core 3aattributed 20class 29 a class attributed this is a utility class that all counters subclass the attributes cc37 plist can hold basically anything currently the attributes are only used when printing and they can be specified by the user the monitor maker functions such as those in classification monitors c573 also add attributes of their own to the counters they create with the prepend attributes initarg when can easily add new attributes without clobbering the those in the initform type rmse in this case princ make instance rmse counter prepend attributes event pred dataset test pred test rmse 0 000e 0 0 rmse counter pred test rmse 0 000e 0 0 a id x 28mgl core 3aattributes 20 28mgl pax 3aaccessor 20mgl core 3aattributed 29 29 a accessor attributes attributed attributes nil a plist of attribute keys and values a id x 28mgl common 3aname 20 28method 20nil 20 28mgl core 3aattributed 29 29 29 a method name attributed attributed return a string assembled from the values of the attributes cc37 of attributed if there are multiple entries with the same key then they are printed near together values may be padded according to an enclosing with padded attribute printing 2e8b a id x 28mgl core 3awith padded attribute printing 20mgl pax 3amacro 29 a macro with padded attribute printing attributeds body body note the width of values for each attribute key which is the number of characters in the value s princ to string d397 ed representation in body if attributes with they same key are printed they are forced to be at least this wide this allows for nice table like output let attributeds list make instance basic counter attributes a 1 b 23 c 456 make instance basic counter attributes a 123 b 45 c 6 with padded attribute printing attributeds map nil lambda attributed format t a attributed attributeds 1 23 456 0 000e 0 0 123 45 6 0 000e 0 0 a id x 28mgl core 3alog padded 20function 29 a function log padded attributeds log see log msg attributeds non escaped as in princ f47b or a with the output being as table like as possible a id x 28mgl core 3a 40mgl counter classes 20mgl pax 3asection 29 a 6 3 2 counter classes in addition to the really basic ones here also see classification counters 6598 a id x 28mgl core 3abasic counter 20class 29 a class basic counter attributed 9715 a simple counter whose add to counter 62de takes two additional parameters an increment to the internal sums of called the numerator c1d9 and denominator 1abb counter values 20e8 returns two values numerator divided by denominator or 0 if denominator is 0 and denominator here is an example the compute the mean of 5 things received in two batches let counter make instance basic counter add to counter counter 6 5 3 add to counter counter 3 5 2 counter basic counter 2 00000e 0 5 a id x 28mgl core 3armse counter 20class 29 a class rmse counter basic counter 5979 a basic counter 5979 with whose nominator accumulates the square of some statistics it has the attribute type rmse counter values 20e8 returns the square root of what basic counter s counter values would return let counter make instance rmse counter add to counter counter 3 3 4 4 2 counter rmse counter rmse 3 53553e 0 2 a id x 28mgl core 3aconcat counter 20class 29 a class concat counter attributed 9715 a counter that simply concatenates sequences cl transcript let counter make instance concat counter add to counter counter 1 2 3 4 5 add to counter counter 6 7 counter values counter 1 2 3 4 5 6 7 a id x 28mgl core 3aconcatenation type 20 28mgl pax 3areader 20mgl core 3aconcat counter 29 29 a reader concatenation type concat counter concatenation type list a type designator suitable as the result type argument to concatenate 8cb9 a id x 28mgl core 3a 40mgl classification 20mgl pax 3asection 29 a 7 classification in package mgl core to be able to measure classification related quantities we need to define what the label of an instance is customization is possible by implementing a method for a specific type of instance but these functions only ever appear as defaults that can be overridden a id x 28mgl core 3alabel index 20generic function 29 a generic function label index instance return the label of instance as a non negative integer a id x 28mgl core 3alabel index distribution 20generic function 29 a generic function label index distribution instance return a one dimensional array of probabilities representing the distribution of labels the probability of the label with label index cc80 i is element at index i of the returned arrray the following two functions are basically the same as the previous two but in batch mode they return a sequence of label indices or distributions these are called on results produced by models implement these for a model and the monitor maker functions below will automatically work see fixdoc for bpn and boltzmann a id x 28mgl core 3alabel indices 20generic function 29 a generic function label indices results return a sequence of label indices for results produced by some model for a batch of instances this is akin to label index cc80 a id x 28mgl core 3alabel index distributions 20generic function 29 a generic function label index distributions result return a sequence of label index distributions for results produced by some model for a batch of instances this is akin to label index distribution caec a id x 28mgl core 3a 40mgl classification monitor 20mgl pax 3asection 29 a 7 1 classification monitors the following functions return a list monitors the monitors are for events of signature instances model such as those produced by monitor model results e50c and its various model specific variations they are model agnostic functions extensible to new classifier types a id x 28mgl core 3amake classification accuracy monitors 20function 29 a function make classification accuracy monitors model key operation mode attributes label index fn label index return a list of monitor 7068 objects associated with classification accuracy counter 430d s label index fn is a function like label index cc80 see that function for more implemented in terms of make classification accuracy monitors 2aa3 a id x 28mgl core 3amake cross entropy monitors 20function 29 a function make cross entropy monitors model key operation mode attributes label index distribution fn label index distribution return a list of monitor 7068 objects associated with cross entropy counter b186 s label index distribution fn is a function like label index distribution caec see that function for more implemented in terms of make cross entropy monitors e46f a id x 28mgl core 3amake label monitors 20function 29 a function make label monitors model key operation mode attributes label index fn label index label index distribution fn label index distribution return classification accuracy and cross entropy monitors see make classification accuracy monitors 911c and make cross entropy monitors 6004 for a description of paramters the monitor makers above can be extended to support new classifier types via the following generic functions a id x 28mgl core 3amake classification accuracy monitors 2a 20generic function 29 a generic function make classification accuracy monitors model operation mode label index fn attributes identical to make classification accuracy monitors 911c bar the keywords arguments specialize this to add to support for new model types the default implementation also allows for some extensibility if label indices 31ed is defined on model then it will be used to extract label indices from model results a id x 28mgl core 3amake cross entropy monitors 2a 20generic function 29 a generic function make cross entropy monitors model operation mode label index distribution fn attributes identical to make cross entropy monitors 6004 bar the keywords arguments specialize this to add to support for new model types the default implementation also allows for some extensibility if label index distributions 0 9385 1 caec is defined on model then it will be used to extract label distributions from model results a id x 28mgl core 3a 40mgl classification measurer 20mgl pax 3asection 29 a 7 2 classification measurers the functions here compare some known good solution also known as ground truth or target to a prediction or approximation and return some measure of their dis similarity they are model independent hence one has to extract the ground truths and predictions first rarely used directly they are mostly hidden behind classification monitors c573 a id x 28mgl core 3ameasure classification accuracy 20function 29 a function measure classification accuracy truths predictions key test eql truth key prediction key weight return the number of correct classifications and as the second value the number of instances equal to length of truths in the non weighted case truths keyed by truth key is a sequence of opaque class labels compared with test to another sequence of classes labels in predictions keyed by prediction key if weight is non nil then it is a function that returns the weight of an element of truths weighted cases add their weight to both counts returned as the first and second values instead of 1 as in the non weighted case note how the returned values are suitable for multiple value call bf1a with add to counter 62de and a classification accuracy counter 430d a id x 28mgl core 3ameasure cross entropy 20function 29 a function measure cross entropy truths predictions key truth key prediction key min prediction pr 1 0d 15 return the sum of the cross entropy between pairs of elements with the same index of truths and predictions truth key is a function that s when applied to an element of truths returns a sequence representing some kind of discrete target distribution p in the definition below truth key may be nil which is equivalent to the identity 8804 function prediction key is the same kind of key for predictions but the sequence it returns represents a distribution that approximates q below the true one cross entropy of the true and approximating distributions is defined as cross entropy p q sum i p i log q i of which this function returns the sum over the pairs of elements of truths and predictions keyed by truth key and prediction key due to the logarithm if q i is close to zero we run into numerical problems to prevent this all q i that are less than min prediction pr are treated as if they were min prediction pr the second value returned is the sum of p i over all truths and all i this is normally equal to length truths since elements of truths represent a probability distribution but this is not enforced which allows relative importance of elements to be controlled the third value returned is a plist that maps each index occurring in the distribution sequences to a list of two elements sum j p j i log q j i and sum j p j i where j indexes into truths and predictions measure cross entropy 0 1 0 0 1 0 7 0 2 0 35667497 1 2 0 0 0 1 0 35667497 1 0 0 0 0 note how the returned values are suitable for multiple value call bf1a with add to counter 62de and a cross entropy counter b186 a id x 28mgl core 3ameasure roc auc 20function 29 a function measure roc auc predictions pred key key identity weight return the area under the roc curve for predictions representing predictions for a binary classification problem pred is a predicate function for deciding whether a prediction belongs to the so called positive class key returns a number for each element which is the predictor s idea of how much that element is likely to belong to the class although it s not necessarily a probability if weight is nil then all elements of predictions count as 1 towards the unnormalized sum within auc else weight must be a function like key but it should return the importance a positive real number of elements if the weight of an prediction is 2 then it s as if there were another identical copy of that prediction in predictions the algorithm is based on algorithm 2 in the paper an introduction to roc analysis by tom fawcett roc auc is equal to the probability of a randomly chosen positive having higher key score than a randomly chosen negative element with equal scores in mind a more precise version is auc is the expectation of the above probability over all possible sequences sorted by scores a id x 28mgl core 3ameasure confusion 20function 29 a function measure confusion truths predictions key test eql truth key prediction key weight create a confusion matrix 60d2 from truths and predictions truths keyed by truth key is a sequence of class labels compared with test to another sequence of class labels in predictions keyed by prediction key if weight is non nil then it is a function that returns the weight of an element of truths weighted cases add their weight to both counts returned as the first and second values note how the returned confusion matrix can be added to another with add to counter 62de a id x 28mgl core 3a 40mgl classification counter 20mgl pax 3asection 29 a 7 3 classification counters a id x 28mgl core 3aclassification accuracy counter 20class 29 a class classification accuracy counter basic counter 5979 a basic counter 5979 with acc as its type attribute and a print object eafc method that prints percentages a id x 28mgl core 3across entropy counter 20class 29 a class cross entropy counter basic counter 5979 a basic counter 5979 with xent as its type attribute a id x 28mgl core 3a 40mgl confusion matrix 20mgl pax 3asection 29 a 7 3 1 confusion matrices a id x 28mgl core 3aconfusion matrix 20class 29 a class confusion matrix a confusion matrix keeps count of classification results the correct class is called target and the output of the classifier is called prediction a id x 28mgl core 3amake confusion matrix 20function 29 a function make confusion matrix key test eql classes are compared with test a id x 28mgl core 3asort confusion classes 20generic function 29 a generic function sort confusion classes matrix classes return a list of classes sorted for presentation purposes a id x 28mgl core 3aconfusion class name 20generic function 29 a generic function confusion class name matrix class name of class for presentation purposes a id x 28mgl core 3aconfusion count 20generic function 29 a generic function confusion count matrix target prediction a id x 28mgl core 3amap confusion matrix 20generic function 29 a generic function map confusion matrix fn matrix call fn with target prediction count 5c66 paramaters for each cell in the confusion matrix cells with a zero count may be ommitted a id x 28mgl core 3aconfusion matrix classes 20generic function 29 a generic function confusion matrix classes matrix a list of all classes the default is to collect classes from the counts this can be overridden if for instance some classes are not present in the results a id x 28mgl core 3aconfusion matrix accuracy 20function 29 a function confusion matrix accuracy matrix key filter return the overall accuracy of the results in matrix it s computed as the number of correctly classified cases hits divided by the name of cases return the number of hits and the number of cases as the second and third value if filter function is given then call it with the target and the prediction of the cell disregard cell for which filter returns nil precision and recall can be easily computed by giving the right filter although those are provided in separate convenience functions a id x 28mgl core 3aconfusion matrix precision 20function 29 a function confusion matrix precision matrix prediction return the accuracy over the cases when the classifier said prediction a id x 28mgl core 3aconfusion matrix recall 20function 29 a function confusion matrix recall matrix target return the accuracy over the cases when the correct class is target a id x 28mgl core 3aadd confusion matrix 20function 29 a function add confusion matrix matrix result matrix add matrix into result matrix a id x 28mgl core 3a 40mgl features 20mgl pax 3asection 29 a 8 features in package mgl core a id x 28mgl core 3a 40mgl feature selection 20mgl pax 3asection 29 a 8 1 feature selection the following scoring functions all return an equal 96d0 hash table that maps features to scores a id x 28mgl core 3acount features 20function 29 a function count features documents mapper key key identity return scored features as an equal 96d0 hash table whose keys are features of documents and values are counts of occurrences of features mapper takes a function and a document and calls function with features of the document common lisp sort alexandria hash table alist count features hello world this is our world lambda fn document map nil fn document string key car hello 1 is 1 our 1 this 1 world 2 a id x 28mgl core 3afeature llrs 20function 29 a function feature llrs documents mapper class fn key classes all document classes documents class fn return scored features as an equal 96d0 hash table whose keys are features of documents and values are their log likelihood ratios mapper takes a function and a document and calls function with features of the document common lisp sort alexandria hash table alist feature llrs a hello world b this is our world lambda fn document map nil fn rest document first string key car hello 2 6032386 is 2 6032386 our 2 6032386 this 2 6032386 world 4 8428774e 8 a id x 28mgl core 3afeature disambiguities 20function 29 a function feature disambiguities documents mapper class fn key classes all document classes documents class fn return scored features as an equal 96d0 hash table whose keys are features of documents and values are their disambiguities mapper takes a function and a document and calls function with features of the document from the paper using ambiguity measure feature selection algorithm for support vector machine classifier a id x 28mgl core 3a 40mgl feature encoding 20mgl pax 3asection 29 a 8 2 feature encoding features can rarely be fed directly to algorithms as is they need to be transformed in some way suppose we have a simple language model that takes a single word as input and predicts the next word however both input and output is to be encoded as float vectors of length 1000 what we do is find the top 1000 words by some measure see feature selection 1b5e and associate these words with the integers in 0 999 this is encode fedd ing by using for example one hot http en wikipedia org wiki one hot encoding we translate a word into a float vector when passing in the input when the model outputs the probability distribution of the next word we find the index of the max and find the word associated with it this is decode 1339 ing a id x 28mgl core 3aencode 20generic function 29 a generic function encode encoder decoded encode decoded with encoder this interface is generic enough to be almost meaningless see encoder decoder 1beb for a simple mgl nlp bag of words encoder cbb4 for a slightly more involved example if encoder is a function designator then it s simply funcall 6b4a ed with decoded a id x 28mgl core 3adecode 20generic function 29 a generic function decode decoder encoded decode encoded with encoder for an decoder encoder pair decode decoder encode encoder object must be equal in some sense to object if decoder is a function designator then it s simply funcall 6b4a ed with encoded a id x 28mgl core 3aencoder 2fdecoder 20class 29 a class encoder decoder implements o 1 encode fedd and decode 1339 by having an internal decoded to encoded and an encoded to decoded equal 96d0 hash table encoder decoder objects can be saved and loaded see persistence 29a1 as long as the elements in the hash tables have read write consitency common lisp let indexer make indexer alexandria alist hash table i 3 me 2 mine 1 2 values encode indexer i encode indexer me encode indexer mine decode indexer 0 decode indexer 1 decode indexer 2 0 1 nil i me nil a id x 28mgl core 3amake indexer 20function 29 a function make indexer scored features n key start 0 class encoder decoder take the top n features from scored features see feature selection 1b5e assign indices to them starting from start return an encoder decoder 1beb or another class that converts between objects and indices also see bag of words 0784 a id x 28mgl opt 3a 40mgl opt 20mgl pax 3asection 29 a 9 gradient based optimization in package mgl opt we have a real valued differentiable function f and the task is to find the parameters that minimize its value optimization starts from a single point in the parameter space of f and this single point is updated iteratively based on the gradient and value of f at or around the current point note that while the stated problem is that of global optimization for non convex functions most algorithms will tend to converge to a local optimum currently there are two optimization algorithms gradient descent 10e7 with several variants and conjugate gradient 83e6 both of which are first order methods they do not need second order gradients but more can be added with the extension api 6a6f a id x 28mgl opt 3aminimize 20function 29 a function minimize optimizer gradient source key weights list segments gradient source dataset infinitely empty dataset minimize the value of the real valued function represented by gradient source by updating some of its parameters in weights a mat or a sequence of mat s return weights dataset see datasets 109e is a set of unoptimized parameters of the same function for example weights may be the weights of a neural network while dataset is the training set consisting of inputs suitable for set input 0c9e the default dataset infinitely empty dataset ad8f is suitable for when all parameters are optimized so there is nothing left to come from the environment optimization terminates if dataset is a sampler and it runs out or when some other condition met see termination 9006 for example if dataset is a sequence b9c1 then it is reused over and over again examples for various optimizers are provided in gradient descent 10e7 and conjugate gradient 83e6 a id x 28mgl opt 3a 40mgl opt iterative optimizer 20mgl pax 3asection 29 a 9 1 iterative optimizer a id x 28mgl opt 3aiterative optimizer 20class 29 a class iterative optimizer an abstract base class of gradient descent 10e7 and conjugate gradient 83e6 based optimizers that iterate over instances until a termination condition is met a id x 28mgl opt 3an instances 20 28mgl pax 3areader 20mgl opt 3aiterative optimizer 29 29 a reader n instances iterative optimizer n instances 0 the number of instances this optimizer has seen so far incremented automatically during optimization a id x 28mgl opt 3atermination 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 a accessor termination iterative optimizer termination nil if a number it s the number of instances to train on in the sense of n instances 4c73 if n instances is equal or greater than this value optimization stops if termination is nil then optimization will continue if it is t then optimization will stop if it is a function of no arguments then its return value is processed as if it was returned by termination a id x 28mgl opt 3aon optimization started 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 a accessor on optimization started iterative optimizer on optimization started nil an event hook with parameters optimizer gradient source n instances called after initializations are performed initialize optimizer initialize gradient source but before optimization is started a id x 28mgl opt 3aon optimization finished 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 a accessor on optimization finished iterative optimizer on optimization finished nil an event hook with parameters optimizer gradient source n instances called when optimization has finished a id x 28mgl opt 3aon n instances changed 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 a accessor on n instances changed iterative optimizer on n instances changed nil an event hook with parameters optimizer gradient source n instances called when optimization of a batch of instances is done and n instances 4c73 is incremented now let s discuss a few handy utilities a id x 28mgl opt 3amonitor optimization periodically 20function 29 a function monitor optimization periodically optimizer periodic fns for each periodic function in the list of periodic fns add a monitor to optimizer s on optimization started ebd4 on optimization finished 0072 and on n instances changed 4f0b hooks the monitors are simple functions that just call each periodic function with the event parameters optimizer gradient source n instances 4c73 return optimizer to log and reset the monitors of the gradient source after every 1000 instances seen by optimizer monitor optimization periodically optimizer fn log my test error period 2000 fn reset optimization monitors period 1000 last eval 0 note how we don t pass it s allowed to just pass the initargs for a periodic fn instead of periodic fn itself the last eval 0 bit prevents reset optimization monitors ca09 from being called at the start of the optimization when the monitors are empty anyway a id x 28mgl opt 3areset optimization monitors 20generic function 29 a generic function reset optimization monitors optimizer gradient source report the state of monitors 8f37 of optimizer and gradient source and reset their counters see monitor optimization periodically 4528 for an example of how this is used a id x 28mgl opt 3areset optimization monitors 20 28method 20nil 20 28mgl opt 3aiterative optimizer 20t 29 29 29 a method reset optimization monitors optimizer iterative optimizer gradient source log the counters of the monitors of optimizer and gradient source and reset them a id x 28mgl opt 3areport optimization parameters 20generic function 29 a generic function report optimization parameters optimizer gradient source a utility that s often called at the start of optimization from on optimization started ebd4 the default implementation logs the description of gradient source as in describe 38a0 and optimizer and calls log mat room a id x 28mgl opt 3a 40mgl opt cost 20mgl pax 3asection 29 a 9 2 cost function the function being minimized is often called the cost or the loss function a id x 28mgl common 3acost 20generic function 29 a generic function cost model return the value of the cost function being minimized calling this only makes sense in the context of an ongoing optimization see minimize 46a4 the cost is that of a batch of instances a id x 28mgl opt 3amake cost monitors 20function 29 a function make cost monitors model key operation mode attributes return a list of monitor 7068 objects each associated with one basic counter 5979 with attribute type cost implemented in terms of make cost monitors 3815 a id x 28mgl opt 3amake cost monitors 2a 20generic function 29 a generic function make cost monitors model operation mode attributes identical to make cost monitors 46c2 bar the keywords arguments specialize this to add to support for new model types a id x 28mgl gd 3a 40mgl gd 20mgl pax 3asection 29 a 9 3 gradient descent in package mgl gd gradient descent is a first order optimization algorithm relying completely on first derivatives it does not even evaluate the function to be minimized let s see how to minimize a numerical lisp function with respect to some of its parameters a id x 28mgl gd 3asgd 2elisp 20 28mgl pax 3ainclude 20 23p 22 2fhome 2fmelisgl 2fown 2fmgl 2fexample 2fsgd 2elisp 22 20 3aheader nl 20 22 60 60 60commonlisp 22 20 3afooter nl 20 22 60 60 60 22 29 29 a commonlisp cl defpackage mgl example sgd use common lisp mgl in package mgl example sgd create an object representing the sine function defparameter diff fn 1 make instance mgl diffun diffun fn sin we are going to optimize its only parameter weight indices 0 minimize sin note that there is no dataset involved because all parameters are being optimized minimize make instance sgd optimizer termination 1000 diff fn 1 weights make mat 1 a mat with a single value of about pi 2 create a differentiable function for f x y x y 2 x is a parameter whose values come from the dataset argument passed to minimize y is a parameter to be optimized a weight defparameter diff fn 2 make instance mgl diffun diffun fn lambda x y expt x y 2 parameter indices 0 weight indices 1 find the y that minimizes the distance from the instances generated by the sampler minimize make instance sgd optimizer batch size 10 diff fn 2 weights make mat 1 dataset make instance function sampler generator lambda list 10 gaussian random 1 max n samples 1000 a mat with a single value of about 10 the expected value of the instances in the dataset the dataset can be a sequence in which case we d better set termination else optimization would never finish minimize make instance sgd optimizer termination 1000 diff fn 2 weights make mat 1 dataset 0 1 2 3 4 5 a mat with a single value of about 2 5 we are going to see a number of accessors for optimizer paramaters in general it s allowed to setf 17b7 real slot accessors as opposed to readers and writers at any time during optimization and so is defining a method on an optimizer subclass that computes the value in any way for example to decay the learning rate on a per mini batch basis commonlisp defmethod learning rate optimizer my sgd optimizer slot value optimizer learning rate expt 0 998 n instances optimizer 60000 a id x 28mgl gd 3a 40mgl gd batch gd optimizer 20mgl pax 3asection 29 a 9 3 1 batch based optimizers first let s see everything common to all batch based optimizers then discuss sgd optimizer 25fd adam optimizer bd13 and normalized batch optimizer 0c91 all batch based optimizers are iterative optimizer 8da0 s so see iterative optimizer 779d too a id x 28mgl gd 3abatch gd optimizer 20class 29 a class batch gd optimizer another abstract base class for gradient based optimizers tath updates all weights simultaneously after chewing through batch size 0 8916 1 ba18 2 c918 inputs see subclasses sgd optimizer 2a2f adam optimizer e0e6 and normalized batch gd optimizer f6ae per weight batch gd optimizer 5a43 may be a better choice when some weights can go unused for instance due to missing input values a id x 28mgl common 3abatch size 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 a accessor batch size gd optimizer batch size 1 after having gone through batch size number of inputs weights are updated with batch size 1 one gets stochastics gradient descent with batch size equal to the number of instances in the dataset one gets standard batch gradient descent with batch size between these two extremes one gets the most practical mini batch compromise a id x 28mgl gd 3alearning rate 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 a accessor learning rate gd optimizer learning rate 0 1 this is the step size along the gradient decrease it if optimization diverges increase it if it doesn t make progress a id x 28mgl gd 3amomentum 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 a accessor momentum gd optimizer momentum 0 a value in the 0 1 interval momentum times the previous weight change is added to the gradient 0 means no momentum a id x 28mgl gd 3amomentum type 20 28mgl pax 3areader 20mgl gd 3a 3agd optimizer 29 29 a reader momentum type gd optimizer momentum type normal one of normal nesterov or none for pure optimization nesterov s momentum may be better but it may also increases chances of overfitting using none is equivalent to 0 momentum but it also uses less memory note that with none momentum af05 is ignored even it it is non zero a id x 28mgl gd 3aweight decay 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 a accessor weight decay gd optimizer weight decay 0 an l2 penalty it discourages large weights much like a zero mean gaussian prior weight decay weight is added to the gradient to penalize large weights it s as if the function whose minimum is sought had weight decay sum i 0 5 weight i 2 added to it a id x 28mgl gd 3aweight penalty 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 a accessor weight penalty gd optimizer weight penalty 0 an l1 penalty it encourages sparsity sign weight weight penalty is added to the gradient pushing the weight towards negative infinity it s as if the function whose minima is sought had weight penalty sum i abs weight i added to it putting it on feature biases consitutes a sparsity constraint on the features a id x 28mgl gd 3ause segment derivatives p 20 28mgl pax 3areader 20mgl gd 3a 3agd optimizer 29 29 a reader use segment derivatives p gd optimizer use segment derivatives p nil save memory if both the gradient source the model being optimized and the optimizer support this feature it works like this the accumulator into which the gradient source is asked to place the derivatives of a segment will be segment derivatives 9a5b of the segment this allows the optimizer not to allocate an accumulator matrix into which the derivatives are summed a id x 28mgl gd 3aafter update hook 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 a accessor after update hook gd optimizer after update hook nil a list of functions with no arguments called after each weight update a id x 28mgl gd 3abefore update hook 20 28mgl pax 3aaccessor 20mgl gd 3abatch gd optimizer 29 29 a accessor before update hook batch gd optimizer before update hook nil a list of functions of no parameters each function is called just before a weight update takes place after accumulated gradients have been divided the length of the batch convenient to hang some additional gradient accumulating code on a id x 28mgl gd 3a 40mgl gd sgd optimizer 20mgl pax 3asection 29 a sgd optimizer a id x 28mgl gd 3asgd optimizer 20class 29 a class sgd optimizer batch gd optimizer d94e with batch size 0 8916 1 ba18 2 c918 1 this is stochastic gradient descent with higher batch sizes one gets mini batch and batch gradient descent assuming that accumulator has the sum of gradients for a mini batch the weight update looks like this delta w t 1 momentum delta w t frac accumulator batchsize l 2 w l 1 sign w w t 1 w t learningrate delta w which is the same as the more traditional formulation delta w t 1 momentum delta w t learningrate left frac frac df dw batchsize l 2 w l 1 sign w right w t 1 w t delta w but the former works better when batch size momentum or learning rate change during the course of optimization the above is with normal momentum nesterov s momentum see momentum type 5611 momentum is also available see batch based optimizers 2c39 for the description of the various options common to all batch based optimizers a id x 28mgl gd 3a 40mgl gd adam optimizer 20mgl pax 3asection 29 a adam optimizer a id x 28mgl gd 3aadam optimizer 20class 29 a class adam optimizer batch gd optimizer d94e adam is a first order stochasistic gradient descent optimizer it maintains an internal estimation for the mean and raw variance of each derivative as exponential moving averages the step it takes is basically m sqrt v e where m is the estimated mean v is the estimated variance and e is a small adjustment factor to prevent the gradient from blowing up see version 5 of the paper http arxiv org abs 1412 6980 for more note that using momentum is not supported with adam in fact an error is signalled if it s not none see batch based optimizers 2c39 for the description of the various options common to all batch based optimizers a id x 28mgl gd 3alearning rate 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 a accessor learning rate adam optimizer 2 0e 4 same thing as learning rate 09ed but with the default suggested by the adam paper a id x 28mgl gd 3amean decay 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 a accessor mean decay adam optimizer mean decay 0 9 a number between 0 and 1 that determines how fast the estimated mean of derivatives is updated 0 basically gives you rmsprop if variance decay 0900 is not too large or adagrad if variance decay is close to 1 and the learning rate is annealed this is beta 1 in the paper a id x 28mgl gd 3amean decay decay 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 a accessor mean decay decay adam optimizer mean decay decay 1 1 0d 7 a value that should be close to 1 mean decay 011d is multiplied by this value after each update this is lambda in the paper a id x 28mgl gd 3avariance decay 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 a accessor variance decay adam optimizer variance decay 0 999 a number between 0 and 1 that determines how fast the estimated variance of derivatives is updated this is beta 2 in the paper a id x 28mgl gd 3avariance adjustment 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 a accessor variance adjustment adam optimizer variance adjustment 1 0d 7 within the bowels of adam the estimated mean is divided by the square root of the estimated variance per weight which can lead to numerical problems if the denominator is near zero to avoid this variance adjustment which should be a small positive number is added to the denominator this is epsilon in the paper a id x 28mgl gd 3a 40mgl gd normalized batch gd optimizer 20mgl pax 3asection 29 a normalized batch optimizer a id x 28mgl gd 3anormalized batch gd optimizer 20class 29 a class normalized batch gd optimizer batch gd optimizer d94e like batch gd optimizer d94e but keeps count of how many times each weight was used in the batch and divides the accumulated gradient by this count instead of dividing by n instances in batch this only makes a difference if there are missing values in the learner that s being trained the main feature that distuinguishes this class from per weight batch gd optimizer 5a43 is that batches end at same time for all weights a id x 28mgl gd 3an weight uses in batch 20 28mgl pax 3aaccessor 20mgl gd 3anormalized batch gd optimizer 29 29 a accessor n weight uses in batch normalized batch gd optimizer number of uses of the weight in its current batch a id x 28mgl gd 3a 40mgl gd segmented gd optimizer 20mgl pax 3asection 29 a 9 3 2 segmented gd optimizer a id x 28mgl gd 3asegmented gd optimizer 20class 29 a class segmented gd optimizer an optimizer that delegates training of segments to other optimizers useful to delegate training of different segments to different optimizers capable of working with segmentables or simply to not train all segments a id x 28mgl gd 3asegmenter 20 28mgl pax 3areader 20mgl gd 3asegmented gd optimizer 29 29 a reader segmenter segmented gd optimizer segmenter when this optimizer is initialized it loops over the segment of the learner with map segments 2312 segmenter is a function that is called with each segment and returns an optimizer or nil several segments may be mapped to the same optimizer after the segment optimizer mappings are collected each optimizer is initialized by initialize optimizer with the list of segments mapped to it a id x 28mgl opt 3asegments 20 28mgl pax 3areader 20mgl gd 3asegmented gd optimizer 29 29 a reader segments segmented gd optimizer segmented gd optimizer 3ce0 inherits from iterative optimizer 8da0 so see iterative optimizer 779d too a id x 28mgl gd 3a 40mgl gd per weight optimization 20mgl pax 3asection 29 a 9 3 3 per weight optimization a id x 28mgl gd 3aper weight batch gd optimizer 20class 29 a class per weight batch gd optimizer this is much like batch based optimizers 2c39 but it is more clever about when to update weights basically every weight has its own batch independent from the batches of others this has desirable properties one can for example put two neural networks together without adding any connections between them and the learning will produce results equivalent to the separated case also adding inputs with only missing values does not change anything due to its very non batch nature there is no cuda implementation of this optimizer a id x 28mgl gd 3an weight uses in batch 20 28mgl pax 3aaccessor 20mgl gd 3aper weight batch gd optimizer 29 29 a accessor n weight uses in batch per weight batch gd optimizer number of uses of the weight in its current batch a id x 28mgl gd 3a 40mgl gd utilities 20mgl pax 3asection 29 a 9 3 4 utilities a id x 28mgl gd 3aclip l2 norm 20function 29 a function clip l2 norm mats l2 upper bound key callback scale mats so that their l 2 norm does not exceed l2 upper bound compute the norm of of mats as if they were a single vector if the norm is greater than l2 upper bound then scale each matrix destructively by the norm divided by l2 upper bound and if non nil call the function callback with the scaling factor a id x 28mgl gd 3aarrange for clipping gradients 20function 29 a function arrange for clipping gradients batch gd optimizer l2 upper bound key callback make it so that the norm of the batch normalized gradients accumulated by batch gd optimizer is clipped to l2 upper bound before every update see clip l2 norm af6b a id x 28mgl cg 3a 40mgl cg 20mgl pax 3asection 29 a 9 4 conjugate gradient in package mgl cg conjugate gradient is a first order optimization algorithm it s more advanced than gradient descent as it does line searches which unfortunately also makes it unsuitable for non deterministic functions let s see how to minimize a numerical lisp function with respect to some of its parameters create an object representing the sine function defparameter diff fn 1 make instance mgl diffun diffun fn sin we are going to optimize its only parameter weight indices 0 minimize sin note that there is no dataset involved because all parameters are being optimized minimize make instance cg optimizer batch size 1 termination 1 diff fn 1 weights make mat 1 a mat with a single value of about pi 2 create a differentiable function for f x y x y 2 x is a parameter whose values come from the dataset argument passed to minimize y is a parameter to be optimized a weight defparameter diff fn 2 make instance mgl diffun diffun fn lambda x y expt x y 2 parameter indices 0 weight indices 1 find the y that minimizes the distance from the instances generated by the sampler minimize make instance cg optimizer batch size 10 diff fn 2 weights make mat 1 dataset make instance function sampler generator lambda list 10 gaussian random 1 max n samples 1000 a mat with a single value of about 10 the expected value of the instances in the dataset the dataset can be a sequence in which case we d better set termination else optimization would never finish note how a single epoch suffices minimize make instance cg optimizer termination 6 diff fn 2 weights make mat 1 dataset 0 1 2 3 4 5 a mat with a single value of about 2 5 a id x 28mgl cg 3acg 20function 29 a function cg fn w key max n line searches default max n line searches max n evaluations per line search default max n evaluations per line search max n evaluations default max n evaluations sig default sig rho default rho int default int ext default ext ratio default ratio spare vectors cg optimizer ee97 passes each batch of data to this function with its cg args 9749 passed on minimize a differentiable multivariate function with conjugate gradient the polak ribiere flavour of conjugate gradients is used to compute search directions and a line search using quadratic and cubic polynomial approximations and the wolfe powell stopping criteria is used together with the slope ratio method for guessing initial step sizes additionally a bunch of checks are made to make sure that exploration is taking place and that extrapolation will not be unboundedly large fn is a function of two parameters weights 0 3c96 1 a896 and derivatives weights is a mat of the same size as w that is where the search start from derivatives is also a mat of that size and it is where fn shall place the partial derivatives fn returns the value of the function that is being minimized cg performs a number of line searches and invokes fn at each step a line search invokes fn at most max n evaluations per line search number of times and can succeed in improving the minimum by the sufficient margin or it can fail note the even a failed line search may improve further and hence change the weights it s just that the improvement was deemed too small cg stops when either two line searches fail in a row max n line searches is reached max n evaluations is reached cg returns a mat that contains the best weights the minimum the number of line searches performed the number of succesful line searches and the number of evaluations when using max n evaluations remember that there is an extra evaluation of fn before the first line search spare vectors is a list of preallocated mat s of the same size as w passing 6 of them covers the current need of the algorithm and it will not cons up vectors of size w at all note if the function terminates within a few iterations it could be an indication that the function values and derivatives are not consistent ie there may be a bug in the implementation of fn function sig and rho are the constants controlling the wolfe powell conditions sig is the maximum allowed absolute ratio between previous and new slopes derivatives in the search direction thus setting sig to low positive values forces higher precision in the line searches rho is the minimum allowed fraction of the expected from the slope at the initial point in the linesearch constants must satisfy 0 rho sig 1 tuning of sig depending on the nature of the function to be optimized may speed up the minimization it is probably not worth playing much with rho a id x 28mgl cg 3a 2adefault int 2a 20variable 29 a variable default int 0 1 don t reevaluate within int of the limit of the current bracket a id x 28mgl cg 3a 2adefault ext 2a 20variable 29 a variable default ext 3 extrapolate maximum ext times the current step size a id x 28mgl cg 3a 2adefault sig 2a 20variable 29 a variable default sig 0 1 sig and rho are the constants controlling the wolfe powell conditions sig is the maximum allowed absolute ratio between previous and new slopes derivatives in the search direction thus setting sig to low positive values forces higher precision in the line searches a id x 28mgl cg 3a 2adefault rho 2a 20variable 29 a variable default rho 0 05 rho is the minimum allowed fraction of the expected from the slope at the initial point in the linesearch constants must satisfy 0 rho sig 1 a id x 28mgl cg 3a 2adefault ratio 2a 20variable 29 a variable default ratio 10 maximum allowed slope ratio a id x 28mgl cg 3a 2adefault max n line searches 2a 20variable 29 a variable default max n line searches nil a id x 28mgl cg 3a 2adefault max n evaluations per line search 2a 20variable 29 a variable default max n evaluations per line search 20 a id x 28mgl cg 3a 2adefault max n evaluations 2a 20variable 29 a variable default max n evaluations nil a id x 28mgl cg 3acg optimizer 20class 29 a class cg optimizer iterative optimizer 8da0 updates all weights simultaneously after chewing through batch size 0 8916 1 ba18 2 c918 inputs a id x 28mgl common 3abatch size 20 28mgl pax 3aaccessor 20mgl cg 3acg optimizer 29 29 a accessor batch size cg optimizer batch size after having gone through batch size number of instances weights are updated normally cg 4ffb operates on all available data but it may be useful to introduce some noise into the optimization to reduce overfitting by using smaller batch sizes if batch size is not set it is initialized to the size of the dataset at the start of optimization a id x 28mgl cg 3acg args 20 28mgl pax 3aaccessor 20mgl cg 3acg optimizer 29 29 a accessor cg args cg optimizer cg args nil a id x 28mgl cg 3aon cg batch done 20 28mgl pax 3aaccessor 20mgl cg 3acg optimizer 29 29 a accessor on cg batch done cg optimizer on cg batch done nil an event hook called when processing a conjugate gradient batch is done the handlers on the hook are called with 8 arguments optimizer gradient source instances best w best f n line searches n succesful line searches n evaluations the latter 5 of which are the return values of the cg 4ffb function a id x 28mgl cg 3alog cg batch done 20generic function 29 a generic function log cg batch done optimizer gradient source instances best w best f n line searches n succesful line searches n evaluations this is a function can be added to on cg batch done d10a the default implementation simply logs the event arguments a id x 28mgl cg 3asegment filter 20 28mgl pax 3areader 20mgl cg 3acg optimizer 29 29 a reader segment filter cg optimizer segment filter constantly t a predicate function on segments that filters out uninteresting segments called from initialize optimizer 7c2f a id x 28mgl opt 3a 40mgl opt extension api 20mgl pax 3asection 29 a 9 5 extension api a id x 28mgl opt 3a 40mgl opt optimizer 20mgl pax 3asection 29 a 9 5 1 implementing optimizers the following generic functions must be specialized for new optimizer types a id x 28mgl opt 3aminimize 2a 20generic function 29 a generic function minimize optimizer gradient source weights dataset called by minimize 46a4 after initialize optimizer 7c2f and initialize gradient source dd95 this generic function is the main extension point for writing optimizers a id x 28mgl opt 3ainitialize optimizer 2a 20generic function 29 a generic function initialize optimizer optimizer gradient source weights dataset called automatically before training starts this function sets up optimizer to be suitable for optimizing gradient source it typically creates appropriately sized accumulators for the gradients a id x 28mgl opt 3asegments 20generic function 29 a generic function segments optimizer several weight matrices known as segments can be optimized by a single optimizer this function returns them as a list the rest are just useful for utilities for implementing optimizers a id x 28mgl opt 3aterminate optimization p 20function 29 a function terminate optimization p n instances termination utility function for subclasses of iterative optimizer 8da0 it returns whether optimization is to be terminated based on n instances and termination that are values of the respective accessors of iterative optimizer a id x 28mgl opt 3aset n instances 20function 29 a function set n instances optimizer gradient source n instances set n instances 4c73 of optimizer and fire on n instances changed 4f0b iterative optimizer 8da0 subclasses must call this to increment n instances 4c73 a id x 28mgl opt 3asegment set 20class 29 a class segment set this is a utility class for optimizers that have a list of segments f00d and the weights being optimized is able to copy back and forth between those segments and a single mat the accumulator a id x 28mgl opt 3asegments 20 28mgl pax 3areader 20mgl opt 3asegment set 29 29 a reader segments segment set segments a list of weight matrices a id x 28mgl common 3asize 20 28mgl pax 3areader 20mgl opt 3asegment set 29 29 a reader size segment set the sum of the sizes of the weight matrices of segments f00d a id x 28mgl opt 3ado segment set 20mgl pax 3amacro 29 a macro do segment set segment optional start segment set body body iterate over segments f00d in segment set if start is specified the it is bound to the start index of segment within segment set the start index is the sum of the sizes of previous segments a id x 28mgl opt 3asegment set 3c mat 20function 29 a function segment set mat segment set mat copy the values of mat to the weight matrices of segment set as if they were concatenated into a single mat a id x 28mgl opt 3asegment set 3emat 20function 29 a function segment set mat segment set mat copy the values of segment set to mat as if they were concatenated into a single mat a id x 28mgl opt 3a 40mgl opt gradient source 20mgl pax 3asection 29 a 9 5 2 implementing gradient sources weights can be stored in a multitude of ways optimizers need to update weights so it is assumed that weights are stored in any number of mat objects called segments the generic functions in this section must all be specialized for new gradient sources except where noted a id x 28mgl opt 3amap segments 20generic function 29 a generic function map segments fn gradient source apply fn to each segment of gradient source a id x 28mgl opt 3amap segment runs 20generic function 29 a generic function map segment runs fn segment call fn with start and end of intervals of consecutive indices that are not missing in segment called by optimizers that support partial updates the default implementation assumes that all weights are present this only needs to be specialized if one plans to use an optimizer that knows how to deal unused missing weights such as mgl gd normalized batch gd optimizer f6ae and optimizer mgl gd per weight batch gd optimizer 5a43 a id x 28mgl opt 3asegment weights 20generic function 29 a generic function segment weights segment return the weight matrix of segment a segment doesn t need to be a mat object itself for example it may be a mgl bm chunk of a mgl bm bm or a mgl bp lump c1ac of a mgl bp bpn 5187 whose nodes cc1c slot holds the weights a id x 28mgl opt 3asegment weights 20 28method 20nil 20 28mgl mat 3amat 29 29 29 a method segment weights mat mat when the segment is really a mat then just return it a id x 28mgl opt 3asegment derivatives 20generic function 29 a generic function segment derivatives segment return the derivatives matrix of segment a segment doesn t need to be a mat object itself for example it may be a mgl bm chunk of a mgl bm bm or a mgl bp lump c1ac of a mgl bp bpn 5187 whose derivatives slot holds the gradient a id x 28mgl opt 3alist segments 20function 29 a function list segments gradient source a utility function that returns the list of segments from map segments 2312 on gradient source a id x 28mgl opt 3ainitialize gradient source 2a 20generic function 29 a generic function initialize gradient source optimizer gradient source weights dataset called automatically before minimize ae3d is called this function may be specialized if gradient source needs some kind of setup a id x 28mgl opt 3ainitialize gradient source 2a 20 28method 20nil 20 28t 20t 20t 20t 29 29 29 a method initialize gradient source optimizer gradient source weights dataset the default method does nothing a id x 28mgl opt 3aaccumulate gradients 2a 20generic function 29 a generic function accumulate gradients gradient source sink batch multiplier valuep add multiplier times the sum of first order gradients to accumulators of sink normally accessed with do gradient sink 20ca and if valuep return the sum of values of the function being optimized for a batch of instances gradient source is the object representing the function being optimized sink is gradient sink note the number of instances in batch may be larger than what gradient source process in one go in the sense of say max n stripes 16c4 so do batches for model faaa or something like group batch max n stripes can be handy a id x 28mgl opt 3a 40mgl opt gradient sink 20mgl pax 3asection 29 a 9 5 3 implementing gradient sinks optimizers call accumulate gradients 4bf1 on gradient sources one parameter of accumulate gradients is the sink a gradient sink knows what accumulator matrix if any belongs to a segment sinks are defined entirely by map gradient sink aabd a id x 28mgl opt 3amap gradient sink 20generic function 29 a generic function map gradient sink fn sink call fn of lambda list segment accumulator on each segment and their corresponding accumulator mat in sink a id x 28mgl opt 3ado gradient sink 20mgl pax 3amacro 29 a macro do gradient sink segment accumulator sink body body a convenience macro on top of map gradient sink aabd a id x 28mgl diffun 3a 40mgl diffun 20mgl pax 3asection 29 a 10 differentiable functions in package mgl diffun a id x 28mgl diffun 3adiffun 20class 29 a class diffun diffun dresses a lisp function in its fn f491 slot as a gradient source see implementing gradient sources c58b which allows it to be used in minimize 46a4 see the examples in gradient descent 10e7 and conjugate gradient 83e6 a id x 28mgl common 3afn 20 28mgl pax 3areader 20mgl diffun 3adiffun 29 29 a reader fn diffun fn a real valued lisp function it may have any number of parameters a id x 28mgl diffun 3aparameter indices 20 28mgl pax 3areader 20mgl diffun 3adiffun 29 29 a reader parameter indices diffun parameter indices nil the list of indices of parameters that we don t optimize values for these will come from the dataset argument of minimize 46a4 a id x 28mgl diffun 3aweight indices 20 28mgl pax 3areader 20mgl diffun 3adiffun 29 29 a reader weight indices diffun weight indices nil the list of indices of parameters to be optimized the values of which will come from the weights argument of minimize 46a4 a id x 28mgl bp 3a 40mgl bp 20mgl pax 3asection 29 a 11 backpropagation neural networks in package mgl bp a id x 28mgl bp 3a 40mgl bp overview 20mgl pax 3asection 29 a 11 1 backprop overview backpropagation neural networks are just functions with lots of parameters called weights and a layered structure when presented as a computational graph http en wikipedia org wiki automatic differentiation the network is trained to minimize 46a4 some kind of loss function whose value the network computes in this implementation a bpn 5187 is assembled from several lump c1ac s roughly corresponding to layers both feed forward and recurrent neural nets are supported fnn 9de4 and rnn b0f3 respectively bpn s can contain not only lump s but other bpn s too as we see networks are composite objects and the abstract base class for composite and simple parts is called clump a4fe a id x 28mgl bp 3aclump 20class 29 a class clump a clump is a lump c1ac or a bpn 5187 it represents a differentiable function arguments of clumps are given during instantiation some arguments are clumps themselves so they get permenantly wired together like this commonlisp v m input size 10 name input weight dimensions 10 20 name weight name activation the above creates three clumps the vector matrix multiplication clumps called activation which has a reference to its operands input and weight note that the example just defines a function no actual computation has taken place yet this wiring of clump s is how one builds feed forward nets fnn 9de4 or recurrent neural networks rnn b0f3 that are clump s themselves so one can build nets in a hiearchical style if desired non composite clump s are called lump note the loss of c that stands for composite the various lump subtypes correspond to different layer types sigmoid 83f9 dropout 441b relu 9d3a tanh 5309 etc at this point you may want to jump ahead to get a feel for how things work by reading the fnn tutorial 6b38 a id x 28mgl bp 3a 40mgl bp extension api 20mgl pax 3asection 29 a 11 2 clump api these are mostly for extension purposes about the only thing needed from here for normal operation is nodes cc1c when clamping inputs or extracting predictions a id x 28mgl bp 3astripedp 20generic function 29 a generic function stripedp clump for efficiency forward and backprop phases do their stuff in batch mode passing a number of instances through the network in batches thus clumps must be able to store values of and gradients for each of these instances however some clumps produce the same result for each instance in a batch these clumps are the weights the parameters of the network stripedp returns true iff clump does not represent weights i e it s not a weight b76f for striped clumps their nodes cc1c and derivatives a81b are mat objects with a leading dimension number of rows in the 2d case equal to the number of instances in the batch non striped clumps have no restriction on their shape apart from what their usage dictates a id x 28mgl common 3anodes 20generic function 29 a generic function nodes object returns a mat object representing the state or result of object the first dimension of the returned matrix is equal to the number of stripes clump a4fe s nodes cc1c holds the result computed by the most recent forward c1ae for input f54e lumps this is where input values shall be placed see set input 0c9e currently the matrix is always two dimensional but this restriction may go away in the future a id x 28mgl bp 3aderivatives 20generic function 29 a generic function derivatives clump return the mat object representing the partial derivatives of the function clump computes the returned partial derivatives were accumulated by previous backward 5bd4 calls this matrix is shaped like the matrix returned by nodes cc1c a id x 28mgl bp 3aforward 20generic function 29 a generic function forward clump compute the values of the function represented by clump for all stripes and place the results into nodes cc1c of clump a id x 28mgl bp 3abackward 20generic function 29 a generic function backward clump compute the partial derivatives of the function represented by clump and add them to derivatives a81b of the corresponding argument clumps the derivatives of clump contains the sum of partial derivatives of all clumps by the corresponding output this function is intended to be called after a forward c1ae pass take the sigmoid 83f9 clump for example when the network is being applied to a batch of two instances x1 and x2 x1 and x2 are set in the input f54e lump x the sigmoid computes 1 1 exp x where x is its only argument clump f x 1 1 exp x when backward is called on the sigmoid lump its derivatives is a 2x1 mat object that contains the partial derivatives of the loss function dl x1 df dl x2 df now the backward method of the sigmoid needs to add dl x1 dx1 and dl x2 dx2 to derivatives of x now dl x1 dx1 dl x1 df df x1 dx1 and the first term is what we have in derivatives of the sigmoid so it only needs to calculate the second term in addition to the above clumps also have to support size 0 85d3 1 b99d n stripes 8dd7 max n stripes 16c4 and the setf 17b7 methods of the latter two which can be accomplished just by inheriting from bpn 5187 fnn 9de4 rnn b0f3 or a lump c1ac a id x 28mgl bp 3a 40mgl bpn 20mgl pax 3asection 29 a 11 3 bpn s a id x 28mgl bp 3abpn 20class 29 a class bpn clump a4fe abstract base class for fnn 9de4 and rnn b0f3 a id x 28mgl core 3an stripes 20 28mgl pax 3areader 20mgl bp 3abpn 29 29 a reader n stripes bpn n stripes 1 the current number of instances the network has this is automatically set to the number of instances passed to set input 0c9e so it rarely has to be manipulated directly although it can be set when set n stripes of all clumps 0 f7c1 1 a4fe get set to the same value a id x 28mgl core 3amax n stripes 20 28mgl pax 3areader 20mgl bp 3abpn 29 29 a reader max n stripes bpn max n stripes nil the maximum number of instances the network can operate on in parallel within build fnn 606c or build rnn 764b it defaults to max n stripes of that parent network else it defaults to 1 when set max n stripes of all clumps 0 f7c1 1 a4fe get set to the same value a id x 28mgl bp 3aclumps 20 28mgl pax 3areader 20mgl bp 3abpn 29 29 a reader clumps bpn clumps make array 0 element type clump adjustable t fill pointer t a topological sorted adjustable array with a fill pointer that holds the clumps that make up the network clumps are added to it by add clump 82d8 or more often automatically when within a build fnn 606c or build rnn 764b rarely needed find clump 175f takes care of most uses a id x 28mgl bp 3afind clump 20function 29 a function find clump name bpn key errorp t find the clump with name among clumps 0 f7c1 1 a4fe of bpn as always names are compared with equal 96d0 if not found then return nil or signal and error depending on errorp a id x 28mgl bp 3aadd clump 20function 29 a function add clump clump bpn add clump to bpn max n stripes 16c4 of clump gets set to that of bpn it is an error to add a clump with a name already used by one of the clumps f7c1 of bpn a id x 28mgl bp 3a 40mgl bp training 20mgl pax 3asection 29 a 11 3 1 training bpn 5187 s are trained to minimize the loss function they compute before a bpn is passed to minimize 46a4 as its gradient source argument it must be wrapped in a bp learner 00a0 object bp learner has monitors 6202 slot which is used for example by reset optimization monitors e4f9 without the bells an whistles the basic shape of training is this commonlisp minimize optimizer make instance bp learner bpn bpn dataset dataset a id x 28mgl bp 3abp learner 20class 29 a class bp learner a id x 28mgl bp 3abpn 20 28mgl pax 3areader 20mgl bp 3abp learner 29 29 a reader bpn bp learner bpn the bpn for which this bp learner 00a0 provides the gradients a id x 28mgl core 3amonitors 20 28mgl pax 3aaccessor 20mgl bp 3abp learner 29 29 a accessor monitors bp learner monitors nil a list of monitor 7068 s a id x 28mgl bp 3a 40mgl bp monitoring 20mgl pax 3asection 29 a 11 3 2 monitoring a id x 28mgl bp 3amonitor bpn results 20function 29 a function monitor bpn results dataset bpn monitors for every batch of size max n stripes 16c4 of bpn of instances in dataset set the batch as the next input with set input 0c9e perform a forward c1ae pass and apply monitors to the bpn with apply monitors 0 989c 1 bbdf finally return the counters of monitors this is built on top of monitor model results e50c a id x 28mgl bp 3amake step monitor monitors 20function 29 a function make step monitor monitors rnn key counter values fn counter raw values make counter make step monitor monitor counter return a list of monitors one for every monitor in step monitors 71f9 of rnn these monitors extract the results from their warp counterpairs with counter values fn and add them to their own counter that s created by make counter wow ew the idea is that one does something like this do monitor warped prediction commonlisp let warp time t setf step monitors rnn make cost monitors rnn attributes event warped pred monitor bpn results dataset rnn just collect and reset the warp monitors after each batch of instances make step monitor monitors rnn a id x 28mgl bp 3amake step monitor monitor counter 20generic function 29 a generic function make step monitor monitor counter step counter in an rnn b0f3 step counter aggregates results of all the time steps during the processing of instances in the current batch return a new counter into which results from step counter can be accumulated when the processing of the batch is finished the default implementation creates a copy of step counter a id x 28mgl bp 3a 40mgl fnn 20mgl pax 3asection 29 a 11 3 3 feed forward nets fnn 9de4 and rnn b0f3 have a lot in common see their common superclass bpn 5187 there is very limited functionality that s specific to fnn s so let s get them out of they way before we study a full example a id x 28mgl bp 3afnn 20class 29 a class fnn bpn 5187 a feed forward neural net as opposed to a recurrent one see rnn b0f3 a id x 28mgl bp 3abuild fnn 20mgl pax 3amacro 29 a macro build fnn key fnn class fnn initargs max n stripes name body clumps syntactic sugar to assemble fnn s from clump a4fe s like let 2739 it is a sequence of bindings of symbols to clump s the names of the clumps created default to the symbol of the binding in case a clump is not bound to a symbol because it was created in a nested expression the local function clump can be used to find the clump with the given name in the fnn being built example build fnn features input size n features biases weight size n features weights weight size n hiddens n features activations0 v m weights weights x clump features activations args list biases activations0 output sigmoid x activations a id x 28mgl bp 3a 40mgl fnn tutorial 20mgl pax 3asection 29 a fnn tutorial hopefully this example from example digit fnn lisp illustrates the concepts involved if it s too dense despite the comments then read up on datasets 109e gradient based optimization c74a and come back a id x 28mgl bp 3adigit fnn 2elisp 20 28mgl pax 3ainclude 20 23p 22 2fhome 2fmelisgl 2fown 2fmgl 2fexample 2fdigit fnn 2elisp 22 20 3aheader nl 20 22 60 60 60commonlisp 22 20 3afooter nl 20 22 60 60 60 22 29 29 a commonlisp cl defpackage mgl example digit fnn use common lisp mgl in package mgl example digit fnn there are 10 possible digits used as inputs defparameter n inputs 10 and we want to learn the rule that maps the input digit d to mod 1 d 3 defparameter n outputs 3 we define a feed forward net to be able to specialize how inputs are translated by adding a set input method later defclass digit fnn fnn build a digit fnn with a single hidden layer of rectified linear units and a softmax output defun make digit fnn key n hiddens 5 build fnn class digit fnn input input size n inputs hidden activation activation input size n hiddens hidden relu hidden activation output activation activation hidden size n outputs output softmax xe loss output activation this method is called with batches of instances input digits in this case by minimize and also by monitor bpn results before performing a forward pass i e computing the value of the function represented by the network its job is to encode the inputs by populating rows of the nodes matrix of the input clump each input is encoded as a row of zeros with a single 1 at index determined by the input digit this is called one hot encoding the target could be encoded the same way but instead we use the sparse option supported by target of softmax xe loss defmethod set input digits fnn digit fnn let input nodes find clump input fnn output lump find clump output fnn fill 0 input loop for i upfrom 0 for digit in digits do setf mref input i digit 1 setf target output lump mapcar lambda digit mod 1 digit n outputs digits train the network by minimizing the loss cross entropy here with stochastic gradient descent defun train digit fnn let optimizer first create the optimizer for minimize make instance segmented gd optimizer segmenter we train each weight lump with the same parameters and in fact the same optimizer but it need not be so in general constantly make instance sgd optimizer learning rate 1 momentum 0 9 batch size 100 fnn make digit fnn the number of instances the fnn can work with in parallel it s usually equal to the batch size or is a its divisor setf max n stripes fnn 50 initialize all weights randomly map segments lambda weights gaussian random nodes weights stddev 0 01 fnn arrange for training and test error to be logged monitor optimization periodically optimizer fn log test error period 10000 fn reset optimization monitors period 1000 finally start the optimization minimize optimizer dress fnn in a bp learner and attach monitors for the cost to it these monitors are going to be logged and reset after every 100 training instance by reset optimization monitors above make instance bp learner bpn fnn monitors make cost monitors fnn attributes event train training stops when the sampler runs out after 10000 instances dataset make sampler 10000 return a sampler object that produces max n samples number of random inputs numbers between 0 and 9 defun make sampler max n samples make instance function sampler max n samples max n samples generator lambda random n inputs log the test error also describe the optimizer and the bpn at the beginning of training called periodically during training see above defun log test error optimizer learner when zerop n instances optimizer describe optimizer describe bpn learner log padded monitor bpn results make sampler 1000 bpn learner make cost monitors bpn learner attributes event pred transcript follows repeatably let log time nil train digit fnn training at n instances 0 train cost 0 000e 0 0 segmented gd optimizer 100e112e93 segmented gd optimizer description n instances 0 optimizers sgd optimizer segment set weight size 15 1 1 norm 0 04473 weight size 3 1 1 norm 0 01850 weight size 50 1 1 norm 0 07159 weight size 5 1 1 norm 0 03056 100e335b73 100e06df83 segments weight hidden output activation size 15 1 1 norm 0 04473 weight bias output activation size 3 1 1 norm 0 01850 weight input hidden activation size 50 1 1 norm 0 07159 weight bias hidden activation size 5 1 1 norm 0 03056 sgd optimizer 100e06df83 gd optimizer description n instances 0 segment set segment set weight hidden output activation size 15 1 1 norm 0 04473 weight bias output activation size 3 1 1 norm 0 01850 weight input hidden activation size 50 1 1 norm 0 07159 weight bias hidden activation size 5 1 1 norm 0 03056 100e335b73 learning rate 1 00000e 0 momentum 9 00000e 1 momentum type normal weight decay 0 00000e 0 weight penalty 0 00000e 0 n after upate hook 0 batch size 100 batch gd optimizer description n before upate hook 0 digit fnn 100e11a423 bpn description clumps input input size 10 1 50 norm 0 00000 activation hidden activation activation stripes 1 50 clumps 4 relu hidden size 5 1 50 norm 0 00000 activation output activation activation stripes 1 50 clumps 4 softmax xe loss output size 3 1 50 norm 0 00000 n stripes 1 max n stripes 50 pred cost 1 100d 0 1000 00 training at n instances 1000 train cost 1 093d 0 1000 00 training at n instances 2000 train cost 5 886d 1 1000 00 training at n instances 3000 train cost 3 574d 3 1000 00 training at n instances 4000 train cost 1 601d 7 1000 00 training at n instances 5000 train cost 1 973d 9 1000 00 training at n instances 6000 train cost 4 882d 10 1000 00 training at n instances 7000 train cost 2 771d 10 1000 00 training at n instances 8000 train cost 2 283d 10 1000 00 training at n instances 9000 train cost 2 123d 10 1000 00 training at n instances 10000 train cost 2 263d 10 1000 00 pred cost 2 210d 10 1000 00 weight bias hidden activation size 5 1 1 norm 2 94294 weight input hidden activation size 50 1 1 norm 11 48995 weight bias output activation size 3 1 1 norm 3 39103 weight hidden output activation size 15 1 1 norm 11 39339 a id x 28mgl bp 3a 40mgl rnn 20mgl pax 3asection 29 a 11 3 4 recurrent neural nets a id x 28mgl bp 3a 40mgl rnn tutorial 20mgl pax 3asection 29 a rnn tutorial hopefully this example from example sum sign fnn lisp illustrates the concepts involved make sure you are comfortable with fnn tutorial 6b38 before reading this a id x 28mgl bp 3asum sig rnn 2elisp 20 28mgl pax 3ainclude 20 23p 22 2fhome 2fmelisgl 2fown 2fmgl 2fexample 2fsum sign rnn 2elisp 22 20 3aheader nl 20 22 60 60 60commonlisp 22 20 3afooter nl 20 22 60 60 60 22 29 29 a commonlisp cl defpackage mgl example sum sign rnn use common lisp mgl in package mgl example sum sign rnn there is a single input at each time step defparameter n inputs 1 and we want to learn the rule that outputs the sign of the sum of inputs so far in the sequence defparameter n outputs 3 generate a training example that s a sequence of random length between 1 and length elements of the sequence are lists of two elements 1 the input for the network a single random number 2 the sign of the sum of inputs so far encoded as 0 1 2 for negative zero and positive values to add a twist the sum is reset whenever a negative input is seen defun make sum sign instance key length 10 let length max 1 random length sum 0 loop for i below length collect let x 1 2 random 2 incf sum x when x 0 setq sum x list x cond minusp sum 0 zerop sum 1 t 2 build an rnn with a single lstm hidden layer and softmax output for each time step a sum sign fnn will be instantiated defun make sum sign rnn key n hiddens 1 build rnn build fnn class sum sign fnn input input size 1 h lstm input name h size n hiddens prediction softmax xe loss activation h name prediction size n outputs we define this class to be able to specialize how inputs are translated by adding a set input method later defclass sum sign fnn fnn we have a batch of instances from make sum sign instance for the rnn this function is invoked with elements of these instances belonging to the same time step i e at the same index and sets the input and target up defmethod set input instances fnn sum sign fnn let input nodes nodes find clump input fnn setf target find clump prediction fnn loop for stripe upfrom 0 for instance in instances collect sequences in the batch are not of equal length the rnn sends a nil our way if a sequence has run out when instance destructuring bind input target instance setf mref input nodes stripe 0 input target train the network by minimizing the loss cross entropy here with the adam optimizer defun train sum sign rnn let rnn make sum sign rnn setf max n stripes rnn 50 initialize the weights in the usual sqrt 1 fan in style map segments lambda weights let fan in mat dimension nodes weights 0 limit sqrt 6 fan in uniform random nodes weights limit 2 limit limit nodes weights rnn minimize monitor optimization periodically make instance adam optimizer learning rate 0 2 mean decay 0 9 mean decay decay 0 9 variance decay 0 9 batch size 100 fn log test error period 30000 fn reset optimization monitors period 3000 make instance bp learner bpn rnn monitors make cost monitors rnn dataset make sampler 30000 return a sampler object that produces max n samples number of random inputs defun make sampler max n samples key length 10 make instance function sampler max n samples max n samples generator lambda make sum sign instance length length log the test error also describe the optimizer and the bpn at the beginning of training called periodically during training see above defun log test error optimizer learner when zerop n instances optimizer describe optimizer describe bpn learner let rnn bpn learner log padded append monitor bpn results make sampler 1000 rnn make cost monitors rnn attributes event pred same result in a different way monitor predictions for sequences up to length 20 but don t unfold the rnn unnecessarily to save memory let warp time t monitor bpn results make sampler 1000 length 20 rnn just collect and reset the warp monitors after each batch of instances make cost monitors rnn attributes event warped pred verify that no further unfoldings took place assert length clumps rnn 10 log mat room transcript follows let backprop nets do not need double float using single floats is faster and needs less memory default mat ctype float enable moving data in and out of gpu memory so that the rnn can work with sequences so long that the unfolded network wouldn t otherwise fit in the gpu cuda window start time 1 log time nil seed the random number generators repeatably enable cuda if available with cuda train sum sign rnn training at n instances 0 cost 0 000e 0 0 adam optimizer 1006cd5663 gd optimizer description n instances 0 segment set segment set weight h size 1 1 1 norm 1 73685 weight h size 1 1 1 norm 0 31893 weight 1 2 peephole size 1 1 1 norm 1 81610 weight h 2 size 1 1 1 norm 0 21965 weight 1 3 peephole size 1 1 1 norm 1 74939 weight h 3 size 1 1 1 norm 0 40377 weight h prediction size 3 1 1 norm 2 15898 weight bias prediction size 3 1 1 norm 2 94470 weight 1 4 peephole size 1 1 1 norm 0 97601 weight input 4 size 1 1 1 norm 0 65261 weight bias 4 size 1 1 1 norm 0 37653 weight input 1 size 1 1 1 norm 0 92334 weight bias 1 size 1 1 1 norm 0 01609 weight input 5 size 1 1 1 norm 1 09995 weight bias 5 size 1 1 1 norm 1 41244 weight input 6 size 1 1 1 norm 0 40475 weight bias 6 size 1 1 1 norm 1 75358 1006cd8753 learning rate 2 00000e 1 momentum none momentum type none weight decay 0 00000e 0 weight penalty 0 00000e 0 n after upate hook 0 batch size 100 batch gd optimizer description n before upate hook 0 adam optimizer description mean decay rate 1 00000e 1 mean decay rate decay 9 00000e 1 variance decay rate 1 00000e 1 variance adjustment 1 00000d 7 rnn 10047c77e3 bpn description clumps sum sign fnn stripes 1 50 clumps 4 sum sign fnn stripes 1 50 clumps 4 n stripes 1 max n stripes 50 rnn description max lag 1 pred cost 1 223e 0 4455 00 warped pred cost 1 228e 0 9476 00 foreign memory usage foreign arrays 162 used bytes 39 600 cuda memory usage device arrays 114 used bytes 220 892 pooled bytes 19 200 host arrays 162 used bytes 39 600 host device copies 6 164 device host copies 4 490 training at n instances 3000 cost 3 323e 1 13726 00 training at n instances 6000 cost 3 735e 2 13890 00 training at n instances 9000 cost 1 012e 2 13872 00 training at n instances 12000 cost 3 026e 3 13953 00 training at n instances 15000 cost 9 267e 4 13948 00 training at n instances 18000 cost 2 865e 4 13849 00 training at n instances 21000 cost 8 893e 5 13758 00 training at n instances 24000 cost 2 770e 5 13908 00 training at n instances 27000 cost 8 514e 6 13570 00 training at n instances 30000 cost 2 705e 6 13721 00 pred cost 1 426e 6 4593 00 warped pred cost 1 406e 6 9717 00 foreign memory usage foreign arrays 216 used bytes 52 800 cuda memory usage device arrays 148 used bytes 224 428 pooled bytes 19 200 host arrays 216 used bytes 52 800 host device copies 465 818 device host copies 371 990 weight h h output size 1 1 1 norm 0 10624 weight h h cell size 1 1 1 norm 0 94460 weight h cell h forget peephole size 1 1 1 norm 0 61312 weight h h forget size 1 1 1 norm 0 38093 weight h cell h input peephole size 1 1 1 norm 1 17956 weight h h input size 1 1 1 norm 0 88011 weight h prediction size 3 1 1 norm 49 93808 weight bias prediction size 3 1 1 norm 10 98112 weight h cell h output peephole size 1 1 1 norm 0 67996 weight input h output size 1 1 1 norm 0 65251 weight bias h output size 1 1 1 norm 10 23003 weight input h cell size 1 1 1 norm 5 98116 weight bias h cell size 1 1 1 norm 0 10681 weight input h forget size 1 1 1 norm 4 46301 weight bias h forget size 1 1 1 norm 1 57195 weight input h input size 1 1 1 norm 0 36401 weight bias h input size 1 1 1 norm 8 63833 a id x 28mgl bp 3arnn 20class 29 a class rnn bpn 5187 a recurrent neural net as opposed to a feed forward one it is typically built with build rnn 764b that s no more than a shallow convenience macro an rnn takes instances as inputs that are sequences of variable length at each time step the next unprocessed elements of these sequences are set as input until all input sequences in the batch run out to be able to perform backpropagation all intermediate lump c1ac s must be kept around so the recursive connections are transformed out by unfolding http en wikipedia org wiki backpropagation through time the network just how many lumps this means depends on the length of the sequences when an rnn is created max lag 1 bpn 5187 s are instantiated so that all weights are present and one can start training it a id x 28mgl bp 3aunfolder 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 a reader unfolder rnn unfolder the unfolder of an rnn b0f3 is function of no arguments that builds and returns a bpn 5187 the unfolder is allowed to create networks with arbitrary topology even different ones for different time step 6e96 s with the help of lag ff5a or nested rnn s weights of the same name are shared between the folds that is if a weight b76f lump were to be created and a weight lump of the same name already exists then the existing lump will be added to the bpn created by unfolder a id x 28mgl bp 3amax lag 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 a reader max lag rnn max lag 1 the networks built by unfolder 8e53 may contain new weights up to time step max lag beyond that point all weight lumps must be reappearances of weight lumps with the same name at previous time steps most recurrent networks reference only the state of lumps at the previous time step with the function lag ff5a hence the default of 1 but it is possible to have connections to arbitrary time steps the maximum connection lag must be specified when creating the rnn b0f3 a id x 28mgl bp 3acuda window start time 20 28mgl pax 3aaccessor 20mgl bp 3arnn 29 29 a accessor cuda window start time rnn cuda window start time cuda window start time due to unfolding the memory footprint of an rnn b0f3 is almost linear in the number of time steps i e the max sequence length for prediction this is addressed by time warp d0e3 for training we cannot discard results of previous time steps because they are needed for backpropagation but we can at least move them out of gpu memory if they are not going to be used for a while and copy them back before they are needed obviously this is only relevant if cuda is being used if cuda window start time is nil then this feature is turned off else during training at cuda window start time or later time steps matrices belonging to non weight lumps may be forced out of gpu memory and later brought back as neeeded this feature is implemented in terms of mgl mat with syncing cuda facets that uses cuda host memory also known as page locked or pinned memory to do asynchronous copies concurrently with normal computation the consequence of this is that it is now main memory usage that s unbounded which toghether with page locking makes it a potent weapon to bring a machine to a halt you were warned a id x 28mgl bp 3a 2acuda window start time 2a 20variable 29 a variable cuda window start time nil the default for cuda window start time f573 a id x 28mgl bp 3abuild rnn 20mgl pax 3amacro 29 a macro build rnn key rnn class rnn name initargs max n stripes max lag 1 body body create an rnn with max n stripes and max lag whose unfolder 8e53 is body wrapped in a lambda bind symbol given as the rnn argument to the rnn object so that body can see it a id x 28mgl bp 3alag 20function 29 a function lag name key lag 1 rnn path in rnn or if it s nil the rnn being extended with another bpn 5187 called unfolding look up the clump a4fe with name in the bpn that s lag number of time steps before the bpn being added if this function is called from unfolder 8e53 of an rnn which is what happens behind the scene in the body of build rnn 764b then it returns an opaque object representing a lagged connection to a clump else it returns the clump itself fixdoc path a id x 28mgl bp 3atime step 20function 29 a function time step key rnn rnn return the time step rnn is currently executing or being unfolded for it is 0 when the rnn is being unfolded for the first time a id x 28mgl core 3aset input 20 28method 20nil 20 28t 20mgl bp 3arnn 29 29 29 a method set input instances rnn rnn rnn s operate on batches of instances just like fnn 9de4 s but the instances here are like datasets sequences or samplers and they are turned into sequences of batches of instances with map datasets 0 765c 1 17d3 impute nil the batch of instances at index 2 is clamped onto the bpn 5187 at time step 2 with set input when the input sequences in the batch are not of the same length already exhausted sequences will produce nil due to impute nil above when such a nil is clamped with set input on a bpn of the rnn set input must set the importance 038e of the error lumps to 0 else training would operate on the noise left there by previous invocations a id x 28mgl bp 3a 40mgl rnn time warp 20mgl pax 3asection 29 a time warp the unbounded memory usage of rnn b0f3 s with one bpn 5187 allocated per time step can become a problem for training where the gradients often have to be backpropagated from the last time step to the very beginning this is hard to solve but with cuda window start time f573 the limit is no longer gpu memory for prediction on the other hand one doesn t need to keep old steps around indefinitely they can be discarded when future time steps will never reference them again a id x 28mgl bp 3a 2awarp time 2a 20variable 29 a variable warp time nil controls whether warping is enabled see time warp d0e3 don t enable it for training as it would make backprop impossible a id x 28mgl bp 3awarped time 20function 29 a function warped time key rnn rnn time time step rnn rnn lag 0 return the index of the bpn 5187 in clumps 0 f7c1 1 a4fe of rnn whose task it is to execute computation at time step rnn lag this is normally the same as time step 6e96 disregarding lag that is clumps can be indexed by time step to get the bpn however when warp time ed4f is true execution proceeds in a cycle as the structure of the network allows suppose we have a typical rnn that only ever references the previous time step so its max lag 084d is 1 its unfolder 8e53 returns bpn s of identical structure bar a shift in their time lagged connections except for the very first so warp start d6e0 and warp length 51d5 are both 1 if warp time is nil then the mapping from time step to the bpn in clumps is straightforward time 0 1 2 3 4 5 warped 0 1 2 3 4 5 bpn b0 b1 b2 b3 b4 b5 when warp time is true we reuse the b1 b2 bpns in a loop time 0 1 2 3 4 5 warped 0 1 2 1 2 1 bpn b0 b1 b2 b1 b2 b1 b1 is the same bpn as b1 but its connections created by lag go through warped time and end up referencing b2 this way memory consumption is independent of the number time steps needed to process a sequence or make predictions to be able to pull this trick off warp start and warp length must be specified when the rnn is instantiated in general with warp time warp start max 2 warp length bpns are needed the 2 comes from the fact that with cycle length 1 a bpn would need to takes its input from itself which is problematic because it has nodes cc1c for only one set of values a id x 28mgl bp 3awarp start 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 a reader warp start rnn warp start 1 the time step 6e96 from which unfolder 8e53 will create bpn 5187 s that essentially repeat every warp length 51d5 steps a id x 28mgl bp 3awarp length 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 a reader warp length rnn warp length 1 an integer such that the bpn 5187 unfolder 8e53 creates at time step i where warp start i is identical to the bpn created at time step warp start mod i warp start warp length except for a shift in its time lagged connections a id x 28mgl bp 3astep monitors 20 28mgl pax 3aaccessor 20mgl bp 3arnn 29 29 a accessor step monitors rnn step monitors nil during training unfolded bpn 5187 s corresponding to previous time steps may be expensive to get at because they are no longer in gpu memory this consideration also applies to making prediction with the additional caveat that with warp time ed4f true previous states are discarded so it s not possible to gather statistics after forward c1ae finished add monitor objects to this slot and they will be automatically applied to the rnn b0f3 after each step when forward ing the rnn during training or prediction to be able to easily switch between sets of monitors in addition to a list of monitors this can be a symbol or a function too if it s a symbol then its a designator for its symbol value 920f if it s a function then it must have no arguments and it s a designator for its return value a id x 28mgl bp 3a 40mgl bp lumps 20mgl pax 3asection 29 a 11 4 lumps a id x 28mgl bp 3a 40mgl bp lump 20mgl pax 3asection 29 a 11 4 1 lump base class a id x 28mgl bp 3alump 20class 29 a class lump clump a4fe a lump is a simple layerlike component of a neural network there are many kinds of lumps each of which performs a specific operation or just stores inputs and weights by convention the names of lumps start with the prefix defined as classes they also have a function of the same name as the class to create them easily these maker functions typically have keyword arguments corresponding to initargs of the class with some mainly the input lumps turned into normal positional arguments so instead of having to do make instance tanh x some input name my tanh one can simply write tanh some input name my tanh lumps instantiated in any way within a build fnn 606c or build rnn 764b are automatically added to the network being built a lump has its own nodes cc1c and derivatives a81b matrices allocated for it in which the results of the forward and backward passes are stored this is in contrast to a bpn 5187 whose nodes and derivatives are those of its last constituent clump a4fe since lumps almost always live within a bpn their n stripes 07fb and max n stripes 91a3 are handled automagically behind the scenes a id x 28mgl common 3asize 20 28mgl pax 3areader 20mgl bp 3alump 29 29 a reader size lump size the number of values in a single stripe a id x 28mgl common 3adefault value 20 28mgl pax 3areader 20mgl bp 3alump 29 29 a reader default value lump default value 0 upon creation or resize the lump s nodes get filled with this value a id x 28mgl bp 3adefault size 20generic function 29 a generic function default size lump return a default for the size 85d3 of lump if one is not supplied at instantiation the value is often computed based on the sizes of the inputs this function is for implementing new lump types a id x 28mgl common 3anodes 20 28mgl pax 3areader 20mgl bp 3alump 29 29 a reader nodes lump nil the values computed by the lump in the forward pass are stored here it is an n stripes size matrix that has storage allocated for max n stripes size elements for non weight lumps weight b76f lumps have no stripes nor restrictions on their shape a id x 28mgl bp 3aderivatives 20 28mgl pax 3areader 20mgl bp 3alump 29 29 a reader derivatives lump the derivatives computed in the backward pass are stored here this matrix is very much like nodes d699 in shape and size a id x 28mgl bp 3a 40mgl bp inputs 20mgl pax 3asection 29 a 11 4 2 inputs a id x 28mgl bp 3a 40mgl bp input lump 20mgl pax 3asection 29 a input lump a id x 28mgl bp 3a 3einput 20class 29 a class input dropout 441b a lump that has no input lumps does not change its values in the forward pass except when dropout e7f6 is non zero and does not compute derivatives clamp inputs on nodes cc1c of input lumps in set input 0c9e for convenience input can perform dropout itself although it defaults to no dropout common lisp input size 10 name some input input some input size 10 1 1 norm 0 00000 a id x 28mgl bp 3adropout 20 28mgl pax 3aaccessor 20mgl bp 3a 3einput 29 29 a accessor dropout input nil see dropout 2481 a id x 28mgl bp 3a 40mgl bp embedding lump 20mgl pax 3asection 29 a embedding lump this lump is like an input and a simple activation molded together in the name of efficiency a id x 28mgl bp 3a 3eembedding 20class 29 a class embedding lump c1ac select rows of weights 0 3c96 1 a896 one row for each index in input row indices 5e58 this lump is equivalent to adding an input f54e lump with a one hot encoding scheme and a v m dbc4 lump on top of it but it is more efficient in execution and in memory usage because it works with a sparse representation of the input the size 0 85d3 1 b99d of this lump is the number of columns of weights which is determined automatically common lisp embedding weights weight name embedding weights dimensions 3 5 name embeddings embedding embeddings size 5 1 1 norm 0 00000 a id x 28mgl common 3aweights 20 28mgl pax 3areader 20mgl bp 3a 3eembedding 29 29 a reader weights embedding weights a weight lump whose rows indexed by input row indices 5e58 are copied to the output of this lump a id x 28mgl bp 3ainput row indices 20 28mgl pax 3areader 20mgl bp 3a 3eembedding 29 29 a reader input row indices embedding input row indices a sequence of batch size length of row indices to be set in set input 0c9e a id x 28mgl bp 3a 40mgl bp weight lump 20mgl pax 3asection 29 a 11 4 3 weight lump a id x 28mgl bp 3a 3eweight 20class 29 a class weight lump c1ac a set of optimizable parameters of some kind when a bpn 5187 is is trained see training 0d82 the nodes cc1c of weight lumps will be changed weight lumps perform no computation weights can be created by specifying the total size or the dimensions common lisp dimensions weight size 10 name w 1 10 dimensions weight dimensions 5 10 name w 5 10 a id x 28mgl bp 3adimensions 20 28mgl pax 3areader 20mgl bp 3a 3eweight 29 29 a reader dimensions weight dimensions nodes cc1c and derivatives a81b of this lump will be allocated with these dimensions a id x 28mgl bp 3awith weights copied 20mgl pax 3amacro 29 a macro with weights copied from bpn body body in body weight b76f will first look up if a weight lump of the same name exists in from bpn and return that or else create a weight lump normally if from bpn is nil then no weights are copied a id x 28mgl bp 3a 40mgl bp activations 20mgl pax 3asection 29 a 11 4 4 activations a id x 28mgl bp 3a 40mgl bp activation subnet 20mgl pax 3asection 29 a activation subnet so we have some inputs usually the next step is to multiply the input vector with a weight matrix and add biases this can be done directly with v m dbc4 and weight b76f but it s more convenient to use activation subnets to reduce the clutter a id x 28mgl bp 3a 3eactivation 20class 29 a class activation bpn 5187 activation subnetworks are built by the function activation b602 and they have a number of lumps hidden inside them ultimately this subnetwork computes a sum like sum i x i w i sum j y j v j biases where x i are input lumps w i are dense matrices representing connections while v j are peephole connection vectors that are mulitplied in an elementwise manner with their corresponding input y j a id x 28mgl bp 3a 3eactivation 20function 29 a function activation inputs key name gensym size peepholes add bias p t create a subnetwork of class activation 7162 that computes the over activation from dense connection from lumps in inputs and elementwise connection from lumps in peepholes create new weight b76f lumps as necessary inputs and peepholes can be a single lump or a list of lumps finally if add bias p then add an elementwise bias too size must be specified explicitly because it is not possible to determine it unless there are peephole connections common lisp activation input size 10 name input name h1 size 4 activation h1 activation stripes 1 1 clumps 4 this is the basic workhorse of neural networks which takes care of the linear transformation whose results and then fed to some non linearity sigmoid 83f9 tanh 5309 etc the name of the subnetwork clump is name activation the bias weight lump if any is named bias name dense connection weight lumps are named are named after the input and name name input name while peepholes weight lumps are named name input name peephole this is useful to know if for example they are to be initialized differently a id x 28mgl bp 3a 40mgl bp batch normalization 20mgl pax 3asection 29 a batch normalization a id x 28mgl bp 3a 3ebatch normalized 20class 29 a class batch normalized lump c1ac this is an implementation of v3 of the batch normalization paper http arxiv org abs 1502 03167 the output of batch normalized is its input normalized so that for all elements the mean across stripes is zero and the variance is 1 that is the mean of the batch is subtracted from the inputs and they are rescaled by their sample stddev actually after the normalization step the values are rescaled and shifted but this time with learnt parameters in order to keep the representational power of the model the same the primary purpose of this lump is to speed up learning but it also acts as a regularizer see the paper for the details to normalize the output of lump without no additional regularizer effect commonlisp batch normalized lump batch size use population the above uses an exponential moving average to estimate the mean and variance of batches and these estimations are used at both training and test time in contrast to this the published version uses the sample mean and variance of the current batch at training time which injects noise into the process the noise is higher for lower batch sizes and has a regularizing effect this is the default behavior equivalent to batch size nil commonlisp batch normalized lump for performance reasons one may wish to process a higher number of instances in a batch in the sense of n stripes 8dd7 and get the regularization effect associated with a lower batch size this is possible by setting batch size to a divisor of the the number of stripes say the number of stripes is 128 but we want as much regularization as we would get with 32 commonlisp batch normalized lump batch size 32 the primary input of batch normalized is often an activation 0 7162 1 b602 and its output is fed into an activation function see activation functions 5d86 a id x 28mgl bp 3abatch normalization 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalized 29 29 a reader batch normalization batch normalized normalization the batch normalization c469 of this lump may be shared between multiple batch normalized 9da9 lumps batch normalization is special in that it has state apart from the computed results nodes cc1c and its derivatives derivatives a81b this state is the estimated mean and variance of its inputs and they are encapsulated by batch normalization if normalization is not given at instantiation then a new batch normalization object will be created automatically passing batch size variance adjustment and population decay arguments on to batch normalization see batch size c918 variance adjustment aa86 and population decay 46c4 new scale and shift weight lumps will be created with names name scale name shift where name is the name 0 0414 1 f3bc of this lump this default behavior covers the use case where the statistics kept by batch normalization are to be shared only between time steps of an rnn b0f3 a id x 28mgl bp 3a 3ebatch normalization 20class 29 a class batch normalization weight b76f the primary purpose of this class is to hold the estimated mean and variance of the inputs to be normalized and allow them to be shared between multiple batch normalized 9da9 lumps that carry out the computation these estimations are saved and loaded by save state c102 and load state 6bd7 commonlisp batch normalization weight name h1 scale size 10 weight name h1 shift size 10 name h1 batch normalization a id x 28mgl common 3ascale 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 a reader scale batch normalization scale a weight lump of the same size as shift 7960 this is gamma in the paper a id x 28mgl bp 3ashift 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 a reader shift batch normalization shift a weight lump of the same size as scale c9d1 this is beta in the paper a id x 28mgl common 3abatch size 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 a reader batch size batch normalization batch size nil normally all stripes participate in the batch lowering the number of stripes may increase the regularization effect but it also makes the computation less efficient by setting batch size to a divisor of n stripes 8dd7 one can decouple the concern of efficiency from that of regularization the default value nil is equivalent to n stripes batch size only affects training with the special value use population instead of the mean and the variance of the current batch use the population statistics for normalization this effectively cancels the regularization effect leaving only the faster learning a id x 28mgl gd 3avariance adjustment 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 a reader variance adjustment batch normalization variance adjustment 1 0e 4 a small positive real number that s added to the sample variance this is epsilon in the paper a id x 28mgl bp 3apopulation decay 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 a reader population decay batch normalization population decay 0 99 while training an exponential moving average of batch means and standard deviances termed population statistics is updated when making predictions normalization is performed using these statistics these population statistics are persisted by save state c102 a id x 28mgl bp 3a 3ebatch normalized activation 20function 29 a function batch normalized activation inputs key name gensym size peepholes batch size variance adjustment population decay a utility functions that creates and wraps an activation 0 7162 1 b602 in batch normalized 9da9 and with its batch normalization eaf1 the two weight lumps for the scale and shift parameters batch normalized activation inputs name h1 size 10 is equivalent to commonlisp batch normalized activation inputs name h1 size 10 add bias p nil name h1 batch normalized activation note how biases are turned off since normalization will cancel them anyway but a shift is added which amounts to the same effect a id x 28mgl bp 3a 40mgl bp activation functions 20mgl pax 3asection 29 a 11 4 5 activation functions now we are moving on to the most important non linearities to which activations are fed a id x 28mgl bp 3a 40mgl bp sigmoid lump 20mgl pax 3asection 29 a sigmoid lump a id x 28mgl bp 3a 3esigmoid 20class 29 a class sigmoid dropout 441b lump c1ac applies the 1 1 e x function elementwise to its inputs this is one of the classic non linearities for neural networks for convenience sigmoid can perform dropout itself although it defaults to no dropout common lisp sigmoid activation input size 10 size 5 name this sigmoid this size 5 1 1 norm 0 00000 the size 0 85d3 1 b99d of this lump is the size of its input which is determined automatically a id x 28mgl bp 3adropout 20 28mgl pax 3aaccessor 20mgl bp 3a 3esigmoid 29 29 a accessor dropout sigmoid nil see dropout 2481 a id x 28mgl bp 3a 40mgl bp tanh lump 20mgl pax 3asection 29 a tanh lump a id x 28mgl bp 3a 3etanh 20class 29 a class tanh lump c1ac applies the tanh 7582 function to its input in an elementwise manner the size 0 85d3 1 b99d of this lump is the size of its input which is determined automatically a id x 28mgl bp 3a 40mgl bp scaled tanh lump 20mgl pax 3asection 29 a scaled tanh lump a id x 28mgl bp 3a 3escaled tanh 20class 29 a class scaled tanh lump c1ac pretty much like tanh 7582 but its input and output is scaled in such a way that the variance of its output is close to 1 if the variance of its input is close to 1 which is a nice property to combat vanishing gradients the actual function is 1 7159 tanh 2 3 x the size 0 85d3 1 b99d of this lump is the size of its input which is determined automatically a id x 28mgl bp 3a 40mgl bp relu lump 20mgl pax 3asection 29 a relu lump we are somewhere around year 2007 by now a id x 28mgl bp 3a 3erelu 20class 29 a class relu lump c1ac max 0 x activation function be careful relu units can get stuck in the off state if they move to far to negative territory it can be very difficult to get out of it the size 0 85d3 1 b99d of this lump is the size of its input which is determined automatically a id x 28mgl bp 3a 40mgl bp max lump 20mgl pax 3asection 29 a max lump we are in about year 2011 a id x 28mgl bp 3a 3emax 20class 29 a class max lump c1ac this is basically maxout without dropout see http arxiv org abs 1302 4389 it groups its inputs by group size 59dd and outputs the maximum of each group the size 0 85d3 1 b99d of the output is automatically calculated it is the size of the input divided by group size 59dd common lisp max input size 120 group size 3 name my max max my max size 40 1 1 norm 0 00000 group size 3 the advantage of max over relu 9d3a is that flow gradient is never stopped so there is no problem of units getting stuck in off state a id x 28mgl common 3agroup size 20 28mgl pax 3areader 20mgl bp 3a 3emax 29 29 a reader group size max group size the number of inputs in each group a id x 28mgl bp 3a 40mgl bp min lump 20mgl pax 3asection 29 a min lump a id x 28mgl bp 3a 3emin 20class 29 a class min lump c1ac same as max f652 but it computes the min 40e2 of groups rarely useful a id x 28mgl common 3agroup size 20 28mgl pax 3areader 20mgl bp 3a 3emin 29 29 a reader group size min group size the number of inputs in each group a id x 28mgl bp 3a 40mgl bp max channel lump 20mgl pax 3asection 29 a max channel lump a id x 28mgl bp 3a 3emax channel 20class 29 a class max channel lump c1ac called lwta local winner take all or channel out see http arxiv org abs 1312 1909 in the literature it is basically max f652 but instead of producing one output per group it just produces zeros for all unit but the one with the maximum value in the group this allows the next layer to get some information about the path along which information flowed the size 0 85d3 1 b99d of this lump is the size of its input which is determined automatically a id x 28mgl common 3agroup size 20 28mgl pax 3areader 20mgl bp 3a 3emax channel 29 29 a reader group size max channel group size the number of inputs in each group a id x 28mgl bp 3a 40mgl bp losses 20mgl pax 3asection 29 a 11 4 6 losses ultimately we need to tell the network what to learn which means that the loss function to be minimized needs to be constructed as part of the network a id x 28mgl bp 3a 40mgl bp loss lump 20mgl pax 3asection 29 a loss lump a id x 28mgl bp 3a 3eloss 20class 29 a class loss sum edcf calculate the loss for the instances in the batch the main purpose of this lump is to provide a training signal an error lump is usually a leaf in the graph of lumps i e there are no other lumps whose input is this one the special thing about error lumps is that 1 but see importance 038e is added automatically to their derivatives error lumps have exactly one node per stripe whose value is computed as the sum of nodes in their input lump a id x 28mgl bp 3aimportance 20 28mgl pax 3aaccessor 20mgl bp 3a 3eloss 29 29 a accessor importance loss importance nil this is to support weighted instances that is when not all training instances are equally important if non nil a 1d mat with the importances of stripes of the batch when importance is given typically in set input 0c9e then instead of adding 1 to the derivatives of all stripes importance is added elemtwise a id x 28mgl bp 3a 40mgl bp squared difference lump 20mgl pax 3asection 29 a squared difference lump in regression the squared error loss is most common the squared error loss can be constructed by combining squared difference e8d2 with a loss 2171 a id x 28mgl bp 3a 3esquared difference 20class 29 a class squared difference lump c1ac this lump takes two input lumps and calculates their squared difference x y 2 in an elementwise manner the size 0 85d3 1 b99d of this lump is automatically determined from the size of its inputs this lump is often fed into loss 2171 that sums the squared differences and makes it part of the function to be minimized common lisp loss squared difference activation input size 100 size 10 input name target size 10 name squared error loss squared error size 1 1 1 norm 0 00000 currently this lump is not cudaized but it will copy data from the gpu if it needs to a id x 28mgl bp 3a 40mgl bp softmax xe loss lump 20mgl pax 3asection 29 a softmax cross entropy loss lump a id x 28mgl bp 3a 3esoftmax xe loss 20class 29 a class softmax xe loss lump c1ac a specialized lump that computes the softmax of its input in the forward pass and backpropagates a cross entropy loss the advantage of doing these together is numerical stability the total cross entropy is the sum of cross entropies per group of group size a437 elements xe x sum i 1 g t i ln s i where g is the number of classes group size a437 t i are the targets i e the true probabilities of the class often all zero but one s i is the output of softmax calculated from input x s i softmax x 1 x 2 x g frac e x i sum j 1 g e x j in other words in the forward phase this lump takes input x computes its elementwise exp 45be normalizes each group of group size a437 elements to sum to 1 to get the softmax which is the result that goes into nodes cc1c in the backward phase there are two sources of gradients the lumps that use the output of this lump as their input currently not implemented and would result in an error and an implicit cross entropy loss one can get the cross entropy calculated in the most recent forward pass by calling cost 410c on this lump this is the most common loss function for classification in fact it is nearly ubiquitous see the fnn tutorial 6b38 and the rnn tutorial 9700 for how this loss and set input 0c9e work together a id x 28mgl common 3agroup size 20 28mgl pax 3areader 20mgl bp 3a 3esoftmax xe loss 29 29 a reader group size softmax xe loss group size the number of elements in a softmax group this is the number of classes for classification often group size is equal to size 0 85d3 1 b99d it is the default but in general the only constraint is that size is a multiple of group size a id x 28mgl common 3atarget 20 28mgl pax 3aaccessor 20mgl bp 3a 3esoftmax xe loss 29 29 a accessor target softmax xe loss target nil set in set input 0c9e this is either a mat of the same size as the input lump x or if the target is very sparse this can also be a sequence of batch size length that contains the index value pairs of non zero entries first instance in batch has two non zero targets class 10 has 30 expected probability 10 0 3 class 2 has 70 expected probability 2 0 7 second instance in batch puts 100 on class 7 7 more instances in the batch follow actually in the rare case where group size a437 is not size 0 85d3 1 b99d i e there are several softmax normalization groups for every example the length of the above target sequence is batch size 0 8916 1 ba18 2 c918 n groups indices are always relative to the start of the group if group size a437 is large for example in neural language models with a huge number of words using sparse targets can make things go much faster because calculation of the derivative is no longer quadratic giving different weights to training instances is implicitly supported while target values in a group should sum to 1 multiplying all target values with a weight w is equivalent to training that w times on the same example a id x 28mgl bp 3aensure softmax target matrix 20function 29 a function ensure softmax target matrix softmax xe loss n set target b5c7 of softmax xe loss to a mat capable of holding the dense target values for n stripes a id x 28mgl bp 3a 40mgl bp stochasticity 20mgl pax 3asection 29 a 11 4 7 stochasticity a id x 28mgl bp 3a 40mgl bp dropout lump 20mgl pax 3asection 29 a dropout lump a id x 28mgl bp 3a 3edropout 20class 29 a class dropout lump c1ac the output of this lump is identical to its input except it randomly zeroes out some of them during training which act as a very strong regularizer see geoffrey hinton s improving neural networks by preventing co adaptation of feature detectors the size 0 85d3 1 b99d of this lump is the size of its input which is determined automatically a id x 28mgl bp 3adropout 20 28mgl pax 3aaccessor 20mgl bp 3a 3edropout 29 29 a accessor dropout dropout dropout 0 5 if non nil then in the forward pass zero out each node in this chunk with dropout probability a id x 28mgl bp 3a 40mgl bp gaussian random lump 20mgl pax 3asection 29 a gaussian random lump a id x 28mgl bp 3a 3egaussian random 20class 29 a class gaussian random lump c1ac this lump has no input it produces normally distributed independent random numbers with mean d96a and variance 404c or variance for prediction 80e2 this is useful building block for noise based regularization methods common lisp gaussian random size 10 name normal mean 1 variance 2 gaussian random normal size 10 1 1 norm 0 00000 a id x 28mgl bp 3amean 20 28mgl pax 3aaccessor 20mgl bp 3a 3egaussian random 29 29 a accessor mean gaussian random mean 0 the mean of the normal distribution a id x 28mgl bp 3avariance 20 28mgl pax 3aaccessor 20mgl bp 3a 3egaussian random 29 29 a accessor variance gaussian random variance 1 the variance of the normal distribution a id x 28mgl bp 3avariance for prediction 20 28mgl pax 3aaccessor 20mgl bp 3a 3egaussian random 29 29 a accessor variance for prediction gaussian random variance for prediction 0 if not nil then this value overrides variance 404c when not in training i e when making predictions a id x 28mgl bp 3a 40mgl bp sample binary lump 20mgl pax 3asection 29 a binary sampling lump a id x 28mgl bp 3a 3esample binary 20class 29 a class sample binary lump c1ac treating values of its input as probabilities sample independent binomials turn true into 1 and false into 0 the size 0 85d3 1 b99d of this lump is determined automatically from the size of its input common lisp sample binary input size 10 name binarized input sample binary binarized input size 10 1 1 norm 0 00000 a id x 28mgl bp 3a 40mgl bp arithmetic 20mgl pax 3asection 29 a 11 4 8 arithmetic a id x 28mgl bp 3a 40mgl bp sum lump 20mgl pax 3asection 29 a sum lump a id x 28mgl bp 3a 3esum 20class 29 a class sum lump c1ac computes the sum of all nodes of its input per stripe this size 0 85d3 1 b99d of this lump is always 1 a id x 28mgl bp 3a 40mgl bp v 2am lump 20mgl pax 3asection 29 a vector matrix multiplication lump a id x 28mgl bp 3a 3ev 2am 20class 29 a class v m lump c1ac perform x weights where x the input is of size m and weights 0 3c96 1 a896 is a weight b76f whose single stripe is taken to be of dimensions m x n stored in row major order n is the size of this lump if transpose weights p 533e then weights is n x m and x weights is computed a id x 28mgl common 3aweights 20 28mgl pax 3areader 20mgl bp 3a 3ev 2am 29 29 a reader weights v m weights a weight b76f lump a id x 28mgl bp 3atranspose weights p 20 28mgl pax 3areader 20mgl bp 3a 3ev 2am 29 29 a reader transpose weights p v m transpose weights p nil determines whether the input is multiplied by weights 0 3c96 1 a896 or its transpose a id x 28mgl bp 3a 40mgl bp 2b lump 20mgl pax 3asection 29 a elementwise addition lump a id x 28mgl bp 3a 3e 2b 20class 29 a class lump c1ac performs elementwise addition on its input lumps the size 0 85d3 1 b99d of this lump is automatically determined from the size of its inputs if there is at least one if one of the inputs is a weight b76f lump then it is added to every stripe common lisp list input size 10 weight size 10 name bias name plus plus size 10 1 1 norm 0 00000 a id x 28mgl bp 3a 40mgl bp 2a lump 20mgl pax 3asection 29 a elementwise multiplication lump a id x 28mgl bp 3a 3e 2a 20class 29 a class lump c1ac performs elementwise multiplication on its two input lumps the size 0 85d3 1 b99d of this lump is automatically determined from the size of its inputs either input can be a weight b76f lump common lisp input size 10 weight size 10 name scale name mult mult size 10 1 1 norm 0 00000 a id x 28mgl bp 3a 40mgl bp abs lump 20mgl pax 3asection 29 a abs lump a id x 28mgl bp 3a 3eabs 20class 29 a class abs lump c1ac a id x 28mgl bp 3a 40mgl bp exp lump 20mgl pax 3asection 29 a exp lump a id x 28mgl bp 3a 3eexp 20class 29 a class exp lump c1ac a id x 28mgl bp 3a 40mgl bp normalized lump 20mgl pax 3asection 29 a normalized lump a id x 28mgl bp 3a 3enormalized 20class 29 a class normalized lump c1ac a id x 28mgl bp 3a 40mgl bp rnn operations 20mgl pax 3asection 29 a 11 4 9 operations for rnn s a id x 28mgl bp 3a 40mgl bp lstm subnet 20mgl pax 3asection 29 a lstm subnet a id x 28mgl bp 3a 3elstm 20class 29 a class lstm bpn 5187 long short term memory subnetworks are built by the function lstm 2823 and they have many lumps hidden inside them these lumps are packaged into a subnetwork to reduce clutter a id x 28mgl bp 3a 3elstm 20function 29 a function lstm inputs key name cell init output init size activation fn activation gate fn sigmoid input fn tanh output fn tanh peepholes t create an lstm layer consisting of input forget output gates with which input cell state and output are scaled lots of lumps are created the final one representing to output of the lstm has name the rest of the lumps are named automatically based on name this function returns only the output lump m but all created lumps are added automatically to the bpn 5187 being built there are many papers and tutorials on lstms this version is well described in long short term memory recurrent neural network architectures for large scale acoustic modeling 2014 hasim sak andrew senior francoise beaufays using the notation from that paper i t s w ix x t w im m t 1 w ic odot c t 1 b i f t s w fx x t w fm m t 1 w fc odot c t 1 b f c t f t odot c t 1 i t odot g w cx x t w cm m t 1 b c o t s w ox x t w om m t 1 w oc odot c t b o m t o t odot h c t where i f and o are the input forget and output gates c is the cell state and m is the actual output weight matrices for connections from c w ic w fc and w oc are diagonal and represented by just the vector of diagonal values these connections are only added if peepholes is true a notable difference from the paper is that in addition to being a single lump x t inputs can also be a list of lumps whenever some activation is to be calculated based on x t it is going to be the sum of individual activations for example w ix x t is really sum j w ijx inputs j if cell init is non nil then it must be a clump a4fe of size form which stands for the initial state of the value cell c 1 cell init being nil is equivalent to the state of all zeros activation fn defaults to activation 0 7162 1 b602 but it can be for example batch normalized activation 0f0f in general functions like the aforementioned two with signature like inputs key name size peepholes can be passed as activation fn a id x 28mgl bp 3a 40mgl bp seq barrier lump 20mgl pax 3asection 29 a sequence barrier lump a id x 28mgl bp 3a 3eseq barrier 20class 29 a class seq barrier lump c1ac in an rnn b0f3 processing of stripes instances in the batch may require different number of time step so the final state for stripe 0 is in stripe 0 of some lump l at time step 7 while for stripe 1 it is in stripe 1 of sump lump l at time step 42 this lump copies the per stripe states from different lumps into a single lump so that further processing can take place typically when the rnn is embedded in another network the size 0 85d3 1 b99d of this lump is automatically set to the size of the lump returned by funcall seq elt fn 0 a id x 28mgl bp 3aseq elt fn 20 28mgl pax 3areader 20mgl bp 3a 3eseq barrier 29 29 a reader seq elt fn seq barrier seq elt fn a function of an index argument that returns the lump with that index in some sequence a id x 28mgl bp 3aseq indices 20 28mgl pax 3aaccessor 20mgl bp 3a 3eseq barrier 29 29 a accessor seq indices seq barrier a sequence of length batch size of indices the element at index i is the index to be passed to seq elt fn 29c0 to find the lump whose stripe i is copied to stripe i of this this lump a id x 28mgl bp 3a 40mgl bp utilities 20mgl pax 3asection 29 a 11 5 utilities a id x 28mgl bp 3arenormalize activations 20function 29 a function renormalize activations v m lumps l2 upper bound if the l2 norm of the incoming weight vector of the a unit is larger than l2 upper bound then renormalize it to l2 upper bound the list of v m lumps is assumed to be eventually fed to the same lump to use it group the activation clumps into the same gd optimizer and hang this function on after update hook 124f that latter of which is done for you arrange for renormalizing activations 8b55 see improving neural networks by preventing co adaptation of feature detectors hinton 2012 http arxiv org pdf 1207 0580 pdf a id x 28mgl bp 3aarrange for renormalizing activations 20function 29 a function arrange for renormalizing activations bpn optimizer l2 upper bound by pushing a lambda to after update hook 124f of optimizer arrange for all weights beings trained by optimizer to be renormalized as in renormalize activations c7fa with l2 upper bound it is assumed that if the weights either belong to an activation lump or are simply added to the activations i e they are biases a id x 28mgl 3a 40mgl bm 20mgl pax 3asection 29 a 12 boltzmann machines a id x 28mgl 3a 40mgl gp 20mgl pax 3asection 29 a 13 gaussian processes a id x 28mgl nlp 3a 40mgl nlp 20mgl pax 3asection 29 a 14 natural language processing in package mgl nlp this in nothing more then a couple of utilities for now which may grow into a more serious toolset for nlp eventually a id x 28mgl nlp 3amake n gram mappee 20function 29 a function make n gram mappee function n make a function of a single argument that s suitable as the function argument to a mapper function it calls function with every n element common lisp map nil make n gram mappee print 3 a b c d e a b c b c d c d e a id x 28mgl nlp 3ableu 20function 29 a function bleu candidates references key candidate key reference key n 4 compute the bleu score http en wikipedia org wiki bleu for bilingual corpus bleu measures how good a translation is compared to human reference translations candidates keyed by candidate key and references keyed by reference key are sequences of sentences a sentence is a sequence of words words are compared with equal 96d0 and may be any kind of object not necessarily strings currently there is no support for multiple reference translations n determines the largest n grams to consider the first return value is the bleu score between 0 and 1 not as a percentage the second value is the sum of the lengths of candidates divided by the sum of the lengths of references or nil if the denominator is 0 the third is a list of n gram precisions also between 0 and 1 or nil one for each element in 1 n this is basically a reimplementation of multi bleu perl https github com moses smt mosesdecoder blob master scripts generic multi bleu perl common lisp bleu 1 2 3 4 a b 1 2 3 4 1 2 0 8408964 1 1 gram precision 4 6 2 3 2 gram precision 3 4 3 4 3 gram precision 2 2 1 4 gram precision 1 1 1 a id x 28mgl nlp 3a 40mgl nlp bag of words 20mgl pax 3asection 29 a 14 1 bag of words a id x 28mgl nlp 3abag of words encoder 20class 29 a class bag of words encoder encode fedd all features of a document with a sparse vector get the features of document from mapper encode each feature with feature encoder 96d0f feature encoder may return nil if the feature is not used the result is a vector of encoded feature value conses encoded features are unique under encoded feature test 21ca within the vector but are in no particular order depending on kind value is calculated in various ways for frequency it is the number of times the corresponding feature was found in document for binary it is always 1 for normalized frequency and normalized binary are like the unnormalized counterparts except that as the final step values in the assembled sparse vector are normalized to sum to 1 finally compacted binary is like binary but the return values is not a vector of conses but a vector of element type encoded feature type d443 common lisp let feature indexer make indexer alexandria alist hash table i 3 me 2 mine 1 2 bag of words encoder make instance bag of words encoder feature encoder feature indexer feature mapper lambda fn document map nil fn document kind frequency encode bag of words encoder all through day i me mine i me mine i me mine 0 3 0d0 1 3 0d0 a id x 28mgl nlp 3afeature encoder 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 a reader feature encoder bag of words encoder feature encoder a id x 28mgl nlp 3afeature mapper 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 a reader feature mapper bag of words encoder feature mapper a id x 28mgl nlp 3aencoded feature test 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 a reader encoded feature test bag of words encoder encoded feature test eql a id x 28mgl nlp 3aencoded feature type 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 a reader encoded feature type bag of words encoder encoded feature type t a id x 28mgl nlp 3abag of words kind 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 a reader bag of words kind bag of words encoder kind binary 0072 x 28mgl opt 3aon optimization finished 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 mgl opt on optimization finished mgl pax accessor mgl opt iterative optimizer 0078 x 28mgl core 3ainstance to executor parameters 20generic function 29 mgl core instance to executor parameters generic function 00a0 x 28mgl bp 3abp learner 20class 29 mgl bp bp learner class 00ee x 28mgl 3a 40mgl links 20mgl pax 3asection 29 links 011d x 28mgl gd 3amean decay 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 mgl gd mean decay mgl pax accessor mgl gd adam optimizer 038e x 28mgl bp 3aimportance 20 28mgl pax 3aaccessor 20mgl bp 3a 3eloss 29 29 mgl bp importance mgl pax accessor mgl bp loss 0414 x 28mgl common 3aname 20 28method 20nil 20 28mgl core 3aattributed 29 29 29 mgl common name method nil mgl core attributed 0784 x 28mgl nlp 3a 40mgl nlp bag of words 20mgl pax 3asection 29 bag of words 07c7 x 28mgl core 3a 40mgl confusion matrix 20mgl pax 3asection 29 confusion matrices 07fb x 28mgl core 3an stripes 20 28mgl pax 3areader 20mgl bp 3abpn 29 29 mgl core n stripes mgl pax reader mgl bp bpn 084d x 28mgl bp 3amax lag 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 mgl bp max lag mgl pax reader mgl bp rnn 08ac x 28mgl dataset 3agenerator 20 28mgl pax 3areader 20mgl dataset 3afunction sampler 29 29 mgl dataset generator mgl pax reader mgl dataset function sampler 0900 x 28mgl gd 3avariance decay 20 28mgl pax 3aaccessor 20mgl gd 3aadam optimizer 29 29 mgl gd variance decay mgl pax accessor mgl gd adam optimizer 0933 x 28mgl bp 3amonitor bpn results 20function 29 mgl bp monitor bpn results function 09ed x 28mgl gd 3alearning rate 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 mgl gd learning rate mgl pax accessor mgl gd gd optimizer 0ba7 x 28mgl core 3a 40mgl classification measurer 20mgl pax 3asection 29 classification measurers 0c91 x 28mgl gd 3a 40mgl gd normalized batch gd optimizer 20mgl pax 3asection 29 normalized batch optimizer 0c9e x 28mgl core 3aset input 20generic function 29 mgl core set input generic function 0d6a x 28mgl nlp 3a 40mgl nlp 20mgl pax 3asection 29 natural language processing 0d82 x 28mgl bp 3a 40mgl bp training 20mgl pax 3asection 29 training 0f0f x 28mgl bp 3a 3ebatch normalized activation 20function 29 mgl bp batch normalized activation function 109e x 28mgl dataset 3a 40mgl dataset 20mgl pax 3asection 29 datasets 10e7 x 28mgl gd 3a 40mgl gd 20mgl pax 3asection 29 gradient descent 117a http www lispworks com documentation hyperspec body f open htm open function 124f x 28mgl gd 3aafter update hook 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 mgl gd after update hook mgl pax accessor mgl gd gd optimizer 1339 x 28mgl core 3adecode 20generic function 29 mgl core decode generic function 1355 x 28mgl bp 3a 40mgl fnn 20mgl pax 3asection 29 feed forward nets 16c4 x 28mgl core 3amax n stripes 20generic function 29 mgl core max n stripes generic function 175f x 28mgl bp 3afind clump 20function 29 mgl bp find clump function 17b7 http www lispworks com documentation hyperspec body m setf htm setf mgl pax macro 17d3 x 28mgl dataset 3amap dataset 20function 29 mgl dataset map dataset function 1abb http www lispworks com documentation hyperspec body f numera htm denominator function 1b5e x 28mgl core 3a 40mgl feature selection 20mgl pax 3asection 29 feature selection 1beb x 28mgl core 3aencoder 2fdecoder 20class 29 mgl core encoder decoder class 1cab x 28mgl dataset 3amax n samples 20 28mgl pax 3aaccessor 20mgl dataset 3afunction sampler 29 29 mgl dataset max n samples mgl pax accessor mgl dataset function sampler 207b x 28mgl bp 3a 40mgl bp inputs 20mgl pax 3asection 29 inputs 20ca x 28mgl opt 3ado gradient sink 20mgl pax 3amacro 29 mgl opt do gradient sink mgl pax macro 20e8 x 28mgl core 3acounter values 20generic function 29 mgl core counter values generic function 2171 x 28mgl bp 3a 3eloss 20class 29 mgl bp loss class 21ca x 28mgl nlp 3aencoded feature test 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 mgl nlp encoded feature test mgl pax reader mgl nlp bag of words encoder 2312 x 28mgl opt 3amap segments 20generic function 29 mgl opt map segments generic function 2481 x 28mgl bp 3adropout 20 28mgl pax 3aaccessor 20mgl bp 3a 3edropout 29 29 mgl bp dropout mgl pax accessor mgl bp dropout 24aa x 28mgl core 3a 40mgl feature encoding 20mgl pax 3asection 29 feature encoding 25fd x 28mgl gd 3a 40mgl gd sgd optimizer 20mgl pax 3asection 29 sgd optimizer 2739 http www lispworks com documentation hyperspec body s let l htm let mgl pax macro 2823 x 28mgl bp 3a 3elstm 20function 29 mgl bp lstm function 2981 x 28mgl diffun 3a 40mgl diffun 20mgl pax 3asection 29 differentiable functions 29a1 x 28mgl core 3a 40mgl persistence 20mgl pax 3asection 29 persistence 29c0 x 28mgl bp 3aseq elt fn 20 28mgl pax 3areader 20mgl bp 3a 3eseq barrier 29 29 mgl bp seq elt fn mgl pax reader mgl bp seq barrier 2a2f x 28mgl gd 3asgd optimizer 20class 29 mgl gd sgd optimizer class 2aa3 x 28mgl core 3amake classification accuracy monitors 2a 20generic function 29 mgl core make classification accuracy monitors generic function 2c39 x 28mgl gd 3a 40mgl gd batch gd optimizer 20mgl pax 3asection 29 batch based optimizers 2e8b x 28mgl core 3awith padded attribute printing 20mgl pax 3amacro 29 mgl core with padded attribute printing mgl pax macro 2fe9 x 28mgl bp 3a 40mgl bp arithmetic 20mgl pax 3asection 29 arithmetic 3045 x 28mgl bp 3a 40mgl bp lump 20mgl pax 3asection 29 lump base class 31ed x 28mgl core 3alabel indices 20generic function 29 mgl core label indices generic function 331b x 28mgl core 3amake executor with parameters 20generic function 29 mgl core make executor with parameters generic function 332e x 28mgl 3a 40mgl bm 20mgl pax 3asection 29 boltzmann machines 376e http www lispworks com documentation hyperspec body f apply htm apply function 3815 x 28mgl opt 3amake cost monitors 2a 20generic function 29 mgl opt make cost monitors generic function 38a0 http www lispworks com documentation hyperspec body f descri htm describe function 3c96 x 28mgl common 3aweights 20 28mgl pax 3areader 20mgl bp 3a 3eembedding 29 29 mgl common weights mgl pax reader mgl bp embedding 3ce0 x 28mgl gd 3asegmented gd optimizer 20class 29 mgl gd segmented gd optimizer class 3f9f x 28mgl resample 3a 40mgl resample cv bagging 20mgl pax 3asection 29 cv bagging 401f x 28mgl dataset 3afinishedp 20generic function 29 mgl dataset finishedp generic function 404c x 28mgl bp 3avariance 20 28mgl pax 3aaccessor 20mgl bp 3a 3egaussian random 29 29 mgl bp variance mgl pax accessor mgl bp gaussian random 40e2 http www lispworks com documentation hyperspec body f max m htm min function 410c x 28mgl common 3acost 20generic function 29 mgl common cost generic function 430d x 28mgl core 3aclassification accuracy counter 20class 29 mgl core classification accuracy counter class 441b x 28mgl bp 3a 3edropout 20class 29 mgl bp dropout class 443c x 28mgl 3a 40mgl code organization 20mgl pax 3asection 29 code organization 4476 x 28mgl core 3a 40mgl executors 20mgl pax 3asection 29 executors 4528 x 28mgl opt 3amonitor optimization periodically 20function 29 mgl opt monitor optimization periodically function 45be http www lispworks com documentation hyperspec body f exp e htm exp function 46a4 x 28mgl opt 3aminimize 20function 29 mgl opt minimize function 46c2 x 28mgl opt 3amake cost monitors 20function 29 mgl opt make cost monitors function 46c4 x 28mgl bp 3apopulation decay 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 mgl bp population decay mgl pax reader mgl bp batch normalization 4a8e x 28mgl 3a 40mgl glossary 20mgl pax 3asection 29 glossary 4bf1 x 28mgl opt 3aaccumulate gradients 2a 20generic function 29 mgl opt accumulate gradients generic function 4c73 x 28mgl opt 3an instances 20 28mgl pax 3areader 20mgl opt 3aiterative optimizer 29 29 mgl opt n instances mgl pax reader mgl opt iterative optimizer 4f0b x 28mgl opt 3aon n instances changed 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 mgl opt on n instances changed mgl pax accessor mgl opt iterative optimizer 4f0e x 28mgl bp 3a 40mgl bp monitoring 20mgl pax 3asection 29 monitoring 4ffb x 28mgl cg 3acg 20function 29 mgl cg cg function 5003 http www lispworks com documentation hyperspec body f vector htm vector function 5187 x 28mgl bp 3abpn 20class 29 mgl bp bpn class 51d5 x 28mgl bp 3awarp length 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 mgl bp warp length mgl pax reader mgl bp rnn 51f7 x 28mgl bp 3a 40mgl bp rnn operations 20mgl pax 3asection 29 operations for rnn s 5293 x 28mgl resample 3asplit fold 2fcont 20function 29 mgl resample split fold cont function 5309 x 28mgl bp 3a 3etanh 20class 29 mgl bp tanh class 533e x 28mgl bp 3atranspose weights p 20 28mgl pax 3areader 20mgl bp 3a 3ev 2am 29 29 mgl bp transpose weights p mgl pax reader mgl bp v m 5611 x 28mgl gd 3amomentum type 20 28mgl pax 3areader 20mgl gd 3a 3agd optimizer 29 29 mgl gd momentum type mgl pax reader mgl gd gd optimizer 56b2 x 28mgl bp 3a 40mgl bp overview 20mgl pax 3asection 29 backprop overview 5748 x 28mgl opt 3a 40mgl opt optimizer 20mgl pax 3asection 29 implementing optimizers 5752 x 28mgl core 3acounter 20 28mgl pax 3areader 20mgl core 3amonitor 29 29 mgl core counter mgl pax reader mgl core monitor 592c http www lispworks com documentation hyperspec body f list htm list function 5979 x 28mgl core 3abasic counter 20class 29 mgl core basic counter class 59c2 x 28mgl resample 3a 40mgl resample misc 20mgl pax 3asection 29 miscellaneous operations 59dd x 28mgl common 3agroup size 20 28mgl pax 3areader 20mgl bp 3a 3emax 29 29 mgl common group size mgl pax reader mgl bp max 5a43 x 28mgl gd 3aper weight batch gd optimizer 20class 29 mgl gd per weight batch gd optimizer class 5bd4 x 28mgl bp 3abackward 20generic function 29 mgl bp backward generic function 5c66 http www lispworks com documentation hyperspec body f countc htm count function 5d86 x 28mgl bp 3a 40mgl bp activation functions 20mgl pax 3asection 29 activation functions 5ded x 28mgl resample 3asplit fold 2fmod 20function 29 mgl resample split fold mod function 5e58 x 28mgl bp 3ainput row indices 20 28mgl pax 3areader 20mgl bp 3a 3eembedding 29 29 mgl bp input row indices mgl pax reader mgl bp embedding 5fdc x 28mgl core 3amap batches for model 20function 29 mgl core map batches for model function 6004 x 28mgl core 3amake cross entropy monitors 20function 29 mgl core make cross entropy monitors function 606c x 28mgl bp 3abuild fnn 20mgl pax 3amacro 29 mgl bp build fnn mgl pax macro 60b3 x 28mgl 3a 40mgl gp 20mgl pax 3asection 29 gaussian processes 60d2 x 28mgl core 3aconfusion matrix 20class 29 mgl core confusion matrix class 60e3 x 28mgl core 3a 40mgl classification 20mgl pax 3asection 29 classification 6202 x 28mgl core 3amonitors 20 28mgl pax 3aaccessor 20mgl bp 3abp learner 29 29 mgl core monitors mgl pax accessor mgl bp bp learner 627a x 28mgl resample 3afracture stratified 20function 29 mgl resample fracture stratified function 62de x 28mgl core 3aadd to counter 20generic function 29 mgl core add to counter generic function 6598 x 28mgl core 3a 40mgl classification counter 20mgl pax 3asection 29 classification counters 6872 x 28mgl bp 3a 40mgl bp weight lump 20mgl pax 3asection 29 weight lump 6a6f x 28mgl opt 3a 40mgl opt extension api 20mgl pax 3asection 29 extension api 6b38 x 28mgl bp 3a 40mgl fnn tutorial 20mgl pax 3asection 29 fnn tutorial 6b4a http www lispworks com documentation hyperspec body f funcal htm funcall function 6bd7 x 28mgl core 3aload state 20function 29 mgl core load state function 6c7f http www lispworks com documentation hyperspec body f mod r htm mod function 6da5 x 28mgl core 3a 40mgl attributes 20mgl pax 3asection 29 attributes 6e96 x 28mgl bp 3atime step 20function 29 mgl bp time step function 6f82 x 28mgl resample 3afracture 20function 29 mgl resample fracture function 7068 x 28mgl core 3amonitor 20class 29 mgl core monitor class 7162 x 28mgl bp 3a 3eactivation 20class 29 mgl bp activation class 71f9 x 28mgl bp 3astep monitors 20 28mgl pax 3aaccessor 20mgl bp 3arnn 29 29 mgl bp step monitors mgl pax accessor mgl bp rnn 7582 http www lispworks com documentation hyperspec body f sinh htm tanh function 764b x 28mgl bp 3abuild rnn 20mgl pax 3amacro 29 mgl bp build rnn mgl pax macro 765c x 28mgl dataset 3amap datasets 20function 29 mgl dataset map datasets function 779d x 28mgl opt 3a 40mgl opt iterative optimizer 20mgl pax 3asection 29 iterative optimizer 7960 x 28mgl bp 3ashift 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 mgl bp shift mgl pax reader mgl bp batch normalization 7a28 x 28mgl bp 3a 40mgl bp extension api 20mgl pax 3asection 29 clump api 7bc3 x 28mgl dataset 3a 40mgl sampler 20mgl pax 3asection 29 samplers 7c2f x 28mgl opt 3ainitialize optimizer 2a 20generic function 29 mgl opt initialize optimizer generic function 7ee3 x 28mgl core 3a 40mgl counter classes 20mgl pax 3asection 29 counter classes 80e2 x 28mgl bp 3avariance for prediction 20 28mgl pax 3aaccessor 20mgl bp 3a 3egaussian random 29 29 mgl bp variance for prediction mgl pax accessor mgl bp gaussian random 8148 x 28mgl core 3aread state 20function 29 mgl core read state function 82d8 x 28mgl bp 3aadd clump 20function 29 mgl bp add clump function 83e6 x 28mgl cg 3a 40mgl cg 20mgl pax 3asection 29 conjugate gradient 83f9 x 28mgl bp 3a 3esigmoid 20class 29 mgl bp sigmoid class 85d3 x 28mgl common 3asize 20 28mgl pax 3areader 20mgl bp 3alump 29 29 mgl common size mgl pax reader mgl bp lump 86fd x 28mgl resample 3asample from 20function 29 mgl resample sample from function 871e x 28mgl bp 3a 40mgl rnn 20mgl pax 3asection 29 recurrent neural nets 8788 x 28mgl bp 3a 40mgl bp 20mgl pax 3asection 29 backpropagation neural networks 8804 http www lispworks com documentation hyperspec body f identi htm identity function 8916 x 28mgl common 3abatch size 20 28mgl pax 3aaccessor 20mgl cg 3acg optimizer 29 29 mgl common batch size mgl pax accessor mgl cg cg optimizer 8b55 x 28mgl bp 3aarrange for renormalizing activations 20function 29 mgl bp arrange for renormalizing activations function 8cb8 x 28mgl resample 3asplit stratified 20function 29 mgl resample split stratified function 8cb9 http www lispworks com documentation hyperspec body f concat htm concatenate function 8da0 x 28mgl opt 3aiterative optimizer 20class 29 mgl opt iterative optimizer class 8dd7 x 28mgl core 3an stripes 20generic function 29 mgl core n stripes generic function 8e53 x 28mgl bp 3aunfolder 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 mgl bp unfolder mgl pax reader mgl bp rnn 8f37 x 28mgl core 3amonitors 20generic function 29 mgl core monitors generic function 9006 x 28mgl opt 3atermination 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 mgl opt termination mgl pax accessor mgl opt iterative optimizer 9105 x 28mgl bp 3a 40mgl bp activations 20mgl pax 3asection 29 activations 911c x 28mgl core 3amake classification accuracy monitors 20function 29 mgl core make classification accuracy monitors function 9192 x 28mgl 3a 40mgl overview 20mgl pax 3asection 29 overview 91a3 x 28mgl core 3amax n stripes 20 28mgl pax 3areader 20mgl bp 3abpn 29 29 mgl core max n stripes mgl pax reader mgl bp bpn 91f3 x 28mgl bp 3a 40mgl bp utilities 20mgl pax 3asection 29 utilities 920f http www lispworks com documentation hyperspec body f symb 5 htm symbol value function 9358 http www lispworks com documentation hyperspec body t mod htm mod type 9385 x 28mgl core 3alabel index distributions 20generic function 29 mgl core label index distributions generic function 93a7 x 28mgl bp 3a 40mgl bp losses 20mgl pax 3asection 29 losses 9524 x 28mgl resample 3across validate 20function 29 mgl resample cross validate function 95fe x 28mgl core 3awrite state 20function 29 mgl core write state function 9641 x 28mgl bp 3a 40mgl bp lumps 20mgl pax 3asection 29 lumps 96d0 http www lispworks com documentation hyperspec body f equal htm equal function 96d0f x 28mgl nlp 3afeature encoder 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 mgl nlp feature encoder mgl pax reader mgl nlp bag of words encoder 9700 x 28mgl bp 3a 40mgl rnn tutorial 20mgl pax 3asection 29 rnn tutorial 9715 x 28mgl core 3aattributed 20class 29 mgl core attributed class 9749 x 28mgl cg 3acg args 20 28mgl pax 3aaccessor 20mgl cg 3acg optimizer 29 29 mgl cg cg args mgl pax accessor mgl cg cg optimizer 989a x 28mgl gd 3a 40mgl gd segmented gd optimizer 20mgl pax 3asection 29 segmented gd optimizer 989c x 28mgl core 3aapply monitors 20function 29 mgl core apply monitors function 98f9 http www lispworks com documentation hyperspec body t list htm list type 9a5b x 28mgl opt 3asegment derivatives 20generic function 29 mgl opt segment derivatives generic function 9d3a x 28mgl bp 3a 3erelu 20class 29 mgl bp relu class 9da9 x 28mgl bp 3a 3ebatch normalized 20class 29 mgl bp batch normalized class 9de4 x 28mgl bp 3afnn 20class 29 mgl bp fnn class a077 x 28mgl core 3acounter 20generic function 29 mgl core counter generic function a1d4 http www lispworks com documentation hyperspec body f eq htm eq function a210 x 28mgl opt 3a 40mgl opt gradient sink 20mgl pax 3asection 29 implementing gradient sinks a39b x 28mgl resample 3a 40mgl resample 20mgl pax 3asection 29 resampling a437 x 28mgl common 3agroup size 20 28mgl pax 3areader 20mgl bp 3a 3esoftmax xe loss 29 29 mgl common group size mgl pax reader mgl bp softmax xe loss a4fe x 28mgl bp 3aclump 20class 29 mgl bp clump class a81b x 28mgl bp 3aderivatives 20generic function 29 mgl bp derivatives generic function a884 x 28mgl gd 3a 40mgl gd per weight optimization 20mgl pax 3asection 29 per weight optimization a896 x 28mgl common 3aweights 20 28mgl pax 3areader 20mgl bp 3a 3ev 2am 29 29 mgl common weights mgl pax reader mgl bp v m aa2e x 28mgl bp 3a 40mgl bp stochasticity 20mgl pax 3asection 29 stochasticity aa86 x 28mgl gd 3avariance adjustment 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 mgl gd variance adjustment mgl pax reader mgl bp batch normalization aabd x 28mgl opt 3amap gradient sink 20generic function 29 mgl opt map gradient sink generic function ad8f x 28mgl dataset 3a 2ainfinitely empty dataset 2a 20variable 29 mgl dataset infinitely empty dataset variable ada2 x 28mgl core 3a 40mgl parameterized executor cache 20mgl pax 3asection 29 parameterized executor cache ae3d x 28mgl opt 3aminimize 2a 20generic function 29 mgl opt minimize generic function aee6 x 28mgl resample 3asample stratified 20function 29 mgl resample sample stratified function af05 x 28mgl gd 3amomentum 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 mgl gd momentum mgl pax accessor mgl gd gd optimizer af6b x 28mgl gd 3aclip l2 norm 20function 29 mgl gd clip l2 norm function afff http www lispworks com documentation hyperspec body f zerop htm zerop function b01b x 28mgl core 3amap over executors 20generic function 29 mgl core map over executors generic function b0f3 x 28mgl bp 3arnn 20class 29 mgl bp rnn class b186 x 28mgl core 3across entropy counter 20class 29 mgl core cross entropy counter class b5c7 x 28mgl common 3atarget 20 28mgl pax 3aaccessor 20mgl bp 3a 3esoftmax xe loss 29 29 mgl common target mgl pax accessor mgl bp softmax xe loss b602 x 28mgl bp 3a 3eactivation 20function 29 mgl bp activation function b647 x 28mgl resample 3a 40mgl resample bagging 20mgl pax 3asection 29 bagging b76f x 28mgl bp 3a 3eweight 20class 29 mgl bp weight class b99d x 28mgl common 3asize 20 28mgl pax 3areader 20mgl opt 3asegment set 29 29 mgl common size mgl pax reader mgl opt segment set b9c1 http www lispworks com documentation hyperspec body t seq htm sequence type ba18 x 28mgl common 3abatch size 20 28mgl pax 3aaccessor 20mgl gd 3a 3agd optimizer 29 29 mgl common batch size mgl pax accessor mgl gd gd optimizer ba91 x 28mgl resample 3astratify 20function 29 mgl resample stratify function bbdf x 28mgl core 3aapply monitor 20generic function 29 mgl core apply monitor generic function bd13 x 28mgl gd 3a 40mgl gd adam optimizer 20mgl pax 3asection 29 adam optimizer bdf9 x 28mgl dataset 3an samples 20 28mgl pax 3areader 20mgl dataset 3afunction sampler 29 29 mgl dataset n samples mgl pax reader mgl dataset function sampler be8d x 28mgl dataset 3a 40mgl sampler function sampler 20mgl pax 3asection 29 function sampler be95 x 28mgl core 3a 40mgl counter 20mgl pax 3asection 29 counters bf1a http www lispworks com documentation hyperspec body s multip htm multiple value call mgl pax macro c102 x 28mgl core 3asave state 20function 29 mgl core save state function c1ac x 28mgl bp 3alump 20class 29 mgl bp lump class c1ae x 28mgl bp 3aforward 20generic function 29 mgl bp forward generic function c1d9 http www lispworks com documentation hyperspec body f numera htm numerator function c40e x 28mgl gd 3a 40mgl gd utilities 20mgl pax 3asection 29 utilities c469 x 28mgl bp 3a 3ebatch normalization 20class 29 mgl bp batch normalization class c573 x 28mgl core 3a 40mgl classification monitor 20mgl pax 3asection 29 classification monitors c58b x 28mgl opt 3a 40mgl opt gradient source 20mgl pax 3asection 29 implementing gradient sources c701 x 28mgl core 3a 40mgl monitor 20mgl pax 3asection 29 monitors c74a x 28mgl opt 3a 40mgl opt 20mgl pax 3asection 29 gradient based optimization c7fa x 28mgl bp 3arenormalize activations 20function 29 mgl bp renormalize activations function c8db x 28mgl core 3a 40mgl features 20mgl pax 3asection 29 features c918 x 28mgl common 3abatch size 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 mgl common batch size mgl pax reader mgl bp batch normalization c9d1 x 28mgl common 3ascale 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalization 29 29 mgl common scale mgl pax reader mgl bp batch normalization ca09 x 28mgl opt 3areset optimization monitors 20generic function 29 mgl opt reset optimization monitors generic function caec x 28mgl core 3alabel index distribution 20generic function 29 mgl core label index distribution generic function cbb4 x 28mgl nlp 3abag of words encoder 20class 29 mgl nlp bag of words encoder class cc1c x 28mgl common 3anodes 20generic function 29 mgl common nodes generic function cc37 x 28mgl core 3aattributes 20 28mgl pax 3aaccessor 20mgl core 3aattributed 29 29 mgl core attributes mgl pax accessor mgl core attributed cc80 x 28mgl core 3alabel index 20generic function 29 mgl core label index generic function cd3b x 28mgl core 3a 40mgl measurer 20mgl pax 3asection 29 measurers d0e3 x 28mgl bp 3a 40mgl rnn time warp 20mgl pax 3asection 29 time warp d10a x 28mgl cg 3aon cg batch done 20 28mgl pax 3aaccessor 20mgl cg 3acg optimizer 29 29 mgl cg on cg batch done mgl pax accessor mgl cg cg optimizer d1e0 x 28mgl bp 3a 40mgl bpn 20mgl pax 3asection 29 bpn s d397 http www lispworks com documentation hyperspec body f wr to htm princ to string function d3b2 x 28mgl core 3aparameterized executor cache mixin 20class 29 mgl core parameterized executor cache mixin class d443 x 28mgl nlp 3aencoded feature type 20 28mgl pax 3areader 20mgl nlp 3abag of words encoder 29 29 mgl nlp encoded feature type mgl pax reader mgl nlp bag of words encoder d699 x 28mgl common 3anodes 20 28mgl pax 3areader 20mgl bp 3alump 29 29 mgl common nodes mgl pax reader mgl bp lump d6e0 x 28mgl bp 3awarp start 20 28mgl pax 3areader 20mgl bp 3arnn 29 29 mgl bp warp start mgl pax reader mgl bp rnn d94e x 28mgl gd 3abatch gd optimizer 20class 29 mgl gd batch gd optimizer class d96a x 28mgl bp 3amean 20 28mgl pax 3aaccessor 20mgl bp 3a 3egaussian random 29 29 mgl bp mean mgl pax accessor mgl bp gaussian random dbc4 x 28mgl bp 3a 3ev 2am 20class 29 mgl bp v m class dbc7 x 28 22mgl 22 20asdf 2fsystem 3asystem 29 mgl asdf system system dd95 x 28mgl opt 3ainitialize gradient source 2a 20generic function 29 mgl opt initialize gradient source generic function e0e6 x 28mgl gd 3aadam optimizer 20class 29 mgl gd adam optimizer class e2a3 http www lispworks com documentation hyperspec body t vector htm vector type e46f x 28mgl core 3amake cross entropy monitors 2a 20generic function 29 mgl core make cross entropy monitors generic function e4b0 http www lispworks com documentation hyperspec body f ensu 1 htm ensure directories exist function e4f9 x 28mgl opt 3areset optimization monitors 20 28method 20nil 20 28mgl opt 3aiterative optimizer 20t 29 29 29 mgl opt reset optimization monitors method nil mgl opt iterative optimizer t e50c x 28mgl core 3amonitor model results 20function 29 mgl core monitor model results function e668 x 28mgl core 3a 40mgl monitoring 20mgl pax 3asection 29 monitoring e746 x 28mgl opt 3a 40mgl opt cost 20mgl pax 3asection 29 cost function e7ea x 28mgl 3a 40mgl dependencies 20mgl pax 3asection 29 dependencies e7f6 x 28mgl bp 3adropout 20 28mgl pax 3aaccessor 20mgl bp 3a 3einput 29 29 mgl bp dropout mgl pax accessor mgl bp input e8d2 x 28mgl bp 3a 3esquared difference 20class 29 mgl bp squared difference class eaf1 x 28mgl bp 3abatch normalization 20 28mgl pax 3areader 20mgl bp 3a 3ebatch normalized 29 29 mgl bp batch normalization mgl pax reader mgl bp batch normalized eafc http www lispworks com documentation hyperspec body f pr obj htm print object generic function eb05 x 28mgl core 3ameasurer 20 28mgl pax 3areader 20mgl core 3amonitor 29 29 mgl core measurer mgl pax reader mgl core monitor ebd4 x 28mgl opt 3aon optimization started 20 28mgl pax 3aaccessor 20mgl opt 3aiterative optimizer 29 29 mgl opt on optimization started mgl pax accessor mgl opt iterative optimizer ed4f x 28mgl bp 3a 2awarp time 2a 20variable 29 mgl bp warp time variable edcf x 28mgl bp 3a 3esum 20class 29 mgl bp sum class ee97 x 28mgl cg 3acg optimizer 20class 29 mgl cg cg optimizer class f00d x 28mgl opt 3asegments 20generic function 29 mgl opt segments generic function f17b x 28mgl resample 3a 40mgl resample cross validation 20mgl pax 3asection 29 cross validation f257 x 28mgl core 3a 40mgl core 20mgl pax 3asection 29 core f3bc x 28mgl common 3aname 20 28mgl pax 3areader 20mgl dataset 3afunction sampler 29 29 mgl common name mgl pax reader mgl dataset function sampler f466 http www lispworks com documentation hyperspec body f length htm length function f47b http www lispworks com documentation hyperspec body f wr pr htm princ function f491 x 28mgl common 3afn 20 28mgl pax 3areader 20mgl diffun 3adiffun 29 29 mgl common fn mgl pax reader mgl diffun diffun f54e x 28mgl bp 3a 3einput 20class 29 mgl bp input class f573 x 28mgl bp 3acuda window start time 20 28mgl pax 3aaccessor 20mgl bp 3arnn 29 29 mgl bp cuda window start time mgl pax accessor mgl bp rnn f652 x 28mgl bp 3a 3emax 20class 29 mgl bp max class f6ae x 28mgl gd 3anormalized batch gd optimizer 20class 29 mgl gd normalized batch gd optimizer class f790 x 28mgl resample 3a 40mgl resample partitions 20mgl pax 3asection 29 partitions f7aa x 28mgl 3a 40mgl introduction 20mgl pax 3asection 29 introduction f7c1 x 28mgl bp 3aclumps 20 28mgl pax 3areader 20mgl bp 3abpn 29 29 mgl bp clumps mgl pax reader mgl bp bpn f956 x 28mgl dataset 3asample 20generic function 29 mgl dataset sample generic function f98e x 28mgl core 3ado executors 20mgl pax 3amacro 29 mgl core do executors mgl pax macro faaa x 28mgl core 3ado batches for model 20mgl pax 3amacro 29 mgl core do batches for model mgl pax macro fedd x 28mgl core 3aencode 20generic function 29 mgl core encode generic function ff5a x 28mgl bp 3alag 20function 29 mgl bp lag function ff82 x 28mgl core 3a 40mgl model stripe 20mgl pax 3asection 29 batch processing generated by mgl pax https github com melisgl mgl pax | ai |
|
cuit-course | div align center img src https github com ooyq cuit course assets 120553430 333e8a85 fec1 458b 97f1 d07cb07ab7ea alt image div h1 align center h1 p align center a href https github com ooyq cuit course stargazers img src https img shields io github stars ooyq cuit course svg alt stars a a href https github com ooyq cuit course network members img src https img shields io github forks ooyq cuit course svg alt forks a img src https img shields io github repo size ooyq cuit course svg alt github repo size a href https github com ooyq cuit course issues img src https img shields io github issues ooyq cuit course svg alt issues a img src https img shields io github issues pr ooyq cuit course svg alt github pull requests p p google a4 cc98 p https github com qsctech zju icicles br heart https github com ooyq cuit helper br downgit https minhaskamal github io downgit home download br issue pr br https qm qq com cgi bin qm qr k zayq5mcgb5ichs1npmmzts02icam4 em authkey ynxbwmdar2vyv9wul3u5jpuqbzhlmg o9e5r7f dztavahqainxoj7ppilk1bmhl noverify 0 a target blank href mailto cuit email com img src https img shields io badge email blue style for the badge logo email logocolor white a br div align center img src https github com ooyq cuit course assets 120553430 5c727622 3719 4b59 9d12 af403a17c67c alt image div | server |
|
nanocrawler | nanocrawler a lightweight web ui for viewing realtime information about your nano node and exploring the nano network what is nano nano s goal is to become a global currency with instantaneous transactions and zero fees over a secure decentralized network more information is available over on the official nano repository https github com nanocurrency raiblocks installation first clone this repository onto the server where you intend to host the site it doesn t have to be the same server as the nano node but it certainly can be if you want to run yarn to install dependencies there are 2 config files you need to update api server config the server is responsible for proxying requests between the site and your nano node you should never expose your nano node rpc to the public which is why we have a server that exposes only certain endpoints it also does a little processing on the raw response from the nano rpc and caches responses with redis there is a full default config available in the examples folder copy server config json to the project root update all of the values to fit your environment redis support is optional but recommended if you wish to skip it you can safely delete the config entry client config the web front end needs to know where the api server can be reached copy client config json from the examples into the src folder and update the config file to fit your environment the websocket server https github com meltingice nanovault ws is optional but you re welcome to use the hosted websocket server that s set as the default in the config depending on the sync status of your node you may receive blocks from the websocket server before your node confirms them which is why hosting one yourself is ideal remove the config entry to disable the websocket altogether development to run nanocrawler in development mode simply run yarn start this will start both the api server and the webpack development server for the front end this does not start any of the reoccuring jobs but can you run those manually if you need the data they provide see below production hosting once the config has been set you can build the project with yarn deploy this will compile and output all of the static site files into the html directory this is preferred over using yarn build because the build process starts by deleting the build directory which can cause the site to break for any visitors during the build process any time you change the client config or pull down any changes from git you will need to rebuild the project from here you can host the static site files anywhere it can be on a home server heroku https github com mars create react app buildpack a digitalocean droplet etc if you re not on heroku i highly recommend hosting the static files with nginx one important thing to note is that all of the site routing is done client side this means you need to do one of two things either configure your web server to always serve index html regardless of the url path or switch to hash based routing example nginx config nginx server root path to build dir index index html server name nano yourdomain com this is the important part that will fallback to loading index html location try files uri uri index html 404 switching to hash based router while i highly recommend hosting via a proper webserver as a last resort you can switch to hash based routing open up src index js and change browserrouter to hashrouter run yarn build to get an updated version of the site now instead of explorer the url will be explorer hosting the server the server side components are broken up into multiple processes in order to separate the api server from some long running reoccurring jobs they re comprised of server api js the api server server peers js reoccurring job for discovering and fetching data from other nano monitors server top accounts js reoccurring job for building the top accounts list server tps js reoccurring job for recording the current block count to calculate blocks sec there are multiple options for hosting a nodejs server if you have experience with one option feel free to use it all of the scripts can be run directly with node and they all use the same server config json i use and recommend pm2 https www npmjs com package pm2 for managing nodejs servers there is an ecosystem config js file included so all you have to do is run pm2 start ecosystem config js env production to start all the processes the api server will start in cluster mode with 4 processes by default feel free to tweak this in the ecosystem config js file localization nanocrawler aims to be available in as many languages as possible if you would like to contribute translations please see the instructions below and send a pull request when ready contributing translations all strings that are used on the site are defined in the translations files in src translations these translation files consist of a simple json object the keys are the stable ids for each of the strings which are used in the site code the values are the corresponding translation if a string contains a value between two brackets e g count that is a dynamic value that is populated in the code it should be present as is in all available translations english is the fallback language for nanocrawler if a particular translation is not present to contribute to nanocrawler s translations please email meltingice see github profile to get an invite to our poeditor https poeditor com project while discouraged we will accept pull requests with translation updates as well | nano nano-node nano-rpc nanocurrency cryptocurrency nodejs javascript react | front_end |
nlp-uncertainty-zoo | robot speech balloon question nlp uncertainty zoo this repository contains implementations of several models used for uncertainty estimation in natural language processing implemented in pytorch you can install the repository using pip pip3 install nlp uncertainty zoo if you are using the repository in your academic research please cite the paper below inproceedings ulmer etal 2022 exploring title exploring predictive uncertainty and calibration in nlp a study on the impact of method data scarcity author ulmer dennis and frellsen jes and hardmeier christian booktitle findings of the association for computational linguistics emnlp 2022 month dec year 2022 address abu dhabi united arab emirates publisher association for computational linguistics url https aclanthology org 2022 findings emnlp 198 pages 2707 2735 to learn more about the package consult the documentation here http dennisulmer eu nlp uncertainty zoo check a jupyter notebook demo here https github com kaleidophon nlp uncertainty zoo blob main demo ipynb or a google collab here https colab research google com drive 1 pl5lvcnpbgl2zxlgddnqvjb7ew8uiss usp sharing included models the following models are implemented in the repository they can all be imported by using from nlp uncertainty zoo import model for transformer based model furthermore a version of a model is available that uses a pre trained bert from the huggingface transformers name description implementation paper lstm vanilla lstm lstm hochreiter schmidhuber 1997 http citeseerx ist psu edu viewdoc download doi 10 1 1 676 4320 rep rep1 type pdf lstm ensemble ensemble of lstms lstmensemble lakshminarayanan et al 2017 https proceedings neurips cc paper 2017 file 9ef2ed4b7fd2c810847ffa5fa85bce38 paper pdf bayesian lstm lstm implementing bayes by backprop blundell et al 2015 http proceedings mlr press v37 blundell15 pdf bayesianlstm fortunato et al 2017 https arxiv org pdf 1704 02798 pdf st tau lstm lstm modelling transitions of a finite state automaton sttaulstm wang et al 2021 https openreview net pdf id 9ekhn1jola variational lstm lstm with mc dropout gal ghahramani 2016a http proceedings mlr press v48 gal16 pdf variationallstm gal ghahramani 2016b https proceedings neurips cc paper 2016 file 076a0c97d09cf1a0ec3e19c7f2529f2b paper pdf ddu transformer ddu bert transformer bert with gaussian mixture model fit to hidden features ddutransformer ddubert mukhoti et al 2021 https arxiv org pdf 2102 11582 pdf variational transformer variational bert transformer bert with mc dropout gal ghahramani 2016a http proceedings mlr press v48 gal16 pdf variationaltransformer variationalbert xiao et al 2021 https arxiv org pdf 2006 08344 pdf dpp transformer dpp bert transformer bert using determinantal point process dropout dpptransformer dppbert shelmanov et al 2021 https aclanthology org 2021 eacl main 157 sngp transformer sngp bert spectrally normalized transformer bert using a gaussian process output layer sngptransformer sngpbert liu et al 2022 http arxiv org abs 2205 00403 contributions to include even more approaches are much appreciated usage each model comes in two versions for instance lstmensemble and lstmensemblemodule the first one is supposed to be used as an out of the box solution encapsulating all training logic and convenience functions these include fitting the model prediction getting the uncertainty for an input batch using a specific metric python model lstmensemble network params ensemble size 10 is sequence classifer false model fit train split train dataloader model get logits x model get predictions x model get sequence representation x model available uncertainty metrics model get uncertainty x model get uncertainty x metric name mutual information in comparison the module class is supposed to me more simple and bare bones only containing the core model logic it is intended for research purposes and for others who would like to embed the model into their own code base while the model class e g lstmensemble inherits from model and would require to implement certain methods any module class sticks closely to torch nn module to check what arguments are required to initialize and use different models check the documentation here http nlpuncertaintyzoo dennisulmer eu also check out the demo provided as a jupyter notebook here https github com kaleidophon nlp uncertainty zoo blob main demo ipynb or a google collab here https colab research google com drive 1 pl5lvcnpbgl2zxlgddnqvjb7ew8uiss usp sharing repository structure the repository has the following structure models all model implementations tests unit tests so far only contains rudimentary tests to check that all output shapes are consistent between models and functions utils utility code see below utils custom types py custom types used in the repository for type annotations utils data py module containing data collators and data builders which build the dataloaders for a type of task and a specific dataset currently language modelling sequence labeling and sequence classification are supported utils metrics py implementations of uncertainty metrics utils samplers py dataset subsamplers for language modelling sequence labelling and sequence classification utils task eval py functions used to evaluate task performance utils uncertainty eval py function used to evaluate uncertainty quality utils calibration eval py function used to evaluate calibration quality config py define available datasets model and tasks defaults py define default config parameters for sequence classification and language modelling note these might not be very good parameters other features weights biases integration you can track your experiments easily with weights biases by passing a wandb run argument to model fit easy fine tuning via huggingface you can fine tune arbitrary bert models using their name from huggingface s transformers contributing this repository is by no means perfect nor complete if you find any bugs please report them using the issue template and if you also happen to provide a fix create a pull request a github template is provided for that as well you would like to make a new addition to the repository follow the steps below adding a new model to add a new model add a new module in the models directory you will also need to implement a corresponding model and module class inheriting from the classes of the same name in models model py and implementing all required functions model is supposed to be an out of the box solution that you can start experimenting right away whil module should only include the most basic model logic in order to be easy to integrate into other codebases and allow tinkering adding a new uncertainty metric to add a new uncertainty metric add the function to utils metrics py the function should take the logits of a model and output an uncertainty score the higher the score the more uncertain the model the function should output a batch size x sequence length matrix with batch size x 1 for sequence classification tasks after finishing the implementation you can add the metric to the single prediction uncertainty metrics of the models model model class and multi prediction uncertainty metrics of models model multipredictionmixin if applicable you would like to add something else create an issue or contact me at dennis dot ulmer at mailbox dot org | deep-learning lstm nlp nlp-machine-learning package python pytorch rnn transformers uncertainty-estimation uncertainty-neural-networks uncertainty-quantification | ai |
node-red-contrib-uibuilder | discussion https img shields io static v1 svg label discussion message node red 20forum color green https discourse nodered org tag node red contrib uibuilder static badge https img shields io badge uibuilder homepage 0d85d7 https totallyinformation github io node red contrib uibuilder npm version https img shields io npm v node red contrib uibuilder svg https www npmjs com package node red contrib uibuilder npm total downloads https img shields io npm dt node red contrib uibuilder svg https www npmjs com package node red contrib uibuilder npm downloads per month https img shields io npm dm node red contrib uibuilder svg https www npmjs com package node red contrib uibuilder github last commit https img shields io github last commit totallyinformation node red contrib uibuilder svg https github com totallyinformation node red contrib uibuilder github stars https img shields io github stars totallyinformation node red contrib uibuilder svg https github com totallyinformation node red contrib uibuilder watchers github watchers https img shields io github watchers totallyinformation node red contrib uibuilder svg https github com totallyinformation node red contrib uibuilder stargazers github license https img shields io github license totallyinformation node red contrib uibuilder svg https github com totallyinformation node red contrib uibuilder blob master license min node version https img shields io node v node red contrib uibuilder svg https www npmjs com package node red contrib uibuilder package quality http npm packagequality com shield node red contrib uibuilder png http packagequality com package node red contrib uibuilder deepscan grade https deepscan io api teams 13157 projects 16160 branches 340901 badge grade svg https deepscan io dashboard view project tid 13157 pid 16160 bid 340901 codeql https github com totallyinformation node red contrib uibuilder actions workflows codeql analysis yml badge svg https github com totallyinformation node red contrib uibuilder actions workflows codeql analysis yml open issues https img shields io github issues raw totallyinformation node red contrib uibuilder svg https github com totallyinformation node red contrib uibuilder issues closed issues https img shields io github issues closed raw totallyinformation node red contrib uibuilder svg https github com totallyinformation node red contrib uibuilder issues q is 3aissue is 3aclosed img class dhide align right style width 124px src front end images node blue svg title uibuilder icon node red contrib uibuilder uibuilder for node red allows the easy creation of data driven front end web applications it includes many helper features that can reduce or eliminate the need to write code for building data driven web applications and user interfaces integrated with node red installation uibuilder is best installed using node red s palette manager details summary manual installs and other versions summary to install manually from a command line on your node red server bash cd node red npm install node red contrib uibuilder to install old versions bash cd node red npm install node red contrib uibuilder v5 1 1 to install development branches please install from github https github com totallyinformation node red contrib uibuilder bash cd node red npm install totallyinformation node red contrib uibuilder branchname you will need to restart node red if installing manually details updates the current changelog https totallyinformation github io node red contrib uibuilder changelog md contains all of the changes and requirement details for each version older changes can be found in the previous change documents changelog v5 https totallyinformation github io node red contrib uibuilder archived changelog v5 md changelog v3 v4 docs https totallyinformation github io node red contrib uibuilder archived changelog v3 v4 md changelog v2 https totallyinformation github io node red contrib uibuilder archived changelog v2 md and changelog v2 https totallyinformation github io node red contrib uibuilder archived changelog v1 md getting started once installed the following is a typical simple flow to get going 1 add a uibuilder node open its settings and give it a url which is used as the identifying name close the settings and click on the deploy button 2 add an inject node for some simple input data and two debug nodes on the two output ports so that you can see everything that is going on deploy the flow 3 re open the uibuilder node s settings and click the open button to see the resulting web page you are now ready to edit the front end html javascript css if you wish and to add logic in node red to provide inputs and handle outputs please see the first timers walkthrough https totallyinformation github io node red contrib uibuilder walkthrough1 in the documentation and the introduction video https www youtube com watch v ivwr 3cx05a for more help to get started also try out the built in example flows examples within node red use the hamburger menu click import click examples select the node red contrib uibuilder folder and choose an example the templates feature in uibuilder provides working front end code of various configurations other examples can be found on the node red flows site https flows nodered org search term uibuilder and the uibuilder wiki https github com totallyinformation node red contrib uibuilder wiki also see the faq s and answered questions on the node red forum https discourse nodered org tag node red contrib uibuilder documentation and other links please refer to the documentation web site https totallyinformation github io node red contrib uibuilder this can also be accessed from within uibuilder nodes even without an internet connection there is a library of official video tutorials on youtube https www youtube com playlist list pl9ieadrqaal3mg3rcf0cjaaxigfh3gdrq other folk have also produced uibuilder related content https www youtube com results search query uibuilder node red questions issues and suggestions the best place to ask questions or discuss possible enhancements is the node red forum https discourse nodered org alternatively use the github issues log https github com totallyinformation node red contrib uibuilder issues for raising issues or contributing suggestions and enhancements and the github discussions page https github com totallyinformation node red contrib uibuilder discussions for general questions suggestions etc other links uib https github com totallyinformation node red contrib uibuilder raw main front end images node blue ico uibuilder for node red https github com totallyinformation node red contrib uibuilder ideas questions general help https discourse nodered org tag node red contrib uibuilder ask your question on the node red forum using the node red contrib uibuilder tag documentation https totallyinformation github io node red contrib uibuilder go to the latest documentation flows https flows nodered org search term uibuilder example flows nodes and collections related to uibuilder wiki https github com totallyinformation node red contrib uibuilder wiki more documentation and examples example svelte external template https github com totallyinformation uib template svelte simple in case you want to build your own svelte app example simple external template https github com totallyinformation uib template test in case you want to build your own external template uplot uibuilder extension https github com totallyinformation nr uibuilder uplot useful charts but also demonstrates how to build your own extension event handler module used by uibuilder https github com totallyinformation ti common event handler so you can see some of the inner workings ui library module used by uibuilder https github com totallyinformation ui js can be used stand alone for turning ui standard config json into html node red contrib moment https github com totallyinformation node red contrib moment nodes to make use of the momentjs date time handling library in node red test nodes for node red https github com totallyinformation uib template test some test nodes for node red that help you understand how everything works hotnipi gauge web component https github com totallyinformation gauge hotnipi a really nice looking gauge component works with node red uibuilder or stand alone experimental web components https github com totallyinformation web components have some node red uibuilder specific enhancements but also work well stand alone array grouper https github com totallyinformation groupit stand alone function to reshape an array of objects purpose the purpose of uibuilder is to support easy methods for creating and delivering data driven web apps and web pages also known as web user interfaces be a conduit between node red and front end browser ui web apps be ui framework agnostic no framework is needed to use uibuilder but it will work with them where desired uibuilder aims to reduce the requirement for a framework by making it easier to work with vanilla html css provide interface data standards for exchanging data and controls between node red and the web pages enable the creation and management of multiple web apps from a single node red instance reduce the amount of front end code html javascript needed to create and manage a web app reduce the knowledge required for creating reliable accessible web apps by providing low code and no code features make it easy to install and serve front end libraries to support the development of more complex web apps features the core features of uibuilder provides nodes to enable zero code translation of input data to usable and accessible web elements provides capability for low code configuration driven data driven ui s creating a framework for describing a ui and translating to actual code without having to write code provides a 2 way communications channel between the node red server back end and front end ui provides a node red node to act as the focus for communications with other nodes for additional ease of use provides a front end library to do the complex parts of the communications in the client browser make manipulation of the ui easier and more consistent make it easy to get data back to node red as needed both automatically and manually provides easy to use templates and examples for front end code to enable people to get a quick start on creating web apps provides management and serving of npm packages that provide front end libraries consumable easily by front end code allows editing of front end code from the node red editor designed for small changes use web development tools generally enables the use of external authentication and authorisation methods and services to control multi user access to web apps provides various server middleware and api options for additional custom capabilities allows as many uibuilder node instances as you like each instance allows the creation of many web pages and sub folders for easy management each uibuilder node instance provides a private 2 way communications channel between the node red server back end and browser front end ui code supports the use of standard web development workflows allows the creation of a dedicated web service to facilitate independent security provides a caching capability allowing newly joining clients to receive the latest data and configurations joining leaving clients create notifications in node red details summary no code ui s summary uibuilder is still growing towards offering more no code capabilities like node red s dashboard extension does however it is starting to offer these features via the new client available since v5 v6 1 introduced the new uib element and uib update nodes that offer the first usable no code features uib element takes in simple data and outputs configuration data this can then be sent to the front end via the uibuilder node alternatively it can be saved and the result used in an initial load several simple options such as tables and lists are available in uibuilder v6 1 additional elements and structures will be made available in future versions the uibuilder front end client takes the configuration information and dynamically builds html elements and inserts them to the web page or removes updates as needed while this is not the most efficient processing approach since updates are mostly replacing the whole element which could be quite large for things like big tables it is very efficient from an authoring perspective so the uib update node provides a more targetted approach to updating and changing specific attributes and slot content for elements it is important to note that no front end 3rd party frameworks such as vuejs or react are needed for this approach everything uses vanilla html javascript and css under the skin and so is compatible with current and future web standards details details summary low code ui s summary the data that uib element outputs is a format that you can use in your own flows in node red and even in front end code if desired it describes a set of html ui elements but does not need you to actually write html code the configuration schema is very flexible and even allows you to load configuration data html scripts and new ecma modules components from external files the schema and the ui creator functions built into the front end client are specifically designed to work with current and future html standards in order to avoid the kinds of issues commonly encountered when using 3rd party front end frameworks e g major version changes forcing rewrites of all of your tooling so es modules ecma components and future ecma versions should all be supported details details summary future direction summary the general direction of uibuilder or associated modules is likely to include provide more no code and low code ui creation and update capabilities as of v6 1 these are now starting to be delivered v6 2 will extend these the ability to save updated html from the front end via node red so that ui building can be done once and loaded as efficient static html the ability to use the zero code features to produce html for other tools to use the ability within node red to for each uibuilder node run npm scripts such as build processes and to manage instance level npm packages be able to install update remove instance level npm packages as can already be done for uibuilder level packages provide a development server capability that auto reloads connected clients when code changes are made a ui designer allowing users without html css js skills to create reasonable web apps without code details details summary feature details and benefits summary designed as an alternative to the node red official dashboard without the overheads and restrictions control everything from the node red admin ui edit your front end resource files manage front end packages no need to access the servers command line manage startup templates internal templates for vanilla html svelte vuejs v2 v3 and vuejs bootstrap vue are provided load templates from other repositories via degit makes it easy to share templates that provide a whole app or just deal with boilerplate have as many custom user interfaces as you want just 1 node is needed for each entry point use link nodes to send data from other parts of your flows an entry point can be contain multiple web pages has a control interface separate to the message interface know when a browser tab connects or disconnects send cached data and more provide a stable client id that identifies a specific browser profile until it is restarted a tabid is provided that identifies a specific browser tab on a client device provide information to node red about the client that is sending a msg so that security and other processing can identify the client the user and so on can be a lot lighter in weight and more mobile friendly than the node red official dashboard use any front end framework you like simply install via the built in library manager use without any front end framework if you prefer keep it light and simple try this out with the blank template and the uib element node the included front end libraries uibuilder iife js uibuilder esm js provide connectivity to node red and msg event handling along with some helper utility functions write your own html css and javascript to define the perfect front end user interface for your needs or define it using a json config description edit your custom front end code from within the node red editor auto reload your clients on changes to the code great for rapid development note that this is designed for quick edits it is recommended to use your normal web development toolchain for larger edits needs almost no boilerplate in your front end code in order to work optional index web page listing of available files two detailed admin info web pages are included to help authors understand where everything is and what is available uses node red s own expressjs webservers by default switch to a custom expressjs server if desired when using a custom server pages can also include ejb server side templating has middleware for expressjs for web services and socket io for communications both at initial connection and per message so that you can add your own custom features including security can create custom api s for each uibuilder instance details details summary current limitations summary you may need to write some of your own html you have to know the front end library locations for installed 3rd party packages and edit your html accordingly the uibindex admin api accessible from any node s admin ui shows you all of the root folders and what the package authors report as the main entry point for all active packages there is now also a simplified information page for the currently viewed uibuilder node instance this is access from a button in the configuration panel note that this is a limitation of npm and module authors not of uibuilder unless module authors correctly identify the browser entrypoint for their libraries uibuilder can only guess you cannot yet compile compress your custom front end code hmtl js scss etc for efficiency this will be added soon this will use a local package json file that contains a build script if it exists uibuilder will expose a build button that will run the script details contributing if you would like to contribute to this node you can contact totally information via github https github com totallyinformation or raise a request in the github issues log https github com totallyinformation node red contrib uibuilder issues pull requests both for code and documentation are welcomed and the wiki is open to new entries and corrections but please let me know if you make a change please refer to the contributing guidelines https github com totallyinformation node red contrib uibuilder blob master github contributing md for more information you can also support the development of uibuilder by sponsoring the development ko fi https ko fi com img githubbutton sm svg https ko fi com a0a3ppmrj github sponsorship https github com sponsors totallyinformation paypal sponsorship https paypal me totallyinformation developers contributors julian knight https github com totallyinformation the designer and main author colin law https github com colinl many thanks for testing corrections and pull requests steve rickus https github com shrickus many thanks for testing corrections contributed code and design ideas ellie lee https github com ellieejlee many thanks for the pr fixing duplicate msgs thomas wagner https github com thomseeen thanks for the steer and pr on using projects folder if active arlena derksen https github com boisei0 thanks for suggestions bug checks and issue 59 pr 60 cflurin https discourse nodered org u cflurin thanks for the cache example scott page indysoft https github com scottpageindysoft thanks for issue 73 pr 74 stephen mclaughlin steve mcl https discourse nodered org u steve mcl thanks for the fix for issue 71 https github com totallyinformation node red contrib uibuilder issues 71 and for the enhancement idea issue 102 https github com totallyinformation node red contrib uibuilder issues 102 sergio rius https github com sergiorius thanks for reporting issue 121 https github com totallyinformation node red contrib uibuilder issues 121 and providing pr 122 https github com totallyinformation node red contrib uibuilder pull 122 as a fix thorsten von eicken https github com tve thanks for providing pr 131 https github com totallyinformation node red contrib uibuilder pull 131 to improve cors handling for socket io meeki007 https github com meeki007 thanks for supplying various documentation improvements and code fixes scott talltechdude https github com talltechdude thanks for supplying pr 170 calum knott https github com calumk thanks for the tidied up node blue logo harold peters inskipp https github com haroldpetersinskipp thanks for the logging examples dczysz https github com dczysz thanks for reporting issue 186 https github com totallyinformation node red contrib uibuilder issues 186 and helping work through the complex async bug colin j mudwalkercj https github com mudwalkercj thanks for helping with the documentation marcus davies https discourse nodered org u marcus j davies many thanks for the encouragement and for the 3d logo many other people have contributed ideas and suggestions thanks to everyone who does they are most welcome a href https stackexchange com users 1375993 julian knight img src https stackexchange com users flair 1375993 png width 208 height 58 alt profile for julian knight on stack exchange a network of free community driven q amp a sites title profile for julian knight on stack exchange a network of free community driven q amp a sites a please also check out my blog much ado about it https it knightnet org uk it has information about all sorts of topics mainly it related including node red | node-red websockets javascript html ui website webapp web-application web-app dashboard data-driven flow-based-programming css low-code no-code zero-code | front_end |
IoTDemos | overview welcome to the github repo for end to end iot demos in this repo you ll find examples of end to end use cases and the iot architectures that enable to those use cases each demo has it s own folder with setup instructions contributing this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla opensource microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g status check comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments | server |
|
LLM-Serve | serving llm based chatbot this repository provides a generalized serving framework to serve llm based application currently aiming chatbot this is a fork project of alpaca lora serve https github com deep diver alpaca lora serve alpaca lora serve was initially written heavily dependent on alpaca model so it was not so easy to extend to support other models in this project i am going to refactor the code base a lot to make it concise and clean with the help of the other library that i built ping pong or bong bong https github com deep diver pingpong screenshots https raw githubusercontent com deep diver llm serve main assets preview png todo structure project support alpaca based application support gpt alpaca based application support flan based application support dolly based application support backend only option w fastapi dockerfile | ai |
|
P2_Cloud_Computing_Udagram_Image_Filtering_Microservice | udagram image filtering microservice udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into three parts 1 the simple frontend https github com remuant udacity c2 frontend a basic ionic client web application which consumes the restapi backend covered in the course 2 the restapi backend https github com remuant udacity c2 restapi a node express server which can be deployed to a cloud service covered in the course 3 the image filtering microservice https github com remuant p2 cloud computing udagram image filtering microservice a node express application which runs a simple script to process images project 2 tasks setup node environment for local development create a new node server 1 initialize a new project npm i 2 run the development server with npm run dev create a new endpoint in the server ts file the task to be completed is the implementation of an endpoint in src server ts which uses a query parameter to download an image from a public url filter the image and return the result helper functions have been included to handle some of these concepts and are imported at the top of the src server ts file typescript import filterimagefromurl deletelocalfiles from util util deployment follow the process described in the course to eb init a new application and eb create a new environment to deploy the image filter service if it is necessary to push changes use eb deploy eb endpoint udagram bex p2 prod final us east 1 elasticbeanstalk com | cloud |
|
Quiz-App | app makes use of firebase camera plugin provider for question data mapping the data with json files angular 2 two way data binding angular 2 such as ngfor ngif answer card component alert sheet ion slides tabs geolocation app allows user to input their name and age take a photo select from variety of quizes choose from four different options for each question needs to answer within 30 seconds upload their name age score type of quiz location and photo to firebase later this data will be displayed in highscore tab app s additional information requires internet connection to avail use of firebase if user doesn t choose a quiz then default one will be chosen default quiz has questions from other type of quizes uncaught exception undefined might show up after installing plugins it causes no issues what so ever with the app so don t worry about it after last update of ionic view camera plugin firebase sometimes don t work might require few times of loading the app how to run it press windows key x select search and then type cmd and press enter there input following commands desktop git clone https github com danielcregggmit 2nd year software ionic 2 assignment ltrmex git cd 2nd year software ionic 2 assignment ltrmex npm install mkdir www ionic state restore ionic serve | front_end |
|
oreilly-hands-on-gpt-llm | oreilly logo images oreilly png deploying gpt large language models this repository contains code for the o reilly live online training for deploying gpt llms https learning oreilly com live events deploying gpt and large language models 0636920087375 0636920087374 in this training you learn how to use gpt 3 chatgpt openai embeddings and other large language models to build applications for both experimenting and production we cover the fundamentals of gpt and its applications and explore alternative generative models such as cohere and gpt j you gain practical experience in building a variety of applications with these models including text generation summarization question answering and more learn how to leverage prompt engineering context stuffing and few shot learning to get the most out of gpt like models then the focus shifts to deploying these models in production including best practices and debugging techniques by the end of the training you have a working knowledge of gpt and other large language models as well as the skills to start building your own applications with them notebooks training bert for classification notebooks bert classification ipynb introduction to prompt engineering notebooks intro prompt engineering ipynb advanced to prompt engineering notebooks advanced prompt engineering ipynb finetuning with openai notebooks openai fine tuned classification ipynb semantic search notebooks semantic search ipynb instructor sinan ozdemir is the founder and cto of loopgenius where he uses state of the art ai to help people create and run their businesses sinan is a former lecturer of data science at johns hopkins university and the author of multiple textbooks on data science and machine learning additionally he is the founder of the recently acquired kylie ai an enterprise grade conversational ai platform with rpa capabilities he holds a master s degree in pure mathematics from johns hopkins university and is based in san francisco ca | ai |
|
super-gradients | div align center markdown 1 img src documentation assets sg img sg horizontal glow 2 png width 600 br br build train and fine tune production ready deep learning sota vision models tweet https img shields io twitter url http shields io svg style social https twitter com intent tweet text easily 20train 20or 20fine tune 20sota 20computer 20vision 20models 20from 20one 20training 20repository url https github com deci ai super gradients via deci ai hashtags ai deeplearning computervision training opensource version 3 is out notebooks have been updated div div align center p align center a href https www supergradients com website a a href https docs deci ai super gradients documentation source welcome html docs a a href getting started getting started a a href implemented model architectures pretrained models a a href community community a a href license license a a href deci platform deci platform a p p align center a href https github com deci ai super gradients prerequisites img src https img shields io badge python 3 7 20 7c 203 8 20 7c 203 9 blue a href https github com deci ai super gradients prerequisites img src https img shields io badge pytorch 1 9 20 7c 201 10 blue a href https pypi org project super gradients img src https img shields io pypi v super gradients a href https github com deci ai super gradients computer vision models pretrained checkpoints img src https img shields io badge pre trained 20models 34 brightgreen a href https github com deci ai super gradients releases img src https img shields io github v release deci ai super gradients a href https join slack com t supergradients comm52 shared invite zt 10vz6o1ia b 0w5jepenuhxm087k t8q img src https img shields io badge slack community blueviolet a href https github com deci ai super gradients blob master license md img src https img shields io badge license apache 202 0 blue a href https docs deci ai super gradients documentation source welcome html img src https img shields io badge docs mkdocs brightgreen a p div build with supergradients support various computer vision tasks div align center img src https github com deci ai super gradients raw master documentation assets sg img segmentation 1500x900 png width 250px img src https github com deci ai super gradients raw master documentation assets sg img object detection 1500x900 png width 250px img src https github com deci ai super gradients raw master documentation assets sg img classification 1500x900 png width 250px img src https github com deci ai super gradients raw master documentation assets sg img poseestimation jpg width 250px div ready to deploy pre trained sota models yolo nas architecture is out the new yolo nas delivers state of the art performance with the unparalleled accuracy speed performance outperforming other models such as yolov5 yolov6 yolov7 and yolov8 check it out here yolo nas yolonas md div align center img src documentation source images yolo nas frontier png width 800px div python load model with pretrained weights from super gradients training import models from super gradients common object names import models model models get models yolo nas m pretrained weights coco all computer vision models pretrained checkpoints can be found in the model zoo http bit ly 41dkt89 classification div align center img src documentation assets sg img classification 2xdark png width 800px div semantic segmentation div align center img src documentation assets sg img semantic segmentation 2xdark png width 800px div object detection div align center img src documentation assets sg img object detection 2xdark png width 800px div easy to train sota models easily load and fine tune production ready pre trained sota models that incorporate best practices and validated hyper parameters for achieving best in class accuracy for more information on how to do it go to getting started getting started plug and play recipes bash python m super gradients train from recipe architecture regnety800 dataset interface data dir your imagenet local path ckpt root dir chekpoint directory more example on how and why to use recipes can be found in recipes recipes production readiness all supergradients models are production ready in the sense that they are compatible with deployment tools such as tensorrt nvidia and openvino intel and can be easily taken into production with a few lines of code you can easily integrate the models into your codebase python load model with pretrained weights from super gradients training import models from super gradients common object names import models model models get models yolo nas m pretrained weights coco prepare model for conversion input size is in format of batch x channels x width x height where 640 is the standard coco dataset dimensions model eval model prep model for conversion input size 1 3 640 640 create dummy input convert model to onnx torch onnx export model dummy input yolo nas m onnx more information on how to take your model to production can be found in getting started getting started notebooks quick installation bash pip install super gradients what s new version 3 1 3 july 19 2023 pose estimation task support https docs deci ai super gradients documentation source poseestimation html check out fine tuning notebook example https colab research google com drive 1nmgzx8ndycizqnrlzkjzrioqyj0mfzje scrollto 3uzjqtehg0on pre trained modified dekr https github com deci ai super gradients blob master src super gradients recipes coco2017 pose dekr w32 no dc yaml model for pose estimation tensorrt compatible support for python 3 10 support for torch compile other bugfixes minor improvements check out release notes https github com deci ai super gradients releases tag 3 1 3 30th of may quantization aware training yolonas on custom dataset https bit ly 3mikdty version 3 1 1 may 3rd yolo nas https bit ly 41wenpz new predict function https bit ly 3ozfaea predict on any image video url path stream roboflow100 https bit ly 40yoj5z datasets integration a new documentation hub https docs deci ai super gradients documentation source welcome html integration with dagshub for experiment monitoring https bit ly 3alfukq support darknet yolo format detection dataset https bit ly 41vx6qu used by yolo v5 v6 v7 v8 segformer https bit ly 3oyu6jp model and recipe post training quantization and quantization aware training notebooks http bit ly 3krn6an check out sg full release notes https github com deci ai super gradients releases coming soon pre trained pose estimation model test time augmentations tta recipe to train dekr model convertable to trt key points rescoring for pose estimation lr finder data analysis tools table of content toc getting started getting started advanced features advanced features installation methods installation methods prerequisites prerequisites quick installation quick installation implemented model architectures implemented model architectures contributing contributing citation citation community community license license deci platform deci platform tocstop getting started start training with just 1 command line the most simple and straightforward way to start training sota performance models with supergradients reproducible recipes just define your dataset path and where you want your checkpoints to be saved and you are good to go from your terminal just make sure that you setup your dataset https github com deci ai super gradients blob master src super gradients training datasets dataset setup instructions md according to the data dir specified in the recipe bash python m super gradients train from recipe config name imagenet regnety architecture regnety800 dataset interface data dir your imagenet local path ckpt root dir chekpoint directory quickly load pre trained weights for your desired model with sota performance want to try our pre trained models on your machine import supergradients initialize your trainer and load your desired architecture and pre trained weights from our sota model zoo http bit ly 41dkt89 python the pretrained weights argument will load a pre trained architecture on the provided dataset import super gradients model models get model name pretrained weights pretrained model name classification transfer learning table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3xziutb img src documentation assets sg img colab logo png classification transfer learning a td td width 200 a target blank href https bit ly 3xwyen1 img src documentation assets sg img github logo png github source a td table br br semantic segmentation quick start table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3qkx9m8 img src documentation assets sg img colab logo png segmentation quick start a td table br br transfer learning table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3qkwmbe img src documentation assets sg img colab logo png segmentation transfer learning a td table br br how to connect custom dataset table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3qqbvjp img src documentation assets sg img colab logo png segmentation how to connect custom dataset a td table br br pose estimation transfer learning table class tfo notebook buttons align left td width 500 a target blank href https colab research google com drive 1nmgzx8ndycizqnrlzkjzrioqyj0mfzje scrollto 3uzjqtehg0on img src documentation assets sg img colab logo png pose estimation transfer learning a td table br br object detection transfer learning table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3skmohx img src documentation assets sg img colab logo png detection transfer learning a td table br br how to connect custom dataset table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3dqdlg3 img src documentation assets sg img colab logo png detection how to connect custom dataset a td table br br how to predict using pre trained model segmentation detection and classification prediction table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3f4mssd img src documentation assets sg img colab logo png how to predict using pre trained model a td table br br advanced features post training quantization and quantization aware training quantization involves representing weights and biases in lower precision resulting in reduced memory and computational requirements making it useful for deploying models on devices with limited resources the process can be done during training called quantization aware training or after training called post training quantization a full tutorial can be found here http bit ly 41hc8ui table class tfo notebook buttons align left td width 500 a target blank href http bit ly 3krn6an img src documentation assets sg img colab logo png post training quantization and quantization aware training a td table quantization aware training yolonas on custom dataset this tutorial provides a comprehensive guide on how to fine tune a yolonas model using a custom dataset it also demonstrates how to utilize sg s qat quantization aware training support additionally it offers step by step instructions on deploying the model and performing benchmarking table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3mikdty img src documentation assets sg img colab logo png quantization aware training yolonas on custom dataset a td table knowledge distillation training knowledge distillation is a training technique that uses a large model teacher model to improve the performance of a smaller model the student model learn more about supergradients knowledge distillation training with our pre trained beit base teacher model and resnet18 student model on cifar10 example notebook on google colab for an easy to use tutorial using free gpu hardware table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3bla5or img src documentation assets sg img colab logo png knowledge distillation training a td table br br recipes to train a model it is necessary to configure 4 main components these components are aggregated into a single main recipe yaml file that inherits the aforementioned dataset architecture raining and checkpoint params it is also possible and recomended for flexibility to override default settings with custom ones all recipes can be found here http bit ly 3gflw07 br recipes support out of the box every model metric or loss that is implemented in supergradients but you can easily extend this to any custom object that you need by registering it check out this http bit ly 3tq4izb tutorial for more information table class tfo notebook buttons align left td width 500 a target blank href https bit ly 3uiy5ab img src documentation assets sg img colab logo png how to use recipes a td table br br br details markdown 1 summary h3 using distributed data parallel ddp h3 summary why use ddp recent deep learning models are growing larger and larger to an extent that training on a single gpu can take weeks in order to train models in a timely fashion it is necessary to train them with multiple gpus using 100s gpus can reduce training time of a model from a week to less than an hour how does it work each gpu has its own process which controls a copy of the model and which loads its own mini batch from disk and sends it to its gpu during training after the forward pass is completed on every gpu the gradient is reduced across all gpus yielding to all the gpus having the same gradient locally this leads to the model weights to stay synchronized across all gpus after the backward pass how to use it you can use supergradients to train your model with ddp in just a few lines main py python from super gradients import init trainer trainer from super gradients common import multigpumode from super gradients training utils distributed training utils import setup device initialize the environment init trainer launch ddp on 4 gpus setup device multi gpu multigpumode distributed data parallel num gpus 4 call the trainer trainer expriment name everything you do below will run on 4 gpus trainer train finally you can launch your distributed training with a simple python call bash python main py please note that if you work with torch 1 9 0 deprecated you will have to launch your training with either torch distributed launch or torchrun in which case nproc per node will overwrite the value set with gpu mode bash python m torch distributed launch nproc per node 4 main py bash torchrun nproc per node 4 main py calling functions on a single node it is often in ddp training that we want to execute code on the master rank i e rank 0 in sg users usually execute their own code by triggering phase callbacks see using phase callbacks section below one can make sure the desired code will only be ran on rank 0 using ddp silent mode or the multi process safe decorator for example consider the simple phase callback below that uploads the first 3 images of every batch during training to the tensorboard python from super gradients training utils callbacks import phasecallback phasecontext phase from super gradients common environment env helpers import multi process safe class upload3trainimagescalbback phasecallback def init self super init phase phase train batch end multi process safe def call self context phasecontext batch imgs context inputs cpu detach numpy tag batch str context batch idx images context sg logger add images tag tag images batch imgs 3 global step context epoch the multi process safe decorator ensures that the callback will only be triggered by rank 0 alternatively this can also be done by the sg trainer boolean attribute which the phase context has access to ddp silent mode which is set to false iff the current process rank is zero even after the process group has been killed python from super gradients training utils callbacks import phasecallback phasecontext phase class upload3trainimagescalbback phasecallback def init self super init phase phase train batch end def call self context phasecontext if not context ddp silent mode batch imgs context inputs cpu detach numpy tag batch str context batch idx images context sg logger add images tag tag images batch imgs 3 global step context epoch note that ddp silent mode can be accessed through sgtrainer ddp silent mode hence it can be used in scripts after calling sgtrainer train when some part of it should be ran on rank 0 only good to know your total batch size will be number of gpus x batch size so you might want to increase your learning rate there is no clear rule but a rule of thumb seems to be to linearly increase the learning rate with the number of gpus https arxiv org pdf 1706 02677 pdf details details markdown 1 summary h3 easily change architectures parameters h3 summary python from super gradients training import models instantiate default pretrained resnet18 default resnet18 models get model name resnet18 num classes 100 pretrained weights imagenet instantiate pretrained resnet18 turning droppath on with probability 0 5 droppath resnet18 models get model name resnet18 arch params droppath prob 0 5 num classes 100 pretrained weights imagenet instantiate pretrained resnet18 without classifier head output will be from the last stage before global pooling backbone resnet18 models get model name resnet18 arch params backbone mode true pretrained weights imagenet details details markdown 1 summary h3 using phase callbacks h3 summary python from super gradients import trainer from torch optim lr scheduler import reducelronplateau from super gradients training utils callbacks import phase lrschedulercallback from super gradients training metrics classification metrics import accuracy define pytorch train and validation loaders and optimizer define what to be called in the callback rop lr scheduler reducelronplateau optimizer mode max patience 10 verbose true define phase callbacks they will fire as defined in phase phase callbacks lrschedulercallback scheduler rop lr scheduler phase phase validation epoch end metric name accuracy create a trainer object look the declaration for more parameters trainer trainer experiment name define phase callbacks as part of the training parameters train params phase callbacks phase callbacks details details markdown 1 summary h3 integration to dagshub h3 summary open in colab https colab research google com assets colab badge svg https colab research google com drive 11fw56pmpwomhqsbqw6xxmryvw1mec t usp sharing python from super gradients import trainer trainer trainer experiment name model training params your training params sg logger dagshub sg logger dagshub logger see class super gradients common sg loggers dagshub sg logger dagshubsglogger for details sg logger params params that will be passes to init of the logger super gradients common sg loggers dagshub sg logger dagshubsglogger dagshub repository repo owner repo name optional your dagshub project name consisting of the owner name followed by and the repo name if this is left empty you ll be prompted in your run to fill it in manually log mlflow only false optional change to true to bypass logging to dvc and log all artifacts only to mlflow save checkpoints remote true save tensorboard remote true save logs remote true details details summary h3 integration to weights and biases h3 summary python from super gradients import trainer create a trainer object look the declaration for more parameters trainer trainer experiment name train params training parameters sg logger wandb sg logger weights biases logger see class wandbsglogger for details sg logger params paramenters that will be passes to init of the logger project name project name w b project name save checkpoints remote true save tensorboard remote true save logs remote true details details markdown 1 summary h3 integration to clearml h3 summary python from super gradients import trainer create a trainer object look the declaration for more parameters trainer trainer experiment name train params training parameters sg logger clearml sg logger clearml logger see class clearmlsglogger for details sg logger params paramenters that will be passes to init of the logger project name project name clearml project name save checkpoints remote true save tensorboard remote true save logs remote true details installation methods prerequisites details markdown 1 summary general requirements summary python 3 7 3 8 or 3 9 installed 1 9 0 torch 1 14 https pytorch org get started locally the python packages that are specified in requirements txt details details markdown 1 summary to train on nvidia gpus summary nvidia cuda toolkit 11 2 https developer nvidia com cuda 11 2 0 download archive target os linux target arch x86 64 target distro ubuntu cudnn 8 1 x nvidia driver with cuda 11 2 support 460 x details quick installation details markdown 1 summary install stable version using pypi summary see in pypi https pypi org project super gradients bash pip install super gradients that s it details details markdown 1 summary install using github summary bash pip install git https github com deci ai super gradients git stable details implemented model architectures all computer vision models pretrained checkpoints can be found in the model zoo http bit ly 41dkt89 image classification densnet densely connected convolutional networks https github com deci ai super gradients blob master src super gradients training models classification models densenet py dpn https github com deci ai super gradients blob master src super gradients training models classification models dpn py efficientnet https github com deci ai super gradients blob master src super gradients training models classification models efficientnet py lenet https github com deci ai super gradients blob master src super gradients training models classification models lenet py mobilenet https github com deci ai super gradients blob master src super gradients training models classification models mobilenet py mobilenet v2 https github com deci ai super gradients blob master src super gradients training models classification models mobilenetv2 py mobilenet v3 https github com deci ai super gradients blob master src super gradients training models classification models mobilenetv3 py pnasnet https github com deci ai super gradients blob master src super gradients training models classification models pnasnet py pre activation resnet https github com deci ai super gradients blob master src super gradients training models classification models preact resnet py regnet https github com deci ai super gradients blob master src super gradients training models classification models regnet py repvgg https github com deci ai super gradients blob master src super gradients training models classification models repvgg py resnet https github com deci ai super gradients blob master src super gradients training models classification models resnet py resnext https github com deci ai super gradients blob master src super gradients training models classification models resnext py senet https github com deci ai super gradients blob master src super gradients training models classification models senet py shufflenet https github com deci ai super gradients blob master src super gradients training models classification models shufflenet py shufflenet v2 https github com deci ai super gradients blob master src super gradients training models classification models shufflenetv2 py vgg https github com deci ai super gradients blob master src super gradients training models classification models vgg py semantic segmentation pp liteseg https bit ly 3rrtmmo ddrnet deep dual resolution networks https github com deci ai super gradients blob master src super gradients training models segmentation models ddrnet py laddernet https github com deci ai super gradients blob master src super gradients training models segmentation models laddernet py regseg https github com deci ai super gradients blob master src super gradients training models segmentation models regseg py shelfnet https github com deci ai super gradients blob master src super gradients training models segmentation models shelfnet py stdc https github com deci ai super gradients blob master src super gradients training models segmentation models stdc py object detection csp darknet https github com deci ai super gradients blob master src super gradients training models detection models csp darknet53 py darknet 53 https github com deci ai super gradients blob master src super gradients training models detection models darknet53 py ssd single shot detector https github com deci ai super gradients blob master src super gradients training models detection models ssd py yolox https github com deci ai super gradients blob master src super gradients training models detection models yolox py pose estimation dekr w32 no dc https github com deci ai super gradients blob master src super gradients training models pose estimation models dekr hrnet py implemented datasets deci provides implementation for various datasets if you need to download any of the dataset you can find instructions https github com deci ai super gradients blob master src super gradients training datasets dataset setup instructions md image classification cifar10 https github com deci ai super gradients blob master src super gradients training datasets classification datasets cifar py imagenet https github com deci ai super gradients blob master src super gradients training datasets classification datasets imagenet dataset py semantic segmentation cityscapes https github com deci ai super gradients blob master src super gradients training datasets segmentation datasets cityscape segmentation py coco https github com deci ai super gradients blob master src super gradients training datasets segmentation datasets coco segmentation py pascalvoc 2012 pascalaug 2012 https github com deci ai super gradients blob master src super gradients training datasets segmentation datasets pascal voc segmentation py superviselypersons https github com deci ai super gradients blob master src super gradients training datasets segmentation datasets supervisely persons segmentation py mapillary vistas dataset https github com deci ai super gradients blob master src super gradients training datasets segmentation datasets mapillary dataset py object detection coco https github com deci ai super gradients blob master src super gradients training datasets detection datasets coco detection py pascalvoc 2007 2012 https github com deci ai super gradients blob master src super gradients training datasets detection datasets pascal voc detection py pose estimation coco https github com deci ai super gradients blob cadcfdd64e7808d21cccddbfaeb26acb8267699b src super gradients recipes dataset params coco pose estimation dekr dataset params yaml documentation check supergradients docs https docs deci ai super gradients documentation source welcome html for full documentation user guide and examples contributing to learn about making a contribution to supergradients please see our contribution page contributing md our awesome contributors a href https github com deci ai super gradients graphs contributors img src https contrib rocks image repo deci ai super gradients a br made with contrib rocks https contrib rocks citation if you are using supergradients library or benchmarks in your research please cite supergradients deep learning training library community if you want to be a part of supergradients growing community hear about all the exciting news and updates need help request for advanced features or want to file a bug or issue report we would love to welcome you aboard discord is the place to be and ask questions about supergradients and get support click here to join our discord community https discord gg 2v6cegmren to report a bug file an issue https github com deci ai super gradients issues on github join the sg newsletter https www supergradients com newsletter for staying up to date with new features and models important announcements and upcoming events for a short meeting with us use this link https calendly com ofer baratz deci 15min and choose your preferred time license this project is released under the apache 2 0 license license citing bibtex bibtex misc supergradients doi 10 5281 zenodo 7789328 url https zenodo org record 7789328 author aharon shay and louis dupont and ofri masad and yurkova kate and lotem fridman and lkdci and khvedchenya eugene and rubin ran and bagrov natan and tymchenko borys and keren tomer and zhilko alexander and eran deci title super gradients publisher github journal github repository year 2021 latest doi doi https zenodo org badge doi 10 5281 zenodo 7789328 svg https doi org 10 5281 zenodo 7789328 deci platform deci platform is our end to end platform for building optimizing and deploying deep learning models to production request free trial https bit ly 3qo3icq to enjoy immediate improvement in throughput latency memory footprint and model size features automatically compile and quantize your models with just a few clicks tensorrt openvino gain up to 10x improvement in throughput latency memory and model size easily benchmark your models performance on different hardware and batch sizes invite co workers to collaborate on models and communicate your progress deci supports all common frameworks and hardware from intel cpus to nvidia s gpus and jetsons request free trial here https bit ly 3qo3icq | deep-learning neural-network pretrained-weights pretrained-models pytorch object-detection image-classification computer-vision semantic-segmentation imagenet transfer-learning | ai |
TextModels.jl | textmodels a julia package for natural language neural network models https github com juliatext textmodels jl actions workflows ci yml badge svg https github com juliatext textmodels jl actions workflows ci yml https img shields io badge docs stable blue svg https juliahub com docs textmodels warning the models in this repo are no longer state of the art the field has moved on very quickly see transformers jl https github com chengchingwen transformers jl for more modern methods introduction the textmodels package enhances the textanalysis package with end user focussed practical natural language models typically based on neural networks in this case flux https fluxml ai please see the documentation https juliahub com docs textmodels for more license mit license https github com juliatext textanalysis jl blob master license md installation julia pkg add textmodels some of the models require data files to run which are downloaded on demand therefore internet access is required at runtime for certain functionality in this package contributing and reporting bugs contributions in the form of bug reports pull requests additional documentation are encouraged they can be made to the github repository all contributions and communications should abide by the julia community standards https julialang org community standards support feel free to ask for help on the julia discourse forum https discourse julialang org or in the natural language channel on julia slack https julialang slack com which you can join here https slackinvite julialang org you can also raise issues in this repository to request new features and or improvements to the documentation and codebase | ai |
|
wdsdu | web development setup debian ubuntu wdsdu overview an installation script and instructions for debian ubuntu web development environment deployment requirements in order to run own copy of the project one must fulfill the following requirements supported operating systems debian based linux https en wikipedia org wiki list of linux distributions debian based core dependencies bash 5 https www gnu org software bash the script can be used to deploy debian ubuntu web development environment which includes the following guake http guake project org tmux https en wikipedia org wiki tmux freerdp x11 client https github com freerdp freerdp tree master client x11 chromium browser https www chromium org google chrome stable https www google com chrome index html git https git scm com docker https www docker com kubernetes minikube https minikube sigs k8s io docs kubernetes kubectl https kubernetes io docs reference kubectl kubectl helm https helm sh nodejs v16 build essential latest npm https nodejs org en global npm dependencies angular cli https cli angular io bazel bazelisk https www npmjs com package bazel bazelisk compodoc compodoc https compodoc app nestjs cli https docs nestjs com ngxs cli https www ngxs io plugins cli nrwl cli https cli angular io asar https www npmjs com package asar clang format https github com angular clang format commitizen https www npmjs com package commitizen corepack https www npmjs com package corepack cz conventional changelog https github com commitizen cz conventional changelog firebase tools https firebase google com docs cli grpcc https www npmjs com package grpcc madge https www npmjs com package madge npm check updates https github com tjunnone npm check updates svgo https github com svg svgo typescript https www typescriptlang org yarn https yarnpkg com flutter https flutter dev vscode https code visualstudio com documentation references docker docker https docs docker com docker engine command line reference https docs docker com engine reference commandline docker docker configure and use docker https docs docker com engine reference commandline docker flutter flutter https flutter dev docs firebase firebase web getting started https firebase google com docs web setup firebase web api reference https firebase google com docs reference js firebase web codelabs https codelabs developers google com codelabs firebase web 0 kubernetes kubernetes https kubernetes io docs home kubernetes minikube https minikube sigs k8s io docs kubernetes kubectl https kubernetes io docs tasks tools kubectl | web-development-tools web-development-environment guake tmux chromium-browser google-chrome-stable git docker nodejs angular-cli typescript installation-script installation-automation installation-help firebase-tools debian ubuntu flutter vscode kubernetes | front_end |
HoneyBadgerMPC | honeybadgermpc travis branch https img shields io travis initc3 honeybadgermpc dev svg https travis ci org initc3 honeybadgermpc codecov branch https img shields io codecov c github initc3 honeybadgermpc dev svg https codecov io github initc3 honeybadgermpc branch dev honeybadgermpc is a robust mpc based confidentiality layer for blockchains img width 200 src http i imgur com wqzdyl4 png compared to other blockchain privacy techniques like zero knowledge proofs the main appeal of honeybadgermpc is that it is much more flexible mpc can be used to write arbitrary smart contracts that compute on secret data while providing availability integrity and confidentiality guarantees while there are many mpc implementations honeybadgermpc is uniquely suited for blockchain integration because of its focus on robustness it is the first mpc toolkit to provide guaranteed output in spite of byzantine faults how to use honeybadgermpc honeybadgermpc is a research prototype and is best used for prototyping proofs of concept and benchmarking experiments as a library honeybadgermpc provides a python based programming environment for writing custom mpc programs the programs can be run in several ways for testing and development they can be running in a single process simulated network to test the entire protocol implementation including socket communications it can be run in a docker network with multiple containers for distributed benchmarks honeybadgermpc can also be deployed to different cloud datacenters to try out honeybadgermpc we recommend following these instructions docs development getting started rst managing your development environment with docker compose to set up the docker based development environment then check out apps tutorial apps tutorial for a walkthrough that explains some sample mpc programs and shows how to run in different modes how it works secure multiparty computation mpc is about computing on secret shared data for each piece of confidential data x each of the n server nodes stores a different share x any t of the servers can be compromised without revealing any information about the confidential data however as long as n t parties are runnng correctly they can work together to perform arbitrary computations and disclose only the outputs leaking no additional information about the inputs the honeybadgermpc protocol is based on known mpc techniques carefully selected to achieve the robustness goals honeybadgermpc consists of three main phases 1 collecting client inputs clients submit input to the servers through an input masking technique cops15 the servers start with a secret sharing of a random mask r a client retrieves shares of this mask from the servers and reconstructs r and then publishes their masked message m r the servers obtain their share of the input as m m r r our programming model is mainly inspired by viff in fact our codebase began as a rewrite of viff porting it to python3 asyncio rather than python2 twisted the programming model is based on python mixins which extend a share that represents a pipelined mpc computation to reach agreement on the published inputs we make use of either the built in asynchronous broadcast protocol a port of honeybadgerbft or else an external blockchain service see fabric integration and ethereum integration 2 online phase once clients provide input the protocol computes an mpc program layer by layer an mpc program consist of linear operations on secret shared data as well as batch reconstruction linear operations can be computed locally non linear operations like multiplication must be emulated using share reconstruction and preprocessed values see mpc py batch reconstruction py taskprogramrunner py reed solomon py our share reconstruction implementation is aggressively batched and uses c and the ntl library for performance the bottleneck operation in mpc is generally batch reconstruction of secret shared values here the batch operations are implemented in c using the ntl library and wrapped using cython we implement both fft based quasilinear and matrix based superlinear todo link to benchmarks about fft vs matrix mul 3 offline phase generating preprocessing ingredients since the online phase consumes preprocessed random values we need to generate these values ahead of time for a continuously running service we need to generate such values at the same rate we consume them we use a linear overhead method randousha bh08 see offline randousha py for more detail on the honeybadgermpc components see docs subprotocols rst docs subprotocols rst comparison with other mpc implementations compared to oher mpc toolkit implementations http www multipartycomputation com mpc software https github com rdragos awesome mpc software honeybadgermpc is unique in that it focuses on robustness in a network of n server nodes assuming at most t n 3 are compromised then honyebadgermpc provides confidentiality integrity and availability guarantees in mpc terminology it is asynchronous provides active security has linear communication overhead and guarantees output delivery other mpc toolkits such as scale mamba viff emp spdz and others do not provide guaranteed output delivery and so if even a single node crashes they stop providing output at all linear communication overhead is about scaling to large network sizes honeybadgermpc implements aggressive batching and amortization so as more server nodes are added the communication cost per server approaches a constant comparison with zero knowledge proofs so far one of the main ways to provide privacy for blockchains is to use commitments and zero knowledge proofs as an alternative the main advantage of mpc is that it can provide better availability guarantees with commitments there is usually a sngle party the prover who knows the witness this becomes a problem when writing general smart contract applications like auctions or mixers if the prover aborts the protocol then the committed data is inaccessible in mpc the secret data is stored as secret shares on the server nodes so if any 1 3 of the server nodes fail the data is still available honeybadgermpc applications asynchromix as an illustration of the use of honeybadgermpc we include a mixnet application called asynchromix asynchromix provides an anonymous communication service compared to alternative mixnet protocols such as coinjoin and pathshuffle the main benefit of asynchromix is its robustness any t of the servers can fail and the system will still produce guaranteed output asynchromix explanation is coming soon todo mpc friendly cryptography honeybadgermpc is designed to be a useful starting point for other research prototypes involving cryptography and mpc we parameterize honeybadgermpc with a finite field based on the bls12 381 pairing friendly elliptic curve the same one used in zcash sapling the codebase comes with a python wrapper for the rust based implementation of bls12 381 see betterpairing py this is used to provide constant size polynomial commitments see the hbavss py honeybadgermpc also includes an mpc implementation of mimc symmetric cryptography honeybadgermpc progs mimc py and the jubjub elliptic curve for public key cryptography honeybadgermpc progs jubjub py blockchain integration honeybadgermpc is designed to be linked up with external transparent blockchain so it can provide a privacy preserving extension the following integration ideas have been explored so far a built in asynchronous bft protocol honeybadgerbft the asynchromix applications apps asynchromix asynchromix py contains an example web3 ethereum ropsten integration run it with python apps asynchromix asynchromix py coming soon honeybadgermpc has been integrated with hyperledger fabric more documentation library documentation docs the honeybadgermpc library documentation is under the docs docs directory contributing to honeybadgermpc contributing md this is an open project we welcome contributions from others see contributing md contributing md to get started acknowledgements see changelog md changelog md for credits to individual contributors work on honeybadgermpc has been funded in part by grants from initiative for cryptocurrencies and contracts ic3 https www initc3 org center for cognitive systems and research https c3sr com national science foundation 1801321 https www nsf gov awardsearch showaward awd id 1801321 1943499 https nsf gov awardsearch showaward awd id 1943499 cyber resilient energy delivery consortium credc https iti illinois edu research energy systems cyber resilient energy delivery consortium credc | blockchain |
|
React-Netflix-Clone | h1 align center img title netflix src https fhsknightlife com wp content uploads 2020 04 uvasxqvmzyurapfsn9pmtxoc7s89ulzddkbdtqcp png alt netflix logo width 400 br netflix clone built using react js firebase h1 p font size 3 this is a clone of netflix website built using strong em react js em strong as a front end strong em firebase em strong as back end it s not a replica and it doesn t have all the features of netflix website it s a similar version of netflix with my own design touch showing my abilities in react js to build something advanced like netflix it contains the home page sign in page sign up page browse page and movie player br br strong em take a look at the live version here em strong https react netflix clone beta vercel app octocat heart eyes p table of contents project walk through project walk through home page home page sign in page sign in page sign up page sign up page browse page browse page live demo live demo technology used technology used how to use how to use show your support show your support acknowledgments acknowledgments license license project walk through home page div align center a name menu a screenshot public images readme 5 jpg screenshot public images readme 6 jpg screenshot public images readme 7 jpg screenshot public images readme 8 jpg screenshot public images readme 9 jpg screenshot public images readme 10 jpg div the home page consists of 5 main sections 1 header which includes logo it redirects you to the home page when you click on it sign in button it redirects you to the sign in page feature title subtitle it shows the main sentences of the website 2 optform it s a text input field and a button it redirects you to the sign up page once you click on the button 3 jumbotron this section contains some images and paragraphs beside it showing the advantages of netflix the data of this jumbotron came from jumbo json file 4 frequently asked questions this section contains the faqs in a form of accordion when you click anywhere in the gray area of the question the answer appears below it and then you can close the answer by clicking again on the same gray area of the question the data of these faqs came from faqs json file 5 footer it contains useful links users may need it the page is fully responsive to all mobile devices even the small ones sign in page div align center a name menu a screenshot public images readme 11 jpg screenshot public images readme 12 jpg div the sign in page consists of 3 main sections 1 header which includes logo it redirects you to the home page when you click on it 2 sign in form which includes email address input field password input field sign in button it has a validation option if any field in the form is empty it will be disabled if the form fields have any data it will be active and will send the data to the firebase database in the backend for authentication it has also an error handling function link to sign up page it redirects you to the sign up page 3 footer it contains useful links users may need it the page is fully responsive to all mobile devices even the small ones sign up page div align center a name menu a screenshot public images readme 13 jpg screenshot public images readme 12 jpg div the sign up page consists of 3 main sections 1 header which includes logo it redirects you to the home page when you click on it 2 sign up form which includes first name input field email address input field password input field sign up button it has a validation option if any field in the form is empty it will be disabled if the form fields have any data it will be active and will send the data to the firebase database in the backend for registration it has also an error handling function link to sign in page it redirects you to the sign in page 3 footer it contains useful links users may need it the page is fully responsive to all mobile devices even the small ones browse page div align center a name menu a screenshot public images readme 1 jpg screenshot public images readme 2 jpg screenshot public images readme 3 jpg screenshot public images readme 4 jpg div the browse page consists of 5 main sections 1 header which includes logo it redirects you to the home page whenever you click it categories links it shows the movies of a specific category when you click on it for example if you click on the films link it will be active and the browse page will show only the films and if you click on the series link it will be active and the browse page will show only the series featured movie title description it shows the title and description of the featured movie play button it shows the video player to play the movie 2 movies slides it s a slides shows the movies based on their genre the genres and all movie information had been retrieved from the firebase database 3 movie card it s an image represent the movie when you hover over it became bigger and it will show its card feature if you click on it 4 card feature it s another section that appears under the movie slide if you click on any movie card it contains more information about the movie like title description a special background represent the movie and play button when you click on the play button it shows the video player to play the movie and you can close the card feature by clicking on the close icon in the top right corner of the card feature 5 video player it s a video player that has full controls appears in the middle of the screen when you click on any play button and you have to click on the play icon in the video player after it show up because it doesn t have an autoplay option currently and when the video player show up the whole screen became an overlay and only the video appears in the middle and when you scroll up and down the video player moves with you the video player should show the video of this movie which you clicked on it but for this project purpose it shows only one video as a sample for all movies you can close the video player anytime by clicking anywhere else on the screen div align center a name menu a screenshot public images readme 14 jpg div 6 footer it contains useful links users may need it the page is fully responsive to all mobile devices even the small ones live demo take a look on the live version here https react netflix clone red vercel app octocat heart eyes technology used i have built this project using the following tools techniques react js react router react forms react hooks usestate usecontext useeffect usehistory usestate compound components jsx css modules firebase vscode stylelint eslint github actions github pages how to use to be able to use this react app locally in a development environment you will need the following 1 you will need git https git scm com and node js https nodejs org en download installed on your computer 2 you will need an account on firebase https firebase com and you should create a project on your firebase account dedicated to this netflix project 3 you will need the seed js file which i added in this repo to seed your firebase backend with movies information or you can use your seed file with your information if you want 4 then from your terminal you should do the following cmd clone this repository git clone https github com ahmedtohamy01 react netflix clone go into the repository cd react netflix clone install dependencies npm install 5 then you will need to create the src lib firebase prod js file in your local repo the content of firebase prod js file will be like the following js import firebase from firebase app import firebase firestore import firebase auth 1 when seeding the database you ll have to uncomment this import seeddatabase from seed const config apikey authdomain databaseurl projectid storagebucket messagingsenderid appid const firebase firebase initializeapp config 2 when seeding the database you ll have to uncomment this seeddatabase firebase 3 once you have populated the database only run once re comment this so you don t get duplicate data export firebase 6 then you should use your firebase project information to fill the config information in firebase prod js file js const config apikey authdomain databaseurl projectid storagebucket messagingsenderid appid 7 then you should seed your firebase database with the information in the seed js file follow the following instructions to do this js 1 un comment the following line import seeddatabase from seed seeddatabase firebase 2 save the firebase prod js 3 wait 2 minutes and check your firebase database if you found the data there then re comment the above 2 lines if you didn t re commet the above 2 lines after the seeding process you will get dupliacted data in your firebase database 8 after seeding your firebase database with the movies information reverting the github pages changes you can run the netflix react app using the following command from your terminal run the app npm start 9 now you can see the project in your browser as you see in the live demo link happy hacking show your support give a if you like this project acknowledgments hat tip to everyone helped me to learn the techniques used in building this project license mit license | react reactjs react-router react-hooks react-components compound-components firebase firestore react-forms jsx | front_end |
Cloud-Management-Engineering-Ecommerce | cloud management engineering ecommerce is458 cloud management and engineering group project | cloud |
|
IoT | iot readme this repository contains code and other resources for the azure end to end iot proof of concept found at this website http www azurelaunchpad net instructions on setting up the iot example in your own environment can be found on the wiki here https github com remixed123 iot wiki azure folder contents c example code for web based examples launchpads folder contents cc3200 azure code composure studio project with code written in c for the cc3200 msp432 azure code composure studio project with code written in c for the msp432 cc3100 misc folder contents architectural diagram security folder contents sastokengenerate windows forms application project written in c this will generate a sas token root certificate authority for azure services | server |
|
blockchain-anomaly-detection | blockchain anomaly detection description anomaly detection has been a well studied area for a long time its applications in the financial sector have aided in identifying suspicious activities of hackers how ever with the advancements in the financial domain such as blockchain and artificial intelligence it is more challenging to deceive financial systems despite these technological advancements many fraudulent cases have still emerged many artificial intelligence techniques have been proposed to deal with the anomaly detection problem some results appear to be considerably assuring but there is no explicit superior solution this repo is for a research that leaps to bridge the gap between artificial intelligence and blockchain by pursuing various anomaly detection techniques on transactional network data of a public financial blockchain bitcoin this repository is an implementation for a research that presents anomaly detection in the light of blockchain technology and its applications in the financial sector it extracts the transactional data of bitcoin blockchain and analyses for malicious transactions using unsupervised machine learning techniques a range of algorithms such as iso lation forest histogram based outlier detection hbos cluster based local outlier factor cblof principal component analysis pca k means deep autoencoder networks and ensemble method are evaluated and compared publication url http urn fi urn nbn fi tuni 201912056592 data ieee data port bitcoin transaction network metadata dataset https ieee dataport org open access bitcoin transaction network metadata 2011 2013 bitcoin transactions dataset https ieee dataport org open access bitcoin transactions data 2011 2013 bitcoin hacks frauds dataset https ieee dataport org open access bitcoin hacked transactions 2010 2013 kaggle bitcoin hacked transactions 2010 2013 https www kaggle com omershafiq bitcoin hacks 2010to2013 bitcoin network transactional metadata 2011 2013 https www kaggle com omershafiq bitcoin network transactional metadata data license a rel license href http creativecommons org licenses by sa 4 0 img alt creative commons license style border width 0 src https i creativecommons org l by sa 4 0 88x31 png a br this data is licensed under a a rel license href http creativecommons org licenses by sa 4 0 creative commons attribution sharealike 4 0 international license a | blockchain |
|
carbon-icons-svelte | carbon icons svelte npm npm npm url github https img shields io github license ibm carbon icons svelte color 262626 style for the badge npm downloads to date https img shields io npm dt carbon icons svelte color 262626 style for the badge carbon design system https github com carbon design system svg icons as svelte components this zero dependency icon library builds carbon design system icons https www carbondesignsystem com guidelines icons library as svelte components although best paired with carbon components svelte https github com ibm carbon components svelte these icons can be consumed standalone try it in the svelte repl https svelte dev repl 931e6a3461434622adad0557579c0a29 preview https carbon icons svelte onrender com icon index icon index md installation install carbon icons svelte as a development dependency sh yarn yarn add d carbon icons svelte npm npm i d carbon icons svelte pnpm pnpm i d carbon icons svelte usage basic import the icon from the carbon icons svelte lib folder see the icon index icon index md for a list of supported icons svelte script import add from carbon icons svelte lib add svelte script add custom size use the size prop to specify the icon size supported icon sizes include 16 20 24 and 32 the default size is 16 svelte add size 16 add size 20 add size 24 add size 32 custom props restprops are forwarded to the svg element you can use fill to customize the color or pass any other valid svg attribute to the component svelte add fill red class icon labelled svelte add aria label add labelled icon that is focusable svelte add aria label add tabindex 0 labelled by svelte label id add file add file label add aria labelledby add file api props all props are optional name type default value size code 16 124 20 124 24 124 32 code 16 title string undefined changelog changelog md contributing contributing md deploying the icon preview is deployed to render https render com as a static site see render yaml render yaml for details deploy to render https render com images deploy to render button svg https render com deploy repo https github com carbon design system carbon icons svelte license apache 2 0 license npm https img shields io npm v carbon icons svelte svg color 262626 style for the badge npm url https npmjs com package carbon icons svelte | ibm icons carbon carbon-design-system svelte components svg svelte-components svg-icons typescript-definitions | os |
hotel-management | hotel management this is a project for the databases class in ntua electrical and computer engineering department the project was graded with perfect score 100 100 contributors listed alphabetically 1 elina syrri elinasyr https github com elinasyr 1 george papadoulis g papad https github com g papad 1 nick bellos nickbel7 https github com nickbel7 tools used python https img shields io badge python v3 7 red svg dependencies https img shields io badge flask v2 0 1 red pypyodbc https img shields io badge pypyodbc v1 3 4 red svg sqlserver https img shields io badge sql server v2019 yellow svg requirements https github com alexandroskyriakakis database blob master requirements txt sql server 2019 flask 2 0 1 pypyodbc 1 3 4 er diagram https github com nickbel7 hotel management blob main diagrams erd 2crelational erd jpg relational model https github com nickbel7 hotel management blob main diagrams erd 2crelational relationaldiagram png raw true https github com nickbel7 hotel management blob main diagrams erd 2crelational relationaldiagram jpg installation 1 at first make sure you have installed sql server 2019 express on your computer download page https www microsoft com en us download details aspx id 101064 2 then connect to the server throught a dbms preferably microsoft sql management studio with sa system administrator credentials run the following sql queries inside the dmbs at this spesific order 3 create tables sql sql code create tables sql to create the database and the tables 4 create indexes sql sql code create indexes sql to create the indexes 5 create views 1 sql sql code create views 1 sql and create views 2 sql sql code create views 2 sql to create the required views insert mock data in the database 6 insert data from the excel hotelmanagement data xlsx mock data hotelmanagement v2 xlsx throught the import export wizard of microsoft management studio br attention insert the data table by table with strictly the following order and by enabling the identity insert in the edit mappings option for each table reservations reservations hotelservices hotellocations doors hotelrooms reservationcustomers reservationservices reservationrooms dooraccesslog or br directly insert the backup bak file of the database with all the data located here hotelmanagement bak db backup hotelmanagement v2 bak bash databases right click restore database download and run the web app 7 run bash git clone https github com nickbel7 hotel management git cd hotel management 9 add your database credentials preferably use sa user to have all privileges at the top of the app py project app py file bash ql user sql password sql server name sql database name hotelmanagement 10 run the following script to download all required libraries bash pip install r requirements txt 11 run the following script to enter the project folder and start the web server bash cd project python m flask run 12 open your browser and type http 127 0 0 1 5000 to preview the website sql queries here we show all the queries sql code project queries sql used in the site at each page find the questions for the queries attached to the file docs pdf youtube explaining in greek language how to use our wep application and what queries are used in each page br https youtu be qy2ix3ab5gi | lambda ece ntua databases | server |
DAQ-System | daq system embedded systems design firmware source code | os |
|
actionview-fe | actionview front end https img shields io badge framework reactjs redux brightgreen svg https img shields io badge license apache2 0 brightgreen svg actoionview reactjs redux actionview https github com lxerxa actionview demo https actionview cn how to install node 4 2 6 npm 3 5 2 git clone https github com lxerxa actionview fe git cd actionview fe npm install 1 node modules react image lightbox react image lightbox js 830 close class 2 node modules dropzone react dropzone component node modules dropzone node modules 3 node modules cropperjs react cropper node modules cropperjs node modules demo npm run dev http localhost 3002 npm run build sh deploy sh contributing actionview bug feature issue board https github com lxerxa actionview fe issues email actionview 126 com license actionview apache license version 2 0 https www apache org licenses license 2 0 the actionview is open sourced software licensed under the apache license version 2 0 https www apache org licenses license 2 0 | jira kanban kanban-board workflow react redux | front_end |
Identifying-Gentrification-Factors | identifying gentrification factors neighborhoods in big cities are in a constant state of change due to the movement of people and capital this trend affects certain demographic groups that are forced to leave due to rising housing prices increasing cost of living and cultural segregation the goal is to consolidate data to identify factors of gentrification and create indices to detect early signs so that city planners and public officials can take necessary measures and mitigate the effects 1 scraping data from american community survey building permits and zillow to collect data on demographic indicators transportation accessibility and use and housing market plan the data pipeline alt text datapipeline png raw true datapipeline png 2 clean the data from all the 3 sources using sas selecting relevant variables for gentrification purposes and available selected features at block group level comes to 900 3 extract transform and load the data into postgrs sql database located on gcp cloud create an er data model and store the data using automated python scripts and sql scripts alt text etl png raw true etl png 4 create a datawarehouse dimensional model for analysis alt text analysis png raw true analysis png 5 deployed tableau server on top of gcp for reporting purpose fiting statistical models and producing actionable insights through data visualization tools chicago dashboard bronzeville gentrification risk alt text chicago 20bronzeville 20gentrificatiom 20risk png raw true chicago bronzeville gentrificatiom risk png los angeles dashboard alt text los 20angeles 20dashboard png raw true los angeles dashboard png 6 summarized the areas at a high risk of gentrification so city planners can take necessary measures to mitigate gentrification effects a gentrification areas identified in the result chicago bronzeville little italy la boyle heights arlington heights koreatown b combined influencing factors into gentrification index gini index among those factors income rent difference between actual and asked price of a housing unit c city planners can take necessary actions to mitigate these effects based on the indices calculated for all block groups that indicate early signs of gentrification | cloud |
|
Machine-Learning-Guide | h1 align center img src https user images githubusercontent com 45159366 134075212 b132056a 5980 4610 a141 dd0677b17b5f png br machine learning guide h1 a href https github com mikeroyal tab followers img alt followers title follow me for updates src https custom icon badges demolab com github followers mikeroyal color 236ad3 labelcolor 1155ba style for the badge logo person add label follow logocolor white a maintenance https img shields io maintenance yes 2023 style for the badge last commit https img shields io github last commit mikeroyal machine learning guide style for the badge a guide covering machine learning including the applications libraries and tools that will make you better and more efficient with machine learning development note you can easily convert this markdown file to a pdf in vscode https code visualstudio com using this handy extension markdown pdf https marketplace visualstudio com items itemname yzane markdown pdf img src https user images githubusercontent com 45159366 134075227 d9e361c6 abb7 47bc 9fae b20f73649696 png machine learning deep learning frameworks table of contents 1 learning resources for ml https github com mikeroyal machine learning guide learning resources for ml developer resources https github com mikeroyal machine learning guide developer resources courses certifications https github com mikeroyal machine learning guide courses certifications books https github com mikeroyal machine learning guide books 2 ml frameworks libraries and tools https github com mikeroyal machine learning guide ml frameworks libraries and tools running large language models llms locally running llms locally 3 algorithms https github com mikeroyal machine learning guide algorithms 4 pytorch development https github com mikeroyal machine learning guide pytorch development 5 tensorflow development https github com mikeroyal machine learning guide tensorflow development 6 core ml development https github com mikeroyal machine learning guide core ml development 7 deep learning development https github com mikeroyal machine learning guide deep learning development 8 reinforcement learning development https github com mikeroyal machine learning guide reinforcement learning development 9 computer vision development https github com mikeroyal machine learning guide computer vision development 10 natural language processing nlp development https github com mikeroyal machine learning guide nlp development 11 bioinformatics https github com mikeroyal machine learning guide bioinformatics 12 cuda development https github com mikeroyal machine learning guide cuda development 13 matlab development https github com mikeroyal machine learning guide matlab development 14 c c development https github com mikeroyal machine learning guide cc development 15 java development https github com mikeroyal machine learning guide java development 16 python development https github com mikeroyal machine learning guide python development 17 scala development https github com mikeroyal machine learning guide scala development 18 r development https github com mikeroyal machine learning guide r development 19 julia development https github com mikeroyal machine learning guide julia development learning resources for ml back to the top https github com mikeroyal machine learning guide table of contents machine learning https www ibm com cloud learn machine learning is a branch of artificial intelligence ai focused on building apps using algorithms that learn from data models and improve their accuracy over time without needing to be programmed img src https user images githubusercontent com 45159366 134075233 925d48fd 47f2 44d0 946b 0453272bde35 jpeg developer resources back to the top https github com mikeroyal machine learning guide table of contents natural language processing nlp best practices by microsoft https github com microsoft nlp recipes the autonomous driving cookbook by microsoft https github com microsoft autonomousdrivingcookbook azure machine learning ml as a service microsoft azure https azure microsoft com en us services machine learning how to run jupyter notebooks in your azure machine learning workspace https docs microsoft com en us azure machine learning how to run jupyter notebooks machine learning and artificial intelligence amazon web services https aws amazon com machine learning scheduling jupyter notebooks on amazon sagemaker ephemeral instances https aws amazon com blogs machine learning scheduling jupyter notebooks on sagemaker ephemeral instances ai machine learning google cloud https cloud google com products ai using jupyter notebooks with apache spark on google cloud https cloud google com blog products gcp google cloud platform for data scientists using jupyter notebooks with apache spark on google cloud machine learning apple developer https developer apple com machine learning artificial intelligence autopilot tesla https www tesla com ai meta ai tools facebook https ai facebook com tools pytorch tutorials https pytorch org tutorials tensorflow tutorials https www tensorflow org tutorials jupyterlab https jupyterlab readthedocs io stable diffusion with core ml on apple silicon https machinelearning apple com research stable diffusion coreml apple silicon courses certifications back to the top https github com mikeroyal machine learning guide table of contents machine learning by stanford university by andrew ng coursera https www coursera org learn machine learning aws training and certification for machine learning ml courses https aws amazon com training learning paths machine learning machine learning scholarship program for microsoft azure udacity https www udacity com scholarships machine learning scholarship microsoft azure microsoft certified azure data scientist associate https docs microsoft com en us learn certifications azure data scientist microsoft certified azure ai engineer associate https docs microsoft com en us learn certifications azure ai engineer azure machine learning training and deployment https docs microsoft com en us azure devops pipelines targets azure machine learning learning machine learning and artificial intelligence from google cloud training https cloud google com training machinelearning ai machine learning crash course for google cloud https developers google com machine learning crash course machine learning courses online udemy https www udemy com topic machine learning machine learning courses online coursera https www coursera org courses query machine 20learning learn machine learning with online courses and classes edx https www edx org learn machine learning books back to the top https github com mikeroyal machine learning guide table of contents introduction to machine learning pdf https ai stanford edu nilsson mlbook pdf artificial intelligence a modern approach by stuart j russel and peter norvig https www amazon com artificial intelligence a modern approach dp 0134610997 ref sr 1 1 dchild 1 keywords artificial intelligence a modern approach qid 1626728093 sr 8 1 deep learning by ian goodfellow yoshoua bengio and aaron courville https www deeplearningbook org the hundred page machine learning book by andriy burkov https themlbook com wiki doku php hundred page machine learning book on github https github com aburkov themlbook machine learning by tom m mitchell https www cs cmu edu tom newchapters html programming collective intelligence building smart web 2 0 applications by toby segaran https www amazon com programming collective intelligence building applications dp 0596529325 ref sr 1 1 crid 8ei42xmxesgb keywords programming collective intelligence 3a building smart web 2 0 applications qid 1654318595 sprefix programming collective intelligence building smart web 2 0 applications 2caps 2c194 sr 8 1 machine learning an algorithmic perspective second edition https www amazon com machine learning algorithmic perspective recognition dp 1466583282 ref sr 1 8 crid 2riq8ommass3 keywords pattern recognition and machine learning qid 1654318681 sprefix pattern recognition and machine learning 2caps 2c184 sr 8 8 pattern recognition and machine learning by christopher m bishop https www amazon com pattern recognition learning information statistics dp 1493938436 ref sr 1 4 crid 2riq8ommass3 keywords pattern recognition and machine learning qid 1654318681 sprefix pattern recognition and machine learning 2caps 2c184 sr 8 4 natural language processing with python by steven bird ewan klein and edward loper https www amazon com natural language processing python analyzing dp 0596516495 ref sr 1 1 crid o4xscf3cnibn keywords natural language processing with python qid 1654318757 sprefix natural language processing with python 2caps 2c285 sr 8 1 python machine learning a technical approach to machine learning for beginners by leonard eddison https www amazon com python machine learning technical beginners dp 1986340872 ref sr 1 1 crid 1w5x2wv05gdqk keywords python machine learning 3a a technical approach to machine learning for beginners qid 1654318782 sprefix python machine learning a technical approach to machine learning for beginners 2caps 2c212 sr 8 1 bayesian reasoning and machine learning by david barber https www amazon com bayesian reasoning machine learning barber dp 0521518148 ref sr 1 1 crid 1j054t5mucd20 keywords bayesian reasoning and machine learning qid 1654318807 sprefix bayesian reasoning and machine learning 2caps 2c179 sr 8 1 machine learning for absolute beginners a plain english introduction by oliver theobald https www amazon com machine learning absolute beginners introduction ebook dp b08rwbskqb ref sr 1 1 crid 1jbs4kehty6i5 keywords machine learning for absolute beginners 3a a plain english introduction qid 1654318861 sprefix machine learning for absolute beginners a plain english introduction 2caps 2c168 sr 8 1 machine learning in action by ben wilson https www amazon com machine learning engineering action wilson dp 1617298719 ref sr 1 1 crid 6s9f2mjhaqx1 keywords machine learning in action qid 1654318897 sprefix machine learning in action 2caps 2c174 sr 8 1 hands on machine learning with scikit learn keras and tensorflow concepts tools and techniques to build intelligent systems by aur lien g ron https www amazon com hands machine learning scikit learn tensorflow dp 1492032646 ref sr 1 6 crid 2riq8ommass3 keywords pattern recognition and machine learning qid 1654318681 sprefix pattern recognition and machine learning 2caps 2c184 sr 8 6 introduction to machine learning with python a guide for data scientists by andreas c m ller sarah guido https www amazon com introduction machine learning python scientists dp 1449369413 ref sr 1 1 crid 3sgfhbbu06gb6 keywords introduction to machine learning with python 3a a guide for data scientists qid 1654318969 sprefix introduction to machine learning with python a guide for data scientists 2caps 2c181 sr 8 1 machine learning for hackers case studies and algorithms to get you started by drew conway and john myles white https www amazon com machine learning hackers studies algorithms dp 1449303714 ref sr 1 1 crid 2pqabq4t9b8k5 keywords machine learning for hackers 3a case studies and algorithms to get you started qid 1654318629 sprefix machine learning for hackers case studies and algorithms to get you started 2caps 2c162 sr 8 1 the elements of statistical learning data mining inference and prediction by trevor hastie robert tibshirani and jerome friedman https www amazon com elements statistical learning prediction statistics dp 0387848576 ref sr 1 1 crid 1hok9m9gfhtk9 keywords the elements of statistical learning 3a data mining 2c inference 2c and prediction qid 1654318661 sprefix the elements of statistical learning data mining 2c inference 2c and prediction 2caps 2c215 sr 8 1 distributed machine learning patterns https github com terrytangyuan distributed ml patterns book free to read online code real world machine learning https www manning com books real world machine learning free chapters an introduction to statistical learning https www bcf usc edu gareth isl book r code elements of statistical learning https web stanford edu hastie elemstatlearn book think bayes https greenteapress com wp think bayes book python code mining massive datasets https infolab stanford edu ullman mmds book pdf a first encounter with machine learning https www ics uci edu welling teaching 273aspring10 intromlbook pdf introduction to machine learning https alex smola org drafts thebook pdf alex smola and s v n vishwanathan a probabilistic theory of pattern recognition https www szit bme hu gyorfi pbook pdf introduction to information retrieval https nlp stanford edu ir book pdf irbookprint pdf forecasting principles and practice https otexts com fpp2 introduction to machine learning https arxiv org pdf 0904 3664v1 pdf amnon shashua reinforcement learning https www intechopen com books reinforcement learning machine learning https www intechopen com books machine learning a quest for ai https ai stanford edu nilsson qai qai pdf r programming for data science https leanpub com rprogramming data mining practical machine learning tools and techniques https cdn preterhuman net texts science and technology artificial intelligence data 20mining 20practical 20machine 20learning 20tools 20and 20techniques 202d 20ed 20 20morgan 20kaufmann pdf machine learning with tensorflow https www manning com books machine learning with tensorflow machine learning systems https www manning com books machine learning systems foundations of machine learning https cs nyu edu mohri mlbook mehryar mohri afshin rostamizadeh and ameet talwalkar ai powered search https www manning com books ai powered search trey grainger doug turnbull max irwin ensemble methods for machine learning https www manning com books ensemble methods for machine learning gautam kunapuli machine learning engineering in action https www manning com books machine learning engineering in action ben wilson privacy preserving machine learning https www manning com books privacy preserving machine learning j morris chang di zhuang g dumindu samaraweera automated machine learning in action https www manning com books automated machine learning in action qingquan song haifeng jin and xia hu distributed machine learning patterns https www manning com books distributed machine learning patterns yuan tang managing machine learning projects from design to deployment https www manning com books managing machine learning projects simon thompson causal machine learning https www manning com books causal machine learning robert ness bayesian optimization in action https www manning com books bayesian optimization in action quan nguyen machine learning algorithms in depth https www manning com books machine learning algorithms in depth vadim smolyakov optimization algorithms https www manning com books optimization algorithms alaa khamis practical gradient boosting https www amazon com dp b0bl1hrd6z by guillaume saupin ml frameworks libraries and tools back to the top https github com mikeroyal machine learning guide table of contents tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml pytorch https pytorch org is a library for deep learning on irregular input data such as graphs point clouds and manifolds primarily developed by facebook s ai research lab amazon sagemaker https aws amazon com sagemaker is a fully managed service that provides every developer and data scientist with the ability to build train and deploy machine learning ml models quickly sagemaker removes the heavy lifting from each step of the machine learning process to make it easier to develop high quality models azure databricks https azure microsoft com en us services databricks is a fast and collaborative apache spark based big data analytics service designed for data science and data engineering azure databricks sets up your apache spark environment in minutes autoscale and collaborate on shared projects in an interactive workspace azure databricks supports python scala r java and sql as well as data science frameworks and libraries including tensorflow pytorch and scikit learn microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers apple coreml https developer apple com documentation coreml is a framework that helps integrate machine learning models into your app core ml provides a unified representation for all models your app uses core ml apis and user data to make predictions and to train or fine tune models all on the user s device a model is the result of applying a machine learning algorithm to a set of training data you use a model to make predictions based on new input data apache opennlp https opennlp apache org is an open source library for a machine learning based toolkit used in the processing of natural language text it features an api for use cases like named entity recognition https en wikipedia org wiki named entity recognition sentence detection pos part of speech tagging https en wikipedia org wiki part of speech tagging tokenization https en wikipedia org wiki tokenization data security feature extraction https en wikipedia org wiki feature extraction chunking https en wikipedia org wiki chunking psychology parsing https en wikipedia org wiki parsing and coreference resolution https en wikipedia org wiki coreference apache airflow https airflow apache org is an open source workflow management platform created by the community to programmatically author schedule and monitor workflows install principles scalable airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers airflow is ready to scale to infinity open neural network exchange onnx https github com onnx is an open ecosystem that empowers ai developers to choose the right tools as their project evolves onnx provides an open source format for ai models both deep learning and traditional ml it defines an extensible computation graph model as well as definitions of built in operators and standard data types apache mxnet https mxnet apache org is a deep learning framework designed for both efficiency and flexibility it allows you to mix symbolic and imperative programming to maximize efficiency and productivity at its core mxnet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly a graph optimization layer on top of that makes symbolic execution fast and memory efficient mxnet is portable and lightweight scaling effectively to multiple gpus and multiple machines support for python r julia scala go javascript and more autogluon https autogluon mxnet io index html is toolkit for deep learning that automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications with just a few lines of code you can train and deploy high accuracy deep learning models on tabular image and text data anaconda https www anaconda com is a very popular data science platform for machine learning and deep learning that enables users to develop models train them and deploy them plaidml https github com plaidml plaidml is an advanced and portable tensor compiler for enabling deep learning on laptops embedded devices or other devices where the available computing hardware is not well supported or the available software stack contains unpalatable license restrictions opencv https opencv org is a highly optimized library with focus on real time computer vision applications the c python and java interfaces support linux macos windows ios and android scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms weka https www cs waikato ac nz ml weka is an open source machine learning software that can be accessed through a graphical user interface standard terminal applications or a java api it is widely used for teaching research and industrial applications contains a plethora of built in tools for standard machine learning tasks and additionally gives transparent access to well known toolboxes such as scikit learn r and deeplearning4j caffe https github com bvlc caffe is a deep learning framework made with expression speed and modularity in mind it is developed by berkeley ai research bair the berkeley vision and learning center bvlc and community contributors theano https github com theano theano is a python library that allows you to define optimize and evaluate mathematical expressions involving multi dimensional arrays efficiently including tight integration with numpy ngraph https github com nervanasystems ngraph is an open source c library compiler and runtime for deep learning the ngraph compiler aims to accelerate developing ai workloads using any deep learning framework and deploying to a variety of hardware targets it provides the freedom performance and ease of use to ai developers nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org huginn https github com huginn huginn is a self hosted system for building agents that perform automated tasks for you online it can read the web watch for events and take actions on your behalf huginn s agents create and consume events propagating them along a directed graph think of it as a hackable version of ifttt or zapier on your own server netron https netron app is a viewer for neural network deep learning and machine learning models it supports onnx tensorflow lite caffe keras darknet paddlepaddle ncnn mnn core ml rknn mxnet mindspore lite tnn barracuda tengine cntk tensorflow js caffe2 and uff dopamine https github com google dopamine is a research framework for fast prototyping of reinforcement learning algorithms dali https github com nvidia dali is a gpu accelerated library containing highly optimized building blocks and an execution engine for data processing to accelerate deep learning training and inference applications mindspore lite https github com mindspore ai mindspore is a new open source deep learning training inference framework that could be used for mobile edge and cloud scenarios darknet https github com pjreddie darknet is an open source neural network framework written in c and cuda it is fast easy to install and supports cpu and gpu computation paddlepaddle https github com paddlepaddle paddle is an easy to use efficient flexible and scalable deep learning platform which is originally developed by baidu scientists and engineers for the purpose of applying deep learning to many products at baidu googlenotebooklm https blog google technology ai notebooklm google ai is an experimental ai tool using the power of language models paired with your existing content to gain critical insights faster similar to a virtual research assistant that can summarize facts explain complex ideas and brainstorm new connections based on the sources you select unilm https github com microsoft unilm is a large scale self supervised pre training across tasks languages and modalities semantic kernel sk https aka ms semantic kernel is a lightweight sdk enabling integration of ai large language models llms with conventional programming languages the sk extensible programming model combines natural language semantic functions traditional code native functions and embeddings based memory unlocking new potential and adding value to applications with ai pandas ai https github com gventuri pandas ai is a python library that integrates generative artificial intelligence capabilities into pandas making dataframes conversational ncnn https github com tencent ncnn is a high performance neural network inference framework optimized for the mobile platform mnn https github com alibaba mnn is a blazing fast lightweight deep learning framework battle tested by business critical use cases in alibaba mediapipe https mediapipe dev is an optimized for end to end performance on a wide array of platforms see demos learn more complex on device ml simplified we ve abstracted away the complexities of making on device ml customizable production ready and accessible across platforms megengine https github com megengine is a fast scalable and user friendly deep learning framework with 3 key features unified framework for both training and inference ml net https dot net ml is a machine learning library that is designed as an extensible platform so that you can consume other popular ml frameworks tensorflow onnx infer net and more and have access to even more machine learning scenarios like image classification object detection and more ludwig https ludwig ai is a declarative machine learning framework https ludwig ai github io ludwig docs latest user guide what is ludwig why declarative machine learning systems that makes it easy to define machine learning pipelines using a simple and flexible data driven configuration system mmdnn https github com microsoft mmdnn is a comprehensive and cross framework tool to convert visualize and diagnose deep learning dl models the mm stands for model management and dnn is the acronym of deep neural network convert models between caffe keras mxnet tensorflow cntk pytorch onnx and coreml horovod https github com horovod horovod is a distributed deep learning training framework for tensorflow keras pytorch and apache mxnet vaex https vaex io is a high performance python library for lazy out of core dataframes similar to pandas to visualize and explore big tabular datasets gluonts https ts gluon ai is a python package for probabilistic time series modeling focusing on deep learning based models based on pytorch https pytorch org and mxnet https mxnet apache org mindsdb http mindsdb com is a ml sql server enables machine learning workflows for the most powerful databases and data warehouses using sql jupyter notebook https jupyter org is an open source web application that allows you to create and share documents that contain live code equations visualizations and narrative text jupyter is used widely in industries that do data cleaning and transformation numerical simulation statistical modeling data visualization data science and machine learning apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture cluster manager for apache kafka cmak https github com yahoo cmak is a tool for managing apache kafka https kafka apache org clusters bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks tensorman https github com pop os tensorman is a utility for easy management of tensorflow containers by developed by system76 https system76 com tensorman allows tensorflow to operate in an isolated environment that is contained from the rest of the system this virtual environment can operate independent of the base system allowing you to use any version of tensorflow on any version of a linux distribution that supports the docker runtime numba https github com numba numba is an open source numpy aware optimizing compiler for python sponsored by anaconda inc it uses the llvm compiler project to generate machine code from python syntax numba can compile a large subset of numerically focused python including many numpy functions additionally numba has support for automatic parallelization of loops generation of gpu accelerated code and creation of ufuncs and c callbacks chainer https chainer org is a python based deep learning framework aiming at flexibility it provides automatic differentiation apis based on the define by run approach dynamic computational graphs as well as object oriented high level apis to build and train neural networks it also supports cuda cudnn using cupy https github com cupy cupy for high performance training and inference xgboost https xgboost readthedocs io is an optimized distributed gradient boosting library designed to be highly efficient flexible and portable it implements machine learning algorithms under the gradient boosting framework xgboost provides a parallel tree boosting also known as gbdt gbm that solve many data science problems in a fast and accurate way it supports distributed training on multiple machines including aws gce azure and yarn clusters also it can be integrated with flink spark and other cloud dataflow systems cuml https github com rapidsai cuml is a suite of libraries that implement machine learning algorithms and mathematical primitives functions that share compatible apis with other rapids projects cuml enables data scientists researchers and software engineers to run traditional tabular ml tasks on gpus without going into the details of cuda programming in most cases cuml s python api matches the api from scikit learn emu https calebwin github io emu is a gpgpu library for rust with a focus on portability modularity and performance it s a cuda esque compute specific abstraction over webgpu providing specific functionality to make webgpu feel more like cuda scalene https github com plasma umass scalene is a high performance cpu gpu and memory profiler for python that does a number of things that other python profilers do not and cannot do it runs orders of magnitude faster than many other profilers while delivering far more detailed information mlpack https mlpack org is a fast flexible c machine learning library written in c and built on the armadillo https arma sourceforge net linear algebra library the ensmallen https ensmallen org numerical optimization library and parts of boost https boost org netron https netron app is a viewer for neural network deep learning and machine learning models it supports onnx tensorflow lite caffe keras darknet paddlepaddle ncnn mnn core ml rknn mxnet mindspore lite tnn barracuda tengine cntk tensorflow js caffe2 and uff lightning https github com lightning ai lightning is a tool that builds and trains pytorch models and connect them to the ml lifecycle using lightning app templates without handling diy infrastructure cost management scaling etc opennn https www opennn net is an open source neural networks library for machine learning it contains sophisticated algorithms and utilities to deal with many artificial intelligence solutions h20 https h2o ai is an ai cloud platform that solves complex business problems and accelerates the discovery of new ideas with results you can understand and trust gensim https github com rare technologies gensim is a python library for topic modelling document indexing and similarity retrieval with large corpora target audience is the natural language processing nlp and information retrieval ir community llama cpp https github com ggerganov llama cpp is a port of facebook s llama model in c c hmmlearn https github com hmmlearn hmmlearn is a set of algorithms for unsupervised learning and inference of hidden markov models https en wikipedia org wiki hidden markov model nextjournal https nextjournal com is a notebook for reproducible research it runs anything you can put into a docker container improve your workflow with polyglot notebooks automatic versioning and real time collaboration save time and money with on demand provisioning including gpu support ipython https ipython org provides a rich architecture for interactive computing with a powerful interactive shell a kernel for jupyter https jupyter org support for interactive data visualization and use of gui toolkits https ipython org ipython doc stable interactive reference html gui event loop support flexible embeddable https ipython org ipython doc stable interactive reference html embedding ipython interpreters to load into your own projects easy to use high performance tools for parallel computing https ipyparallel readthedocs io en latest veles https github com samsung veles is a distributed platform for rapid deep learning application development currently devloped by samsung dynet https github com clab dynet is a neural network library developed by carnegie mellon university and many others it is written in c with bindings in python and is designed to be efficient when run on either cpu or gpu and to work well with networks that have dynamic structures that change for every training instance these kinds of networks are particularly important in natural language processing tasks and dynet has been used to build state of the art systems for syntactic parsing machine translation morphological inflection and many other application areas ray https github com ray project ray is a unified framework for scaling ai and python applications it consists of a core distributed runtime and a toolkit of libraries ray air for accelerating ml workloads whisper cpp https github com ggerganov whisper cpp is a high performance inference of openai s whisper automatic speech recognition asr model chatgpt plus https openai com blog chatgpt plus is a pilot subscription plan 20 month for chatgpt a conversational ai that can chat with you answer follow up questions and challenge incorrect assumptions auto gpt https github com significant gravitas auto gpt is an ai agent that given a goal in natural language can attempt to achieve it by breaking it into sub tasks and using the internet and other tools in an automatic loop it uses openai s gpt 4 or gpt 3 5 apis and is among the first examples of an application using gpt 4 to perform autonomous tasks chatbot ui by mckaywrigley https github com mckaywrigley chatbot ui is an advanced chatbot kit for openai s chat models built on top of chatbot ui lite using next js typescript and tailwind css this version of chatbot ui supports both gpt 3 5 and gpt 4 models conversations are stored locally within your browser you can export and import conversations to safeguard against data loss see a demo https twitter com mckaywrigley status 1636103188733640704 chatbot ui lite by mckaywrigley https github com mckaywrigley chatbot ui lite is a simple chatbot starter kit for openai s chat model using next js typescript and tailwind css see a demo https twitter com mckaywrigley status 1636103188733640704 minigpt 4 https minigpt 4 github io is an enhancing vision language understanding with advanced large language models gpt4all https github com nomic ai gpt4all is an ecosystem of open source chatbots trained on a massive collections of clean assistant data including code stories and dialogue based on llama https github com facebookresearch llama gpt4all ui https github com nomic ai gpt4all ui is a flask web application that provides a chat ui for interacting with the gpt4all chatbot alpaca cpp https github com antimatter15 alpaca cpp is a fast chatgpt like model locally on your device it combines the llama foundation model https github com facebookresearch llama with an open reproduction https github com tloen alpaca lora of stanford alpaca https github com tatsu lab stanford alpaca a fine tuning of the base model to obey instructions akin to the rlhf https huggingface co blog rlhf used to train chatgpt and a set of modifications to llama cpp https github com ggerganov llama cpp to add a chat interface llama cpp https github com ggerganov llama cpp is a port of facebook s llama model in c c openplayground https github com nat openplayground is a playfround for running chatgpt like models locally on your device vicuna https vicuna lmsys org is an open source chatbot trained by fine tuning llama it apparently achieves more than 90 quality of chatgpt and costs 300 to train yeagar ai https github com yeagerai yeagerai agent is a langchain agent creator designed to help you build prototype and deploy ai powered agents with ease vicuna https vicuna lmsys org is created by fine tuning a llama base model using approximately 70k user shared conversations gathered from sharegpt com with public apis to ensure data quality it convert the html back to markdown and filter out some inappropriate or low quality samples sharegpt https sharegpt com is a place to share your wildest chatgpt conversations with one click with 198 404 conversations shared so far fastchat https github com lm sys fastchat is an open platform for training serving and evaluating large language model based chatbots haystack https haystack deepset ai is an open source nlp framework to interact with your data using transformer models and llms gpt 4 chatgpt and alike it offers production ready tools to quickly build complex decision making question answering semantic search text generation applications and more stablelm stability ai language models https github com stability ai stablelm is stablelm series of language models and will be continuously updated with new checkpoints databricks dolly https github com databrickslabs dolly is an instruction following large language model trained on the databricks machine learning platform that is licensed for commercial use gptcach https gptcache readthedocs io is a library for creating semantic cache for llm queries alac https github com gofireflyio aiac is an artificial intelligence infrastructure as code generator adrenaline https useadrenaline com is a tool that lets you talk to your codebase it s powered by static analysis vector search and large language models openassistant https open assistant io is a chat based assistant that understands tasks can interact with third party systems and retrieve information dynamically to do so doctorgpt https github com ingyamilmolinar doctorgpt is a lightweight self contained binary that monitors your application logs for problems and diagnoses them httpgpt https github com lucoiso uehttpgpt releases is an unreal engine 5 plugin that facilitates integration with openai s gpt based services chatgpt and dall e through asynchronous rest requests making it easy for developers to communicate with these services it also includes editor tools to integrate chat gpt and dall e image generation directly in the engine palm 2 https ai google discover palm2 is a next generation large language model that builds on google s legacy of breakthrough research in machine learning and responsible ai it includes an advanced reasoning tasks including code and math classification and question answering translation and multilingual proficiency and natural language generation better than our previous state of the art llms med palm https sites research google med palm is a large language model llm designed to provide high quality answers to medical questions it harnesses the power of google s large language models which we have aligned to the medical domain with a set of carefully curated medical expert demonstrations sec palm https cloud google com blog products identity security rsa google cloud security ai workbench generative ai is a large language models llms that accelerate the ability to help people who are responsible for keeping their organizations safe these new models not only give people a more natural and creative way to understand and manage security running llms locally back to the top table of contents a comprehensive guide to running llama 2 locally https replicate com blog run llama locally leaderboard by lmsys org https chat lmsys org leaderboard llm leaderboard https github com ludwigstumpp llm leaderboard open llm leaderboard by hugging face https huggingface co spaces huggingfaceh4 open llm leaderboard holistic evaluation of language models helm https crfm stanford edu helm latest groups 1 textsynth server benchmarks https bellard org ts server localai https localai io is a self hosted community driven local openai compatible api drop in replacement for openai running llms on consumer grade hardware with no gpu required it s an api to run ggml compatible models llama gpt4all rwkv whisper vicuna koala gpt4all j cerebras falcon dolly starcoder and many others llama cpp https github com ggerganov llama cpp is a port of facebook s llama model in c c ollama https ollama ai is a tool to get up and running with llama 2 and other large language models locally localai https localai io is a self hosted community driven local openai compatible api drop in replacement for openai running llms on consumer grade hardware with no gpu required it s an api to run ggml compatible models llama gpt4all rwkv whisper vicuna koala gpt4all j cerebras falcon dolly starcoder and many others serge https github com serge chat serge is a web interface for chatting with alpaca through llama cpp fully self hosted dockerized with an easy to use api openllm https github com bentoml openllm is an open platform for operating large language models llms in production fine tune serve deploy and monitor any llms with ease llama gpt https github com getumbrel llama gpt is a self hosted offline chatgpt like chatbot powered by llama 2 100 private with no data leaving your device llama2 webui https github com liltom eth llama2 webui is a tool to run any llama 2 locally with gradio ui on gpu or cpu from anywhere linux windows mac use llama2 wrapper as your local llama2 backend for generative agents apps llama2 c https github com karpathy llama2 c is a tool to train the llama 2 llm architecture in pytorch then inference it with one simple 700 line c file run c https github com karpathy llama2 c blob master run c alpaca cpp https github com antimatter15 alpaca cpp is a fast chatgpt like model locally on your device it combines the llama foundation model https github com facebookresearch llama with an open reproduction https github com tloen alpaca lora of stanford alpaca https github com tatsu lab stanford alpaca a fine tuning of the base model to obey instructions akin to the rlhf https huggingface co blog rlhf used to train chatgpt and a set of modifications to llama cpp https github com ggerganov llama cpp to add a chat interface gpt4all https github com nomic ai gpt4all is an ecosystem of open source chatbots trained on a massive collections of clean assistant data including code stories and dialogue based on llama https github com facebookresearch llama minigpt 4 https minigpt 4 github io is an enhancing vision language understanding with advanced large language models lollms webui https github com parisneo lollms webui is a the hub for llm large language model models it aims to provide a user friendly interface to access and utilize various llm models for a wide range of tasks whether you need help with writing coding organizing data generating images or seeking answers to your questions lm studio https lmstudio ai is a tool to discover download and run local llms gradio web ui https github com oobabooga text generation webui is a tool for large language models supports transformers gptq llama cpp ggml gguf llama models openplayground https github com nat openplayground is a playfround for running chatgpt like models locally on your device vicuna https vicuna lmsys org is an open source chatbot trained by fine tuning llama it apparently achieves more than 90 quality of chatgpt and costs 300 to train yeagar ai https github com yeagerai yeagerai agent is a langchain agent creator designed to help you build prototype and deploy ai powered agents with ease koboldcpp https github com lostruins koboldcpp is an easy to use ai text generation software for ggml models it s a single self contained distributable from concedo that builds off llama cpp and adds a versatile kobold api endpoint additional format support backward compatibility as well as a fancy ui with persistent stories editing tools save formats memory world info author s note characters and scenarios algorithms back to the top https github com mikeroyal machine learning guide table of contents fuzzy logic https www investopedia com terms f fuzzy logic asp is a heuristic approach that allows for more advanced decision tree processing and better integration with rules based programming p align center img src https user images githubusercontent com 45159366 123861872 858dce80 d8dc 11eb 9a2c 51205d1541e9 png br p architecture of a fuzzy logic system source researchgate https www researchgate net figure architecture of a fuzzy logic system fig2 309452475 support vector machine svm https web stanford edu hastie mooc slides svm pdf is a supervised machine learning model that uses classification algorithms for two group classification problems p align center img src https user images githubusercontent com 45159366 123858065 ec5cb900 d8d7 11eb 81c5 c6a8feefa84f png br p support vector machine svm source openclipart https openclipart org detail 182977 svm support vector machines neural networks https www ibm com cloud learn neural networks are a subset of machine learning and are at the heart of deep learning algorithms the name structure is inspired by the human brain copying the process that biological neurons nodes signal to one another p align center img src https user images githubusercontent com 45159366 123858036 e5ce4180 d8d7 11eb 8c52 43d7c7e6e3c4 png br p deep neural network source ibm https www ibm com cloud learn neural networks convolutional neural networks r cnn https stanford edu shervine teaching cs 230 cheatsheet convolutional neural networks is an object detection algorithm that first segments the image to find potential relevant bounding boxes and then run the detection algorithm to find most probable objects in those bounding boxes p align center img src https user images githubusercontent com 45159366 123858026 e36be780 d8d7 11eb 9034 8859d6f09490 png br p convolutional neural networks source cs231n https cs231n github io convolutional networks conv recurrent neural networks rnns https www ibm com cloud learn recurrent neural networks is a type of artificial neural network which uses sequential data or time series data p align center img src https user images githubusercontent com 45159366 123858062 ebc42280 d8d7 11eb 9252 97e058bda8bd png br p recurrent neural networks source slideteam https www slideteam net recurrent neural networks rnns ppt powerpoint presentation file templates html multilayer perceptrons mlps https deepai org machine learning glossary and terms multilayer perceptron is multi layer neural networks composed of multiple layers of perceptrons https en wikipedia org wiki perceptron with a threshold activation p align center img src https user images githubusercontent com 45159366 123858053 e8c93200 d8d7 11eb 844c 60463ecf662c png br p multilayer perceptrons source deepai https deepai org machine learning glossary and terms multilayer perceptron random forest https www ibm com cloud learn random forest is a commonly used machine learning algorithm which combines the output of multiple decision trees to reach a single result a decision tree in a forest cannot be pruned for sampling and therefore prediction selection its ease of use and flexibility have fueled its adoption as it handles both classification and regression problems p align center img src https user images githubusercontent com 45159366 124398881 fe21d000 dccc 11eb 8f5f 0a0730d85d55 png br p random forest source wikimedia https community tibco com wiki random forest template tibco spotfirer wiki page decision trees https www cs cmu edu bhiksha courses 10 601 decisiontrees are tree structured models for classification and regression p align center img src https user images githubusercontent com 45159366 124398883 ffeb9380 dccc 11eb 9adb 66729a353132 png br p decision trees source cmu http www cs cmu edu bhiksha courses 10 601 decisiontrees naive bayes https en wikipedia org wiki naive bayes classifier is a machine learning algorithm that is used solved calssification problems it s based on applying bayes theorem https www mathsisfun com data bayes theorem html with strong independence assumptions between the features p align center img src https user images githubusercontent com 45159366 124398885 00842a00 dccd 11eb 89c1 bd4c1adbf305 png br p bayes theorem source mathisfun https www mathsisfun com data bayes theorem html pytorch development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 135770888 33dc24b8 f341 4bbe a264 cf335273c587 png br p pytorch learning resources pytorch https pytorch org is an open source deep learning framework that accelerates the path from research to production used for applications such as computer vision and natural language processing pytorch is developed by facebook s ai research https ai facebook com research lab getting started with pytorch https pytorch org get started locally pytorch documentation https pytorch org docs stable index html pytorch discussion forum https discuss pytorch org top pytorch courses online coursera https www coursera org courses query pytorch page 1 top pytorch courses online udemy https www udemy com topic pytorch learn pytorch with online courses and classes edx https www edx org learn pytorch pytorch fundamentals learn microsoft docs https docs microsoft com en us learn paths pytorch fundamentals intro to deep learning with pytorch udacity https www udacity com course deep learning pytorch ud188 pytorch development in visual studio code https code visualstudio com docs datascience pytorch support pytorch on azure deep learning with pytorch microsoft azure https azure microsoft com en us develop pytorch pytorch azure databricks microsoft docs https docs microsoft com en us azure databricks applications machine learning train model pytorch deep learning with pytorch amazon web services aws https aws amazon com pytorch getting started with pytorch on google cloud https cloud google com ai platform training docs getting started pytorch pytorch tools libraries and frameworks pytorch mobile https pytorch org mobile home is an end to end ml workflow from training to deployment for ios and android mobile devices torchscript https pytorch org docs stable jit html is a way to create serializable and optimizable models from pytorch code this allows any torchscript program to be saved from a python process and loaded in a process where there is no python dependency torchserve https pytorch org serve is a flexible and easy to use tool for serving pytorch models keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml onnx runtime https github com microsoft onnxruntime is a cross platform high performance ml inferencing and training accelerator it supports models from deep learning frameworks such as pytorch and tensorflow keras as well as classical machine learning libraries such as scikit learn lightgbm xgboost etc kornia https kornia github io is a differentiable computer vision library that consists of a set of routines and differentiable modules to solve generic cv computer vision problems pytorch nlp https pytorchnlp readthedocs io en latest is a library for natural language processing nlp in python it s built with the very latest research in mind and was designed from day one to support rapid prototyping pytorch nlp comes with pre trained embeddings samplers dataset loaders metrics neural network modules and text encoders ignite https pytorch org ignite is a high level library to help with training and evaluating neural networks in pytorch flexibly and transparently hummingbird https github com microsoft hummingbird is a library for compiling trained traditional ml models into tensor computations it allows users to seamlessly leverage neural network frameworks such as pytorch to accelerate traditional ml models deep graph library dgl https www dgl ai is a python package built for easy implementation of graph neural network model family on top of pytorch and other frameworks tensorly http tensorly org stable home html is a high level api for tensor methods and deep tensorized neural networks in python that aims to make tensor learning simple gpytorch https cornellius gp github io is a gaussian process library implemented using pytorch designed for creating scalable flexible gaussian process models poutyne https poutyne org is a keras like framework for pytorch and handles much of the boilerplating code needed to train neural networks forte https github com asyml forte tree master docs is a toolkit for building nlp pipelines featuring composable components convenient data interfaces and cross task interaction torchmetrics https github com pytorchlightning metrics is a machine learning metrics for distributed scalable pytorch applications captum https captum ai is an open source extensible library for model interpretability built on pytorch transformer https github com huggingface transformers is a state of the art natural language processing for pytorch tensorflow and jax hydra https hydra cc is a framework for elegantly configuring complex applications accelerate https huggingface co docs accelerate is a simple way to train and use pytorch models with multi gpu tpu mixed precision ray https github com ray project ray is a fast and simple framework for building and running distributed applications parlai http parl ai is a unified platform for sharing training and evaluating dialog models across many tasks pytorchvideo https pytorchvideo org is a deep learning library for video understanding research hosts various video focused models datasets training pipelines and more opacus https opacus ai is a library that enables training pytorch models with differential privacy pytorch lightning https github com williamfalcon pytorch lightning is a keras like ml library for pytorch it leaves core training and validation logic to you and automates the rest pytorch geometric temporal https github com benedekrozemberczki pytorch geometric temporal is a temporal dynamic extension library for pytorch geometric pytorch geometric https github com rusty1s pytorch geometric is a library for deep learning on irregular input data such as graphs point clouds and manifolds raster vision https docs rastervision io is an open source framework for deep learning on satellite and aerial imagery crypten https github com facebookresearch crypten is a framework for privacy preserving ml its goal is to make secure computing techniques accessible to ml practitioners optuna https optuna org is an open source hyperparameter optimization framework to automate hyperparameter search pyro http pyro ai is a universal probabilistic programming language ppl written in python and supported by pytorch on the backend albumentations https github com albu albumentations is a fast and extensible image augmentation library for different cv tasks like classification segmentation object detection and pose estimation skorch https github com skorch dev skorch is a high level library for pytorch that provides full scikit learn compatibility mmf https mmf sh is a modular framework for vision language multimodal research from facebook ai research fair adaptdl https github com petuum adaptdl is a resource adaptive deep learning training and scheduling framework polyaxon https github com polyaxon polyaxon is a platform for building training and monitoring large scale deep learning applications textbrewer http textbrewer hfl rc com is a pytorch based knowledge distillation toolkit for natural language processing advertorch https github com borealisai advertorch is a toolbox for adversarial robustness research it contains modules for generating adversarial examples and defending against attacks nemo https github com nvidia nemo is a a toolkit for conversational ai clinicadl https clinicadl readthedocs io is a framework for reproducible classification of alzheimer s disease stable baselines3 sb3 https github com dlr rm stable baselines3 is a set of reliable implementations of reinforcement learning algorithms in pytorch torchio https github com fepegar torchio is a set of tools to efficiently read preprocess sample augment and write 3d medical images in deep learning applications written in pytorch pysyft https github com openmined pysyft is a python library for encrypted privacy preserving deep learning flair https github com flairnlp flair is a very simple framework for state of the art natural language processing nlp glow https github com pytorch glow is a ml compiler that accelerates the performance of deep learning frameworks on different hardware platforms fairscale https github com facebookresearch fairscale is a pytorch extension library for high performance and large scale training on one or multiple machines nodes monai https monai io is a deep learning framework that provides domain optimized foundational capabilities for developing healthcare imaging training workflows pfrl https github com pfnet pfrl is a deep reinforcement learning library that implements various state of the art deep reinforcement algorithms in python using pytorch einops https github com arogozhnikov einops is a flexible and powerful tensor operations for readable and reliable code pytorch3d https pytorch3d org is a deep learning library that provides efficient reusable components for 3d computer vision research with pytorch ensemble pytorch https ensemble pytorch readthedocs io is a unified ensemble framework for pytorch to improve the performance and robustness of your deep learning model lightly https github com lightly ai lightly is a computer vision framework for self supervised learning higher https github com facebookresearch higher is a library which facilitates the implementation of arbitrarily complex gradient based meta learning algorithms and nested optimisation loops with near vanilla pytorch horovod http horovod ai is a distributed training library for deep learning frameworks horovod aims to make distributed dl fast and easy to use pennylane https pennylane ai is a library for quantum ml automatic differentiation and optimization of hybrid quantum classical computations detectron2 https github com facebookresearch detectron2 is fair s next generation platform for object detection and segmentation fastai https docs fast ai is a library that simplifies training fast and accurate neural nets using modern best practices tensorflow development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 135770877 ba05b184 89d5 4762 9142 4e0640d52bae png br p tensorflow learning resources tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications getting started with tensorflow https www tensorflow org learn tensorflow tutorials https www tensorflow org tutorials tensorflow developer certificate tensorflow https www tensorflow org certificate tensorflow community https www tensorflow org community tensorflow models datasets https www tensorflow org resources models datasets tensorflow cloud https www tensorflow org cloud machine learning education tensorflow https www tensorflow org resources learn ml top tensorflow courses online coursera https www coursera org courses query tensorflow top tensorflow courses online udemy https www udemy com courses search src ukw q tensorflow deep learning with tensorflow udemy https www udemy com course deep learning with tensorflow certification training deep learning with tensorflow edx https www edx org course deep learning with tensorflow intro to tensorflow for deep learning udacity https www udacity com course intro to tensorflow for deep learning ud187 intro to tensorflow machine learning crash course google developers https developers google com machine learning crash course first steps with tensorflow toolkit train and deploy a tensorflow model azure machine learning https docs microsoft com en us azure machine learning how to train tensorflow apply machine learning models in azure functions with python and tensorflow microsoft azure https docs microsoft com en us azure azure functions functions machine learning tensorflow tabs bash deep learning with tensorflow amazon web services aws https aws amazon com tensorflow tensorflow amazon emr aws documentation https docs aws amazon com emr latest releaseguide emr tensorflow html tensorflow enterprise google cloud https cloud google com tensorflow enterprise tensorflow tools libraries and frameworks tensorflow lite https www tensorflow org lite is an open source deep learning framework for deploying machine learning models on mobile and iot devices tensorflow js https www tensorflow org js is a javascript library that lets you develop or execute ml models in javascript and use ml directly in the browser client side server side via node js mobile native via react native desktop native via electron and even on iot devices via node js on raspberry pi tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework google colaboratory https colab sandbox google com notebooks welcome ipynb is a free jupyter notebook environment that requires no setup and runs entirely in the cloud allowing you to execute tensorflow code in your browser with a single click what if tool https pair code github io what if tool is a tool for code free probing of machine learning models useful for model understanding debugging and fairness available in tensorboard and jupyter or colab notebooks tensorboard https www tensorflow org tensorboard is a suite of visualization tools to understand debug and optimize tensorflow programs keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml xla accelerated linear algebra https www tensorflow org xla is a domain specific compiler for linear algebra that optimizes tensorflow computations the results are improvements in speed memory usage and portability on server and mobile platforms ml perf https mlperf org is a broad ml benchmark suite for measuring performance of ml software frameworks ml hardware accelerators and ml cloud platforms tensorflow playground https playground tensorflow org activation tanh batchsize 10 dataset circle regdataset reg plane learningrate 0 03 regularizationrate 0 noise 0 networkshape 4 2 seed 0 04620 showtestdata false discretize false perctraindata 50 x true y true xtimesy false xsquared false ysquared false cosx false sinx false cosy false siny false collectstats false problem classification initzero false hidetext false is an development environment to tinker around with a neural network in your browser tpu research cloud trc https sites research google trc is a program enables researchers to apply for access to a cluster of more than 1 000 cloud tpus at no charge to help them accelerate the next wave of research breakthroughs mlir https www tensorflow org mlir is a new intermediate representation and compiler framework lattice https www tensorflow org lattice is a library for flexible controlled and interpretable ml solutions with common sense shape constraints tensorflow hub https www tensorflow org hub is a library for reusable machine learning download and reuse the latest trained models with a minimal amount of code tensorflow cloud https www tensorflow org cloud is a library to connect your local environment to google cloud tensorflow model optimization toolkit https www tensorflow org model optimization is a suite of tools for optimizing ml models for deployment and execution tensorflow recommenders https www tensorflow org recommenders is a library for building recommender system models tensorflow text https www tensorflow org text is a collection of text and nlp related classes and ops ready to use with tensorflow 2 tensorflow graphics https www tensorflow org graphics is a library of computer graphics functionalities ranging from cameras lights and materials to renderers tensorflow federated https www tensorflow org federated is an open source framework for machine learning and other computations on decentralized data tensorflow probability https www tensorflow org probability is a library for probabilistic reasoning and statistical analysis tensor2tensor https github com tensorflow tensor2tensor is a library of deep learning models and datasets designed to make deep learning more accessible and accelerate ml research tensorflow privacy https www tensorflow org responsible ai privacy is a python library that includes implementations of tensorflow optimizers for training machine learning models with differential privacy tensorflow ranking https github com tensorflow ranking is a library for learning to rank ltr techniques on the tensorflow platform tensorflow agents https www tensorflow org agents is a library for reinforcement learning in tensorflow tensorflow addons https github com tensorflow addons is a repository of contributions that conform to well established api patterns but implement new functionality not available in core tensorflow maintained by sig addons https groups google com a tensorflow org g addons tensorflow natively supports a large number of operators layers metrics losses and optimizers tensorflow i o https github com tensorflow io is a dataset streaming and file system extensions maintained by sig io tensorflow quantum https www tensorflow org quantum is a quantum machine learning library for rapid prototyping of hybrid quantum classical ml models dopamine https github com google dopamine is a research framework for fast prototyping of reinforcement learning algorithms trfl https deepmind com blog trfl is a library for reinforcement learning building blocks created by deepmind mesh tensorflow https github com tensorflow mesh is a language for distributed deep learning capable of specifying a broad class of distributed tensor computations raggedtensors https www tensorflow org guide ragged tensor is an api that makes it easy to store and manipulate data with non uniform shape including text words sentences characters and batches of variable length unicode ops https www tensorflow org tutorials load data unicode is an api that supports working with unicode text directly in tensorflow magenta https magenta tensorflow org is a research project exploring the role of machine learning in the process of creating art and music nucleus https github com google nucleus is a library of python and c code designed to make it easy to read write and analyze data in common genomics file formats like sam and vcf sonnet https github com deepmind sonnet is a library from deepmind for constructing neural networks neural structured learning https www tensorflow org neural structured learning is a learning framework to train neural networks by leveraging structured signals in addition to feature inputs model remediation https www tensorflow org responsible ai model remediation is a library to help create and train models in a way that reduces or eliminates user harm resulting from underlying performance biases fairness indicators https www tensorflow org responsible ai fairness indicators guide is a library that enables easy computation of commonly identified fairness metrics for binary and multiclass classifiers decision forests https www tensorflow org decision forests is a state of the art algorithms for training serving and interpreting models that use decision forests for classification regression and ranking core ml development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 132140794 0d4bf654 0884 4068 b5bb eaee36c36797 png br p core ml learning resources core ml https developer apple com documentation coreml is an apple framework for integrating machine learning models into apps running on apple devices including ios watchos macos and tvos core ml introduces a public file format mlmodel for a broad set of ml methods including deep neural networks both convolutional and recurrent tree ensembles with boosting and generalized linear models models in this format can be directly integrated into apps through xcode introduction to core ml https coremltools readme io docs integrating a core ml model into your app https developer apple com documentation coreml integrating a core ml model into your app core ml models https developer apple com machine learning models core ml api reference https apple github io coremltools index html core ml specification https apple github io coremltools mlmodel index html apple developer forums for core ml https developer apple com forums tags core ml top core ml courses online udemy https www udemy com topic core ml top core ml courses online coursera https www coursera org courses query core 20ml ibm watson services for core ml ibm https www ibm com watson stories coreml generate core ml assets using ibm maximo visual inspection ibm https developer ibm com technologies iot tutorials ibm maximo visual inspection apple devices core ml tools libraries and frameworks core ml tools https coremltools readme io is a project that contains supporting tools for core ml model conversion editing and validation create ml https developer apple com machine learning create ml is a tool that provides new ways of training machine learning models on your mac it takes the complexity out of model training while producing powerful core ml models tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework apple vision https developer apple com documentation vision is a framework that performs face and face landmark detection text detection barcode recognition image registration and general feature tracking vision also allows the use of custom core ml models for tasks like classification or object detection keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml xgboost https xgboost readthedocs io is an optimized distributed gradient boosting library designed to be highly efficient flexible and portable it implements machine learning algorithms under the gradient boosting framework xgboost provides a parallel tree boosting also known as gbdt gbm that solve many data science problems in a fast and accurate way it supports distributed training on multiple machines including aws gce azure and yarn clusters also it can be integrated with flink spark and other cloud dataflow systems libsvm https www csie ntu edu tw cjlin libsvm is an integrated software for support vector classification c svc nu svc regression epsilon svr nu svr and distribution estimation one class svm it supports multi class classification scikit learn https scikit learn org stable index html is a simple and efficient tool for data mining and data analysis it is built on numpy scipy and mathplotlib xcode https developer apple com xcode includes everything developers need to create great applications for mac iphone ipad apple tv and apple watch xcode provides developers a unified workflow for user interface design coding testing and debugging xcode is built as an universal app that runs 100 natively on intel based cpus and apple silicon it includes a unified macos sdk that features all the frameworks compilers debuggers and other tools you need to build apps that run natively on apple silicon and the intel x86 64 cpu swiftui https developer apple com documentation swiftui is a user interface toolkit that provides views controls and layout structures for declaring your app s user interface the swiftui framework provides event handlers for delivering taps gestures and other types of input to your application uikit https developer apple com documentation uikit is a framework provides the required infrastructure for your ios or tvos apps it provides the window and view architecture for implementing your interface the event handling infrastructure for delivering multi touch and other types of input to your app and the main run loop needed to manage interactions among the user the system and your app appkit https developer apple com documentation appkit is a graphical user interface toolkit that contains all the objects you need to implement the user interface for a macos app such as windows panels buttons menus scrollers and text fields and it handles all the details for you as it efficiently draws on the screen communicates with hardware devices and screen buffers clears areas of the screen before drawing and clips views arkit https developer apple com augmented reality arkit is a set set of software development tools to enable developers to build augmented reality apps for ios developed by apple the latest version arkit 3 5 takes advantage of the new lidar scanner and depth sensing system on ipad pro 2020 to support a new generation of ar apps that use scene geometry for enhanced scene understanding and object occlusion realitykit https developer apple com documentation realitykit is a framework to implement high performance 3d simulation and rendering with information provided by the arkit framework to seamlessly integrate virtual objects into the real world scenekit https developer apple com scenekit is a high level 3d graphics framework that helps you create 3d animated scenes and effects in your ios apps instruments https help apple com instruments mac current dev7b09c84f5 is a powerful and flexible performance analysis and testing tool that s part of the xcode tool set it s designed to help you profile your ios watchos tvos and macos apps processes and devices in order to better understand and optimize their behavior and performance cocoapods https cocoapods org is a dependency manager for swift and objective c used in xcode projects by specifying the dependencies for your project in a simple text file cocoapods then recursively resolves dependencies between libraries fetches source code for all dependencies and creates and maintains an xcode workspace to build your project appcode https www jetbrains com objc is constantly monitoring the quality of your code it warns you of errors and smells and suggests quick fixes to resolve them automatically appcode provides lots of code inspections for objective c swift c c and a number of code inspections for other supported languages deep learning development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 133943699 6dcfcb40 ddf7 4501 86e0 41e8aee91fe2 png br p deep learning learning resources deep learning https www ibm com cloud learn deep learning is a subset of machine learning which is essentially a neural network with three or more layers these neural networks attempt to simulate the behavior of the human brain though far from matching its ability this allows the neural networks to learn from large amounts of data the learning can be supervised https en wikipedia org wiki supervised learning semi supervised https en wikipedia org wiki semi supervised learning or unsupervised https en wikipedia org wiki unsupervised learning deep learning online courses nvidia https www nvidia com en us training online top deep learning courses online coursera https www coursera org courses query deep 20learning top deep learning courses online udemy https www udemy com topic deep learning learn deep learning with online courses and lessons edx https www edx org learn deep learning deep learning online course nanodegree udacity https www udacity com course deep learning nanodegree nd101 machine learning course by andrew ng coursera https www coursera org learn machine learning machine learning engineering for production mlops course by andrew ng coursera https www coursera org specializations machine learning engineering for production mlops data science deep learning and neural networks in python udemy https www udemy com course data science deep learning in python understanding machine learning with python pluralsight https www pluralsight com courses python understanding machine learning how to think about machine learning algorithms pluralsight https www pluralsight com courses machine learning algorithms deep learning courses stanford online https online stanford edu courses cs230 deep learning deep learning uw professional continuing education https www pce uw edu courses deep learning deep learning online courses harvard university https online learning harvard edu course deep learning 0 machine learning for everyone courses datacamp https www datacamp com courses introduction to machine learning with r artificial intelligence expert course platinum edition udemy https www udemy com course artificial intelligence exposed future 10 extreme edition top artificial intelligence courses online coursera https www coursera org courses query artificial 20intelligence learn artificial intelligence with online courses and lessons edx https www edx org learn artificial intelligence professional certificate in computer science for artificial intelligence edx https www edx org professional certificate harvardx computer science for artifical intelligence artificial intelligence nanodegree program https www udacity com course ai artificial intelligence nanodegree nd898 artificial intelligence ai online courses udacity https www udacity com school of ai intro to artificial intelligence course udacity https www udacity com course intro to artificial intelligence cs271 edge ai for iot developers course udacity https www udacity com course intel edge ai for iot developers nanodegree nd131 reasoning goal trees and rule based expert systems mit opencourseware https ocw mit edu courses electrical engineering and computer science 6 034 artificial intelligence fall 2010 lecture videos lecture 3 reasoning goal trees and rule based expert systems expert systems and applied artificial intelligence https www umsl edu joshik msis480 chapt11 htm autonomous systems microsoft ai https www microsoft com en us ai autonomous systems introduction to microsoft project bonsai https docs microsoft com en us learn autonomous systems intro to project bonsai machine teaching with the microsoft autonomous systems platform https docs microsoft com en us azure architecture solution ideas articles autonomous systems autonomous maritime systems training amc search https www amcsearch com au ams training top autonomous cars courses online udemy https www udemy com topic autonomous cars applied control systems 1 autonomous cars math pid mpc udemy https www udemy com course applied systems control for engineers modelling pid mpc learn autonomous robotics with online courses and lessons edx https www edx org learn autonomous robotics artificial intelligence nanodegree program https www udacity com course ai artificial intelligence nanodegree nd898 autonomous systems online courses programs udacity https www udacity com school of autonomous systems edge ai for iot developers course udacity https www udacity com course intel edge ai for iot developers nanodegree nd131 autonomous systems mooc and free online courses mooc list https www mooc list com tags autonomous systems robotics and autonomous systems graduate program standford online https online stanford edu programs robotics and autonomous systems graduate program mobile autonomous systems laboratory mit opencourseware https ocw mit edu courses electrical engineering and computer science 6 186 mobile autonomous systems laboratory january iap 2005 lecture notes deep learning tools libraries and frameworks nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org nvidia dlss deep learning super sampling https developer nvidia com dlss is a temporal image upscaling ai rendering technology that increases graphics performance using dedicated tensor core ai processors on geforce rtx gpus dlss uses the power of a deep learning neural network to boost frame rates and generate beautiful sharp images for your games amd fidelityfx super resolution fsr https www amd com en technologies radeon software fidelityfx is an open source high quality solution for producing high resolution frames from lower resolution inputs it uses a collection of cutting edge deep learning algorithms with a particular emphasis on creating high quality edges giving large performance improvements compared to rendering at native resolution directly fsr enables practical performance for costly render operations such as hardware ray tracing for the amd rdna and amd rdna 2 architectures intel xe super sampling xess https www youtube com watch v y9hfpf sqeg is a temporal image upscaling ai rendering technology that increases graphics performance similar to nvidia s dlss deep learning super sampling https developer nvidia com dlss intel s arc gpu architecture early 2022 will have gpus that feature dedicated xe cores to run xess the gpus will have xe matrix extenstions matrix xmx engines for hardware accelerated ai processing xess will be able to run on devices without xmx including integrated graphics though the performance of xess will be lower on non intel graphics cards because it will be powered by dp4a instruction https www intel com content dam www public us en documents reference guides 11th gen quick reference guide pdf jupyter notebook https jupyter org is an open source web application that allows you to create and share documents that contain live code equations visualizations and narrative text jupyter is used widely in industries that do data cleaning and transformation numerical simulation statistical modeling data visualization data science and machine learning apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture cluster manager for apache kafka cmak https github com yahoo cmak is a tool for managing apache kafka https kafka apache org clusters bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks deep learning toolbox https www mathworks com products deep learning html is a tool that provides a framework for designing and implementing deep neural networks with algorithms pretrained models and apps you can use convolutional neural networks convnets cnns and long short term memory lstm networks to perform classification and regression on image time series and text data you can build network architectures such as generative adversarial networks gans and siamese networks using automatic differentiation custom training loops and shared weights with the deep network designer app you can design analyze and train networks graphically it can exchange models with tensorflow and pytorch through the onnx format and import models from tensorflow keras and caffe the toolbox supports transfer learning with darknet 53 resnet 50 nasnet squeezenet and many other pretrained models reinforcement learning toolbox https www mathworks com products reinforcement learning html is a tool that provides an app functions and a simulink block for training policies using reinforcement learning algorithms including dqn ppo sac and ddpg you can use these policies to implement controllers and decision making algorithms for complex applications such as resource allocation robotics and autonomous systems deep learning hdl toolbox https www mathworks com products deep learning hdl html is a tool that provides functions and tools to prototype and implement deep learning networks on fpgas and socs it provides pre built bitstreams for running a variety of deep learning networks on supported xilinx and intel fpga and soc devices profiling and estimation tools let you customize a deep learning network by exploring design performance and resource utilization tradeoffs parallel computing toolbox https www mathworks com products matlab parallel server html is a tool that lets you solve computationally and data intensive problems using multicore processors gpus and computer clusters high level constructs such as parallel for loops special array types and parallelized numerical algorithms enable you to parallelize matlab applications without cuda or mpi programming the toolbox lets you use parallel enabled functions in matlab and other toolboxes you can use the toolbox with simulink to run multiple simulations of a model in parallel programs and models can run in both interactive and batch modes xgboost https xgboost readthedocs io is an optimized distributed gradient boosting library designed to be highly efficient flexible and portable it implements machine learning algorithms under the gradient boosting framework xgboost provides a parallel tree boosting also known as gbdt gbm that solve many data science problems in a fast and accurate way it supports distributed training on multiple machines including aws gce azure and yarn clusters also it can be integrated with flink spark and other cloud dataflow systems libsvm https www csie ntu edu tw cjlin libsvm is an integrated software for support vector classification c svc nu svc regression epsilon svr nu svr and distribution estimation one class svm it supports multi class classification scikit learn https scikit learn org stable index html is a simple and efficient tool for data mining and data analysis it is built on numpy scipy and mathplotlib tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml pytorch https pytorch org is a library for deep learning on irregular input data such as graphs point clouds and manifolds primarily developed by facebook s ai research lab azure databricks https azure microsoft com en us services databricks is a fast and collaborative apache spark based big data analytics service designed for data science and data engineering azure databricks sets up your apache spark environment in minutes autoscale and collaborate on shared projects in an interactive workspace azure databricks supports python scala r java and sql as well as data science frameworks and libraries including tensorflow pytorch and scikit learn microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework apache airflow https airflow apache org is an open source workflow management platform created by the community to programmatically author schedule and monitor workflows install principles scalable airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers airflow is ready to scale to infinity open neural network exchange onnx https github com onnx is an open ecosystem that empowers ai developers to choose the right tools as their project evolves onnx provides an open source format for ai models both deep learning and traditional ml it defines an extensible computation graph model as well as definitions of built in operators and standard data types apache mxnet https mxnet apache org is a deep learning framework designed for both efficiency and flexibility it allows you to mix symbolic and imperative programming to maximize efficiency and productivity at its core mxnet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly a graph optimization layer on top of that makes symbolic execution fast and memory efficient mxnet is portable and lightweight scaling effectively to multiple gpus and multiple machines support for python r julia scala go javascript and more autogluon https autogluon mxnet io index html is toolkit for deep learning that automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications with just a few lines of code you can train and deploy high accuracy deep learning models on tabular image and text data anaconda https www anaconda com is a very popular data science platform for machine learning and deep learning that enables users to develop models train them and deploy them plaidml https github com plaidml plaidml is an advanced and portable tensor compiler for enabling deep learning on laptops embedded devices or other devices where the available computing hardware is not well supported or the available software stack contains unpalatable license restrictions opencv https opencv org is a highly optimized library with focus on real time computer vision applications the c python and java interfaces support linux macos windows ios and android scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms weka https www cs waikato ac nz ml weka is an open source machine learning software that can be accessed through a graphical user interface standard terminal applications or a java api it is widely used for teaching research and industrial applications contains a plethora of built in tools for standard machine learning tasks and additionally gives transparent access to well known toolboxes such as scikit learn r and deeplearning4j caffe https github com bvlc caffe is a deep learning framework made with expression speed and modularity in mind it is developed by berkeley ai research bair the berkeley vision and learning center bvlc and community contributors theano https github com theano theano is a python library that allows you to define optimize and evaluate mathematical expressions involving multi dimensional arrays efficiently including tight integration with numpy microsoft project bonsai https azure microsoft com en us services project bonsai is a low code ai platform that speeds ai powered automation development and part of the autonomous systems suite from microsoft bonsai is used to build ai components that can provide operator guidance or make independent decisions to optimize process variables improve production efficiency and reduce downtime microsoft airsim https microsoft github io airsim lidar html is a simulator for drones cars and more built on unreal engine with an experimental unity release airsim is open source cross platform and supports software in the loop simulation https www mathworks com help ecoder software in the loop sil simulation html with popular flight controllers such as px4 ardupilot and hardware in loop https www ni com en us innovations white papers 17 what is hardware in the loop html with px4 for physically and visually realistic simulations it is developed as an unreal plugin that can simply be dropped into any unreal environment airsim is being developed as a platform for ai research to experiment with deep learning computer vision and reinforcement learning algorithms for autonomous vehicles carla https github com carla simulator carla is an open source simulator for autonomous driving research carla has been developed from the ground up to support development training and validation of autonomous driving systems in addition to open source code and protocols carla provides open digital assets urban layouts buildings vehicles that were created for this purpose and can be used freely ros ros2 bridge for carla package https github com carla simulator ros bridge is a bridge that enables two way communication between ros and carla the information from the carla server is translated to ros topics in the same way the messages sent between nodes in ros get translated to commands to be applied in carla ros toolbox https www mathworks com products ros html is a tool that provides an interface connecting matlab and simulink with the robot operating system ros and ros 2 enabling you to create a network of ros nodes the toolbox includes matlab functions and simulink blocks to import analyze and play back ros data recorded in rosbag files you can also connect to a live ros network to access ros messages robotics toolbox https www mathworks com products robotics html provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot image processing toolbox https www mathworks com products image html is a tool that provides a comprehensive set of reference standard algorithms and workflow apps for image processing analysis visualization and algorithm development you can perform image segmentation image enhancement noise reduction geometric transformations image registration and 3d image processing computer vision toolbox https www mathworks com products computer vision html is a tool that provides algorithms functions and apps for designing and testing computer vision 3d vision and video processing systems you can perform object detection and tracking as well as feature detection extraction and matching you can automate calibration workflows for single stereo and fisheye cameras for 3d vision the toolbox supports visual and point cloud slam stereo vision structure from motion and point cloud processing robotics toolbox https www mathworks com products robotics html is a tool that provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot model predictive control toolbox https www mathworks com products model predictive control html is a tool that provides functions an app and simulink blocks for designing and simulating controllers using linear and nonlinear model predictive control mpc the toolbox lets you specify plant and disturbance models horizons constraints and weights by running closed loop simulations you can evaluate controller performance predictive maintenance toolbox https www mathworks com products predictive maintenance html is a tool that lets you manage sensor data design condition indicators and estimate the remaining useful life rul of a machine the toolbox provides functions and an interactive app for exploring extracting and ranking features using data based and model based techniques including statistical spectral and time series analysis vision hdl toolbox https www mathworks com products vision hdl html is a tool that provides pixel streaming algorithms for the design and implementation of vision systems on fpgas and asics it provides a design framework that supports a diverse set of interface types frame sizes and frame rates the image processing video and computer vision algorithms in the toolbox use an architecture appropriate for hdl implementations automated driving toolbox https www mathworks com products automated driving html is a matlab tool that provides algorithms and tools for designing simulating and testing adas and autonomous driving systems you can design and test vision and lidar perception systems as well as sensor fusion path planning and vehicle controllers visualization tools include a bird s eye view plot and scope for sensor coverage detections and tracks and displays for video lidar and maps the toolbox lets you import and work with here hd live map data and opendrive road networks it also provides reference application examples for common adas and automated driving features including fcw aeb acc lka and parking valet the toolbox supports c c code generation for rapid prototyping and hil testing with support for sensor fusion tracking path planning and vehicle controller algorithms uav toolbox https www mathworks com products uav html is an application that provides tools and reference applications for designing simulating testing and deploying unmanned aerial vehicle uav and drone applications you can design autonomous flight algorithms uav missions and flight controllers the flight log analyzer app lets you interactively analyze 3d flight paths telemetry information and sensor readings from common flight log formats navigation toolbox https www mathworks com products navigation html is a tool that provides algorithms and analysis tools for motion planning simultaneous localization and mapping slam and inertial navigation the toolbox includes customizable search and sampling based path planners as well as metrics for validating and comparing paths you can create 2d and 3d map representations generate maps using slam algorithms and interactively visualize and debug map generation with the slam map builder app lidar toolbox https www mathworks com products lidar html is a tool that provides algorithms functions and apps for designing analyzing and testing lidar processing systems you can perform object detection and tracking semantic segmentation shape fitting lidar registration and obstacle detection lidar toolbox supports lidar camera cross calibration for workflows that combine computer vision and lidar processing mapping toolbox https www mathworks com products mapping html is a tool that provides algorithms and functions for transforming geographic data and creating map displays you can visualize your data in a geographic context build map displays from more than 60 map projections and transform data from a variety of sources into a consistent geographic coordinate system reinforcement learning development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 133943722 059d8f26 44d7 40c3 9850 4b3a88d46b01 png br p reinforcement learning learning resources reinforcement learning https www ibm com cloud learn deep learning toc deep learn md q of3 is a subset of machine learning which is a neural network with three or more layers these neural networks attempt to simulate the behavior of the human brain though far from matching its ability this allows the neural networks to learn from a process in which a model learns to become more accurate for performing an action in an environment based on feedback in order to maximize the reward the learning can be supervised https en wikipedia org wiki supervised learning semi supervised https en wikipedia org wiki semi supervised learning or unsupervised https en wikipedia org wiki unsupervised learning top reinforcement learning courses coursera https www coursera org courses query reinforcement 20learning top reinforcement learning courses udemy https www udemy com topic reinforcement learning top reinforcement learning courses udacity https www udacity com course reinforcement learning ud600 reinforcement learning courses stanford online https online stanford edu courses xcs234 reinforcement learning deep learning online courses nvidia https www nvidia com en us training online top deep learning courses online coursera https www coursera org courses query deep 20learning top deep learning courses online udemy https www udemy com topic deep learning learn deep learning with online courses and lessons edx https www edx org learn deep learning deep learning online course nanodegree udacity https www udacity com course deep learning nanodegree nd101 machine learning course by andrew ng coursera https www coursera org learn machine learning machine learning engineering for production mlops course by andrew ng coursera https www coursera org specializations machine learning engineering for production mlops data science deep learning and neural networks in python udemy https www udemy com course data science deep learning in python understanding machine learning with python pluralsight https www pluralsight com courses python understanding machine learning how to think about machine learning algorithms pluralsight https www pluralsight com courses machine learning algorithms deep learning courses stanford online https online stanford edu courses cs230 deep learning deep learning uw professional continuing education https www pce uw edu courses deep learning deep learning online courses harvard university https online learning harvard edu course deep learning 0 machine learning for everyone courses datacamp https www datacamp com courses introduction to machine learning with r artificial intelligence expert course platinum edition udemy https www udemy com course artificial intelligence exposed future 10 extreme edition top artificial intelligence courses online coursera https www coursera org courses query artificial 20intelligence learn artificial intelligence with online courses and lessons edx https www edx org learn artificial intelligence professional certificate in computer science for artificial intelligence edx https www edx org professional certificate harvardx computer science for artifical intelligence artificial intelligence nanodegree program https www udacity com course ai artificial intelligence nanodegree nd898 artificial intelligence ai online courses udacity https www udacity com school of ai intro to artificial intelligence course udacity https www udacity com course intro to artificial intelligence cs271 edge ai for iot developers course udacity https www udacity com course intel edge ai for iot developers nanodegree nd131 reasoning goal trees and rule based expert systems mit opencourseware https ocw mit edu courses electrical engineering and computer science 6 034 artificial intelligence fall 2010 lecture videos lecture 3 reasoning goal trees and rule based expert systems expert systems and applied artificial intelligence https www umsl edu joshik msis480 chapt11 htm autonomous systems microsoft ai https www microsoft com en us ai autonomous systems introduction to microsoft project bonsai https docs microsoft com en us learn autonomous systems intro to project bonsai machine teaching with the microsoft autonomous systems platform https docs microsoft com en us azure architecture solution ideas articles autonomous systems autonomous maritime systems training amc search https www amcsearch com au ams training top autonomous cars courses online udemy https www udemy com topic autonomous cars applied control systems 1 autonomous cars math pid mpc udemy https www udemy com course applied systems control for engineers modelling pid mpc learn autonomous robotics with online courses and lessons edx https www edx org learn autonomous robotics artificial intelligence nanodegree program https www udacity com course ai artificial intelligence nanodegree nd898 autonomous systems online courses programs udacity https www udacity com school of autonomous systems edge ai for iot developers course udacity https www udacity com course intel edge ai for iot developers nanodegree nd131 autonomous systems mooc and free online courses mooc list https www mooc list com tags autonomous systems robotics and autonomous systems graduate program standford online https online stanford edu programs robotics and autonomous systems graduate program mobile autonomous systems laboratory mit opencourseware https ocw mit edu courses electrical engineering and computer science 6 186 mobile autonomous systems laboratory january iap 2005 lecture notes reinforcement learning tools libraries and frameworks openai https gym openai com is an open source python library for developing and comparing reinforcement learning algorithms by providing a standard api to communicate between learning algorithms and environments as well as a standard set of environments compliant with that api reinforcementlearning jl https juliareinforcementlearning org is a collection of tools for doing reinforcement learning research in julia reinforcement learning toolbox https www mathworks com products reinforcement learning html is a tool that provides an app functions and a simulink block for training policies using reinforcement learning algorithms including dqn ppo sac and ddpg you can use these policies to implement controllers and decision making algorithms for complex applications such as resource allocation robotics and autonomous systems amazon sagemaker https aws amazon com robomaker is a fully managed service that provides every developer and data scientist with the ability to build train and deploy machine learning ml models quickly aws robomaker https aws amazon com robomaker is a service that provides a fully managed scalable infrastructure for simulation that customers use for multi robot simulation and ci cd integration with regression testing in simulation tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml pytorch https pytorch org is a library for deep learning on irregular input data such as graphs point clouds and manifolds primarily developed by facebook s ai research lab scikit learn https scikit learn org stable index html is a simple and efficient tool for data mining and data analysis it is built on numpy scipy and mathplotlib nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org jupyter notebook https jupyter org is an open source web application that allows you to create and share documents that contain live code equations visualizations and narrative text jupyter is used widely in industries that do data cleaning and transformation numerical simulation statistical modeling data visualization data science and machine learning apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture cluster manager for apache kafka cmak https github com yahoo cmak is a tool for managing apache kafka https kafka apache org clusters bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks deep learning toolbox https www mathworks com products deep learning html is a tool that provides a framework for designing and implementing deep neural networks with algorithms pretrained models and apps you can use convolutional neural networks convnets cnns and long short term memory lstm networks to perform classification and regression on image time series and text data you can build network architectures such as generative adversarial networks gans and siamese networks using automatic differentiation custom training loops and shared weights with the deep network designer app you can design analyze and train networks graphically it can exchange models with tensorflow and pytorch through the onnx format and import models from tensorflow keras and caffe the toolbox supports transfer learning with darknet 53 resnet 50 nasnet squeezenet and many other pretrained models deep learning hdl toolbox https www mathworks com products deep learning hdl html is a tool that provides functions and tools to prototype and implement deep learning networks on fpgas and socs it provides pre built bitstreams for running a variety of deep learning networks on supported xilinx and intel fpga and soc devices profiling and estimation tools let you customize a deep learning network by exploring design performance and resource utilization tradeoffs parallel computing toolbox https www mathworks com products matlab parallel server html is a tool that lets you solve computationally and data intensive problems using multicore processors gpus and computer clusters high level constructs such as parallel for loops special array types and parallelized numerical algorithms enable you to parallelize matlab applications without cuda or mpi programming the toolbox lets you use parallel enabled functions in matlab and other toolboxes you can use the toolbox with simulink to run multiple simulations of a model in parallel programs and models can run in both interactive and batch modes xgboost https xgboost readthedocs io is an optimized distributed gradient boosting library designed to be highly efficient flexible and portable it implements machine learning algorithms under the gradient boosting framework xgboost provides a parallel tree boosting also known as gbdt gbm that solve many data science problems in a fast and accurate way it supports distributed training on multiple machines including aws gce azure and yarn clusters also it can be integrated with flink spark and other cloud dataflow systems libsvm https www csie ntu edu tw cjlin libsvm is an integrated software for support vector classification c svc nu svc regression epsilon svr nu svr and distribution estimation one class svm it supports multi class classification azure databricks https azure microsoft com en us services databricks is a fast and collaborative apache spark based big data analytics service designed for data science and data engineering azure databricks sets up your apache spark environment in minutes autoscale and collaborate on shared projects in an interactive workspace azure databricks supports python scala r java and sql as well as data science frameworks and libraries including tensorflow pytorch and scikit learn microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework apache airflow https airflow apache org is an open source workflow management platform created by the community to programmatically author schedule and monitor workflows install principles scalable airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers airflow is ready to scale to infinity open neural network exchange onnx https github com onnx is an open ecosystem that empowers ai developers to choose the right tools as their project evolves onnx provides an open source format for ai models both deep learning and traditional ml it defines an extensible computation graph model as well as definitions of built in operators and standard data types apache mxnet https mxnet apache org is a deep learning framework designed for both efficiency and flexibility it allows you to mix symbolic and imperative programming to maximize efficiency and productivity at its core mxnet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly a graph optimization layer on top of that makes symbolic execution fast and memory efficient mxnet is portable and lightweight scaling effectively to multiple gpus and multiple machines support for python r julia scala go javascript and more autogluon https autogluon mxnet io index html is toolkit for deep learning that automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications with just a few lines of code you can train and deploy high accuracy deep learning models on tabular image and text data anaconda https www anaconda com is a very popular data science platform for machine learning and deep learning that enables users to develop models train them and deploy them plaidml https github com plaidml plaidml is an advanced and portable tensor compiler for enabling deep learning on laptops embedded devices or other devices where the available computing hardware is not well supported or the available software stack contains unpalatable license restrictions opencv https opencv org is a highly optimized library with focus on real time computer vision applications the c python and java interfaces support linux macos windows ios and android scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms weka https www cs waikato ac nz ml weka is an open source machine learning software that can be accessed through a graphical user interface standard terminal applications or a java api it is widely used for teaching research and industrial applications contains a plethora of built in tools for standard machine learning tasks and additionally gives transparent access to well known toolboxes such as scikit learn r and deeplearning4j caffe https github com bvlc caffe is a deep learning framework made with expression speed and modularity in mind it is developed by berkeley ai research bair the berkeley vision and learning center bvlc and community contributors theano https github com theano theano is a python library that allows you to define optimize and evaluate mathematical expressions involving multi dimensional arrays efficiently including tight integration with numpy microsoft project bonsai https azure microsoft com en us services project bonsai is a low code ai platform that speeds ai powered automation development and part of the autonomous systems suite from microsoft bonsai is used to build ai components that can provide operator guidance or make independent decisions to optimize process variables improve production efficiency and reduce downtime microsoft airsim https microsoft github io airsim lidar html is a simulator for drones cars and more built on unreal engine with an experimental unity release airsim is open source cross platform and supports software in the loop simulation https www mathworks com help ecoder software in the loop sil simulation html with popular flight controllers such as px4 ardupilot and hardware in loop https www ni com en us innovations white papers 17 what is hardware in the loop html with px4 for physically and visually realistic simulations it is developed as an unreal plugin that can simply be dropped into any unreal environment airsim is being developed as a platform for ai research to experiment with deep learning computer vision and reinforcement learning algorithms for autonomous vehicles carla https github com carla simulator carla is an open source simulator for autonomous driving research carla has been developed from the ground up to support development training and validation of autonomous driving systems in addition to open source code and protocols carla provides open digital assets urban layouts buildings vehicles that were created for this purpose and can be used freely ros ros2 bridge for carla package https github com carla simulator ros bridge is a bridge that enables two way communication between ros and carla the information from the carla server is translated to ros topics in the same way the messages sent between nodes in ros get translated to commands to be applied in carla ros toolbox https www mathworks com products ros html is a tool that provides an interface connecting matlab and simulink with the robot operating system ros and ros 2 enabling you to create a network of ros nodes the toolbox includes matlab functions and simulink blocks to import analyze and play back ros data recorded in rosbag files you can also connect to a live ros network to access ros messages robotics toolbox https www mathworks com products robotics html provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot image processing toolbox https www mathworks com products image html is a tool that provides a comprehensive set of reference standard algorithms and workflow apps for image processing analysis visualization and algorithm development you can perform image segmentation image enhancement noise reduction geometric transformations image registration and 3d image processing computer vision toolbox https www mathworks com products computer vision html is a tool that provides algorithms functions and apps for designing and testing computer vision 3d vision and video processing systems you can perform object detection and tracking as well as feature detection extraction and matching you can automate calibration workflows for single stereo and fisheye cameras for 3d vision the toolbox supports visual and point cloud slam stereo vision structure from motion and point cloud processing robotics toolbox https www mathworks com products robotics html is a tool that provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot model predictive control toolbox https www mathworks com products model predictive control html is a tool that provides functions an app and simulink blocks for designing and simulating controllers using linear and nonlinear model predictive control mpc the toolbox lets you specify plant and disturbance models horizons constraints and weights by running closed loop simulations you can evaluate controller performance predictive maintenance toolbox https www mathworks com products predictive maintenance html is a tool that lets you manage sensor data design condition indicators and estimate the remaining useful life rul of a machine the toolbox provides functions and an interactive app for exploring extracting and ranking features using data based and model based techniques including statistical spectral and time series analysis vision hdl toolbox https www mathworks com products vision hdl html is a tool that provides pixel streaming algorithms for the design and implementation of vision systems on fpgas and asics it provides a design framework that supports a diverse set of interface types frame sizes and frame rates the image processing video and computer vision algorithms in the toolbox use an architecture appropriate for hdl implementations automated driving toolbox https www mathworks com products automated driving html is a matlab tool that provides algorithms and tools for designing simulating and testing adas and autonomous driving systems you can design and test vision and lidar perception systems as well as sensor fusion path planning and vehicle controllers visualization tools include a bird s eye view plot and scope for sensor coverage detections and tracks and displays for video lidar and maps the toolbox lets you import and work with here hd live map data and opendrive road networks it also provides reference application examples for common adas and automated driving features including fcw aeb acc lka and parking valet the toolbox supports c c code generation for rapid prototyping and hil testing with support for sensor fusion tracking path planning and vehicle controller algorithms navigation toolbox https www mathworks com products navigation html is a tool that provides algorithms and analysis tools for motion planning simultaneous localization and mapping slam and inertial navigation the toolbox includes customizable search and sampling based path planners as well as metrics for validating and comparing paths you can create 2d and 3d map representations generate maps using slam algorithms and interactively visualize and debug map generation with the slam map builder app uav toolbox https www mathworks com products uav html is an application that provides tools and reference applications for designing simulating testing and deploying unmanned aerial vehicle uav and drone applications you can design autonomous flight algorithms uav missions and flight controllers the flight log analyzer app lets you interactively analyze 3d flight paths telemetry information and sensor readings from common flight log formats lidar toolbox https www mathworks com products lidar html is a tool that provides algorithms functions and apps for designing analyzing and testing lidar processing systems you can perform object detection and tracking semantic segmentation shape fitting lidar registration and obstacle detection lidar toolbox supports lidar camera cross calibration for workflows that combine computer vision and lidar processing mapping toolbox https www mathworks com products mapping html is a tool that provides algorithms and functions for transforming geographic data and creating map displays you can visualize your data in a geographic context build map displays from more than 60 map projections and transform data from a variety of sources into a consistent geographic coordinate system computer vision development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 129494417 b0ee8192 ac41 4a6d 8e1d 4761ffc8bab1 png br p computer vision learning resources computer vision https azure microsoft com en us overview what is computer vision is a field of artificial intelligence ai that focuses on enabling computers to identify and understand objects and people in images and videos opencv courses https opencv org courses exploring computer vision in microsoft azure https docs microsoft com en us learn paths explore computer vision microsoft azure top computer vision courses online coursera https www coursera org courses languages en query computer 20vision top computer vision courses online udemy https www udemy com topic computer vision learn computer vision with online courses and lessons edx https www edx org learn computer vision computer vision and image processing fundamentals edx https www edx org course computer vision and image processing fundamentals introduction to computer vision courses udacity https www udacity com course introduction to computer vision ud810 computer vision nanodegree program udacity https www udacity com course computer vision nanodegree nd891 machine vision course mit open courseware https ocw mit edu courses electrical engineering and computer science 6 801 machine vision fall 2004 computer vision training courses nobleprog https www nobleprog com computer vision training visual computing graduate program stanford online https online stanford edu programs visual computing graduate program computer vision tools libraries and frameworks opencv https opencv org is a highly optimized library with focus on real time computer vision applications the c python and java interfaces support linux macos windows ios and android microsoft computer vision recipes https github com microsoft computervision recipes is a project that provides examples and best practice guidelines for building computer vision systems this allows people to build a comprehensive set of tools and examples that leverage recent advances in computer vision algorithms neural architectures and operationalizing such systems creatin from existing state of the art libraries and build additional utility around loading image data optimizing and evaluating models and scaling up to the cloud microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org automated driving toolbox https www mathworks com products automated driving html is a matlab tool that provides algorithms and tools for designing simulating and testing adas and autonomous driving systems you can design and test vision and lidar perception systems as well as sensor fusion path planning and vehicle controllers visualization tools include a bird s eye view plot and scope for sensor coverage detections and tracks and displays for video lidar and maps the toolbox lets you import and work with here hd live map data and opendrive road networks it also provides reference application examples for common adas and automated driving features including fcw aeb acc lka and parking valet the toolbox supports c c code generation for rapid prototyping and hil testing with support for sensor fusion tracking path planning and vehicle controller algorithms lrslibrary https github com andrewssobral lrslibrary is a low rank and sparse tools for background modeling and subtraction in videos the library was designed for moving object detection in videos but it can be also used for other computer vision and machine learning problems image processing toolbox https www mathworks com products image html is a tool that provides a comprehensive set of reference standard algorithms and workflow apps for image processing analysis visualization and algorithm development you can perform image segmentation image enhancement noise reduction geometric transformations image registration and 3d image processing computer vision toolbox https www mathworks com products computer vision html is a tool that provides algorithms functions and apps for designing and testing computer vision 3d vision and video processing systems you can perform object detection and tracking as well as feature detection extraction and matching you can automate calibration workflows for single stereo and fisheye cameras for 3d vision the toolbox supports visual and point cloud slam stereo vision structure from motion and point cloud processing statistics and machine learning toolbox https www mathworks com products statistics html is a tool that provides functions and apps to describe analyze and model data you can use descriptive statistics visualizations and clustering for exploratory data analysis fit probability distributions to data generate random numbers for monte carlo simulations and perform hypothesis tests regression and classification algorithms let you draw inferences from data and build predictive models either interactively using the classification and regression learner apps or programmatically using automl lidar toolbox https www mathworks com products lidar html is a tool that provides algorithms functions and apps for designing analyzing and testing lidar processing systems you can perform object detection and tracking semantic segmentation shape fitting lidar registration and obstacle detection lidar toolbox supports lidar camera cross calibration for workflows that combine computer vision and lidar processing mapping toolbox https www mathworks com products mapping html is a tool that provides algorithms and functions for transforming geographic data and creating map displays you can visualize your data in a geographic context build map displays from more than 60 map projections and transform data from a variety of sources into a consistent geographic coordinate system uav toolbox https www mathworks com products uav html is an application that provides tools and reference applications for designing simulating testing and deploying unmanned aerial vehicle uav and drone applications you can design autonomous flight algorithms uav missions and flight controllers the flight log analyzer app lets you interactively analyze 3d flight paths telemetry information and sensor readings from common flight log formats parallel computing toolbox https www mathworks com products matlab parallel server html is a tool that lets you solve computationally and data intensive problems using multicore processors gpus and computer clusters high level constructs such as parallel for loops special array types and parallelized numerical algorithms enable you to parallelize matlab applications without cuda or mpi programming the toolbox lets you use parallel enabled functions in matlab and other toolboxes you can use the toolbox with simulink to run multiple simulations of a model in parallel programs and models can run in both interactive and batch modes partial differential equation toolbox https www mathworks com products pde html is a tool that provides functions for solving structural mechanics heat transfer and general partial differential equations pdes using finite element analysis ros toolbox https www mathworks com products ros html is a tool that provides an interface connecting matlab and simulink with the robot operating system ros and ros 2 enabling you to create a network of ros nodes the toolbox includes matlab functions and simulink blocks to import analyze and play back ros data recorded in rosbag files you can also connect to a live ros network to access ros messages robotics toolbox https www mathworks com products robotics html provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot deep learning toolbox https www mathworks com products deep learning html is a tool that provides a framework for designing and implementing deep neural networks with algorithms pretrained models and apps you can use convolutional neural networks convnets cnns and long short term memory lstm networks to perform classification and regression on image time series and text data you can build network architectures such as generative adversarial networks gans and siamese networks using automatic differentiation custom training loops and shared weights with the deep network designer app you can design analyze and train networks graphically it can exchange models with tensorflow and pytorch through the onnx format and import models from tensorflow keras and caffe the toolbox supports transfer learning with darknet 53 resnet 50 nasnet squeezenet and many other pretrained models reinforcement learning toolbox https www mathworks com products reinforcement learning html is a tool that provides an app functions and a simulink block for training policies using reinforcement learning algorithms including dqn ppo sac and ddpg you can use these policies to implement controllers and decision making algorithms for complex applications such as resource allocation robotics and autonomous systems deep learning hdl toolbox https www mathworks com products deep learning hdl html is a tool that provides functions and tools to prototype and implement deep learning networks on fpgas and socs it provides pre built bitstreams for running a variety of deep learning networks on supported xilinx and intel fpga and soc devices profiling and estimation tools let you customize a deep learning network by exploring design performance and resource utilization tradeoffs model predictive control toolbox https www mathworks com products model predictive control html is a tool that provides functions an app and simulink blocks for designing and simulating controllers using linear and nonlinear model predictive control mpc the toolbox lets you specify plant and disturbance models horizons constraints and weights by running closed loop simulations you can evaluate controller performance vision hdl toolbox https www mathworks com products vision hdl html is a tool that provides pixel streaming algorithms for the design and implementation of vision systems on fpgas and asics it provides a design framework that supports a diverse set of interface types frame sizes and frame rates the image processing video and computer vision algorithms in the toolbox use an architecture appropriate for hdl implementations data acquisition toolbox https www mathworks com products data acquisition html is a tool that provides apps and functions for configuring data acquisition hardware reading data into matlab and simulink and writing data to daq analog and digital output channels the toolbox supports a variety of daq hardware including usb pci pci express pxi and pxi express devices from national instruments and other vendors microsoft airsim https microsoft github io airsim lidar html is a simulator for drones cars and more built on unreal engine with an experimental unity release airsim is open source cross platform and supports software in the loop simulation https www mathworks com help ecoder software in the loop sil simulation html with popular flight controllers such as px4 ardupilot and hardware in loop https www ni com en us innovations white papers 17 what is hardware in the loop html with px4 for physically and visually realistic simulations it is developed as an unreal plugin that can simply be dropped into any unreal environment airsim is being developed as a platform for ai research to experiment with deep learning computer vision and reinforcement learning algorithms for autonomous vehicles nlp development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 131386286 e23991d5 a1aa 4ee9 9582 874dc0854c1a png br p nlp learning resources natural language processing nlp https www ibm com cloud learn natural language processing is a branch of artificial intelligence ai focused on giving computers the ability to understand text and spoken words in much the same way human beings can nlp combines computational linguistics rule based modeling of human language with statistical machine learning and deep learning models natural language processing with python s nltk package https realpython com nltk nlp python cognitive services apis for ai developers microsoft azure https azure microsoft com en us services cognitive services artificial intelligence services amazon web services aws https aws amazon com machine learning ai services google cloud natural language api https cloud google com natural language docs reference rest top natural language processing courses online udemy https www udemy com topic natural language processing introduction to natural language processing nlp udemy https www udemy com course natural language processing top natural language processing courses coursera https www coursera org courses query natural 20language 20processing natural language processing coursera https www coursera org learn language processing natural language processing in tensorflow coursera https www coursera org learn natural language processing tensorflow learn natural language processing with online courses and lessons edx https www edx org learn natural language processing build a natural language processing solution with microsoft azure pluralsight https www pluralsight com courses build natural language processing solution microsoft azure natural language processing nlp training courses nobleprog https www nobleprog com nlp training natural language processing with deep learning course standford online https online stanford edu courses cs224n natural language processing deep learning advanced natural language processing mit opencourseware https ocw mit edu courses electrical engineering and computer science 6 864 advanced natural language processing fall 2005 certified natural language processing expert certification iabac https iabac org artificial intelligence certification certified natural language processing expert natural language processing course intel https software intel com content www us en develop training course natural language processing html nlp tools libraries and frameworks natural language toolkit nltk https www nltk org is a leading platform for building python programs to work with human language data it provides easy to use interfaces to over 50 corpora and lexical resources https nltk org nltk data such as wordnet along with a suite of text processing libraries for classification tokenization stemming tagging parsing and semantic reasoning wrappers for industrial strength nlp libraries spacy https spacy io is a library for advanced natural language processing in python and cython it s built on the very latest research and was designed from day one to be used in real products spacy comes with pretrained pipelines and currently supports tokenization and training for 60 languages it also features neural network models for tagging parsing named entity recognition text classification and more multi task learning with pretrained transformers like bert corenlp https stanfordnlp github io corenlp is a set of natural language analysis tools written in java corenlp enables users to derive linguistic annotations for text including token and sentence boundaries parts of speech named entities numeric and time values dependency and constituency parses coreference sentiment quote attributions and relations nlpnet https github com erickrf nlpnet is a python library for natural language processing tasks based on neural networks it performs part of speech tagging semantic role labeling and dependency parsing flair https github com flairnlp flair is a simple framework for state of the art natural language processing nlp models to your text such as named entity recognition ner part of speech tagging pos special support for biomedical data sense disambiguation and classification with support for a rapidly growing number of languages catalyst https github com curiosity ai catalyst is a c natural language processing library built for speed inspired by spacy s design https spacy io it brings pre trained models out of the box support for training word and document embeddings and flexible entity recognition models apache opennlp https opennlp apache org is an open source library for a machine learning based toolkit used in the processing of natural language text it features an api for use cases like named entity recognition https en wikipedia org wiki named entity recognition sentence detection pos part of speech tagging https en wikipedia org wiki part of speech tagging tokenization https en wikipedia org wiki tokenization data security feature extraction https en wikipedia org wiki feature extraction chunking https en wikipedia org wiki chunking psychology parsing https en wikipedia org wiki parsing and coreference resolution https en wikipedia org wiki coreference microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml pytorch https pytorch org is a library for deep learning on irregular input data such as graphs point clouds and manifolds primarily developed by facebook s ai research lab eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks chainer https chainer org is a python based deep learning framework aiming at flexibility it provides automatic differentiation apis based on the define by run approach dynamic computational graphs as well as object oriented high level apis to build and train neural networks it also supports cuda cudnn using cupy https github com cupy cupy for high performance training and inference anaconda https www anaconda com is a very popular data science platform for machine learning and deep learning that enables users to develop models train them and deploy them plaidml https github com plaidml plaidml is an advanced and portable tensor compiler for enabling deep learning on laptops embedded devices or other devices where the available computing hardware is not well supported or the available software stack contains unpalatable license restrictions scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms caffe https github com bvlc caffe is a deep learning framework made with expression speed and modularity in mind it is developed by berkeley ai research bair the berkeley vision and learning center bvlc and community contributors theano https github com theano theano is a python library that allows you to define optimize and evaluate mathematical expressions involving multi dimensional arrays efficiently including tight integration with numpy apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture apache airflow https airflow apache org is an open source workflow management platform created by the community to programmatically author schedule and monitor workflows airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers airflow is ready to scale to infinity open neural network exchange onnx https github com onnx is an open ecosystem that empowers ai developers to choose the right tools as their project evolves onnx provides an open source format for ai models both deep learning and traditional ml it defines an extensible computation graph model as well as definitions of built in operators and standard data types bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters numba https github com numba numba is an open source numpy aware optimizing compiler for python sponsored by anaconda inc it uses the llvm compiler project to generate machine code from python syntax numba can compile a large subset of numerically focused python including many numpy functions additionally numba has support for automatic parallelization of loops generation of gpu accelerated code and creation of ufuncs and c callbacks bioinformatics back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 126912438 49cc660e 90c5 4cbc 828b 10e807b99767 png br p bioinformatics learning resources bioinformatics https www genome gov genetics glossary bioinformatics is a field of computational science that has to do with the analysis of sequences of biological molecules this usually refers to genes dna rna or protein and is particularly useful in comparing genes and other sequences in proteins and other sequences within an organism or between organisms looking at evolutionary relationships between organisms and using the patterns that exist across dna and protein sequences to figure out what their function is european bioinformatics institute https www ebi ac uk national center for biotechnology information https www ncbi nlm nih gov online courses in bioinformatics iscb international society for computational biology https www iscb org cms addon online courses index php bioinformatics coursera https www coursera org specializations bioinformatics top bioinformatics courses udemy https www udemy com topic bioinformatics biometrics courses udemy https www udemy com course biometrics learn bioinformatics with online courses and lessons edx https www edx org learn bioinformatics bioinformatics graduate certificate harvard extension school https extension harvard edu academics programs bioinformatics graduate certificate bioinformatics and biostatistics uc san diego extension https extension ucsd edu courses and programs bioinformatics and biostatistics bioinformatics and proteomics free online course materials mit https ocw mit edu courses electrical engineering and computer science 6 092 bioinformatics and proteomics january iap 2005 introduction to biometrics course biometrics institute https www biometricsinstitute org event introduction to biometrics short course bioinformatics tools libraries and frameworks bioconductor https bioconductor org is an open source project that provides tools for the analysis and comprehension of high throughput genomic data bioconductor uses the r statistical programming language https www r project org about html and is open source and open development it has two releases each year and an active user community bioconductor is also available as an ami amazon machine image https docs aws amazon com awsec2 latest userguide amis html and docker images https docs docker com engine reference commandline images bioconda https bioconda github io is a channel for the conda package manager specializing in bioinformatics software it has a repository of packages containing over 7000 bioinformatics packages ready to use with conda install uniprot https www uniprot org is a freely accessible database that provide users with a comprehensive high quality and freely accessible set of protein sequences annotated with functional information bowtie 2 https bio tools bowtie2 is an ultrafast and memory efficient tool for aligning sequencing reads to long reference sequences it is particularly good at aligning reads of about 50 up to 100s or 1 000s of characters and particularly good at aligning to relatively long mammalian genomes biopython https biopython org is a set of freely available tools for biological computation written in python by an international team of developers it is a distributed collaborative effort to develop python libraries and applications which address the needs of current and future work in bioinformatics bioruby https bioruby open bio org is a toolkit that has components for sequence analysis pathway analysis protein modelling and phylogenetic analysis it supports many widely used data formats and provides easy access to databases external programs and public web services including blast kegg genbank medline and go biojava https biojava org is a toolkit that provides an api to maintain local installations of the pdb load and manipulate structures perform standard analysis such as sequence and structure alignments and visualize them in 3d biophp https biophp org is an open source project that provides a collection of open source php code with classes for dna and protein sequence analysis alignment database parsing and other bioinformatics tools avogadro https avogadro cc is an advanced molecule editor and visualizer designed for cross platform use in computational chemistry molecular modeling bioinformatics materials science and related areas it offers flexible high quality rendering and a powerful plugin architecture ascalaph designer https www biomolecular modeling com ascalaph ascalaph designer html is a program for molecular dynamic simulations under a single graphical environment are represented as their own implementation of molecular dynamics as well as the methods of classical and quantum mechanics of popular programs anduril https www anduril org site is a workflow platform for analyzing large data sets anduril provides facilities for analyzing high thoughput data in biomedical research and the platform is fully extensible by third parties ready made tools support data visualization dna rna chip sequencing dna rna microarrays cytometry and image analysis galaxy https melbournebioinformatics github io melbioinf docs tutorials galaxy 101 galaxy 101 is an open source web based platform for accessible reproducible and transparent computational biomedical research it allows users without programming experience to easily specify parameters and run individual tools as well as larger workflows it also captures run information so that any user can repeat and understand a complete computational analysis pathvisio https pathvisio github io is a free open source pathway analysis and drawing software which allows drawing editing and analyzing biological pathways it is developed in java and can be extended with plugins orange https orangedatamining com is a powerful data mining and machine learning toolkit that performs data analysis and visualization basic local alignment search tool https blast ncbi nlm nih gov blast cgi is a tool that finds regions of similarity between biological sequences the program compares nucleotide or protein sequences to sequence databases and calculates the statistical significance osiris https www ncbi nlm nih gov osiris is public domain free and open source str analysis software designed for clinical forensic and research use and has been validated for use as an expert system for single source samples ncbi biosystems https www ncbi nlm nih gov biosystems is a database that provides integrated access to biological systems and their component genes proteins and small molecules as well as literature describing those biosystems and other related data throughout entrez cuda development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 94306481 e17b8f00 ff27 11ea 832f c85374acb3b1 png br p p align center img src https user images githubusercontent com 45159366 117718735 55a23480 b191 11eb 874d e690d09cd490 png br p cuda toolkit source nvidia developer cuda https developer nvidia com cuda zone cuda learning resources cuda https developer nvidia com cuda zone is a parallel computing platform and programming model developed by nvidia for general computing on graphical processing units gpus with cuda developers are able to dramatically speed up computing applications by harnessing the power of gpus in gpu accelerated applications the sequential part of the workload runs on the cpu which is optimized for single threaded the compute intensive portion of the application runs on thousands of gpu cores in parallel when using cuda developers can program in popular languages such as c c fortran python and matlab cuda toolkit documentation https docs nvidia com cuda index html cuda quick start guide https docs nvidia com cuda cuda quick start guide index html cuda on wsl https docs nvidia com cuda wsl user guide index html cuda gpu support for tensorflow https www tensorflow org install gpu nvidia deep learning cudnn documentation https docs nvidia com deeplearning cudnn api index html nvidia gpu cloud documentation https docs nvidia com ngc ngc introduction index html nvidia ngc https ngc nvidia com is a hub for gpu optimized software for deep learning machine learning and high performance computing hpc workloads nvidia ngc containers https www nvidia com en us gpu cloud containers is a registry that provides researchers data scientists and developers with simple access to a comprehensive catalog of gpu accelerated software for ai machine learning and hpc these containers take full advantage of nvidia gpus on premises and in the cloud cuda tools libraries and frameworks cuda toolkit https developer nvidia com cuda downloads is a collection of tools libraries that provide a development environment for creating high performance gpu accelerated applications the cuda toolkit allows you can develop optimize and deploy your applications on gpu accelerated embedded systems desktop workstations enterprise data centers cloud based platforms and hpc supercomputers the toolkit includes gpu accelerated libraries debugging and optimization tools a c c compiler and a runtime library to build and deploy your application on major architectures including x86 arm and power nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org cuda x hpc https www nvidia com en us technologies cuda x is a collection of libraries tools compilers and apis that help developers solve the world s most challenging problems cuda x hpc includes highly tuned kernels essential for high performance computing hpc nvidia container toolkit https github com nvidia nvidia docker is a collection of tools libraries that allows users to build and run gpu accelerated docker containers the toolkit includes a container runtime library https github com nvidia libnvidia container and utilities to automatically configure containers to leverage nvidia gpus minkowski engine https nvidia github io minkowskiengine is an auto differentiation library for sparse tensors it supports all standard neural network layers such as convolution pooling unpooling and broadcasting operations for sparse tensors cutlass https github com nvidia cutlass is a collection of cuda c template abstractions for implementing high performance matrix multiplication gemm at all levels and scales within cuda it incorporates strategies for hierarchical decomposition and data movement similar to those used to implement cublas cub https github com nvidia cub is a cooperative primitives for cuda c kernel authors tensorman https github com pop os tensorman is a utility for easy management of tensorflow containers by developed by system76 https system76 com tensorman allows tensorflow to operate in an isolated environment that is contained from the rest of the system this virtual environment can operate independent of the base system allowing you to use any version of tensorflow on any version of a linux distribution that supports the docker runtime numba https github com numba numba is an open source numpy aware optimizing compiler for python sponsored by anaconda inc it uses the llvm compiler project to generate machine code from python syntax numba can compile a large subset of numerically focused python including many numpy functions additionally numba has support for automatic parallelization of loops generation of gpu accelerated code and creation of ufuncs and c callbacks chainer https chainer org is a python based deep learning framework aiming at flexibility it provides automatic differentiation apis based on the define by run approach dynamic computational graphs as well as object oriented high level apis to build and train neural networks it also supports cuda cudnn using cupy https github com cupy cupy for high performance training and inference cupy https cupy dev is an implementation of numpy compatible multi dimensional array on cuda cupy consists of the core multi dimensional array class cupy ndarray and many functions on it it supports a subset of numpy ndarray interface catboost https catboost ai is a fast scalable high performance gradient boosting https en wikipedia org wiki gradient boosting on decision trees library used for ranking classification regression and other machine learning tasks for python r java c supports computation on cpu and gpu cudf https rapids ai is a gpu dataframe library for loading joining aggregating filtering and otherwise manipulating data cudf provides a pandas like api that will be familiar to data engineers data scientists so they can use it to easily accelerate their workflows without going into the details of cuda programming cuml https github com rapidsai cuml is a suite of libraries that implement machine learning algorithms and mathematical primitives functions that share compatible apis with other rapids projects cuml enables data scientists researchers and software engineers to run traditional tabular ml tasks on gpus without going into the details of cuda programming in most cases cuml s python api matches the api from scikit learn arrayfire https arrayfire com is a general purpose library that simplifies the process of developing software that targets parallel and massively parallel architectures including cpus gpus and other hardware acceleration devices thrust https github com nvidia thrust is a c parallel programming library which resembles the c standard library thrust s high level interface greatly enhances programmer productivity while enabling performance portability between gpus and multicore cpus aresdb https eng uber com aresdb is a gpu powered real time analytics storage and query engine it features low query latency high data freshness and highly efficient in memory and on disk storage management arraymancer https mratsim github io arraymancer is a tensor n dimensional array project in nim the main focus is providing a fast and ergonomic cpu cuda and opencl ndarray library on which to build a scientific computing ecosystem kintinuous https github com mp3guy kintinuous is a real time dense visual slam system capable of producing high quality globally consistent point and mesh reconstructions over hundreds of metres in real time with only a low cost commodity rgb d sensor graphvite https graphvite io is a general graph embedding engine dedicated to high speed and large scale embedding learning in various applications matlab development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 94306473 de809e80 ff27 11ea 924b 0a6947ae38bc png br p matlab learning resources matlab https www mathworks com products matlab html is a programming language that does numerical computing such as expressing matrix and array mathematics directly matlab documentation https www mathworks com help matlab getting started with matlab https www mathworks com help matlab getting started with matlab html matlab and simulink training from matlab academy https matlabacademy mathworks com mathworks certification program https www mathworks com services training certification html matlab online courses from udemy https www udemy com topic matlab matlab online courses from coursera https www coursera org courses query matlab matlab online courses from edx https www edx org learn matlab building a matlab gui https www mathworks com discovery matlab gui html matlab style guidelines 2 0 https www mathworks com matlabcentral fileexchange 46056 matlab style guidelines 2 0 setting up git source control with matlab simulink https www mathworks com help matlab matlab prog set up git source control html pull push and fetch files with git with matlab simulink https www mathworks com help matlab matlab prog push and fetch with git html create new repository with matlab simulink https www mathworks com help matlab matlab prog add folder to source control html prmlt http prml github io is matlab code for machine learning algorithms in the prml book matlab tools libraries frameworks matlab and simulink services applications list https www mathworks com products html matlab in the cloud https www mathworks com solutions cloud html is a service that allows you to run in cloud environments from mathworks cloud https www mathworks com solutions cloud html browser to public clouds https www mathworks com solutions cloud html public cloud including aws https aws amazon com and azure https azure microsoft com matlab online https matlab mathworks com is a service that allows to users to uilitize matlab and simulink through a web browser such as google chrome simulink https www mathworks com products simulink html is a block diagram environment for model based design it supports simulation automatic code generation and continuous testing of embedded systems simulink online https www mathworks com products simulink online html is a service that provides access to simulink through your web browser matlab drive https www mathworks com products matlab drive html is a service that gives you the ability to store access and work with your files from anywhere matlab parallel server https www mathworks com products matlab parallel server html is a tool that lets you scale matlab programs and simulink simulations to clusters and clouds you can prototype your programs and simulations on the desktop and then run them on clusters and clouds without recoding matlab parallel server supports batch jobs interactive parallel computations and distributed computations with large matrices matlab schemer https github com scottclowe matlab schemer is a matlab package makes it easy to change the color scheme theme of the matlab display and gui lrslibrary https github com andrewssobral lrslibrary is a low rank and sparse tools for background modeling and subtraction in videos the library was designed for moving object detection in videos but it can be also used for other computer vision and machine learning problems image processing toolbox https www mathworks com products image html is a tool that provides a comprehensive set of reference standard algorithms and workflow apps for image processing analysis visualization and algorithm development you can perform image segmentation image enhancement noise reduction geometric transformations image registration and 3d image processing computer vision toolbox https www mathworks com products computer vision html is a tool that provides algorithms functions and apps for designing and testing computer vision 3d vision and video processing systems you can perform object detection and tracking as well as feature detection extraction and matching you can automate calibration workflows for single stereo and fisheye cameras for 3d vision the toolbox supports visual and point cloud slam stereo vision structure from motion and point cloud processing statistics and machine learning toolbox https www mathworks com products statistics html is a tool that provides functions and apps to describe analyze and model data you can use descriptive statistics visualizations and clustering for exploratory data analysis fit probability distributions to data generate random numbers for monte carlo simulations and perform hypothesis tests regression and classification algorithms let you draw inferences from data and build predictive models either interactively using the classification and regression learner apps or programmatically using automl lidar toolbox https www mathworks com products lidar html is a tool that provides algorithms functions and apps for designing analyzing and testing lidar processing systems you can perform object detection and tracking semantic segmentation shape fitting lidar registration and obstacle detection lidar toolbox supports lidar camera cross calibration for workflows that combine computer vision and lidar processing mapping toolbox https www mathworks com products mapping html is a tool that provides algorithms and functions for transforming geographic data and creating map displays you can visualize your data in a geographic context build map displays from more than 60 map projections and transform data from a variety of sources into a consistent geographic coordinate system uav toolbox https www mathworks com products uav html is an application that provides tools and reference applications for designing simulating testing and deploying unmanned aerial vehicle uav and drone applications you can design autonomous flight algorithms uav missions and flight controllers the flight log analyzer app lets you interactively analyze 3d flight paths telemetry information and sensor readings from common flight log formats parallel computing toolbox https www mathworks com products matlab parallel server html is a tool that lets you solve computationally and data intensive problems using multicore processors gpus and computer clusters high level constructs such as parallel for loops special array types and parallelized numerical algorithms enable you to parallelize matlab applications without cuda or mpi programming the toolbox lets you use parallel enabled functions in matlab and other toolboxes you can use the toolbox with simulink to run multiple simulations of a model in parallel programs and models can run in both interactive and batch modes partial differential equation toolbox https www mathworks com products pde html is a tool that provides functions for solving structural mechanics heat transfer and general partial differential equations pdes using finite element analysis ros toolbox https www mathworks com products ros html is a tool that provides an interface connecting matlab and simulink with the robot operating system ros and ros 2 enabling you to create a network of ros nodes the toolbox includes matlab functions and simulink blocks to import analyze and play back ros data recorded in rosbag files you can also connect to a live ros network to access ros messages robotics toolbox https www mathworks com products robotics html provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot deep learning toolbox https www mathworks com products deep learning html is a tool that provides a framework for designing and implementing deep neural networks with algorithms pretrained models and apps you can use convolutional neural networks convnets cnns and long short term memory lstm networks to perform classification and regression on image time series and text data you can build network architectures such as generative adversarial networks gans and siamese networks using automatic differentiation custom training loops and shared weights with the deep network designer app you can design analyze and train networks graphically it can exchange models with tensorflow and pytorch through the onnx format and import models from tensorflow keras and caffe the toolbox supports transfer learning with darknet 53 resnet 50 nasnet squeezenet and many other pretrained models reinforcement learning toolbox https www mathworks com products reinforcement learning html is a tool that provides an app functions and a simulink block for training policies using reinforcement learning algorithms including dqn ppo sac and ddpg you can use these policies to implement controllers and decision making algorithms for complex applications such as resource allocation robotics and autonomous systems deep learning hdl toolbox https www mathworks com products deep learning hdl html is a tool that provides functions and tools to prototype and implement deep learning networks on fpgas and socs it provides pre built bitstreams for running a variety of deep learning networks on supported xilinx and intel fpga and soc devices profiling and estimation tools let you customize a deep learning network by exploring design performance and resource utilization tradeoffs model predictive control toolbox https www mathworks com products model predictive control html is a tool that provides functions an app and simulink blocks for designing and simulating controllers using linear and nonlinear model predictive control mpc the toolbox lets you specify plant and disturbance models horizons constraints and weights by running closed loop simulations you can evaluate controller performance vision hdl toolbox https www mathworks com products vision hdl html is a tool that provides pixel streaming algorithms for the design and implementation of vision systems on fpgas and asics it provides a design framework that supports a diverse set of interface types frame sizes and frame rates the image processing video and computer vision algorithms in the toolbox use an architecture appropriate for hdl implementations soc blockset https www mathworks com products soc html is a tool that provides simulink blocks and visualization tools for modeling simulating and analyzing hardware and software architectures for asics fpgas and systems on a chip soc you can build your system architecture using memory models bus models and i o models and simulate the architecture together with the algorithms wireless hdl toolbox https www mathworks com products wireless hdl html is a tool that provides pre verified hardware ready simulink blocks and subsystems for developing 5g lte and custom ofdm based wireless communication applications it includes reference applications ip blocks and gateways between frame and sample based processing thingspeak https www mathworks com products thingspeak html is an iot analytics service that allows you to aggregate visualize and analyze live data streams in the cloud thingspeak provides instant visualizations of data posted by your devices to thingspeak with the ability to execute matlab code in thingspeak you can perform online analysis and process data as it comes in thingspeak is often used for prototyping and proof of concept iot systems that require analytics sea mat https sea mat github io sea mat is a collaborative effort to organize and distribute matlab tools for the oceanographic community gramm https github com piermorel gramm is a complete data visualization toolbox for matlab it provides an easy to use and high level interface to produce publication quality plots of complex data with varied statistical visualizations gramm is inspired by r s ggplot2 library hctsa https hctsa users gitbook io hctsa manual is a software package for running highly comparative time series analysis using matlab plotly https plot ly matlab is a graphing library for matlab yalmip https yalmip github io is a matlab toolbox for optimization modeling gnu octave https www gnu org software octave is a high level interpreted language primarily intended for numerical computations it provides capabilities for the numerical solution of linear and nonlinear problems and for performing other numerical experiments it also provides extensive graphics capabilities for data visualization and manipulation c c development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 115297894 961e0d80 a111 11eb 81c3 e2bd2ac9a7cd png br p c c learning resources c https www cplusplus com doc tutorial is a cross platform language that can be used to build high performance applications developed by bjarne stroustrup as an extension to the c language c https www iso org standard 74528 html is a general purpose high level language that was originally developed by dennis m ritchie to develop the unix operating system at bell labs it supports structured programming lexical variable scope and recursion with a static type system c also provides constructs that map efficiently to typical machine instructions which makes it one was of the most widely used programming languages today embedded c https en wikipedia org wiki embedded c is a set of language extensions for the c programming language by the c standards committee https isocpp org std the committee to address issues that exist between c extensions for different embedded systems https en wikipedia org wiki embedded system the extensions hep enhance microprocessor features such as fixed point arithmetic multiple distinct memory banks and basic i o operations this makes embedded c the most popular embedded software language in the world c c developer tools from jetbrains https www jetbrains com cpp open source c libraries on cppreference com https en cppreference com w cpp links libs c graphics libraries https cpp libhunt com libs graphics c libraries in matlab https www mathworks com help matlab call cpp library functions html c tools and libraries articles https www cplusplus com articles tools google c style guide https google github io styleguide cppguide html introduction c education course on google developers https developers google com edu c c style guide for fuchsia https fuchsia dev fuchsia src development languages c cpp cpp style c and c coding style guide by opentitan https docs opentitan org doc rm c cpp coding style chromium c style guide https chromium googlesource com chromium src master styleguide c c md c core guidelines https github com isocpp cppcoreguidelines blob master cppcoreguidelines md c style guide for ros http wiki ros org cppstyleguide learn c https www learncpp com learn c an interactive c tutorial https www learn c org c institute https cppinstitute org free c and c courses c online training courses on linkedin learning https www linkedin com learning topics c plus plus c tutorials on w3schools https www w3schools com cpp default asp learn c programming online courses on edx https www edx org learn c programming learn c with online courses on edx https www edx org learn c plus plus learn c on codecademy https www codecademy com learn learn c plus plus coding for everyone c and c course on coursera https www coursera org specializations coding for everyone c for c programmers on coursera https www coursera org learn c plus plus a top c courses on coursera https www coursera org courses query c 20programming c online courses on udemy https www udemy com topic c plus plus top c courses on udemy https www udemy com topic c programming basics of embedded c programming for beginners on udemy https www udemy com course embedded c programming for embedded systems c for programmers course on udacity https www udacity com course c for programmers ud210 c fundamentals course on pluralsight https www pluralsight com courses learn program cplusplus introduction to c on mit free online course materials https ocw mit edu courses electrical engineering and computer science 6 096 introduction to c january iap 2011 introduction to c for programmers harvard https online learning harvard edu course introduction c programmers online c courses harvard university https online learning harvard edu subject c c c tools aws sdk for c https aws amazon com sdk for cpp azure sdk for c https github com azure azure sdk for cpp azure sdk for c https github com azure azure sdk for c c client libraries for google cloud services https github com googleapis google cloud cpp visual studio https visualstudio microsoft com is an integrated development environment ide from microsoft which is a feature rich application that can be used for many aspects of software development visual studio makes it easy to edit debug build and publish your app by using microsoft software development platforms such as windows api windows forms windows presentation foundation and windows store visual studio code https code visualstudio com is a code editor redefined and optimized for building and debugging modern web and cloud applications vcpkg https github com microsoft vcpkg is a c library manager for windows linux and macos resharper c https www jetbrains com resharper cpp features is a visual studio extension for c developers developed by jetbrains appcode https www jetbrains com objc is constantly monitoring the quality of your code it warns you of errors and smells and suggests quick fixes to resolve them automatically appcode provides lots of code inspections for objective c swift c c and a number of code inspections for other supported languages all code inspections are run on the fly clion https www jetbrains com clion features is a cross platform ide for c and c developers developed by jetbrains code blocks https www codeblocks org is a free c c and fortran ide built to meet the most demanding needs of its users it is designed to be very extensible and fully configurable built around a plugin framework code blocks can be extended with plugins cppsharp https github com mono cppsharp is a tool and set of libraries which facilitates the usage of native c c code with the net ecosystem it consumes c c header and library files and generates the necessary glue code to surface the native api as a managed api such an api can be used to consume an existing native library in your managed code or add managed scripting support to a native codebase conan https conan io is an open source package manager for c development and dependency management into the 21st century and on par with the other development ecosystems high performance computing hpc sdk https developer nvidia com hpc is a comprehensive toolbox for gpu accelerating hpc modeling and simulation applications it includes the c c and fortran compilers libraries and analysis tools necessary for developing hpc applications on the nvidia platform thrust https github com nvidia thrust is a c parallel programming library which resembles the c standard library thrust s high level interface greatly enhances programmer productivity while enabling performance portability between gpus and multicore cpus interoperability with established technologies such as cuda tbb and openmp integrates with existing software boost https www boost org is an educational opportunity focused on cutting edge c boost has been a participant in the annual google summer of code since 2007 in which students develop their skills by working on boost library development automake https www gnu org software automake is a tool for automatically generating makefile in files compliant with the gnu coding standards automake requires the use of gnu autoconf cmake https cmake org is an open source cross platform family of tools designed to build test and package software cmake is used to control the software compilation process using simple platform and compiler independent configuration files and generate native makefiles and workspaces that can be used in the compiler environment of your choice gdb http www gnu org software gdb is a debugger that allows you to see what is going on inside another program while it executes or what another program was doing at the moment it crashed gcc https gcc gnu org is a compiler collection that includes front ends for c c objective c fortran ada go and d as well as libraries for these languages gsl https www gnu org software gsl is a numerical library for c and c programmers it is free software under the gnu general public license the library provides a wide range of mathematical routines such as random number generators special functions and least squares fitting there are over 1000 functions in total with an extensive test suite opengl extension wrangler library glew https www opengl org sdk libs glew is a cross platform open source c c extension loading library glew provides efficient run time mechanisms for determining which opengl extensions are supported on the target platform libtool https www gnu org software libtool is a generic library support script that hides the complexity of using shared libraries behind a consistent portable interface to use libtool add the new generic library building commands to your makefile makefile in or makefile am maven https maven apache org is a software project management and comprehension tool based on the concept of a project object model pom maven can manage a project s build reporting and documentation from a central piece of information tau tuning and analysis utilities http www cs uoregon edu research tau home php is capable of gathering performance information through instrumentation of functions methods basic blocks and statements as well as event based sampling all c language features are supported including templates and namespaces clang https clang llvm org is a production quality c objective c c and objective c compiler when targeting x86 32 x86 64 and arm other targets may have caveats but are usually easy to fix clang is used in production to build performance critical software like google chrome or firefox opencv https opencv org is a highly optimized library with focus on real time applications cross platform c python and java interfaces support linux macos windows ios and android libcu https nvidia github io libcudacxx is the nvidia c standard library for your entire system it provides a heterogeneous implementation of the c standard library that can be used in and between cpu and gpu code antlr another tool for language recognition https www antlr org is a powerful parser generator for reading processing executing or translating structured text or binary files it s widely used to build languages tools and frameworks from a grammar antlr generates a parser that can build parse trees and also generates a listener interface that makes it easy to respond to the recognition of phrases of interest oat https oatpp io is a light and powerful c web framework for highly scalable and resource efficient web application it s zero dependency and easy portable javacpp https github com bytedeco javacpp is a program that provides efficient access to native c inside java not unlike the way some c c compilers interact with assembly language cython https cython org is a language that makes writing c extensions for python as easy as python itself cython is based on pyrex but supports more cutting edge functionality and optimizations such as calling c functions and declaring c types on variables and class attributes spdlog https github com gabime spdlog is a very fast header only compiled c logging library infer https fbinfer com is a static analysis tool for java c objective c and c infer is written in ocaml https ocaml org java development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 93925952 c0b6fd80 fccb 11ea 9f90 21c4148e3c86 png br p java learning resources java https www oracle com java is a popular programming language and development platform jdk it reduces costs shortens development timeframes drives innovation and improves application services with millions of developers running more than 51 billion java virtual machines worldwide the eclipse foundation https www eclipse org downloads is home to a worldwide community of developers the eclipse ide jakarta ee and over 375 open source projects including runtimes tools and frameworks for java and other languages getting started with java https docs oracle com javase tutorial oracle java certifications from oracle university https education oracle com java certification benefits google developers training https developers google com training google developers certification https developers google com certification java tutorial by w3schools https www w3schools com java building your first android app in java codelabs developers google com codelabs build your first android app getting started with java in visual studio code https code visualstudio com docs java java tutorial google java style guide https google github io styleguide javaguide html aosp java code style for contributors https source android com setup contribute code style chromium java style guide https chromium googlesource com chromium src master styleguide java java md get started with or tools for java https developers google com optimization introduction java getting started with java tool installer task for azure pipelines https docs microsoft com en us azure devops pipelines tasks tool java tool installer gradle user manual https docs gradle org current userguide userguide html tools java se https www oracle com java technologies javase tools jsp html contains several tools to assist in program development and debugging and in the monitoring and troubleshooting of production applications jdk development tools https docs oracle com javase 7 docs technotes tools includes the java web start tools javaws java troubleshooting profiling monitoring and management tools jcmd jconsole jmc jvisualvm and java web services tools schemagen wsgen wsimport xjc android studio https developer android com studio is the official integrated development environment for google s android operating system built on jetbrains intellij idea software and designed specifically for android development availble on windows macos linux chrome os intellij idea https www jetbrains com idea is an ide for java but it also understands and provides intelligent coding assistance for a large variety of other languages such as kotlin sql jpql html javascript etc even if the language expression is injected into a string literal in your java code netbeans https netbeans org features java index html is an ide provides java developers with all the tools needed to create professional desktop mobile and enterprise applications creating editing and refactoring the ide provides wizards and templates to let you create java ee java se and java me applications java design patterns https github com iluwatar java design patterns is a collection of the best formalized practices a programmer can use to solve common problems when designing an application or system elasticsearch https www elastic co products elasticsearch is a distributed restful search engine built for the cloud written in java rxjava https github com reactivex rxjava is a java vm implementation of reactive extensions http reactivex io a library for composing asynchronous and event based programs by using observable sequences it extends the observer pattern http en wikipedia org wiki observer pattern to support sequences of data events and adds operators that allow you to compose sequences together declaratively while abstracting away concerns about things like low level threading synchronization thread safety and concurrent data structures guava https github com google guava is a set of core java libraries from google that includes new collection types such as multimap and multiset immutable collections a graph library and utilities for concurrency i o hashing caching primitives strings and more it is widely used on most java projects within google and widely used by many other companies as well okhttp https square github io okhttp is a http client for java and kotlin developed by square retrofit https square github io retrofit is a type safe http client for android and java develped by square leakcanary https square github io leakcanary is a memory leak detection library for android develped by square apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache flink https flink apache org is an open source stream processing framework with powerful stream and batch processing capabilities with elegant and fluent apis in java and scala fastjson https github com alibaba fastjson wiki is a java library that can be used to convert java objects into their json representation it can also be used to convert a json string to an equivalent java object libgdx https libgdx com is a cross platform java game development framework based on opengl es that works on windows linux mac os x android your webgl enabled browser and ios jenkins https www jenkins io is the leading open source automation server built with java it provides over 1700 plugins https plugins jenkins io to support automating virtually anything so that humans can actually spend their time doing things machines cannot dbeaver https dbeaver io is a free multi platform database tool for developers sql programmers database administrators and analysts supports any database which has jdbc driver which basically means any database ee version also supports non jdbc datasources mongodb cassandra redis dynamodb etc redisson https redisson pro is a redis java client with features of in memory data grid over 50 redis based java objects and services set multimap sortedset map list queue deque semaphore lock atomiclong map reduce publish subscribe bloom filter spring cache tomcat scheduler jcache api hibernate mybatis rpc and local cache graalvm https www graalvm org is a universal virtual machine for running applications written in javascript python ruby r jvm based languages like java scala clojure kotlin and llvm based languages such as c and c gradle https gradle org is a build automation tool for multi language software development from mobile apps to microservices from small startups to big enterprises gradle helps teams build automate and deliver better software faster write in java c python or your language of choice apache groovy http www groovy lang org is a powerful optionally typed and dynamic language with static typing and static compilation capabilities for the java platform aimed at improving developer productivity thanks to a concise familiar and easy to learn syntax it integrates smoothly with any java program and immediately delivers to your application powerful features including scripting capabilities domain specific language authoring runtime and compile time meta programming and functional programming jacoco https www jacoco org jacoco is a free code coverage library for java which has been created by the eclemma team based on the lessons learned from using and integration existing libraries for many years apache jmeter http jmeter apache org is used to test performance both on static and dynamic resources web dynamic applications it also used to simulate a heavy load on a server group of servers network or object to test its strength or to analyze overall performance under different load types junit https junit org is a simple framework to write repeatable tests it is an instance of the xunit architecture for unit testing frameworks mockito https site mockito org is the most popular mocking framework for unit tests written in java spotbugs https spotbugs github io is a program which uses static analysis to look for bugs in java code springboot https spring io projects spring boot is a great tool that helps you to create spring powered production grade applications and services with absolute minimum fuss it takes an opinionated view of the spring platform so that new and existing users can quickly get to the bits they need yourkit https www yourkit com is a technology leader creator of the most innovative and intelligent tools for profiling java net applications python development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 93133273 ce490380 f68b 11ea 81d0 7f6a3debe6c0 png br p python learning resources python https www python org is an interpreted high level programming language python is used heavily in the fields of data science and machine learning python developer s guide https devguide python org is a comprehensive resource for contributing to python for both new and experienced contributors it is maintained by the same community that maintains python azure functions python developer guide https docs microsoft com en us azure azure functions functions reference python is an introduction to developing azure functions using python the content below assumes that you ve already read the azure functions developers guide https docs microsoft com en us azure azure functions functions reference checkio https checkio org is a programming learning platform and a gamified website that teaches python through solving code challenges and competing for the most elegant and creative solutions python institute https pythoninstitute org pcep certified entry level python programmer certification https pythoninstitute org pcep certification entry level pcap certified associate in python programming certification https pythoninstitute org pcap certification associate pcpp certified professional in python programming 1 certification https pythoninstitute org pcpp certification professional pcpp certified professional in python programming 2 https pythoninstitute org pcpp certification professional mta introduction to programming using python certification https docs microsoft com en us learn certifications mta introduction to programming using python getting started with python in visual studio code https code visualstudio com docs python python tutorial google s python style guide https google github io styleguide pyguide html google s python education class https developers google com edu python real python https realpython com the python open source computer science degree by forrest knight https github com forrestknight open source cs python intro to python for data science https www datacamp com courses intro to python for data science intro to python by w3schools https www w3schools com python python intro asp codecademy s python 3 course https www codecademy com learn learn python 3 learn python with online courses and classes from edx https www edx org learn python python courses online from coursera https www coursera org courses query python python frameworks and tools python package index pypi https pypi org is a repository of software for the python programming language pypi helps you find and install software developed and shared by the python community pycharm https www jetbrains com pycharm is the best ide i ve ever used with pycharm you can access the command line connect to a database create a virtual environment and manage your version control system all in one place saving time by avoiding constantly switching between windows python tools for visual studio ptvs https microsoft github io ptvs is a free open source plugin that turns visual studio into a python ide it supports editing browsing intellisense mixed python c debugging remote linux macos debugging profiling ipython and web development with django and other frameworks pylance https github com microsoft pylance release is an extension that works alongside python in visual studio code to provide performant language support under the hood pylance is powered by pyright microsoft s static type checking tool pyright https github com microsoft pyright is a fast type checker meant for large python source bases it can run in a watch mode and performs fast incremental updates when files are modified django https www djangoproject com is a high level python web framework that encourages rapid development and clean pragmatic design flask https flask palletsprojects com is a micro web framework written in python it is classified as a microframework because it does not require particular tools or libraries web2py http web2py com is an open source web application framework written in python allowing allows web developers to program dynamic web content one web2py instance can run multiple web sites using different databases aws chalice https github com aws chalice is a framework for writing serverless apps in python it allows you to quickly create and deploy applications that use aws lambda tornado https www tornadoweb org is a python web framework and asynchronous networking library tornado uses a non blocking network i o which can scale to tens of thousands of open connections httpie https github com httpie httpie is a command line http client that makes cli interaction with web services as easy as possible httpie is designed for testing debugging and generally interacting with apis http servers scrapy https scrapy org is a fast high level web crawling and web scraping framework used to crawl websites and extract structured data from their pages it can be used for a wide range of purposes from data mining to monitoring and automated testing sentry https sentry io is a service that helps you monitor and fix crashes in realtime the server is in python but it contains a full api for sending events from any language in any application pipenv https github com pypa pipenv is a tool that aims to bring the best of all packaging worlds bundler composer npm cargo yarn etc to the python world python fire https github com google python fire is a library for automatically generating command line interfaces clis from absolutely any python object bottle https github com bottlepy bottle is a fast simple and lightweight wsgi https www wsgi org micro web framework for python it is distributed as a single file module and has no dependencies other than the python standard library https docs python org library cherrypy https cherrypy org is a minimalist python object oriented http web framework sanic https github com huge success sanic is a python 3 6 web server and web framework that s written to go fast pyramid https trypyramid com is a small and fast open source python web framework it makes real world web application development and deployment more fun and more productive turbogears https turbogears org is a hybrid web framework able to act both as a full stack framework or as a microframework falcon https falconframework org is a reliable high performance python web framework for building large scale app backends and microservices with support for mongodb pluggable applications and autogenerated admin neural network intelligence nni https github com microsoft nni is an open source automl toolkit for automate machine learning lifecycle including feature engineering https github com microsoft nni blob master docs en us featureengineering overview md neural architecture search https github com microsoft nni blob master docs en us nas overview md model compression https github com microsoft nni blob master docs en us compressor overview md and hyperparameter tuning https github com microsoft nni blob master docs en us tuner builtintuner md dash https plotly com dash is a popular python framework for building ml data science web apps for python r julia and jupyter luigi https github com spotify luigi is a python module that helps you build complex pipelines of batch jobs it handles dependency resolution workflow management visualization etc it also comes with hadoop support built in locust https github com locustio locust is an easy to use scriptable and scalable performance testing tool spacy https github com explosion spacy is a library for advanced natural language processing in python and cython numpy https www numpy org is the fundamental package needed for scientific computing with python pillow https python pillow org is a friendly pil python imaging library fork ipython https ipython org is a command shell for interactive computing in multiple programming languages originally developed for the python programming language that offers enhanced introspection rich media additional shell syntax tab completion and rich history graphlab create https turi com is a python library backed by a c engine for quickly building large scale high performance machine learning models pandas https pandas pydata org is a fast powerful and easy to use open source data structrures data analysis and manipulation tool built on top of the python programming language pulp https coin or github io pulp is an linear programming modeler written in python pulp can generate lp files and call on use highly optimized solvers glpk coin clp cbc cplex and gurobi to solve these linear problems matplotlib https matplotlib org is a 2d plotting library for creating static animated and interactive visualizations in python matplotlib produces publication quality figures in a variety of hardcopy formats and interactive environments across platforms scikit learn https scikit learn org stable index html is a simple and efficient tool for data mining and data analysis it is built on numpy scipy and mathplotlib scala development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 95688293 09bcec00 0bbe 11eb 8d0d d75706856673 png br p scala learning resources scala https scala lang org is a combination of object oriented and functional programming in one concise high level language scala s static types help avoid bugs in complex applications and its jvm and javascript runtimes let you build high performance systems with easy access to huge ecosystems of libraries scala style guide https docs scala lang org style databricks scala style guide https github com databricks scala style guide data science using scala and spark on azure https docs microsoft com en us azure machine learning team data science process scala walkthrough creating a scala maven application for apache spark in hdinsight using intellij https docs microsoft com en us azure hdinsight spark apache spark create standalone application intro to spark dataframes using scala with azure databricks https docs microsoft com en us azure databricks spark latest dataframes datasets introduction to dataframes scala using scala to program aws glue etl scripts https docs aws amazon com glue latest dg glue etl scala using html using flink scala shell with amazon emr clusters https docs aws amazon com emr latest releaseguide flink scala html aws emr and spark 2 using scala from udemy https www udemy com course aws emr and spark 2 using scala using the google cloud storage connector with apache spark https cloud google com dataproc docs tutorials gcs connector spark tutorial write and run spark scala jobs on cloud dataproc for google cloud https cloud google com dataproc docs tutorials spark scala scala courses and certifications from edx https www edx org learn scala scala courses from coursera https www coursera org courses query scala top scala courses from udemy https www udemy com topic scala scala tools apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs azure databricks https azure microsoft com en us services databricks is a fast and collaborative apache spark based big data analytics service designed for data science and data engineering azure databricks sets up your apache spark environment in minutes autoscale and collaborate on shared projects in an interactive workspace azure databricks supports python scala r java and sql as well as data science frameworks and libraries including tensorflow pytorch and scikit learn apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture cluster manager for apache kafka cmak https github com yahoo cmak is a tool for managing apache kafka https kafka apache org clusters bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks play framework https github com playframework playframework is a web framework combines productivity and performance making it easy to build scalable web applications with java and scala dotty https github com lampepfl dotty is a research compiler that will become scala 3 awscala https github com seratch awscala is a tool that enables scala developers to easily work with amazon web services in the scala way scala js https www scala js org is a compiler that converts scala to javascript polynote https polynote org is an experimental polyglot notebook environment currently it supports scala and python with or without spark sql and vega scala native http scala native org is an optimizing ahead of time compiler and lightweight managed runtime designed specifically for scala gitbucket https gitbucket github io is a git platform powered by scala with easy installation high extensibility github api compatibility finagle https twitter github io finagle is a fault tolerant protocol agnostic rpc system gatling https gatling io is a load test tool it officially supports http websocket server sent events and jms scalatra https scalatra org is a tiny scala high performance async web framework inspired by sinatra https www sinatrarb com r development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 126080648 b515b7c0 e1bd 481b 92d2 c9cdd98ac30c png br p r learning resources r https www r project org is an open source software environment for statistical computing and graphics it compiles and runs on a wide variety of platforms such as windows and macos an introduction to r https cran r project org doc manuals r release r intro pdf google s r style guide https google github io styleguide rguide html r developer s guide to azure https docs microsoft com en us azure architecture data guide technology choices r developers guide running r at scale on google compute engine https cloud google com solutions running r at scale running r on aws https aws amazon com blogs big data running r on aws rstudio server pro for aws https aws amazon com marketplace pp rstudio rstudio server pro for aws b06w2g9pry learn r by codecademy https www codecademy com learn learn r learn r programming with online courses and lessons by edx https www edx org learn r programming r language courses by coursera https www coursera org courses query r 20language learn r for data science by udacity https www udacity com course programming for data science nanodegree with r nd118 r tools rstudio https rstudio com is an integrated development environment for r and python with a console syntax highlighting editor that supports direct code execution and tools for plotting history debugging and workspace management shiny https shiny rstudio com is a newer package from rstudio that makes it incredibly easy to build interactive web applications with r rmarkdown https rmarkdown rstudio com is a package helps you create dynamic analysis documents that combine code rendered output such as figures and prose rplugin https github com jetbrains rplugin is r language supported plugin for the intellij ide plotly https plotly r com is an r package for creating interactive web graphics via the open source javascript graphing library plotly js https github com plotly plotly js metaflow https metaflow org is a python r library that helps scientists and engineers build and manage real life data science projects metaflow was originally developed at netflix to boost productivity of data scientists who work on a wide variety of projects from classical statistics to state of the art deep learning prophet https facebook github io prophet is a procedure for forecasting time series data based on an additive model where non linear trends are fit with yearly weekly and daily seasonality plus holiday effects it works best with time series that have strong seasonal effects and several seasons of historical data lightgbm https lightgbm readthedocs io is a gradient boosting framework that uses tree based learning algorithms used for ranking classification and many other machine learning tasks dash https plotly com dash is a python framework for building analytical web applications in python r julia and jupyter mlr https mlr mlr org com is machine learning in r ml workspace https github com ml tooling ml workspace is an all in one web based ide specialized for machine learning and data science it is simple to deploy and gets you started within minutes to productively built ml solutions on your own machines ml workspace is the ultimate tool for developers preloaded with a variety of popular data science libraries tensorflow pytorch keras and mxnet and dev tools jupyter vs code and tensorboard perfectly configured optimized and integrated catboost https catboost ai is a fast scalable high performance gradient boosting on decision trees library used for ranking classification regression and other machine learning tasks for python r java c supports computation on cpu and gpu plumber https www rplumber io is a tool that allows you to create a web api by merely decorating your existing r source code with special comments drake https docs ropensci org drake is an r focused pipeline toolkit for reproducibility and high performance computing diagrammer https visualizers co diagrammer is a package you can create modify analyze and visualize network graph diagrams the output can be incorporated into r markdown documents integrated with shiny web apps converted to other graph formats or exported as image files knitr https yihui org knitr is a general purpose literate programming engine in r with lightweight api s designed to give users full control of the output without heavy coding work broom https broom tidymodels org is a tool that converts statistical analysis objects from r into tidy format julia development back to the top https github com mikeroyal machine learning guide table of contents p align center img src https user images githubusercontent com 45159366 94961900 6e839280 04aa 11eb 84c6 2fb3f83e2b90 png br p julia learning resources julia https julialang org is a high level high performance https julialang org benchmarks dynamic language for technical computing julia programs compile to efficient native code for multiple platforms https julialang org downloads support tiers via llvm juliahub https juliahub com contains over 4 000 julia packages for use by the community julia observer https www juliaobserver com julia manual https docs julialang org en v1 manual getting started julialang essentials https docs julialang org en v1 base base julia style guide https docs julialang org en v1 manual style guide julia by example https juliabyexample helpmanual io julialang gitter https gitter im julialang julia dataframes tutorial using jupyter notebooks https github com bkamins julia dataframes tutorial julia academy https juliaacademy com courses preview logged out julia meetup groups https www meetup com topics julia julia on microsoft azure https juliacomputing com media 2017 02 08 azure html julia tools juliapro https juliacomputing com products juliapro html is a free and fast way to setup julia for individual researchers engineers scientists quants traders economists students and others julia developers can build better software quicker and easier while benefiting from julia s unparalleled high performance it includes 2600 open source packages or from a curated list of 250 juliapro packages curated packages are tested documented and supported by julia computing juno https junolab org is a powerful free ide based on atom for the julia language debugger jl https github com juliadebug debugger jl is the julia debuggin tool profile stdlib https docs julialang org en v1 manual profile is a module provides tools to help developers improve the performance of their code when used it takes measurements on running code and produces output that helps you understand how much time is spent on individual line s revise jl https github com timholy revise jl allows you to modify code and use the changes without restarting julia with revise you can be in the middle of a session and then update packages switch git branches and or edit the source code in the editor of your choice any changes will typically be incorporated into the very next command you issue from the repl this can save you the overhead of restarting julia loading packages and waiting for code to jit compile juliagpu https juliagpu org is a github organization created to unify the many packages for programming gpus in julia with its high level syntax and flexible compiler julia is well positioned to productively program hardware accelerators like gpus without sacrificing performance ijulia jl https github com julialang ijulia jl is the julia kernel for jupyter aws jl https github com juliacloud aws jl is a julia interface for amazon web services https aws amazon com cuda jl https juliagpu gitlab io cuda jl is a package for the main programming interface for working with nvidia cuda gpus using julia it features a user friendly array abstraction a compiler for writing cuda kernels in julia and wrappers for various cuda libraries xla jl https github com juliatpu xla jl is a package for compiling julia to xla for tensor processing unit tpu https cloud google com tpu nanosoldier jl https github com juliaci nanosoldier jl is a package for running juliaci services on mit s nanosoldier cluster julia for vscode https www julia vscode org is a powerful extension for the julia language jump jl https jump dev is a domain specific modeling language for mathematical optimization https en wikipedia org wiki mathematical optimization embedded in julia optim jl https github com julianlsolvers optim jl is a univariate and multivariate optimization in julia rcall jl https github com juliainterop rcall jl is a package that allows you to call r functions from julia javacall jl http juliainterop github io javacall jl is a package that allows you to call java functions from julia pycall jl https github com juliapy pycall jl is a package that allows you to call python functions from julia mxnet jl https github com dmlc mxnet jl is the apache mxnet julia package mxnet jl brings flexible and efficient gpu computing and state of art deep learning to julia knet https denizyuret github io knet jl latest is the ko university deep http www ku edu tr en learning framework implemented in julia by deniz yuret https www denizyuret com and collaborators it supports gpu operation and automatic differentiation using dynamic computational graphs for models defined in plain julia distributions jl https github com juliastats distributions jl is a julia package for probability distributions and associated functions dataframes jl http juliadata github io dataframes jl stable is a tool for working with tabular data in julia flux jl https fluxml ai is an elegant approach to machine learning it s a 100 pure julia stack and provides lightweight abstractions on top of julia s native gpu and ad support irtools jl https github com fluxml irtools jl is a simple and flexible ir format expressive enough to work with both lowered and typed julia code as well as external irs cassette jl https github com jrevels cassette jl is a julia package that provides a mechanism for dynamically injecting code transformation passes into julia s just in time jit compilation cycle enabling post hoc analysis and modification of cassette unaware julia programs without requiring manual source annotation or refactoring of the target code contribute x if would you like to contribute to this guide simply make a pull request https github com mikeroyal machine learning guide pulls license back to the top https github com mikeroyal machine learning guide table of contents distributed under the creative commons attribution 4 0 international cc by 4 0 public license https creativecommons org licenses by 4 0 | python deep-learning machine-learning-library machinelearning machine-learning-models pytorch machinelearning-python scikit-learn scikitlearn-machine-learning support-vector-machines gpt-3 aws-sagemaker artificial-neural-networks jax image-classification gpt-4 generative-ai image-processing gpt4all llms | ai |
IT- | it any information about technology | server |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.