names
stringlengths 1
98
| readmes
stringlengths 8
608k
| topics
stringlengths 0
442
| labels
stringclasses 6
values |
---|---|---|---|
nitp-web-front | nitp website this is the repository of website of national institute of technology patna the current repo is live at https www nitp ac in | gatsby react css dom | front_end |
FrontEndWebDevGetStartedPS | front end web development get started currently the course is a bit out of date but still very relevant things that are somewhat out of date parts about browser dev tools they have changed a bit but the basics are still the same multi browser testing quite a few new options here besides what i show module systems mostly up to date but there is definitely a lot of movement in this area libraries and tools this is pretty out of date much of this has changed bower modernizr are less used now mvc frameworks section is quite out of date that changes on almost a daily basis the rest of the course is still just as applicable today as when it was published | front_end |
|
mlxtend | doi https joss theoj org papers 10 21105 joss 00638 status svg https doi org 10 21105 joss 00638 pypi version https badge fury io py mlxtend svg https badge fury io py mlxtend anaconda server badge https anaconda org conda forge mlxtend badges version svg https anaconda org conda forge mlxtend build status https ci appveyor com api projects status 7vx20e0h5dxcyla2 branch master svg true https ci appveyor com project rasbt mlxtend branch master codecov https codecov io gh rasbt mlxtend branch master graph badge svg https codecov io gh rasbt mlxtend python 3 https img shields io badge python 3 blue svg license https img shields io badge license bsd blue svg discuss https img shields io badge discuss github blue svg https github com rasbt mlxtend discussions img src docs sources img logo png alt mlxtend logo width 300px mlxtend machine learning extensions is a python library of useful tools for the day to day data science tasks br sebastian raschka 2014 2023 br links documentation https rasbt github io mlxtend https rasbt github io mlxtend pypi https pypi python org pypi mlxtend https pypi python org pypi mlxtend changelog https rasbt github io mlxtend changelog https rasbt github io mlxtend changelog contributing https rasbt github io mlxtend contributing https rasbt github io mlxtend contributing questions check out the github discussions board https github com rasbt mlxtend discussions br br installing mlxtend pypi to install mlxtend just execute bash pip install mlxtend alternatively you could download the package manually from the python package index https pypi python org pypi mlxtend https pypi python org pypi mlxtend unzip it navigate into the package and use the command bash python setup py install conda if you use conda to install mlxtend just execute bash conda install c conda forge mlxtend dev version the mlxtend version on pypi may always be one step behind you can install the latest development version from the github repository by executing bash pip install git git github com rasbt mlxtend git egg mlxtend or you can fork the github repository from https github com rasbt mlxtend and install mlxtend from your local drive via bash python setup py install br br examples python import numpy as np import matplotlib pyplot as plt import matplotlib gridspec as gridspec import itertools from sklearn linear model import logisticregression from sklearn svm import svc from sklearn ensemble import randomforestclassifier from mlxtend classifier import ensemblevoteclassifier from mlxtend data import iris data from mlxtend plotting import plot decision regions initializing classifiers clf1 logisticregression random state 0 clf2 randomforestclassifier random state 0 clf3 svc random state 0 probability true eclf ensemblevoteclassifier clfs clf1 clf2 clf3 weights 2 1 1 voting soft loading some example data x y iris data x x 0 2 plotting decision regions gs gridspec gridspec 2 2 fig plt figure figsize 10 8 for clf lab grd in zip clf1 clf2 clf3 eclf logistic regression random forest rbf kernel svm ensemble itertools product 0 1 repeat 2 clf fit x y ax plt subplot gs grd 0 grd 1 fig plot decision regions x x y y clf clf legend 2 plt title lab plt show docs sources img ensemble decision regions 2d png if you use mlxtend as part of your workflow in a scientific publication please consider citing the mlxtend repository with the following doi article raschkas 2018 mlxtend author sebastian raschka title mlxtend providing machine learning and data science utilities and extensions to python s scientific computing stack journal the journal of open source software volume 3 number 24 month apr year 2018 publisher the open journal doi 10 21105 joss 00638 url https joss theoj org papers 10 21105 joss 00638 raschka sebastian 2018 mlxtend providing machine learning and data science utilities and extensions to python s scientific computing stack j open source softw 3 24 license this project is released under a permissive new bsd open source license license bsd3 txt https github com rasbt mlxtend blob master license bsd3 txt and commercially usable there is no warranty not even for merchantability or fitness for a particular purpose in addition you may use copy modify and redistribute all artistic creative works figures and images included in this distribution under the directory according to the terms and conditions of the creative commons attribution 4 0 international license see the file license cc by txt https github com rasbt mlxtend blob master license cc by txt for details computer generated graphics such as the plots produced by matplotlib fall under the bsd license mentioned above contact the best way to ask questions is via the github discussions channel https github com rasbt mlxtend discussions in case you encounter usage bugs please don t hesitate to use the github s issue tracker https github com rasbt mlxtend issues directly | python machine-learning data-science data-mining association-rules supervised-learning unsupervised-learning | ai |
MIDI-LLM-tokenizer | midi llm tokenizer tools for converting mid files into text for training large language models the exact format of the intermediary text is not critical the primary goals are to 1 create a legible text format that can be easily used with existing llm tech and 2 encode midi data into an efficient format for training expected workflow would be 1 convert midi datasets into jsonl files midi to jsonlpy 2 tokenize build tokenizerpy jsonl files into binidx 3 train llm 4 sample llm 5 convert text into midi str to midipy for converting jsonl files to binidx format for training see https github com abel2076 json2binidx tool or https github com eleutherai gpt neox vocabulary midi files contain a lot of data and only some of it can be reasonably learned by a language model inspired by openai musenet and oore et al 2018 we have two main types of tokens wait tokens for timing 125 of them representing real time combined note velocity instrument tokens 128 notes 16 quantized velocity 16 binned instruments 32768 tokens pad start end tokens notes and quantized velocities are encoded as hex while instruments are encoded as the shortest unique string we knowingly discard the following information panning pitch bend modulation key signature time signature track names instrument names we assume the standard gm instruments simultaneous tokens e g chords are sorted by instrument note then midi event order this reduces unnecessary randomness in the data in the future instrument order could be uniformly randomized to allow constrained sampling where you provide a preexisting track and the model generates a melody scripts build tokenizer py builds a new huggingface tokenizer and vocab using vocab config json sh python build tokenizer py midi to jsonl py converts a directory or archive of mid midi files into a jsonl file of note sequences sh python midi to jsonl py path lmd full tar gz output lmd full jsonl workers 4 midi text conversion sh python midi to str py test mid sh python str to midi py start p 3c f t10 p 3e f t10 p 40 f t64 p 3c 0 p 3e 0 p 40 0 end output test mid | ai |
|
switchboard | switchboard packaging status https repology org badge tiny repos switchboard svg https repology org metapackage switchboard l10n https l10n elementary io widgets switchboard svg badge svg https l10n elementary io projects switchboard utm source widget system settings screenshot data screenshot png raw true plugs switchboard is just the container application for switchboard plugs which provide the actual settings for various hardware and software browse all plugs https github com elementary q switchboard plug org repositories building testing and installation you ll need the following dependencies libgee 0 8 dev libglib2 0 dev libgranite 7 dev libgtk 4 dev libadwaita 1 dev meson valac run meson to configure the build environment and then ninja to build meson build prefix usr cd build ninja to install use ninja install then execute with switchboard sudo ninja install switchboard making switchboard plugs documentation for libswitchboard is available on valadoc org https valadoc org switchboard 2 0 switchboard plug html | gmodule pantheon switchboard gtk gtk3 vala meson hacktoberfest | os |
Laravel-JS-Localization | laravel js localization convert you laravel messages and use them in the front end github assets banner svg laravel 5 5 https img shields io badge laravel 5 5 f4645f svg laravel 4 2 https img shields io badge laravel 4 2 f4645f svg latest stable version https poser pugx org mariuzzo laravel js localization v stable svg https packagist org packages mariuzzo laravel js localization total downloads https poser pugx org mariuzzo laravel js localization downloads svg https packagist org packages mariuzzo laravel js localization license https poser pugx org mariuzzo laravel js localization license svg https packagist org packages mariuzzo laravel js localization this package convert all your localization messages from your laravel app to javascript with a small library to interact with those messages following a very similar syntax you are familiar with features support laravel 4 2 5 0 5 1 5 2 5 3 5 4 5 5 6 x 7 x and 8 x includes lang js https github com rmariuzzo lang js a thin library highly inspired on laravel s translator https laravel com api 5 4 illuminate translation translator html class allow to specify desired lang files to be converted to js lang js api is based on laravel s translator https laravel com api 5 4 illuminate translation translator html class no need to learn a whole api table tbody tr td star webpack user try the new and shiny laravel localization loader https github com rmariuzzo laravel localization loader for webpack td tr tbody table installation shell composer require mariuzzo laravel js localization in your laravel app go to config app php and add the following service provider php mariuzzo laraveljslocalization laraveljslocalizationserviceprovider class usage the laravel js localization package provides a command that generate the javascript version of all your messages found at app lang laravel 4 or resources lang laravel 5 directory the resulting javascript file will contain all your messages plus lang js https github com rmariuzzo lang js a thin library highly inspired on laravel s translator https laravel com api 5 4 illuminate translation translator html class generating js messages shell php artisan lang js specifying a custom target shell php artisan lang js public assets dist lang dist js compressing the js file shell php artisan lang js c specifying a custom source folder shell php artisan lang js public assets dist lang dist js s themes default lang output a json file instead shell php artisan lang js json configuration first publish the default package s configuration file running shell php artisan vendor publish provider mariuzzo laraveljslocalization laraveljslocalizationserviceprovider the configuration will be published to config localization js php you may edit this file to define the messages you need in your javascript code just edit the messages array in the config file empty messages array will include all the language files in build to make only pagination php and validation php files to be included in build process php php return messages pagination validation using gulp http gulpjs com optional install gulp shell https github com sun zheng an gulp shell and then run it directly in your gulpfile js js var shell require gulp shell gulp task langjs shell task php artisan lang js c public js messages js using laravel s elixir http laravel com docs elixir optional before elixir 4 0 js elixir extend langjs function path gulp task langjs function gulp src pipe shell php artisan lang js path public js messages js return this queuetask langjs elixir 4 0 js var task elixir task elixir extend langjs function path new task langjs function gulp src pipe shell php artisan lang js path public js messages js and use it like this js elixir function mix mix langjs using laravel s mix https laravel com docs 5 4 mix with laravel 5 4 optional add webpack shell plugin next https www npmjs com package webpack shell plugin next to package json s devdependencies section add the following to webpack mix js js const webpackshellpluginnext require webpack shell plugin next add shell command plugin configured to create javascript language file mix webpackconfig plugins new webpackshellpluginnext onbuildstart php artisan lang js quiet onbuildend documentation this is a quick documentation regarding lang js https github com rmariuzzo lang js the thin javascript library included by laravel js localization the lang js https github com rmariuzzo lang js a thin library highly inspired on laravel s translator https laravel com api 5 3 illuminate translation translator html class go to lang js documentation lang js https github com rmariuzzo lang js to see all available methods getting a message js lang get messages home getting a message with replacements js lang get messages welcome name joe changing the locale js lang setlocale es checking if a message key exists js lang has messages foo support for singular and plural message based on a count js lang choice messages apples 10 calling the choice method with replacements js lang choice messages apples 10 name joe go to lang js documentation lang js https github com rmariuzzo lang js to see all available methods want to contribute 1 fork this repository and clone it 2 create a feature branch https guides github com introduction flow from develop git checkout develop git checkout b feature foo 3 push your commits and create a pull request prerequisites you will need to have installed the following softwares composer php 5 5 development setup after getting all the required softwares you may run the following commands to get everything ready 1 install php dependencies shell composer install 2 install test dependencies shell composer test install now you are good to go happy coding testing this project uses phpunit all tests are stored at tests directory to run all tests type in your terminal shell composer test div align center made with heart by rubens mariuzzo https github com rmariuzzo mit license license div | php javascript localization i18n l10n internationalization laravel laravel-package laravel-5-package laravel-4-package | front_end |
blockchain-demo | blockchain demo a web based demonstration of blockchain concepts live version https guggero github io blockchain demo this is a complete rewrite of anders brownworth s blockchain demo https github com anders94 blockchain demo with lots of additional features basically only the idea shown in his excellent demo video https www youtube com watch v 160omzbly8 remains the code is completely different that s why it s not a fork any more but a standalone repository if you are looking for the cryptographic tools that were found behind the advanced menu they have been moved to their own project https github com guggero cryptography toolkit changes in detail static html js so it can be served with github pages use angularjs for rendering the page add explanations to most pages expert mode that shows many details show implement concept of mining difficulty in expert mode show duration and speed of mining process in expert mode toggle between tx coinbase and data view send thanks created by oliver gugger https github com guggero btc tip address bc1qfgua5vhwm6myajak9p4crhwmwm2k6mczf789eh original idea by anders94 https github com anders94 btc 1k3nvcuzzvtuehw1qhkg2cm3virkh2exjp eth 0x84a90e21d9d02e30ddcea56d618aa75ba90331ff | blockchain |
|
database-fundamentals | softuni software engineering database fundamentals | server |
|
uniwa-cloud-todoapp | todo app a simple todo app project 1 of the cloud computing class at uniwa department of electrical and electronic engineering installation you can install the dependencies using pip and then run the app either using gunicorn for production or flask s development server python 3 is required sh pip install r requirements txt development python m todoapp production gunicorn todoapp app demo you can check the app at heroku https uniwa cc todoapp herokuapp com license this app was written for the purposes of the class feel free to use the code for whenever reason for the legalise check license md | cloud |
|
4T4_WMusic | 4t4 wmusic mobile application development project group members nipuna manoratne rashini liyanarachchi nisansali nikapotha hirumi samarawickrama | front_end |
|
full-moon-night | full moon night the project was originally designed to acquire more core data about the game full moon night and enable fast updates to minimize manual labor through data analysis we can expand the foundation for decision making and thus increase the win rate | server |
|
Node-RED-with-Watson-APIs-and-Building-custom-Models | node red overview node red is a visual tool for wiring the internet of things it is easy to connect devices data and api s services it can also be used for other types of applications to quickly assemble flows of services node red provides a browser based flow editor that makes it easy to wire together flows using the wide range of nodes in the palette flows can be then deployed to the runtime in a single click while node red is based on node js javascript functions can be created within the editor using a rich text editor a built in library allows you to save useful functions templates or flows for re use node red is included in the node red starter application in ibm cloud but you can also deploy it as a stand alone node js application node red can not only be used for iot applications but it is a generic event processing engine for example you can use it to listen to events from http websockets tcp twitter and more and store this data in databases without having to program much if at all you can also use it for example to implement simple rest apis you can find many other sample flows on the node red website this app in this lab will be created and run on your ibm cloud account as a first step ibm cloud node red environment will be created and setup this will then host your node red flows we will then create multiple flows to work with different watson api s before you begin before setting up your environment and in order to create the services needed for the workshop you ll need an ibm cloud account 1 sign up for an account here https ibm biz bdzgta 2 verify your account by clicking on the link in the email sent to you node red setup instuctions once you have created your ibm cloud account follow the below instructions to setup your node red environment node red setup https github com ibmdeveloperuk node red with watson apis and building custom models blob master node red 20setup md node red and ibm watson apis ibm watson natural language understanding watsonnlu md ibm watson speech to text speechtotext md ibm watson language translator languagetranslator md model asset exchange model asset exchange https developer ibm com exchanges models part 2 creating a custom model on watson studio custommodel md | node-red visual-recognition ibm-cloud | ai |
data | data a place to store data engineering learning and created databases | server |
|
mobile-policy | category management policy 16 2 improving the acquisition and management of common information technology mobile devices and services the office of management and budget omb is accepting public comment on draft guidance to improve the acquisition and management of mobile services and devices this policy is the third in a series of category management policies to drive greater performance efficiencies and savings in commonly purchased information technology goods and services the public comment period has ended thank you for your comments omb will analyze all feedback submitted during the public comment period and revise the policy as necessary the proposed guidance is now open for public comment on this page the public feedback period will be 30 days closing on april 28 2016 following the public comment period feedback received will be analyzed to help inform the development of any final policy public domain this project is in the worldwide public domain license md as stated in contributing contributing md this project is in the public domain within the united states and copyright and related rights in the work worldwide are waived through the cc0 1 0 universal public domain dedication https creativecommons org publicdomain zero 1 0 all contributions to this project will be released under the cc0 dedication by submitting a pull request you are agreeing to comply with this waiver of copyright interest privacy all comments messages pull requests and other submissions received through official white house pages including this github page may be subject to archiving requirements see the https www whitehouse gov privacy for more information developing on the site locally this site uses jekyll http jekyllrb com sass http sass lang com bourbon http bourbon io neat http neat bourbon io and requires ruby 2 x install dependencies with bundler bundle install and run the site with jekyll bundle exec jekyll serve watch if all goes well visit the site at http localhost 4000 | server |
|
front-end-interview-skills | warning pdf javascript leetcode thinking https github com cuixueshe front end interview skills blob main guide thinking md subjects https github com cuixueshe front end interview skills blob main guide subjects md plan https github com cuixueshe front end interview skills blob main guide plan md materials https github com cuixueshe front end interview skills blob main guide materials md issue resume https github com cuixueshe front end interview skills blob main guide resume md interview skills https github com cuixueshe front end interview skills blob main guide interview skills md 2 3 pr javascript materials https github com cuixueshe front end interview skills blob main guide materials md github readme md guide company technology stack md interview experience md interview process md interview skills md materials md plan md resume md subjects md thinking md pdf pdf question back others md cuixueshe md | front_end |
|
InvestigationCenterProject | el sistema grupos investigaci n ucr tabla de contenidos 1 definiciones markdown header definiciones acronimos y abreviaciones 2 introducci n markdown header introduccion 3 equipos markdown header listado de equipos y miembros de los equipos 4 descripci n del sistema markdown header descripcion general 4 1 contexto y situaci n actual markdown header contexto actual 4 2 problema que resuelve markdown header problema que resuelve 4 3 interesados del proyecto y tipos de usuarios markdown header interesados del proyecto y tipos de usuarios 4 4 soluci n propuesta markdown header solucion propuesta 4 5 an lisis del entorno markdown header analisis del entorno 4 6 visi n del producto markdown header vision del producto 4 7 relaci n con otros sistemas externos markdown header relacion con otros sistemas externos 4 8 descripci n de los sistemas markdown header descripcion de los temas modulos asignados a cada equipo 4 9 requerimientos funcionales markdown header requerimientos funcionales 4 10 mapa de ruta del producto markdown header mapa de ruta del producto 4 11 requerimientos no funcionales markdown header requerimientos no funcionales que debe cumplir toda la aplicacion web 5 decisiones t cnicas markdown header decisiones tecnicas 5 1 metodolog as utilizadas y procesos definidos markdown header metodologias utilizadas en el desarrollo del proyecto 5 2 artefactos utilizados en el desarrollo del proyecto markdown header artefactos utilizados en el desarrollo del proyecto 5 3 tecnolog as utilizadas con sus respectivas versiones markdown header tecnologias utilizadas con sus respectivas versiones 5 4 repositorio de c digo y estrategia git para el proyecto markdown header repositorio de codigo y estrategia git para el proyecto 5 5 definici n de listo markdown header definicion de listo 6 referencias bibliogr ficas markdown header referencias bibliograficas definiciones acr nimos y abreviaciones citic centro de investigaciones en tecnolog as de informaci n y comunicaci n ucr universidad de costa rica odi oficina de divulgaci n e informaci n ecci escuela de ciencias de computaci n e inform tica ddd domain driven design vpn virtual private network introducci n este documento especifica la organizaci n entre los equipos de trabajo del proyecto cada uno de los roles de cada miembro al igual que las decisiones t cnicas para el desarrollo del proyecto se estructura en 3 secciones principales listado de equipos descripci n general del proyecto y decisiones t cnicas listado de equipos y miembros de los equipos equipo panas nombre rol sebasti n montero castro scrummaster andrea alvarado acon database carlos mora look and feel dylan arias git greivin s nchez garita ambassador and documentation equipo monkey madness nombre rol nels n lvarez scrum master and database tyron fonseca look and feel rodrigo contreras git roberto m ndez documentaci n dean vargas ambassador equipo cheetos nombre rol angie sofia castillo campos scrummaster sebastian gonzalez varela documentaci n oscar navarro c spedes github steven n ez murillo look and feel esteban quesada quesada bases de datos gabriel revillat zeled n ambassador equipo pollos hermanos nombre rol frank alvarado alfaro bases de datos diana luna pacheco scrum master pablo ot rola rodr guez documentaci n christian rojas r os look and feel david s nchez l pez git elvis badilla mena ambassador descripci n general contexto actual el centro de investigaciones en tecnolog as de la informaci n y comunicaci n de la ucr tiene desea promover la investigaci n en las reas relacionadas con las tics es por esto que desea desarrollar una aplicaci n web para administrar la informaci n de los grupos de investigaci n cient fica que trabajan en esta rea y ayudar a facilitar la divulgaci n cient fica que realizan estos grupos actualmente la universidad de costa rica no cuenta con una nica aplicaci n web para los distintos centros de investigaci n que existen es por esto que este proyecto plantea ser una primera soluci n escalable que comience a utilizarse en el citic y luego pueda ser usada por los otros centros de investigaci n problema que resuelve dado la ausencia de una nica aplicaci n web centralizada que permita acceder de manera sencilla y ordena a la informaci n sobre proyectos de investigaci n publicaciones tesis noticias de los centros de investigaci n de la ucr esta aplicaci n web viene a solucionar el problema de una informaci n de dif cil acceso y desactualizada que existe actualmente en el contexto de las diferentes p ginas de los centros de investigaci n interesados del proyecto y tipos de usuarios en primera instancia la aplicaci n ser desarrollada con un enfoque en el centro de investigaciones en tecnolog as de la informaci n y comunicaci n citic de la ucr de modo que las personas interasadas son todas aquellas que se relacionan directa o indirectamente con el citic tales como visitantes personal administrativo e investigadores estudiantes publicadores entre otros dado que la aplicaci n busca expandirse a los dem s centros de investigaci n dentro de la universidad las personas interesadas son tambi n todas aquellas que se ver n beneficiadas de una aplicaci n centralizada de f cil acceso respuesta r pida e intuitiva soluci n propuesta desarrollar una aplicaci n web que permita administrar la informaci n de los grupos de investigaci n cient fica y facilitar la divulgaci n cient fica que se realiza an lisis del entorno requerimientos del usuario el sistema permitir acceder a la informaci n del centro de investigaci n el sistema muestra la informaci n de manera ordenada el sistema permite el filtrado de informaci n la informaci n presentada est actualizada al d a regulaci n este sistema se regula bajo las normas de la universidad de costa rica ser un sistema que cumpla con los lineamientos t cnicos del centro de inform tica as como los lineamientos formales establecidos por la oficina de divulgaci n e informaci n odi y la vicerector a de investigaci n competidores competidores internos distintas p ginas web existentes de los centros de investigaci n otros grupos de desarrollo dentro de la universidad competidores externos empresas desarrolladoras de software desarrolladores independientes visi n del producto para personas interesadas en proyectos de los centros de investigaci n de la ucr quienes requieren acceder a investigaciones de los diferentes centros y sus grupos de investigaci n de la ucr el sistema grupos investigaci n ucr es una aplicaci n web que permite la b squeda de proyectos de investigaci n publicaciones tesis y personas investigadoras de forma distinta a las p ginas actuales de centros de investigaci n de la ucr nuestro producto proveer un sitio web simple de usar que facilitar la administraci n y publicaci n de la investigaci n que realizan los grupos de investigaci n de la ucr relaci n con otros sistemas externos los diversos centros de investigaci n de la ucr administran de manera distinta su informaci n principalmente en el acceso a los recursos que proporcionan dado a que algunos casos tienen recursos de manera privativa como proyectos que se est n realizando tesis o publicaciones donde solo proporcionan un resumen adem s algunos de los sitios de los centros de investigaci n son lentos o tienen un look and feel distinto con dise os que en ocasiones pueden llegar a dificultar el acceso a la informaci n de inter s nuestra aplicaci n viene a permitir centralizar los proyectos de investigaci n publicaciones y tesis en un mismo sitio para que pueda ser accedido por las personas interesadas por medio de una interfaz amigable y al igual que facilitar la administraci n proporcionando informaci n estad stica y a su vez permitiendo una mejor divulgaci n de la informaci n cient fica descripci n de los temas m dulos asignados a cada equipo pollos hermanos publicaciones y estad sticas publicaciones este m dulo se encarga de gestionar las publicaciones del centro de investigaci n y de la visualizaci n de cada una de est s publicaciones estad sticas este m dulo se encarga de desplegar las estad sticas de la cantidad de publicaciones en cuatro apartados por grupo de invstigaci n por a o por rea de investigaci n y por tipo de publicaci n monkeymadness centro de investigaci n grupos de investigaci n reas de investigaci n centro de investigaci n este m dulo contiene a la persona directora personal administrativo y grupos de investigaci n asociados grupos de investigaci n este m dulo es el encargado de desplegar la informaci n de personal administrativo y personas relacionadas investigadores estudiantes colaboradores adem s de las publicaciones proyectos tesis entre otros reas de investigaci n este m dulo contiene descripciones y palabras clave que permiten la clasificaci n de la informaci n proyectos tesis entre otros que se encuentra dentro de los centros y grupos de investigaci n flaming hot cheetos proyectos de investigaci n y tesis de grado y posgrado proyectos de investigaci n este m dulo se encarga de la administraci n y visualizaci n de los proyectos de investigaci n con sus caracter sticas y diferentes relaciones tesis grado y posgrado este m dulo se encarga de la administraci n y visualizaci n de las tesis de grado y posgrado con sus caracter sticas y diferentes relaciones panas personas noticias contacto trabajar con nosotros personas este m dulo se encarga de mostrar la informaci n de contacto publicaciones asociadas y rol dentro del centro de investigaci n para cada persona as como de la administraci n de cuentas es el m dulo m s grande del equipo informaci n genereal noticias despliegue de la informaci n relacionada a las noticias actualizadas del centro de investigaci n contacto m dulo encargado de brindar informaci n pertinente al contacto con el centro de investigaci n trabaja con nosotros brinda la informaci n necesaria para conocer las labores que realiza el centro de investigaci n y c mo participar requerimientos funcionales los requerimientos funcionales se administran por medio de la herramienta de jira el cual puede ser accedido por este enlace http 10 1 4 22 8080 projects piib22021 summary para poder acceder a este proyecto de jira es necesario utilizar el vpn proporcionado por la ecci las instrucciones se pueden en este enlace https www ecci ucr ac cr colaboradores procedimientos acceso remoto por red privada virtual vpn la ecci mapa de ruta del producto si desea conocer el mapa de ruta del producto puede hacerlo accediendo a este enlace https docs google com spreadsheets d 1ga4vefm1qp1bqun9uzckp5slvnvys1ng37ylmm5xcnm edit usp sharing requerimientos no funcionales que debe cumplir toda la aplicaci n web el sistema ser desarrollado para navegadores web basados en chromium y gecko firefox eficiencia toda la funcionalidad del sistema debe ser capaz de responder en menos de 10 segundos el sistema debe ser capaz de operar de manera adecuada con hasta 1000 usuarios de manera concurrente los datos que se modifican deben ser actualizados en la base de datos para todos los usuarios en menos de 2 segundos usabilidad el sistema debe proporcionar mensajes de error que sean informativos y orientados a usuario final el sistema debe poseer interfaces gr ficas bien formadas el tiempo de aprendizaje del sistema por un usuario deber ser menor a 4 horas seguridad el sistema se debe desarrollar aplicando recomendaciones de programaci n que mantengan la seguridad de los datos los datos solo podr n ser modificados por el administrador del sistema o miembro con los permisos correspondientes a su rol organizaci n la metodolog a de desarrollo de software ser domain driven development ddd y arquitectura limpia cada semana se debera producir reportes de las reuniones diarias en los cuales cada equipo muestre los avances del proyecto decisiones t cnicas clean code la especificaci n definida por los equipos desarrolladores se encuentra en c digo limpio https docs google com document d 1hw9ravlhxjj9bz xtidhbhkgfuqr4v2q edit usp sharing ouid 105993574609311310782 rtpof true sd true arquitectura limpia para la realizaci n de este proyecto se trabaj en la arquitectura limpia de tipo ddd la decisi n se toma en conjunto con el asesor t cnico que recomienda la arquitectura pues suele ir de la mano con blazor es necesario recalcar que las capas exteriores pueden depender de las capas interiores pero no al rev s es decir la capa de dominio no puede depender de la capa de presentaci n esta arquitectura cuenta con 4 capas bien definidas dominio contiene las entidades servicios del dominio interfaces para repositorio aplicaci n casos de uso y servicios de aplicaci n services infraestructura implementaciones de las interfaces de dominio y fuentes de datos contextos presentaci n interfaz de usuario en general es quien interact a con los servicios de la aplicaci n razor metodolog as utilizadas y procesos definidos metodolog a gil scrum scrum es un marco de gesti n para el desarrollo incremental de productos vali ndose de uno o m s equipos multifuncionales autoorganizados de aproximadamente siete personas cada uno estos equipos son responsables de la creaci n y adaptaci n de los procesos mediante una estructura de roles reuniones reglas y artefactos para el trabajo en equipo que scrum posee en scrum se realizan entregas parciales y regulares del producto final priorizadas por el beneficio que aportan al receptor del proyecto el mayor beneficio de la metodolog a scrum se experimenta en el trabajo complejo que implica la creaci n de conocimiento y colaboraci n roles se definieron distintos roles para trabajar en equipos transversales ambassador encargado git encargado documentaci n encargado look and feel scrum master reuniones se decide realizar por lo menos una reuni n semanal con el objetivo de conocer los avances problemas y sidebar topics que enfrenta cada equipo en el desarrollo de sus m dulos reglas cada equipo es responsable de las reglas bajo las que se trabaja pero en conjunto todos deben cumplir con los requisitos de clean code arquitectura limpia y definici n de acordados artefactos en el siguiente punto del documento se encuentran los artefactos utilizados markdown header artefactos utilizados en el desarrollo del proyecto cada equipo cuenta con un backlog de prioridades donde se almacenan las historias de usuario de acuerdo a su importancia se trabaja mediante ciclos o iteraciones llamados sprints estos consisten de per odos de trabajo continuo de entre 2 semanas y 2 meses al final de cada per odo se procede a una revisi n del trabajo revisado con los inversores y el due o del producto artefactos utilizados en el desarrollo del proyecto se realizaron los siguientes artefactos para el dise o modelo conceptual de la aplicaci n web https miro com app board o9j ly1kqic modelo conceptual y relacional base de datos https lucid app lucidchart invitations accept inv 16199716 9634 40d5 9347 320976039096 tecnolog as utilizadas con sus respectivas versiones microsoft visual studio enterprise 2019 v16 11 2 microsoft sql management studio 2018 v15 0 18386 0 microsoft sql server blazor asp net core 5 0 repositorio de c digo y estrategia git para el proyecto el c digo del proyecto se encuentra en un repositorio de bitbucket https cristian quesadalopez bitbucket org cristian quesadalopez ecci ci0128 ii2021 g01 pi git dado que se utiliz la metolod a gil scrum se defini que se utilizar a una rama main de la cual se crea una rama por cada equipo de acuerdo a sus mod los en cada subrama de equipo se crea una subrama por cada historia de usuario definida por los equipos definici n de listo para subir al master no romper la arquitectura tiene que completar una tarea t cnica comentario en el c digo especificando la ltima tarea t cnica que modific esa parte espec fica del c digo comentario explicando algo en el c digo que se considere complicado si se hace un cambio a los modelos se debe de hacer un pull request a los otros equipos seguir principios de clean code b sico verificar que corra todas las tareas de integraci n deben tener pull requests entre los equipos involucrados para presentar en el sprint review estar en el master y actualizado en el backlog tiene que estar cumpliendo los criterios de aceptaci n tiene que haber sido revisada y aprovada por el p o referencias bibliogr ficas elmasri r and navathe s 2010 fundamentals of database systems 6th ed boston united states addison wesley cohn m 2005 agile estimating and planning 1st ed pearson india uml y patrones craig larman ingenier a de software roger pressman | server |
|
webstack | webstack my personal stack of micro libraries for web development each of these libraries is available separately on http nuget org bulky high speed lightweight bulk insert library for sqlite mysql and sqlserver container a modified version of munq that does not throw exceptions internally when resolving dates handles date periods and occurrences generally useful when billing depot a caching interface useful when moving from local to remote cache uses remote metaphors email a library for handling delivery of email in process or parallelized in a service hollywood a governor shim that lets you control who can see your web app in production linger a simple delayed job queue inspired by delayed job for performing tasks asynchronously minirack a manager for httpmodule like behavior that can be easily plugged into a web app minirack routes seo friendly canonical route manager with zero configuration money a small library to deal with calculating money and displaying it in different cultures paging a small library to handle scenarios for paging works with iqueryable too but those are old news table descriptor maps objects and properties to database tables and columns fast flexible tophat like dapper but for connection management gives any idbconnection unit of work scoping tuxedo like dapper but for common queries gives you high speed insert update delete with no sql voltron driven development use one use them all use none i don t believe in frameworks use purpose built libraries that accomplish one and only one thing | front_end |
|
BloodDonationApp | blooddonationapp mobile app development assignment team members progress suren 100 faris 75 faiqah 100 jia jun 100 siti nurfaizzah 25 done 1 firebase connection 2 login and registration 3 bottom navigation 4 personal details page 5 questionnaire page todo 1 design every page 2 other firebase features connection | front_end |
|
SQL_Challenges | sql homework employee database a mystery in two parts sql png sql png background it is a beautiful spring day and it is two weeks since you have been hired as a new data engineer at pewlett hackard your first major task is a research project on employees of the corporation from the 1980s and 1990s all that remain of the database of employees from that period are six csv files in this assignment you will design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data in other words you will perform 1 data engineering 3 data analysis note you may hear the term data modeling in place of data engineering but they are the same terms data engineering is the more modern wording instead of data modeling before you begin 1 create a new repository for this project called sql challenge do not add this homework to an existing repository 2 clone the new repository to your computer 3 inside your local git repository create a directory for the sql challenge use a folder name to correspond to the challenge employeesql 4 add your files to this folder 5 push the above changes to github instructions data modeling inspect the csvs and sketch out an erd of the tables feel free to use a tool like http www quickdatabasediagrams com http www quickdatabasediagrams com data engineering use the information you have to create a table schema for each of the six csv files remember to specify data types primary keys foreign keys and other constraints for the primary keys check to see if the column is unique otherwise create a composite key https en wikipedia org wiki compound key which takes to primary keys in order to uniquely identify a row be sure to create tables in the correct order to handle foreign keys import each csv file into the corresponding sql table note be sure to import the data in the same order that the tables were created and account for the headers when importing to avoid errors data analysis once you have a complete database do the following 1 list the following details of each employee employee number last name first name sex and salary 2 list first name last name and hire date for employees who were hired in 1986 3 list the manager of each department with the following information department number department name the manager s employee number last name first name 4 list the department of each employee with the following information employee number last name first name and department name 5 list first name last name and sex for employees whose first name is hercules and last names begin with b 6 list all employees in the sales department including their employee number last name first name and department name 7 list all employees in the sales and development departments including their employee number last name first name and department name 8 in descending order list the frequency count of employee last names i e how many employees share each last name bonus optional as you examine the data you are overcome with a creeping suspicion that the dataset is fake you surmise that your boss handed you spurious data in order to test the data engineering skills of a new employee to confirm your hunch you decide to take the following steps to generate a visualization of the data with which you will confront your boss 1 import the sql database into pandas yes you could read the csvs directly in pandas but you are after all trying to prove your technical mettle this step may require some research feel free to use the code below to get started be sure to make any necessary modifications for your username password host port and database name sql from sqlalchemy import create engine engine create engine postgresql localhost 5432 your db name connection engine connect consult sqlalchemy documentation https docs sqlalchemy org en latest core engines html postgresql for more information if using a password do not upload your password to your github repository see https www youtube com watch v 2uatpmnvh0i https www youtube com watch v 2uatpmnvh0i and https help github com en github using git ignoring files https help github com en github using git ignoring files for more information 2 create a histogram to visualize the most common salary ranges for employees 3 create a bar chart of average salary by title epilogue evidence in hand you march into your boss s office and present the visualization with a sly grin your boss thanks you for your work on your way out of the office you hear the words search your id number you look down at your badge to see that your employee id number is 499942 submission create an image file of your erd create a sql file of your table schemata create a sql file of your queries optional create a jupyter notebook of the bonus analysis create and upload a repository with the above files to github and post a link on bootcamp spot ensure your repository has regular commits i e 20 commits and a thorough readme md file rubric unit 9 rubric sql homework employee database a mystery in two parts https docs google com document d 1oksntynct0v0e vkhimj9 ig0 oxnwczajlkv0avmkq edit usp sharing references mockaroo llc 2021 realistic data generator https www mockaroo com https www mockaroo com 2021 trilogy education services llc a 2u inc brand confidential and proprietary all rights reserved | server |
|
DIG-OpenIE | dig openie openie logo res dig openie jpg apply open information extraction technology to extract useful information for avoiding illigle human trafficking behaviours this is a direct research outcome when i worked with prof pedro szekely http usc isi i2 github io szekely on dig project http usc isi i2 github io dig main idea develop capability to automatically build knowledge graphs and their ontologies using extractions from open information extraction open ie systems these extractions are typically subject predicate object triples such as my name be jessica or i be sweet female where each of the elements of the triples are text the goal is to create a knowledge graph with appropriate nodes and links that represent the information in the extractions the problem is difficult because open ie extractions are noisy and the mapping from the extractions to knowledge graphs is complex and subtle to address the problem and evaluate progress we propose to use the knowledge graphs we already built for escorts weapons and patents as gold standards we will first address the simpler problem of using open ie extractions to populate individual attributes of the nodes in a graph we will train learning algorithms using two of the knowledge graphs and test on the third one by removing one attribute definition from the ontology and removing all instances of the attribute in the graph we will test the ability of our system to recover the removed attribute using open ie extractions measuring performance using precision and recall for example if we remove the name attribute of people can the system add it back to the ontology and populate the name of every person node in the graph we will then address increasingly more difficult problems recreating the relationships among nodes recreating a class of objects including its attributes and relationships to other objects and finally recreating the whole graph this incremental research plan breaks the very hard problem of creating knowledge graphs from open ie extractions into manageable sub problems each representing a valuable capability that we plan to deliver at the end of each stage of the research during the next year we plan to address the problem or recreating attributes and relationships and in the last year we plan to address the problem of recreating classes of nodes and the full graph delivery digoie annotation https github com zwein27 digoie annotation a python program to annotate person name or phone number based on open information extraction technology digoie extraction https github com zwein27 digoie extraction a python project to extract useful information for identifying person s description behavior and relationship my progress task one run reverb on 7 documents from different websites list extractions from reverb score extractions as useful or useless and make a note of information for extractions ran reverb on 5 documents from each web site camera reviews and 2 forum sites in the weapons domain for each document listed the extractions and score them as useful or useless a useful extraction is one we could use to build a knowledge graph use your judgment made a note of information that should have been extracted but wasn t task two write a program to train and run classifers based on the human trafficking datasets from isi for annotating person s name fetched data by elastic search from dig database run it on reverb and get extractions implement feature extraction and build feature vectors to generate training and testing datasets for machine learning process wrote code for automatically labeling based on given names applied scikit learn library ot train and test classifers such as decision tree random forest ada boost svm and naive bayes improved the implementation for feature extraction by which the accuracy for decision tree and random forest are at around 97 relatively task three evaluate the performance of classifers on random datasets that contain person names or that contain phone numbers tested the classifers on random datasets that contain person names and get accuracy at around 90 refined implementation for training process by adding a name dictionary that contains enough names instead of the names from original datasets updated the feature set from the feature extractions by removing real person names and adding reverb accuracy improved the precision and recall for random dataset that contains names at around 93 and 83 relatively tested the classifers on random datasets that contain phone numbers and found that this solution can not work well for annotating phone numbers generated datasets with bigram and trigram tuples from original one and trained and ran classifers based on new datasets classifers work better on original one than datasets in bigram and trigram conducted a research on random datasets and low extractions for reverb and low precision and recall for these sentences that contain phone number by which found that this solution cannot work well for annotating phone numbers task four write a program to extract useful information and build knowledge graph for the dataset from an escorts forum that contain reviews run and analyze the dataset from the forum that contains customer reviews try to find useful information from the reviews information like people s description actions and relationships train and run classifiers to flag these informations write a program to build knowledge graph for specifc description and behaviour | server |
|
Senior-Project | while loop simulator see google drive folder https drive google com open id 0b8lqhjvjvhdvwetrvzdqttgtu00 here is a video demonstration https www youtube com watch v 7uughjmyrlq | os |
|
AnomalyDetection | anomalydetection anomaly detection in computer vision paper list in records paper records readme md the difficulty and the potential solution solution records difficulty md some resources about anomaly detection resources resources md projects projects md | ai |
|
exonum-client | light client for exonum blockchain build status travis image travis url npm version npmjs image npmjs url coverage status coveralls image coveralls url js standard style codestyle image codestyle url travis image https img shields io travis exonum exonum client master svg travis url https travis ci com exonum exonum client npmjs image https img shields io npm v exonum client svg npmjs url https www npmjs com package exonum client coveralls image https coveralls io repos github exonum exonum client badge svg branch master coveralls url https coveralls io github exonum exonum client branch master codestyle image https img shields io badge code 20style standard brightgreen svg codestyle url http standardjs com a javascript library to work with exonum blockchain from browser and node js used to sign transactions before sending to blockchain and verify blockchain responses using cryptographic proofs contains numerous helper functions find out more information about the architecture and tasks docs clients of light clients in exonum if you are using exonum in your project and want to be listed on our website github list write us a line to contact exonum com library compatibility with exonum core javascript light client exonum core 0 18 4 1 0 0 18 3 1 0 0 rc 1 0 17 1 0 12 0 16 9 0 11 0 16 9 0 10 0 13 0 0 9 0 10 2 0 8 0 9 0 0 7 0 6 1 0 6 0 6 1 0 5 0 3 0 0 4 0 0 3 0 0 3 0 0 2 0 0 2 0 0 1 1 0 1 getting started getting started data types data types hash hash signature signature sign data sign data verify signature verify signature transactions transactions define transaction define transaction sign transaction sign transaction send transaction send transaction send multiple transactions send multiple transactions cryptographic proofs cryptographic proofs merkle tree proof merkle tree proof map proof map proof integrity checks integrity checks verify block verify block verify table verify table built in structures built in structures helpers helpers generate key pair generate key pair get random number get random number converters converters hexadecimal to uint8array hexadecimal to uint8array hexadecimal to string hexadecimal to string uint8array to hexadecimal uint8array to hexadecimal binary string to uint8array binary string to uint8array binary string to hexadecimal binary string to hexadecimal string to uint8array string to uint8array contributing contributing coding standards coding standards test coverage test coverage changelog changelog license license getting started there are several options to include light client library in the application the preferred way is to install exonum client as a package npmjs from npm registry sh npm install exonum client otherwise you can download the source code from github and compile it before use in browser include in browser html script src node modules exonum client dist exonum client min js script usage in node js javascript let exonum require exonum client data types exonum uses protobufjs protobufjs library to serialize structured data into protobuf protobuf format each transaction is signed sign data before sending into blockchain before the transaction is signed it is converted into byte array under the hood the data received from the blockchain should be converted into byte array under the hood before it will be possible to verify proof of its existence cryptographic proofs using cryptographic algorithm developer can both define data structures on the fly or use precompiled stubs with data structures to define protobuf structures use protobufjs protobufjs library example javascript const messageschema new type custommessage add new field balance 1 uint32 add new field name 2 string const message exonum newtype messageschema exonum newtype function requires a single argument of protobuf type type hash exonum uses cryptographic hashes docs glossary hash of certain data for transactions transactions and proofs cryptographic proofs different signatures of the hash function are possible javascript exonum hash data type type hash data argument description type data data to be processed using a hash function object type definition of the data type custom data type define data type or transaction define transaction an example of hash calculation javascript define a data structure const message new type user add new field balance 1 uint32 add new field name 2 string define a data type const user exonum newtype message data to hash const data balance 100 name john doe get a hash const hash user hash data it is also possible to get a hash from byte array javascript exonum hash buffer argument description type buffer byte array array or uint8array an example of byte array hash calculation javascript const arr 8 100 18 8 74 111 104 110 32 68 111 101 const hash exonum hash arr signature the procedure for signing data sign data using signing key pair and verifying of obtained signature verify signature is commonly used in the process of data exchange between the client and the service built in exonum keypair generate key pair helper function can be used to generate a new random signing key pair sign data the signature can be obtained using the secret key of the signing pair there are three possible signatures of the sign function javascript exonum sign secretkey data type type sign secretkey data exonum sign secretkey buffer argument description type secretkey secret key as hexadecimal string string data data to be signed object type definition of the data type custom data type define data type buffer byte array array or uint8array the sign function returns value as hexadecimal string verify signature the signature can be verified using the author s public key there are two possible signatures of the verifysignature function javascript exonum verifysignature signature publickey data type type verifysignature signature publickey data argument description type signature signature as hexadecimal string string publickey author s public key as hexadecimal string string data data that has been signed object type definition of the data type custom data type define data type the verifysignature function returns value of boolean type an example of signature creation and verification javascript define a data structure const message new type user add new field balance 1 uint32 add new field name 2 string const user exonum newtype message define a signing key pair const keypair exonum keypair data that has been hashed const data balance 100 name john doe signature obtained upon signing using secret key const signature exonum sign keypair secretkey data user verify the signature const result exonum verifysignature signature keypair publickey data user transactions transaction in exonum is an operation to change the data stored in blockchain transaction processing rules is a part of business logic implemented in a service docs architecture services sending data to the blockchain from a light client consist of 3 steps 1 describe the fields of transaction using custom data types define data type 2 sign sign data data of transaction using signing key pair 3 send transaction send single transaction to the blockchain read more about transactions docs architecture transactions in exonum or see the example of their usage examples transactions js define transaction an example of a transaction definition javascript const transaction new type custommessage add new field to 2 string add new field amount 3 uint32 const sendfunds new exonum transaction schema transaction service id 130 method id 0 exonum transaction constructor requires a single argument of object type with the next structure property description type schema protobuf data structure object service id service id docs architecture serialization service id number method id method id docs architecture serialization message id number schema structure is identical to that of custom data type define data type sign transaction an example of a transaction signing javascript signing key pair const keypair exonum keypair transaction data to be signed const data from john to adam amount 50 create a signed transaction const signed sendfunds create data keypair send transaction to submit transaction to the blockchain send function can be used javascript exonum send explorerbasepath transaction attempts timeout property description type explorerbasepath api address of transaction explorer on a blockchain node string transaction signed transaction bytes string uint8array or array like attempts number of attempts to check transaction status pass 0 in case you do not need to verify if the transaction is accepted to the block optional default value is 10 number timeout timeout between attempts to check transaction status optional default value is 500 number the send function returns a promise with the transaction hash the promise resolves when the transaction is committed accepted to a block an example of a transaction sending javascript define transaction explorer address const explorerbasepath http 127 0 0 1 8200 api explorer v1 transactions const transactionhash await exonum send explorerbasepath signed serialize send multiple transactions to submit multiple transactions to the blockchain sendqueue function can be used transactions will be sent in the order specified by the caller each transaction from the queue will be sent to the blockchain only after the previous transaction is committed javascript exonum sendqueue explorerbasepath transactions attempts timeout property description type explorerbasepath api address of transaction explorer on a blockchain node string transactions list of transactions array attempts number of attempts to check each transaction status pass 0 in case you do not need to verify if the transactions are accepted to the block optional default value is 10 number timeout timeout between attempts to check each transaction status optional default value is 500 number the sendqueue function returns a promise with an array of transaction hashes the promise resolves when all transactions are committed cryptographic proofs a cryptographic proof is a format in which a exonum node can provide sensitive data from a blockchain these proofs are based on merkle trees docs glossary merkle tree and their variants light client library validates the cryptographic proof and can prove the integrity and reliability of the received data read more about design of cryptographic proofs docs advanced merkelized list in exonum merkle tree proof javascript const proof new exonum listproof json valuetype console log proof entries the listproof class is used to validate proofs for merkelized lists argument description type json the json presentation of the proof obtained from a full node object valuetype data type for values in the merkelized list custom data type define data type the returned object has the following fields field description type merkleroot hexadecimal hash of the root of the underlying merkelized list string entries elements that are proven to exist in the list together with their indexes array index number value v length list length number see an example of using a listproof examples list proof js map proof javascript const proof new exonum mapproof json keytype valuetype console log proof entries the mapproof class is used to validate proofs for merkelized maps argument description type json the json presentation of the proof obtained from a full node object keytype data type for keys in the merkelized map custom define data type or built in data type valuetype data type for values in the merkelized map custom data type define data type keys in a map proof can either be hashed which is the default option or raw to obtain a raw version for keytype use mapproof rawkey keytype the key type is determined by the service developer when the service schema is created raw keys minimize the amount of hashing but require that the underlying type has fixed width binary serialization the returned object has the following fields field description type merkleroot hexadecimal hash of the root of the underlying merkelized map string missingkeys set of keys which the proof asserts as missing from the map set keytype entries map of key value pairs that the are proven to exist in the map map keytype valuetype see an example of using a mapproof examples map proof js integrity checks verify block javascript exonum verifyblock data validators each new block in exonum blockchain is signed by validators docs glossary validator to prove the integrity and reliability of the block it is necessary to verify their signatures the signature of each validator are stored in the precommits the verifyblock function throws an error if a block is invalid argument description type data structure with block and precommits object validators an array of validators public keys as a hexadecimal strings array verify table javascript exonum verifytable proof statehash fulltablename verify table existence in the root tree returns root hash for the table as hexadecimal string argument description type proof the json presentation of the proof obtained from a full node object statehash hash of current blockchain state stored in each block string fulltablename name of the table such as token wallets string built in structures the library exports protobuf declarations from the core crate consult protobuf files included into the library proto for more details helpers generate key pair javascript const pair exonum keypair javascript publickey 32 byte public key secretkey 64 byte secret key exonum keypair function generates a new random ed25519 docs glossary digital signature signing key pair using the tweetnacl tweetnacl key pair cryptographic library get random number javascript const rand exonum randomuint64 exonum randomuint64 function generates a new random uint64 number of cryptographic quality using the tweetnacl tweetnacl random bytes cryptographic library converters hexadecimal to uint8array javascript const hex 674718178bd97d3ac5953d0d8e5649ea373c4d98b3b61befd5699800eaa8513b exonum hexadecimaltouint8array hex hexadecimal to string javascript const hex 674718178bd97d3ac5953d0d8e5649ea373c4d98b3b61befd5699800eaa8513b exonum hexadecimaltobinarystring hex uint8array to hexadecimal javascript const arr new uint8array 103 71 24 23 139 217 125 58 197 149 61 exonum uint8arraytohexadecimal arr uint8array to binary string javascript const arr new uint8array 103 71 24 23 139 217 125 58 197 149 61 exonum uint8arraytobinarystring arr binary string to uint8array javascript const str 0110011101000111000110000001011110001011110110010111110100111010 exonum binarystringtouint8array str binary string to hexadecimal javascript const str 0110011101000111000110000001011110001011110110010111110100111010 exonum binarystringtohexadecimal str string to uint8array javascript const str hello world exonum stringtouint8array str contributing the contributing to the exonum client is based on the same principles and rules as the contributing to exonum core contributing coding standards the coding standards are described in the eslintrc eslintrc json file to help developers define and maintain consistent coding styles between different editors and ides we used editorconfig editorconfig configuration file test coverage all functions must include relevant unit tests this applies to both of adding new features and fixing existed bugs changelog detailed changes for each release are documented in the changelog changelog md file other languages support light clients for java lc java and python lc python license exonum client is licensed under the apache license version 2 0 see license license for details docs clients https exonum com doc version latest architecture clients docs architecture services https exonum com doc version latest architecture services docs architecture serialization https exonum com doc version latest architecture serialization docs architecture serialization network id https exonum com doc version latest architecture serialization etwork id docs architecture serialization protocol version https exonum com doc version latest architecture serialization protocol version docs architecture serialization service id https exonum com doc version latest architecture serialization service id docs architecture serialization message id https exonum com doc version latest architecture serialization message id docs architecture transactions https exonum com doc version latest architecture transactions docs advanced merkelized list https exonum com doc version latest advanced merkelized list docs glossary digital signature https exonum com doc version latest glossary digital signature docs glossary hash https exonum com doc version latest glossary hash docs glossary blockchain state https exonum com doc version latest glossary blockchain state docs glossary merkle tree https exonum com doc version latest glossary merkle tree docs glossary validator https exonum com doc version latest glossary validator npmjs https www npmjs com package exonum client gitter https gitter im exonum exonum twitter https twitter com exonumplatform newsletter https exonum com newsletter contributing https exonum com doc version latest contributing is safe integer https developer mozilla org en us docs web javascript reference global objects number issafeinteger vector structure https doc rust lang org std vec struct vec html tweetnacl key pair https github com dchest tweetnacl js naclsignkeypair tweetnacl random bytes https github com dchest tweetnacl js random bytes generation protobuf https developers google com protocol buffers protobufjs https www npmjs com package protobufjs lc java https github com exonum exonum java binding tree master exonum light client lc python https github com exonum exonum python client | exonum blockchain cryptography ed25519 sha256 merkle-tree | blockchain |
xaya | xaya core https xaya io formerly chimaera what is xaya xaya core is a decentralized open source information registration and transfer system based on the namecoin cryptocurrency it is primarily aimed at decentralised blockchain gaming this is the first layer of the xaya platform what does it do securely record and transfer arbitrary names keys transact chi coins the digital currency chi store data allow creation of tokens and more what can xaya be used for along with some of namecoin s uses cases it is designed especially with gaming and complex decentralised applications in mind allow creation of game accounts create virtual game currencies asset storage ownership and management particularly for in game items identification and authentication systems dapps payment gateway decentralised autonomous worlds and much more more information xaya core includes the base elements required for experienced users miners and exchanges xayad xaya qt xaya cli and the usual you may check https github com xaya xaya tree master doc xaya for more xaya specs and additional information several tutorials and more information are here https github com xaya xaya tutorials wiki for compiling on ubuntu you can use this guide https forum xaya io topic 59 guide building xaya on ubuntu you can also follow any bitcoin guide just replace the repo with xaya s https github com bitcoin bitcoin blob master doc build windows md https github com bitcoin bitcoin blob master doc build unix md https github com bitcoin bitcoin blob master doc build osx md standard users should use the xaya electron wallet which is a front end to xayad here https github com xaya xaya electron for issues with xaya core you can create an issue in github or ask on the forum https forum xaya io | blockchain |
|
iotex-core | iotex core official golang implementation of the iotex protocol join the forum https img shields io badge discuss iotex 20community blue https community iotex io c research development protocol go version https img shields io badge go 1 18 5 blue svg https github com moovweb gvm go report card https goreportcard com badge github com iotexproject iotex core https goreportcard com report github com iotexproject iotex core coverage https codecov io gh iotexproject iotex core branch master graph badge svg https codecov io gh iotexproject iotex core godoc http img shields io badge go documentation blue svg style flat square https godoc org github com iotexproject iotex core releases https img shields io github release iotexproject iotex core all svg style flat square https github com iotexproject iotex core releases license https img shields io badge license apache 202 0 blue svg license a href https iotex io img src logo iotex png height 200px a welcome to the official go implementation of iotex protocol iotex is building the next generation of the decentralized blockchain protocol for powering real world information marketplace in a decentralized yet scalable way refer to iotex whitepaper https iotex io research for details a href https iotex io devdiscord target blank img src https github com iotexproject halogrants blob 880eea4af074b082a75608c7376bd7a8eaa1ac21 img btn discord svg height 36px a new to iotex please visit https iotex io official website or iotex onboard pack https onboard iotex io to learn more about iotex network run a delegate please visit iotex delegate manual https github com iotexproject iotex bootstrap for detailed setup process building the source code minimum requirements components version description golang https golang org ge 1 18 5 go programming language protoc https developers google com protocol buffers ge 3 6 0 protocol buffers required only when you rebuild protobuf messages compile download the code to your desired local location doesn t have to be under gopath src git clone git github com iotexproject iotex core git cd iotex core if you put the project code under your gopath src you will need to set up an environment variable export go111module on set go111module on for windows build the project for general purpose server ioctl by make build the project for broader purpose server ioctl injector by make all if the dependency needs to be updated run go get u go mod tidy if you want to learn more advanced usage about go mod you can find out here https github com golang go wiki modules run unit tests only by make test build the docker image by make docker run iotex core start or resume a standalone server to operate on a blockchain by make run restart the server from a clean state by make reboot if make run fails due to corrupted or missing state database while block database is in normal condition e g failing to get factory s height from underlying db please try to recover state database by make recover then make run again use cli users could interact with iotex blockchain by ioctl command refer to cli document https docs iotex io developer ioctl install html for more details contact mailing list iotex dev iotex dev iotex io dev forum forum https community iotex io c research development protocol bugs issues https github com iotexproject iotex core issues contribution we are glad to have contributors out of the core team contributions including but not limited to style bug fixes implementation of features proposals of schemes algorithms and thorough documentation are welcomed please refer to our contribution guideline contributing md for more information development guide documentation is here https github com iotexproject iotex core wiki developers 27 guide for any major protocol level changes we use iip https github com iotexproject iips to track the proposal decision and etc contributors thank you for considering contributing to the iotex framework a href https github com iotexproject iotex core graphs contributors img src https contrib rocks image repo iotexproject iotex core a license this project is licensed under the apache license 2 0 license | blockchain cryptography distributed-systems crypto internet-of-things internet-of-everything machinefi depin | server |
STM32F4-HAL-FreeRTOS | companion sources for series of blog posts on implementing freertos on stm32f429 discovery board with stm hal list of topics and links to individual pages is available on the main project page https blog shirtec com p blog page html part i setup blinky https blog shirtec com 2018 05 stm32 hal freertos part i setup blinky html deals with setting up development environment on linux as well as windows to get basic blinker task off the ground part ii uart https blog shirtec com 2018 05 stm32 hal freertos part ii uart html is about the simplest ish way of implementing feedback from the devboard via uart in blocking mode part iii spi in blocking mode https blog shirtec com 2018 05 stm32 hal freertos part iii spi blocking html connects to the on board l3gd20 gyroscope and fetches its who am i register in blocking mode part iv ide eclipse setup https blog shirtec com 2018 05 stm32 hal freertos part iv ide eclipse html briefly shows how to import makefile project into eclipse and how to get it to recognize all the symbols used so that actual errors don t get swamped out by eclipse bitching about symbols part v spi with dma https blog shirtec com 2018 06 stm32 hal freertos part v spi with dma html shows how to set up dma communications as well as use of queue to pass a custom data structure from interrupts and code to a separate processing task | os |
|
paho.mqtt.android | eclipse paho android service build status https travis ci org eclipse paho mqtt android svg branch master https travis ci org eclipse paho mqtt android the paho android service is an mqtt client library written in java for developing applications on android features mqtt 3 1 heavy check mark automatic reconnect heavy check mark mqtt 3 1 1 heavy check mark offline buffering heavy check mark lwt heavy check mark websocket support heavy check mark ssl tls heavy check mark standard tcp support heavy check mark message persistence heavy check mark to get started download android studio http developer android com tools studio index html you will also need to download the android sdk https developer android com sdk installing adding packages html currently you will need the sdk for 24 project description the paho project has been created to provide reliable open source implementations of open and standard messaging protocols aimed at new existing and emerging applications for machine to machine m2m and internet of things iot paho reflects the inherent physical and cost constraints of device connectivity its objectives include effective levels of decoupling between devices and applications designed to keep markets open and encourage the rapid growth of scalable web and enterprise middleware and applications links project website https www eclipse org paho https www eclipse org paho eclipse project information https projects eclipse org projects iot paho https projects eclipse org projects iot paho paho android client page https www eclipse org paho clients android https www eclipse org paho clients android github https github com eclipse paho mqtt android https github com eclipse paho mqtt android twitter eclipsepaho https twitter com eclipsepaho issues https github com eclipse paho mqtt android issues https github com eclipse paho mqtt android issues mailing list https dev eclipse org mailman listinfo paho dev https dev eclipse org mailman listinfo paho dev using the paho android client downloading maven eclipse hosts a nexus repository for those who want to use maven to manage their dependencies add the repository definition and the dependency definition shown below to your pom xml replace repourl with either https repo eclipse org content repositories paho releases for the official releases or https repo eclipse org content repositories paho snapshots for the nightly snapshots replace version with the level required the latest release version is 1 1 1 and the current snapshot version is 1 1 2 snapshot project repositories repository id eclipse paho repo id url repourl url repository repositories dependencies dependency groupid org eclipse paho groupid artifactid org eclipse paho android service artifactid version version version dependency dependencies project gradle if you are using android studio and or gradle to manage your application dependencies and build then you can use the same repository to get the paho android service add the eclipse maven repository to your build gradle file and then add the paho dependency to the dependencies section repositories maven url https repo eclipse org content repositories paho snapshots dependencies compile org eclipse paho org eclipse paho client mqttv3 1 1 0 compile org eclipse paho org eclipse paho android service 1 1 1 note currently you have to include the org eclipse paho org eclipse paho client mqttv3 dependency as well we are attempting to get the build to produce an android aar file that contains both the android service as well as it s dependencies however this is still experimental if you wish to try it remove the org eclipse paho org eclipse paho client mqttv3 dependency and append aar to the end of the android service dependency e g org eclipse paho org eclipse paho android service 1 1 1 aar if you find that there is functionality missing or bugs in the release version you may want to try using the snapshot version to see if this helps before raising a feature request or an issue building from source open a terminal and navigate to this directory org eclipse paho android service run the command gradlew clean assemble exportjar or on windows gradlew bat clean assemble exportjar running the sample app open the this current directory in android studio org eclipse paho android service in the toolbar along the top there should be a dropdown menu make sure that it contains org eclipse android sample then click the green run triangle it should now build and launch an virtual android device to run the app if you have an android device with developer mode turned on plugged in you will have the oppertunity to run it directly on that if you have any problems check out the android developer documentation for help https developer android com | eclipseiot mqtt iot internet-of-things | server |
SQL_corporation_employee_research | sql corporation employee research it is a beautiful spring day and it is two weeks since you have been hired as a new data engineer at pewlett hackard your first major task is a research project on employees of the corporation from the 1980s and 1990s all that remain of the database of employees from that period are six csv files in this assignment you will design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data in other words you will perform data engineering data analysis data modeling inspected the csvs and sketched a erd of the tables data engineering created a table schema for each of the six csvs specifying data types primary keys foreign keys and other constraints then imported each csv file into their corresponding sql table bonus analysis as you examine the data you are overcome with a creeping suspicion that the dataset is fake you surmise that your boss handed you spurious data in order to test the data engineering skills of a new employee to confirm your hunch you decide to take the following steps to generate a visualization of the data with which you will confront your boss imported the sql database into pandas created a histogram to visualize the most common salary ranges for employees created a bar chart of average salary by title | server |
|
EnhancedFreeRTOS | enhancedfreertos this is a visual studio project the windows simulator and the kernel of freertos this is a modified version of freertos kernel with an enhanced scheduler an adaptive scheduler for the freertos based on ant colony optimisation the project aims to introduce ant colony optimisation into the scheduler of freertos such that during overloaded situation the scheduling strategy can be switched ant colony optimisation is an evolutionary algorithm developed after studying the colony of ants which gives us an algorithm that can be used to compute shortest distance path with relative ease the optimisation will be used in the adaptive scheduler which will select the default strategy or the ant colony optimisation based on the loading factor of the system along with this has been created a dummy test environment to analyse the performance of each scheduler problem with traditional scheduler suppose we have n task with high priority and one task with priority value one less than that of the n tasks because freertos uses priority based preemptive scheduling only the high priority tasks will get executed even though the difference in priority is not that much this will lead to starvation for low priority task this is a major drawback of freertos scheduler other than above if we have a task with high priority and takes a lot of time for completion and we have n other tasks which have less time of execution the default scheduler will schedule only the high priority one this will lead to the increase in the average completion time of all the tasks because the low priority tasks have to wait longer this is also another major drawback of the scheduler adaptive scheduler adaptive scheduling basically means the scheduling algorithm will adapt to the changes in system depending upon the load we will either use traditional or aco algorithm the scheduling algorithm is required to execute when a new task arrives or presently running task completes 1 construct tour of different ants and produce the task execution sequence 2 analyze the task execution sequences generated for available number of processors one in our case 3 update the value of pheromone 4 decide probability of each task and select the task with maximum priority for execu tion project set up all additional code is hosted within the macro definition use aco example c if use aco 1 do stuff endif tools and requirements microsoft visual c visual studio compatible with 2013 we used 2017 while developing it tuning the algorithm constants are defined in file freertosconfig h c define aco debug 1 define aco paths 2 define aco pheromone init value 1 define acohvalue 12 define acoalpha 1 define acobeta 1 define acopheromone const 0 2 define acopheromone evaporation const 0 4 define performance coeffecient priority 1 define performance coeffecient wait time 1 define performance coeffecient rank 1 define priority deadline multiplier 20 define minimum deadline 200 define deadline constant configmax priorities priority deadline multiplier minimum deadline vary the values to get different results | freertos adaptive-scheduling scheduling operating-system kernel optimisation os | os |
IoTGoat | official repo has been moved to https github com owasp iotgoat p align center img src images vertical blue logo png alt iotgoat width 250 height 350 p description the iotgoat project is a deliberately insecure firmware based on openwrt https openwrt org and maintained by owasp http owasp org to educate users how to test for the most common vulnerabilities found in iot devices the vulnerability challenges are based on the owasp iot top 10 noted below as well as easter eggs from project contributors for a list of vulnerability challenges see the iotgoat challenges wiki https github com scriptingxss iotgoat wiki iotgoat challenges page owasp iot top 10 2018 https www owasp org images 1 1c owasp iot top 10 2018 final pdf description i1 weak guessable or hardcoded passwords use of easily bruteforced publicly available or unchangeable credentials including backdoors in firmware or client software that grants unauthorized access to deployed systems i2 insecure network services unneeded or insecure network services running on the device itself especially those exposed to the internet that compromise the confidentiality integrity authenticity or availability of information or allow unauthorized remote control i3 insecure ecosystem interfaces insecure web backend api cloud or mobile interfaces in the ecosystem outside of the device that allows compromise of the device or its related components common issues include a lack of authentication authorization lacking or weak encryption and a lack of input and output filtering i4 lack of secure update mechanism lack of ability to securely update the device this includes lack of firmware validation on device lack of secure delivery un encrypted in transit lack of anti rollback mechanisms and lack of notifications of security changes due to updates i5 use of insecure or outdated components use of deprecated or insecure software components libraries that could allow the device to be compromised this includes insecure customization of operating system platforms and the use of third party software or hardware components from a compromised supply chain i6 insufficient privacy protection user s personal information stored on the device or in the ecosystem that is used insecurely improperly or without permission i7 insecure data transfer and storage lack of encryption or access control of sensitive data anywhere within the ecosystem including at rest in transit or during processing i8 lack of device management lack of security support on devices deployed in production including asset management update management secure decommissioning systems monitoring and response capabilities i9 insecure default settings devices or systems shipped with insecure default settings or lack the ability to make the system more secure by restricting operators from modifying configurations i10 lack of physical hardening lack of physical hardening measures allowing potential attackers to gain sensitive information that can help in a future remote attack or take local control of the device getting started several methods exist to get started with hacking iotgoat 1 for those looking to extract the filesystem analyze configurations and binaries statically download the latest precompiled firmware release from https github com scriptingxss iotgoat releases refer to owasp s firmware security testing methodology https github com scriptingxss owasp fstm to help with identifying vulnerabilities 2 for dynamic web testing and binary runtime analysis the quickest way to get started is downloading the latest iotgoat x86 vmdk vmware https github com scriptingxss iotgoat releases and create a custom virtual machine using the iotgoat disk image refer to owasp s web security testing guide https github com owasp wstg tree master document and asvs https github com owasp asvs projects for additional guidance on identifying web application vulnerabilities 3 emulate firmware with opensource tools e g firmadyne https github com firmadyne firmadyne and fat https github com attify firmware analysis toolkit that leverage qemu to virtualize iotgoat locally 4 use the iotgoat raspberry pi2 sysupgrade img firmware to flash on a raspberry pi 2 brcm2708 brcm2709 building from source openwrt can build many different cpu platforms and boards building from source gives users the flexibility to flash iotgoat on supported openwrt hardware ensure 10 15gb disk space is available with at least 4gb of ram and a supported linux distribution such as ubuntu 18 04 https openwrt org docs guide developer build system install buildsystem use the following steps to get started with building custom firmware do everything as a normal user don t use root user or sudo when building https openwrt org docs guide developer build system use buildsystem git clone https github com scriptingxss iotgoat git cd iotgoat openwrt openwrt 18 06 2 scripts feeds update a scripts feeds install a make menuconfig select your preferred configuration for the toolchain target system firmware packages make build your firmware with make this will download all sources build the cross compile toolchain and then cross compile the linux kernel all chosen applications for your target system the first build will take some time to complete and will vary based on the provided internet connection for downloading the toolchain once a successful build is complete the compiled firmware will be placed in the following directory iotgoat openwrt openwrt 18 06 2 bin targets depending on the target selected in menuconfig for example iotgoat raspberry pi 2 firmware will be located in the following directory iotgoat openwrt openwrt 18 06 2 bin targets brcm2708 bcm2709 iotgoat build configuration files are made availble for x86 config x86 and raspberry pi 2 config rpi platforms project leaders aaron guzman scriptingxss fotios chantzis paulino calderon contributors parag mhatre paraaagggg abhinav mohanty cyanide284 jason andress jandress 0x48piraj license the mit license mit license md | server |
|
Time-and-Attendance-Management-System | to run this project first download this repository go to cmd then go to the project directory and type npm install to install the node packages now download the php rest api for tms repository from my git and copy the php slim api and api folders to htdocs folder of the xampp server now again go to project directory in cmd and type npm start to start the project this project was bootstrapped with create react app https github com facebookincubator create react app below you will find some information on how to perform common tasks br you can find the most recent version of this guide here https github com facebookincubator create react app blob master packages react scripts template readme md table of contents updating to new releases updating to new releases sending feedback sending feedback folder structure folder structure available scripts available scripts npm start npm start npm test npm test npm run build npm run build npm run eject npm run eject supported browsers supported browsers supported language features and polyfills supported language features and polyfills syntax highlighting in the editor syntax highlighting in the editor displaying lint output in the editor displaying lint output in the editor debugging in the editor debugging in the editor formatting code automatically formatting code automatically changing the page title changing the page title installing a dependency installing a dependency importing a component importing a component code splitting code splitting adding a stylesheet adding a stylesheet post processing css post processing css adding a css preprocessor sass less etc adding a css preprocessor sass less etc adding images fonts and files adding images fonts and files using the public folder using the public folder changing the html changing the html adding assets outside of the module system adding assets outside of the module system when to use the public folder when to use the public folder using global variables using global variables adding bootstrap adding bootstrap using a custom theme using a custom theme adding flow adding flow adding a router adding a router adding custom environment variables adding custom environment variables referencing environment variables in the html referencing environment variables in the html adding temporary environment variables in your shell adding temporary environment variables in your shell adding development environment variables in env adding development environment variables in env can i use decorators can i use decorators fetching data with ajax requests fetching data with ajax requests integrating with an api backend integrating with an api backend node node ruby on rails ruby on rails proxying api requests in development proxying api requests in development invalid host header errors after configuring proxy invalid host header errors after configuring proxy configuring the proxy manually configuring the proxy manually configuring a websocket proxy configuring a websocket proxy using https in development using https in development generating dynamic meta tags on the server generating dynamic meta tags on the server pre rendering into static html files pre rendering into static html files injecting data from the server into the page injecting data from the server into the page running tests running tests filename conventions filename conventions command line interface command line interface version control integration version control integration writing tests writing tests testing components testing components using third party assertion libraries using third party assertion libraries initializing test environment initializing test environment focusing and excluding tests focusing and excluding tests coverage reporting coverage reporting continuous integration continuous integration disabling jsdom disabling jsdom snapshot testing snapshot testing editor integration editor integration debugging tests debugging tests debugging tests in chrome debugging tests in chrome debugging tests in visual studio code debugging tests in visual studio code developing components in isolation developing components in isolation getting started with storybook getting started with storybook getting started with styleguidist getting started with styleguidist publishing components to npm publishing components to npm making a progressive web app making a progressive web app opting out of caching opting out of caching offline first considerations offline first considerations progressive web app metadata progressive web app metadata analyzing the bundle size analyzing the bundle size deployment deployment static server static server other solutions other solutions serving apps with client side routing serving apps with client side routing building for relative paths building for relative paths azure azure firebase firebase github pages github pages heroku heroku netlify netlify now now s3 and cloudfront s3 and cloudfront surge surge advanced configuration advanced configuration troubleshooting troubleshooting npm start doesn t detect changes npm start doesnt detect changes npm test hangs on macos sierra npm test hangs on macos sierra npm run build exits too early npm run build exits too early npm run build fails on heroku npm run build fails on heroku npm run build fails to minify npm run build fails to minify moment js locales are missing momentjs locales are missing alternatives to ejecting alternatives to ejecting something missing something missing updating to new releases create react app is divided into two packages create react app is a global command line utility that you use to create new projects react scripts is a development dependency in the generated projects including this one you almost never need to update create react app itself it delegates all the setup to react scripts when you run create react app it always creates the project with the latest version of react scripts so you ll get all the new features and improvements in newly created apps automatically to update an existing project to a new version of react scripts open the changelog https github com facebookincubator create react app blob master changelog md find the version you re currently on check package json in this folder if you re not sure and apply the migration instructions for the newer versions in most cases bumping the react scripts version in package json and running npm install in this folder should be enough but it s good to consult the changelog https github com facebookincubator create react app blob master changelog md for potential breaking changes we commit to keeping the breaking changes minimal so you can upgrade react scripts painlessly sending feedback we are always open to your feedback https github com facebookincubator create react app issues folder structure after creation your project should look like this my app readme md node modules package json public index html favicon ico src app css app js app test js index css index js logo svg for the project to build these files must exist with exact filenames public index html is the page template src index js is the javascript entry point you can delete or rename the other files you may create subdirectories inside src for faster rebuilds only files inside src are processed by webpack br you need to put any js and css files inside src otherwise webpack won t see them only files inside public can be used from public index html br read instructions below for using assets from javascript and html you can however create more top level directories br they will not be included in the production build so you can use them for things like documentation available scripts in the project directory you can run npm start runs the app in the development mode br open http localhost 3000 http localhost 3000 to view it in the browser the page will reload if you make edits br you will also see any lint errors in the console npm test launches the test runner in the interactive watch mode br see the section about running tests running tests for more information npm run build builds the app for production to the build folder br it correctly bundles react in production mode and optimizes the build for the best performance the build is minified and the filenames include the hashes br your app is ready to be deployed see the section about deployment deployment for more information npm run eject note this is a one way operation once you eject you can t go back if you aren t satisfied with the build tool and configuration choices you can eject at any time this command will remove the single build dependency from your project instead it will copy all the configuration files and the transitive dependencies webpack babel eslint etc right into your project so you have full control over them all of the commands except eject will still work but they will point to the copied scripts so you can tweak them at this point you re on your own you don t have to ever use eject the curated feature set is suitable for small and middle deployments and you shouldn t feel obligated to use this feature however we understand that this tool wouldn t be useful if you couldn t customize it when you are ready for it supported browsers by default the generated project uses the latest version of react you can refer to the react documentation https reactjs org docs react dom html browser support for more information about supported browsers supported language features and polyfills this project supports a superset of the latest javascript standard br in addition to es6 https github com lukehoban es6features syntax features it also supports exponentiation operator https github com rwaldron exponentiation operator es2016 async await https github com tc39 ecmascript asyncawait es2017 object rest spread properties https github com sebmarkbage ecmascript rest spread stage 3 proposal dynamic import https github com tc39 proposal dynamic import stage 3 proposal class fields and static properties https github com tc39 proposal class public fields part of stage 3 proposal jsx https facebook github io react docs introducing jsx html and flow https flowtype org syntax learn more about different proposal stages https babeljs io docs plugins presets stage x experimental presets while we recommend using experimental proposals with some caution facebook heavily uses these features in the product code so we intend to provide codemods https medium com cpojer effective javascript codemods 5a6686bb46fb if any of these proposals change in the future note that the project only includes a few es6 polyfills https en wikipedia org wiki polyfill object assign https developer mozilla org en docs web javascript reference global objects object assign via object assign https github com sindresorhus object assign promise https developer mozilla org en us docs web javascript reference global objects promise via promise https github com then promise fetch https developer mozilla org en docs web api fetch api via whatwg fetch https github com github fetch if you use any other es6 features that need runtime support such as array from or symbol make sure you are including the appropriate polyfills manually or that the browsers you are targeting already support them also note that using some newer syntax features like for of or nonarrayvalue causes babel to emit code that depends on es6 runtime features and might not work without a polyfill when in doubt use babel repl https babeljs io repl to see what any specific syntax compiles down to syntax highlighting in the editor to configure the syntax highlighting in your favorite text editor head to the relevant babel documentation page https babeljs io docs editors and follow the instructions some of the most popular editors are covered displaying lint output in the editor note this feature is available with react scripts 0 2 0 and higher br it also only works with npm 3 or higher some editors including sublime text atom and visual studio code provide plugins for eslint they are not required for linting you should see the linter output right in your terminal as well as the browser console however if you prefer the lint results to appear right in your editor there are some extra steps you can do you would need to install an eslint plugin for your editor first then add a file called eslintrc to the project root js extends react app now your editor should report the linting warnings note that even if you edit your eslintrc file further these changes will only affect the editor integration they won t affect the terminal and in browser lint output this is because create react app intentionally provides a minimal set of rules that find common mistakes if you want to enforce a coding style for your project consider using prettier https github com jlongster prettier instead of eslint style rules debugging in the editor this feature is currently only supported by visual studio code https code visualstudio com and webstorm https www jetbrains com webstorm visual studio code and webstorm support debugging out of the box with create react app this enables you as a developer to write and debug your react code without leaving the editor and most importantly it enables you to have a continuous development workflow where context switching is minimal as you don t have to switch between tools visual studio code you would need to have the latest version of vs code https code visualstudio com and vs code chrome debugger extension https marketplace visualstudio com items itemname msjsdiag debugger for chrome installed then add the block below to your launch json file and put it inside the vscode folder in your app s root directory json version 0 2 0 configurations name chrome type chrome request launch url http localhost 3000 webroot workspaceroot src sourcemappathoverrides webpack src webroot note the url may be different if you ve made adjustments via the host or port environment variables advanced configuration start your app by running npm start and start debugging in vs code by pressing f5 or by clicking the green debug icon you can now write code set breakpoints make changes to the code and debug your newly modified code all from your editor having problems with vs code debugging please see their troubleshooting guide https github com microsoft vscode chrome debug blob master readme md troubleshooting webstorm you would need to have webstorm https www jetbrains com webstorm and jetbrains ide support https chrome google com webstore detail jetbrains ide support hmhgeddbohgjknpmjagkdomcpobmllji chrome extension installed in the webstorm menu run select edit configurations then click and select javascript debug paste http localhost 3000 into the url field and save the configuration note the url may be different if you ve made adjustments via the host or port environment variables advanced configuration start your app by running npm start then press d on macos or f9 on windows and linux or click the green debug icon to start debugging in webstorm the same way you can debug your application in intellij idea ultimate phpstorm pycharm pro and rubymine formatting code automatically prettier is an opinionated code formatter with support for javascript css and json with prettier you can format the code you write automatically to ensure a code style within your project see the prettier s github page https github com prettier prettier for more information and look at this page to see it in action https prettier github io prettier to format our code whenever we make a commit in git we need to install the following dependencies sh npm install save husky lint staged prettier alternatively you may use yarn sh yarn add husky lint staged prettier husky makes it easy to use githooks as if they are npm scripts lint staged allows us to run scripts on staged files in git see this blog post about lint staged to learn more about it https medium com okonetchnikov make linting great again f3890e1ad6b8 prettier is the javascript formatter we will run before commits now we can make sure every file is formatted correctly by adding a few lines to the package json in the project root add the following line to scripts section diff scripts precommit lint staged start react scripts start build react scripts build next we add a lint staged field to the package json for example diff dependencies lint staged src js jsx json css prettier single quote write git add scripts now whenever you make a commit prettier will format the changed files automatically you can also run node modules bin prettier single quote write src js jsx json css to format your entire project for the first time next you might want to integrate prettier in your favorite editor read the section on editor integration https prettier io docs en editors html on the prettier github page changing the page title you can find the source html file in the public folder of the generated project you may edit the title tag in it to change the title from react app to anything else note that normally you wouldn t edit files in the public folder very often for example adding a stylesheet adding a stylesheet is done without touching the html if you need to dynamically update the page title based on the content you can use the browser document title https developer mozilla org en us docs web api document title api for more complex scenarios when you want to change the title from react components you can use react helmet https github com nfl react helmet a third party library if you use a custom server for your app in production and want to modify the title before it gets sent to the browser you can follow advice in this section generating dynamic meta tags on the server alternatively you can pre build each page as a static html file which then loads the javascript bundle which is covered here pre rendering into static html files installing a dependency the generated project includes react and reactdom as dependencies it also includes a set of scripts used by create react app as a development dependency you may install other dependencies for example react router with npm sh npm install save react router alternatively you may use yarn sh yarn add react router this works for any library not just react router importing a component this project setup supports es6 modules thanks to babel br while you can still use require and module exports we encourage you to use import and export http exploringjs com es6 ch modules html instead for example button js js import react component from react class button extends component render export default button don t forget to use export default dangerbutton js js import react component from react import button from button import a component from another file class dangerbutton extends component render return button color red export default dangerbutton be aware of the difference between default and named exports http stackoverflow com questions 36795819 react native es 6 when should i use curly braces for import 36796281 36796281 it is a common source of mistakes we suggest that you stick to using default imports and exports when a module only exports a single thing for example a component that s what you get when you use export default button and import button from button named exports are useful for utility modules that export several functions a module may have at most one default export and as many named exports as you like learn more about es6 modules when to use the curly braces http stackoverflow com questions 36795819 react native es 6 when should i use curly braces for import 36796281 36796281 exploring es6 modules http exploringjs com es6 ch modules html understanding es6 modules https leanpub com understandinges6 read leanpub auto encapsulating code with modules code splitting instead of downloading the entire app before users can use it code splitting allows you to split your code into small chunks which you can then load on demand this project setup supports code splitting via dynamic import http 2ality com 2017 01 import operator html loading code on demand its proposal https github com tc39 proposal dynamic import is in stage 3 the import function like form takes the module name as an argument and returns a promise https developer mozilla org en us docs web javascript reference global objects promise which always resolves to the namespace object of the module here is an example modulea js js const modulea hello export modulea app js js import react component from react class app extends component handleclick import modulea then modulea use modulea catch err handle failure render return div button onclick this handleclick load button div export default app this will make modulea js and all its unique dependencies as a separate chunk that only loads after the user clicks the load button you can also use it with async await syntax if you prefer it with react router if you are using react router check out this tutorial http serverless stack com chapters code splitting in create react app html on how to use code splitting with it you can find the companion github repository here https github com anomalyinnovations serverless stack demo client tree code splitting in create react app also check out the code splitting https reactjs org docs code splitting html section in react documentation adding a stylesheet this project setup uses webpack https webpack js org for handling all assets webpack offers a custom way of extending the concept of import beyond javascript to express that a javascript file depends on a css file you need to import the css from the javascript file button css css button padding 20px button js js import react component from react import button css tell webpack that button js uses these styles class button extends component render you can use them as regular css styles return div classname button this is not required for react but many people find this feature convenient you can read about the benefits of this approach here https medium com seek ui engineering block element modifying your javascript components d7f99fcab52b however you should be aware that this makes your code less portable to other build tools and environments than webpack in development expressing dependencies this way allows your styles to be reloaded on the fly as you edit them in production all css files will be concatenated into a single minified css file in the build output if you are concerned about using webpack specific semantics you can put all your css right into src index css it would still be imported from src index js but you could always remove that import if you later migrate to a different build tool post processing css this project setup minifies your css and adds vendor prefixes to it automatically through autoprefixer https github com postcss autoprefixer so you don t need to worry about it for example this css app display flex flex direction row align items center becomes this css app display webkit box display ms flexbox display flex webkit box orient horizontal webkit box direction normal ms flex direction row flex direction row webkit box align center ms flex align center align items center if you need to disable autoprefixing for some reason follow this section https github com postcss autoprefixer disabling adding a css preprocessor sass less etc generally we recommend that you don t reuse the same css classes across different components for example instead of using a button css class in acceptbutton and rejectbutton components we recommend creating a button component with its own button styles that both acceptbutton and rejectbutton can render but not inherit https facebook github io react docs composition vs inheritance html following this rule often makes css preprocessors less useful as features like mixins and nesting are replaced by component composition you can however integrate a css preprocessor if you find it valuable in this walkthrough we will be using sass but you can also use less or another alternative first let s install the command line interface for sass sh npm install save node sass chokidar alternatively you may use yarn sh yarn add node sass chokidar then in package json add the following lines to scripts diff scripts build css node sass chokidar src o src watch css npm run build css node sass chokidar src o src watch recursive start react scripts start build react scripts build test react scripts test env jsdom note to use a different preprocessor replace build css and watch css commands according to your preprocessor s documentation now you can rename src app css to src app scss and run npm run watch css the watcher will find every sass file in src subdirectories and create a corresponding css file next to it in our case overwriting src app css since src app js still imports src app css the styles become a part of your application you can now edit src app scss and src app css will be regenerated to share variables between sass files you can use sass imports for example src app scss and other component style files could include import shared scss with variable definitions to enable importing files without using relative paths you can add the include path option to the command in package json build css node sass chokidar include path src include path node modules src o src watch css npm run build css node sass chokidar include path src include path node modules src o src watch recursive this will allow you to do imports like scss import styles colors scss assuming a styles directory under src import nprogress nprogress importing a css file from the nprogress node module at this point you might want to remove all css files from the source control and add src css to your gitignore file it is generally a good practice to keep the build products outside of the source control as a final step you may find it convenient to run watch css automatically with npm start and run build css as a part of npm run build you can use the operator to execute two scripts sequentially however there is no cross platform way to run two scripts in parallel so we will install a package for this sh npm install save npm run all alternatively you may use yarn sh yarn add npm run all then we can change start and build scripts to include the css preprocessor commands diff scripts build css node sass chokidar src o src watch css npm run build css node sass chokidar src o src watch recursive start react scripts start build react scripts build start js react scripts start start npm run all p watch css start js build js react scripts build build npm run all build css build js test react scripts test env jsdom eject react scripts eject now running npm start and npm run build also builds sass files why node sass chokidar node sass has been reported as having the following issues node sass watch has been reported to have performance issues in certain conditions when used in a virtual machine or with docker infinite styles compiling 1939 https github com facebookincubator create react app issues 1939 node sass has been reported as having issues with detecting new files in a directory 1891 https github com sass node sass issues 1891 node sass chokidar is used here as it addresses these issues adding images fonts and files with webpack using static assets like images and fonts works similarly to css you can import a file right in a javascript module this tells webpack to include that file in the bundle unlike css imports importing a file gives you a string value this value is the final path you can reference in your code e g as the src attribute of an image or the href of a link to a pdf to reduce the number of requests to the server importing images that are less than 10 000 bytes returns a data uri https developer mozilla org en us docs web http basics of http data uris instead of a path this applies to the following file extensions bmp gif jpg jpeg and png svg files are excluded due to 1153 https github com facebookincubator create react app issues 1153 here is an example js import react from react import logo from logo png tell webpack this js file uses this image console log logo logo 84287d09 png function header import result is the url of your image return img src logo alt logo export default header this ensures that when the project is built webpack will correctly move the images into the build folder and provide us with correct paths this works in css too css logo background image url logo png webpack finds all relative module references in css they start with and replaces them with the final paths from the compiled bundle if you make a typo or accidentally delete an important file you will see a compilation error just like when you import a non existent javascript module the final filenames in the compiled bundle are generated by webpack from content hashes if the file content changes in the future webpack will give it a different name in production so you don t need to worry about long term caching of assets please be advised that this is also a custom feature of webpack it is not required for react but many people enjoy it and react native uses a similar mechanism for images br an alternative way of handling static assets is described in the next section using the public folder note this feature is available with react scripts 0 5 0 and higher changing the html the public folder contains the html file so you can tweak it for example to set the page title changing the page title the script tag with the compiled code will be added to it automatically during the build process adding assets outside of the module system you can also add other assets to the public folder note that we normally encourage you to import assets in javascript files instead for example see the sections on adding a stylesheet adding a stylesheet and adding images and fonts adding images fonts and files this mechanism provides a number of benefits scripts and stylesheets get minified and bundled together to avoid extra network requests missing files cause compilation errors instead of 404 errors for your users result filenames include content hashes so you don t need to worry about browsers caching their old versions however there is an escape hatch that you can use to add an asset outside of the module system if you put a file into the public folder it will not be processed by webpack instead it will be copied into the build folder untouched to reference assets in the public folder you need to use a special variable called public url inside index html you can use it like this html link rel shortcut icon href public url favicon ico only files inside the public folder will be accessible by public url prefix if you need to use a file from src or node modules you ll have to copy it there to explicitly specify your intention to make this file a part of the build when you run npm run build create react app will substitute public url with a correct absolute path so your project works even if you use client side routing or host it at a non root url in javascript code you can use process env public url for similar purposes js render note this is an escape hatch and should be used sparingly normally we recommend using import for getting asset urls as described in adding images and fonts above this section return img src process env public url img logo png keep in mind the downsides of this approach none of the files in public folder get post processed or minified missing files will not be called at compilation time and will cause 404 errors for your users result filenames won t include content hashes so you ll need to add query arguments or rename them every time they change when to use the public folder normally we recommend importing stylesheets adding a stylesheet images and fonts adding images fonts and files from javascript the public folder is useful as a workaround for a number of less common cases you need a file with a specific name in the build output such as manifest webmanifest https developer mozilla org en us docs web manifest you have thousands of images and need to dynamically reference their paths you want to include a small script like pace js http github hubspot com pace docs welcome outside of the bundled code some library may be incompatible with webpack and you have no other option but to include it as a script tag note that if you add a script that declares global variables you also need to read the next section on using them using global variables when you include a script in the html file that defines global variables and try to use one of these variables in the code the linter will complain because it cannot see the definition of the variable you can avoid this by reading the global variable explicitly from the window object for example js const window this makes it obvious you are using a global variable intentionally rather than because of a typo alternatively you can force the linter to ignore any line by adding eslint disable line after it adding bootstrap you don t have to use react bootstrap https react bootstrap github io together with react but it is a popular library for integrating bootstrap with react apps if you need it you can integrate it with create react app by following these steps install react bootstrap and bootstrap from npm react bootstrap does not include bootstrap css so this needs to be installed as well sh npm install save react bootstrap bootstrap 3 alternatively you may use yarn sh yarn add react bootstrap bootstrap 3 import bootstrap css and optionally bootstrap theme css in the beginning of your src index js file js import bootstrap dist css bootstrap css import bootstrap dist css bootstrap theme css put any other imports below so that css from your components takes precedence over default styles import required react bootstrap components within src app js file or your custom component files js import navbar jumbotron button from react bootstrap now you are ready to use the imported react bootstrap components within your component hierarchy defined in the render method here is an example app js https gist githubusercontent com gaearon 85d8c067f6af1e56277c82d19fd4da7b raw 6158dd991b67284e9fc8d70b9d973efe87659d72 app js redone using react bootstrap using a custom theme sometimes you might need to tweak the visual styles of bootstrap or equivalent package br we suggest the following approach create a new package that depends on the package you wish to customize e g bootstrap add the necessary build steps to tweak the theme and publish your package on npm install your own theme npm package as a dependency of your app here is an example of adding a customized bootstrap https medium com tacomanator customizing create react app aa9ffb88165 that follows these steps adding flow flow is a static type checker that helps you write code with fewer bugs check out this introduction to using static types in javascript https medium com preethikasireddy why use static types in javascript part 1 8382da1e0adb if you are new to this concept recent versions of flow http flowtype org work with create react app projects out of the box to add flow to a create react app project follow these steps 1 run npm install save flow bin or yarn add flow bin 2 add flow flow to the scripts section of your package json 3 run npm run flow init or yarn flow init to create a flowconfig file https flowtype org docs advanced configuration html in the root directory 4 add flow to any files you want to type check for example to src app js now you can run npm run flow or yarn flow to check the files for type errors you can optionally use an ide like nuclide https nuclide io docs languages flow for a better integrated experience in the future we plan to integrate it into create react app even more closely to learn more about flow check out its documentation https flowtype org adding a router create react app doesn t prescribe a specific routing solution but react router https reacttraining com react router is the most popular one to add it run sh npm install save react router dom alternatively you may use yarn sh yarn add react router dom to try it delete all the code in src app js and replace it with any of the examples on its website the basic example https reacttraining com react router web example basic is a good place to get started note that you may need to configure your production server to support client side routing serving apps with client side routing before deploying your app adding custom environment variables note this feature is available with react scripts 0 2 3 and higher your project can consume variables declared in your environment as if they were declared locally in your js files by default you will have node env defined for you and any other environment variables starting with react app the environment variables are embedded during the build time since create react app produces a static html css js bundle it can t possibly read them at runtime to read them at runtime you would need to load html into memory on the server and replace placeholders in runtime just like described here injecting data from the server into the page alternatively you can rebuild the app on the server anytime you change them note you must create custom environment variables beginning with react app any other variables except node env will be ignored to avoid accidentally exposing a private key on the machine that could have the same name https github com facebookincubator create react app issues 865 issuecomment 252199527 changing any environment variables will require you to restart the development server if it is running these environment variables will be defined for you on process env for example having an environment variable named react app secret code will be exposed in your js as process env react app secret code there is also a special built in environment variable called node env you can read it from process env node env when you run npm start it is always equal to development when you run npm test it is always equal to test and when you run npm run build to make a production bundle it is always equal to production you cannot override node env manually this prevents developers from accidentally deploying a slow development build to production these environment variables can be useful for displaying information conditionally based on where the project is deployed or consuming sensitive data that lives outside of version control first you need to have environment variables defined for example let s say you wanted to consume a secret defined in the environment inside a form jsx render return div small you are running this application in b process env node env b mode small form input type hidden defaultvalue process env react app secret code form div during the build process env react app secret code will be replaced with the current value of the react app secret code environment variable remember that the node env variable will be set for you automatically when you load the app in the browser and inspect the input you will see its value set to abcdef and the bold text will show the environment provided when using npm start html div small you are running this application in b development b mode small form input type hidden value abcdef form div the above form is looking for a variable called react app secret code from the environment in order to consume this value we need to have it defined in the environment this can be done using two ways either in your shell or in a env file both of these ways are described in the next few sections having access to the node env is also useful for performing actions conditionally js if process env node env production analytics disable when you compile the app with npm run build the minification step will strip out this condition and the resulting bundle will be smaller referencing environment variables in the html note this feature is available with react scripts 0 9 0 and higher you can also access the environment variables starting with react app in the public index html for example html title react app website name title note that the caveats from the above section apply apart from a few built in variables node env and public url variable names must start with react app to work the environment variables are injected at build time if you need to inject them at runtime follow this approach instead generating dynamic meta tags on the server adding temporary environment variables in your shell defining environment variables can vary between oses it s also important to know that this manner is temporary for the life of the shell session windows cmd exe cmd set react app secret code abcdef npm start note quotes around the variable assignment are required to avoid a trailing whitespace windows powershell powershell env react app secret code abcdef and npm start linux macos bash bash react app secret code abcdef npm start adding development environment variables in env note this feature is available with react scripts 0 5 0 and higher to define permanent environment variables create a file called env in the root of your project react app secret code abcdef note you must create custom environment variables beginning with react app any other variables except node env will be ignored to avoid accidentally exposing a private key on the machine that could have the same name https github com facebookincubator create react app issues 865 issuecomment 252199527 changing any environment variables will require you to restart the development server if it is running env files should be checked into source control with the exclusion of env local what other env files can be used note this feature is available with react scripts 1 0 0 and higher env default env local local overrides this file is loaded for all environments except test env development env test env production environment specific settings env development local env test local env production local local overrides of environment specific settings files on the left have more priority than files on the right npm start env development local env development env local env npm run build env production local env production env local env npm test env test local env test env note env local is missing these variables will act as the defaults if the machine does not explicitly set them br please refer to the dotenv documentation https github com motdotla dotenv for more details note if you are defining environment variables for development your ci and or hosting platform will most likely need these defined as well consult their documentation how to do this for example see the documentation for travis ci https docs travis ci com user environment variables or heroku https devcenter heroku com articles config vars expanding environment variables in env note this feature is available with react scripts 1 1 0 and higher expand variables already on your machine for use in your env file using dotenv expand https github com motdotla dotenv expand for example to get the environment variable npm package version react app version npm package version also works react app version npm package version or expand variables local to the current env file domain www example com react app foo domain foo react app bar domain bar can i use decorators many popular libraries use decorators https medium com google developers exploring es7 decorators 76ecb65fb841 in their documentation br create react app doesn t support decorator syntax at the moment because it is an experimental proposal and is subject to change the current specification version is not officially supported by babel if the specification changes we won t be able to write a codemod because we don t use them internally at facebook however in many cases you can rewrite decorator based code without decorators just as fine br please refer to these two threads for reference 214 https github com facebookincubator create react app issues 214 411 https github com facebookincubator create react app issues 411 create react app will add decorator support when the specification advances to a stable stage fetching data with ajax requests react doesn t prescribe a specific approach to data fetching but people commonly use either a library like axios https github com axios axios or the fetch api https developer mozilla org en us docs web api fetch api provided by the browser conveniently create react app includes a polyfill for fetch so you can use it without worrying about the browser support the global fetch function allows to easily makes ajax requests it takes in a url as an input and returns a promise that resolves to a response object you can find more information about fetch here https developer mozilla org en us docs web api fetch api using fetch this project also includes a promise polyfill https github com then promise which provides a full implementation of promises a a promise represents the eventual result of an asynchronous operation you can find more information about promises here https www promisejs org and here https developer mozilla org en us docs web javascript reference global objects promise both axios and fetch use promises under the hood you can also use the async await https davidwalsh name async await syntax to reduce the callback nesting you can learn more about making ajax requests from react components in the faq entry on the react website https reactjs org docs faq ajax html integrating with an api backend these tutorials will help you to integrate your app with an api backend running on another port using fetch to access it node check out this tutorial https www fullstackreact com articles using create react app with a server you can find the companion github repository here https github com fullstackreact food lookup demo ruby on rails check out this tutorial https www fullstackreact com articles how to get create react app to work with your rails api you can find the companion github repository here https github com fullstackreact food lookup demo rails proxying api requests in development note this feature is available with react scripts 0 2 3 and higher people often serve the front end react app from the same host and port as their backend implementation br for example a production setup might look like this after the app is deployed static server returns index html with react app todos static server returns index html with react app api todos server handles any api requests using the backend implementation such setup is not required however if you do have a setup like this it is convenient to write requests like fetch api todos without worrying about redirecting them to another host or port during development to tell the development server to proxy any unknown requests to your api server in development add a proxy field to your package json for example js proxy http localhost 4000 this way when you fetch api todos in development the development server will recognize that it s not a static asset and will proxy your request to http localhost 4000 api todos as a fallback the development server will only attempt to send requests without text html in its accept header to the proxy conveniently this avoids cors issues http stackoverflow com questions 21854516 understanding ajax cors and security considerations and error messages like this in development fetch api cannot load http localhost 4000 api todos no access control allow origin header is present on the requested resource origin http localhost 3000 is therefore not allowed access if an opaque response serves your needs set the request s mode to no cors to fetch the resource with cors disabled keep in mind that proxy only has effect in development with npm start and it is up to you to ensure that urls like api todos point to the right thing in production you don t have to use the api prefix any unrecognized request without a text html accept header will be redirected to the specified proxy the proxy option supports http https and websocket connections br if the proxy option is not flexible enough for you alternatively you can configure the proxy yourself configuring the proxy manually enable cors on your server here s how to do it for express http enable cors org server expressjs html use environment variables adding custom environment variables to inject the right server host and port into your app invalid host header errors after configuring proxy when you enable the proxy option you opt into a more strict set of host checks this is necessary because leaving the backend open to remote hosts makes your computer vulnerable to dns rebinding attacks the issue is explained in this article https medium com webpack webpack dev server middleware security issues 1489d950874a and this issue https github com webpack webpack dev server issues 887 this shouldn t affect you when developing on localhost but if you develop remotely like described here https github com facebookincubator create react app issues 2271 you will see this error in the browser after enabling the proxy option invalid host header to work around it you can specify your public development host in a file called env development in the root of your project host mypublicdevhost com if you restart the development server now and load the app from the specified host it should work if you are still having issues or if you re using a more exotic environment like a cloud editor you can bypass the host check completely by adding a line to env development local note that this is dangerous and exposes your machine to remote code execution from malicious websites note this is dangerous it exposes your machine to attacks from the websites you visit dangerously disable host check true we don t recommend this approach configuring the proxy manually note this feature is available with react scripts 1 0 0 and higher if the proxy option is not flexible enough for you you can specify an object in the following form in package json br you may also specify any configuration value http proxy middleware https github com chimurai http proxy middleware options or http proxy https github com nodejitsu node http proxy options supports js proxy api target url ws true all requests matching this path will be proxies no exceptions this includes requests for text html which the standard proxy option does not proxy if you need to specify multiple proxies you may do so by specifying additional entries matches are regular expressions so that you can use a regexp to match multiple paths js proxy matches any request starting with api api target url 1 ws true matches any request starting with foo foo target url 2 ssl true pathrewrite foo foo beta matches bar abc html but not bar sub def html bar html target url 3 matches baz abc html and baz sub def html baz html target url 4 configuring a websocket proxy when setting up a websocket proxy there are a some extra considerations to be aware of if you re using a websocket engine like socket io https socket io you must have a socket io server running that you can use as the proxy target socket io will not work with a standard websocket server specifically don t expect socket io to work with the websocket org echo test http websocket org echo html there s some good documentation available for setting up a socket io server https socket io docs standard websockets will work with a standard websocket server as well as the websocket org echo test you can use libraries like ws https github com websockets ws for the server with native websockets in the browser https developer mozilla org en us docs web api websocket either way you can proxy websocket requests manually in package json js proxy socket your compatible websocket server target ws socket url tell http proxy middleware that this is a websocket proxy also allows you to proxy websocket requests without an additional http request https github com chimurai http proxy middleware external websocket upgrade ws true using https in development note this feature is available with react scripts 0 4 0 and higher you may require the dev server to serve pages over https one particular case where this could be useful is when using the proxy feature proxying api requests in development to proxy requests to an api server when that api server is itself serving https to do this set the https environment variable to true then start the dev server as usual with npm start windows cmd exe cmd set https true npm start windows powershell powershell env https true and npm start note the lack of whitespace is intentional linux macos bash bash https true npm start note that the server will use a self signed certificate so your web browser will almost definitely display a warning upon accessing the page generating dynamic meta tags on the server since create react app doesn t support server rendering you might be wondering how to make meta tags dynamic and reflect the current url to solve this we recommend to add placeholders into the html like this html doctype html html lang en head meta property og title content og title meta property og budget content og budget then on the server regardless of the backend you use you can read index html into memory and replace og title og budget and any other placeholders with values depending on the current url just make sure to sanitize and escape the interpolated values so that they are safe to embed into html if you use a node server you can even share the route matching logic between the client and the server however duplicating it also works fine in simple cases pre rendering into static html files if you re hosting your build with a static hosting provider you can use react snapshot https www npmjs com package react snapshot or react snap https github com stereobooster react snap to generate html pages for each route or relative link in your application these pages will then seamlessly become active or hydrated when the javascript bundle has loaded there are also opportunities to use this outside of static hosting to take the pressure off the server when generating and caching routes the primary benefit of pre rendering is that you get the core content of each page with the html payload regardless of whether or not your javascript bundle successfully downloads it also increases the likelihood that each route of your application will be picked up by search engines you can read more about zero configuration pre rendering also called snapshotting here https medium com superhighfives an almost static stack 6df0a2791319 injecting data from the server into the page similarly to the previous section you can leave some placeholders in the html that inject global variables for example js doctype html html lang en head script window server data server data script then on the server you can replace server data with a json of real data right before sending the response the client code can then read window server data to use it make sure to sanitize the json before sending it to the client https medium com node security the most common xss vulnerability in react js applications 2bdffbcc1fa0 as it makes your app vulnerable to xss attacks running tests note this feature is available with react scripts 0 3 0 and higher br read the migration guide to learn how to enable it in older projects https github com facebookincubator create react app blob master changelog md migrating from 023 to 030 create react app uses jest https facebook github io jest as its test runner to prepare for this integration we did a major revamp https facebook github io jest blog 2016 09 01 jest 15 html of jest so if you heard bad things about it years ago give it another try jest is a node based runner this means that the tests always run in a node environment and not in a real browser this lets us enable fast iteration speed and prevent flakiness while jest provides browser globals such as window thanks to jsdom https github com tmpvar jsdom they are only approximations of the real browser behavior jest is intended to be used for unit tests of your logic and your components rather than the dom quirks we recommend that you use a separate tool for browser end to end tests if you need them they are beyond the scope of create react app filename conventions jest will look for test files with any of the following popular naming conventions files with js suffix in tests folders files with test js suffix files with spec js suffix the test js spec js files or the tests folders can be located at any depth under the src top level folder we recommend to put the test files or tests folders next to the code they are testing so that relative imports appear shorter for example if app test js and app js are in the same folder the test just needs to import app from app instead of a long relative path colocation also helps find tests more quickly in larger projects command line interface when you run npm test jest will launch in the watch mode every time you save a file it will re run the tests just like npm start recompiles the code the watcher includes an interactive command line interface with the ability to run all tests or focus on a search pattern it is designed this way so that you can keep it open and enjoy fast re runs you can learn the commands from the watch usage note that the watcher prints after every run jest watch mode http facebook github io jest img blog 15 watch gif version control integration by default when you run npm test jest will only run the tests related to files changed since the last commit this is an optimization designed to make your tests run fast regardless of how many tests you have however it assumes that you don t often commit the code that doesn t pass the tests jest will always explicitly mention that it only ran tests related to the files changed since the last commit you can also press a in the watch mode to force jest to run all tests jest will always run all tests on a continuous integration continuous integration server or if the project is not inside a git or mercurial repository writing tests to create tests add it or test blocks with the name of the test and its code you may optionally wrap them in describe blocks for logical grouping but this is neither required nor recommended jest provides a built in expect global function for making assertions a basic test could look like this js import sum from sum it sums numbers expect sum 1 2 toequal 3 expect sum 2 2 toequal 4 all expect matchers supported by jest are extensively documented here https facebook github io jest docs en expect html content br you can also use jest fn and expect fn tobecalled https facebook github io jest docs en expect html tohavebeencalled to create spies or mock functions testing components there is a broad spectrum of component testing techniques they range from a smoke test verifying that a component renders without throwing to shallow rendering and testing some of the output to full rendering and testing component lifecycle and state changes different projects choose different testing tradeoffs based on how often components change and how much logic they contain if you haven t decided on a testing strategy yet we recommend that you start with creating simple smoke tests for your components js import react from react import reactdom from react dom import app from app it renders without crashing const div document createelement div reactdom render app div this test mounts a component and makes sure that it didn t throw during rendering tests like this provide a lot of value with very little effort so they are great as a starting point and this is the test you will find in src app test js when you encounter bugs caused by changing components you will gain a deeper insight into which parts of them are worth testing in your application this might be a good time to introduce more specific tests asserting specific expected output or behavior if you d like to test components in isolation from the child components they render we recommend using shallow rendering api http airbnb io enzyme docs api shallow html from enzyme http airbnb io enzyme to install it run sh npm install save enzyme enzyme adapter react 16 react test renderer alternatively you may use yarn sh yarn add enzyme enzyme adapter react 16 react test renderer as of enzyme 3 you will need to install enzyme along with an adapter corresponding to the version of react you are using the examples above use the adapter for react 16 the adapter will also need to be configured in your global setup file initializing test environment src setuptests js js import configure from enzyme import adapter from enzyme adapter react 16 configure adapter new adapter note keep in mind that if you decide to eject before creating src setuptests js the resulting package json file won t contain any reference to it read here initializing test environment to learn how to add this after ejecting now you can write a smoke test with it js import react from react import shallow from enzyme import app from app it renders without crashing shallow app unlike the previous smoke test using reactdom render this test only renders app and doesn t go deeper for example even if app itself renders a button that throws this test will pass shallow rendering is great for isolated unit tests but you may still want to create some full rendering tests to ensure the components integrate correctly enzyme supports full rendering with mount http airbnb io enzyme docs api mount html and you can also use it for testing state changes and component lifecycle you can read the enzyme documentation http airbnb io enzyme for more testing techniques enzyme documentation uses chai and sinon for assertions but you don t have to use them because jest provides built in expect and jest fn for spies here is an example from enzyme documentation that asserts specific output rewritten to use jest matchers js import react from react import shallow from enzyme import app from app it renders welcome message const wrapper shallow app const welcome h2 welcome to react h2 expect wrapper contains welcome to equal true expect wrapper contains welcome toequal true all jest matchers are extensively documented here http facebook github io jest docs en expect html br nevertheless you can use a third party assertion library like chai http chaijs com if you want to as described below additionally you might find jest enzyme https github com blainekasten enzyme matchers helpful to simplify your tests with readable matchers the above contains code can be written more simply with jest enzyme js expect wrapper tocontainreact welcome to enable this install jest enzyme sh npm install save jest enzyme alternatively you may use yarn sh yarn add jest enzyme import it in src setuptests js initializing test environment to make its matchers available in every test js import jest enzyme using third party assertion libraries we recommend that you use expect for assertions and jest fn for spies if you are having issues with them please file those against jest https github com facebook jest issues new and we ll fix them we intend to keep making them better for react supporting for example pretty printing react elements as jsx https github com facebook jest pull 1566 however if you are used to other libraries such as chai http chaijs com and sinon http sinonjs org or if you have existing code using them that you d like to port over you can import them normally like this js import sinon from sinon import expect from chai and then use them in your tests like you normally do initializing test environment note this feature is available with react scripts 0 4 0 and higher if your app uses a browser api that you need to mock in your tests or if you just need a global setup before running your tests add a src setuptests js to your project it will be automatically executed before running your tests for example src setuptests js js const localstoragemock getitem jest fn setitem jest fn clear jest fn global localstorage localstoragemock note keep in mind that if you decide to eject before creating src setuptests js the resulting package json file won t contain any reference to it so you should manually create the property setuptestframeworkscriptfile in the configuration for jest something like the following js jest setuptestframeworkscriptfile rootdir src setuptests js focusing and excluding tests you can replace it with xit to temporarily exclude a test from being executed br similarly fit lets you focus on a specific test without running any other tests coverage reporting jest has an integrated coverage reporter that works well with es6 and requires no configuration br run npm test coverage note extra in the middle to include a coverage report like this coverage report http i imgur com 5bfhnts png note that tests run much slower with coverage so it is recommended to run it separately from your normal workflow configuration the default jest coverage configuration can be overriden by adding any of the following supported keys to a jest config in your package json supported overrides collectcoveragefrom https facebook github io jest docs en configuration html collectcoveragefrom array coveragereporters https facebook github io jest docs en configuration html coveragereporters array string coveragethreshold https facebook github io jest docs en configuration html coveragethreshold object snapshotserializers https facebook github io jest docs en configuration html snapshotserializers array string example package json json name your package jest collectcoveragefrom src js jsx rootdir node modules rootdir path to dir coveragethreshold global branches 90 functions 90 lines 90 statements 90 coveragereporters text snapshotserializers my serializer module continuous integration by default npm test runs the watcher with interactive cli however you can force it to run tests once and finish the process by setting an environment variable called ci when creating a build of your application with npm run build linter warnings are not checked by default like npm test you can force the build to perform a linter warning check by setting the environment variable ci if any warnings are encountered then the build fails popular ci servers already set the environment variable ci by default but you can do this yourself too on ci servers travis ci 1 following the travis getting started https docs travis ci com user getting started guide for syncing your github repository with travis you may need to initialize some settings manually in your profile https travis ci org profile page 1 add a travis yml file to your git repository language node js node js 6 cache directories node modules script npm run build npm test 1 trigger your first build with a git push 1 customize your travis ci build https docs travis ci com user customizing the build if needed circleci follow this article https medium com knowbody circleci and zeits now sh c9b7eebcd3c1 to set up circleci with a create react app project on your own environment windows cmd exe cmd set ci true npm test cmd set ci true npm run build note the lack of whitespace is intentional windows powershell powershell env ci true and npm test powershell env ci true and npm run build linux macos bash bash ci true npm test bash ci true npm run build the test command will force jest to run tests once instead of launching the watcher if you find yourself doing this often in development please file an issue https github com facebookincubator create react app issues new to tell us about your use case because we want to make watcher the best experience and are open to changing how it works to accommodate more workflows the build command will check for linter warnings and fail if any are found disabling jsdom by default the package json of the generated project looks like this js scripts start react scripts start build react scripts build test react scripts test env jsdom if you know that none of your tests depend on jsdom https github com tmpvar jsdom you can safely remove env jsdom and your tests will run faster diff scripts start react scripts start build react scripts build test react scripts test env jsdom test react scripts test to help you make up your mind here is a list of apis that need jsdom any browser globals like window and document reactdom render https facebook github io react docs top level api html reactdom render testutils renderintodocument https facebook github io react docs test utils html renderintodocument a shortcut https github com facebook react blob 34761cf9a252964abfaab6faf74d473ad95d1f21 src test reacttestutils js l83 l91 for the above mount http airbnb io enzyme docs api mount html in enzyme http airbnb io enzyme index html in contrast jsdom is not needed for the following apis testutils createrenderer https facebook github io react docs test utils html shallow rendering shallow rendering shallow http airbnb io enzyme docs api shallow html in enzyme http airbnb io enzyme index html finally jsdom is also not needed for snapshot testing http facebook github io jest blog 2016 07 27 jest 14 html snapshot testing snapshot testing is a feature of jest that automatically generates text snapshots of your components and saves them on the disk so if the ui output changes you get notified without manually writing any assertions on the component output read more about snapshot testing http facebook github io jest blog 2016 07 27 jest 14 html editor integration if you use visual studio code https code visualstudio com there is a jest extension https github com orta vscode jest which works with create react app out of the box this provides a lot of ide like features while using a text editor showing the status of a test run with potential fail messages inline starting and stopping the watcher automatically and offering one click snapshot updates vs code jest preview https cloud githubusercontent com assets 49038 20795349 a032308a b7c8 11e6 9b34 7eeac781003f png debugging tests there are various ways to setup a debugger for your jest tests we cover debugging in chrome and visual studio code https code visualstudio com note debugging tests requires node 8 or higher debugging tests in chrome add the following to the scripts section in your project s package json json scripts test debug react scripts inspect brk test runinband env jsdom place debugger statements in any test and run bash npm run test debug this will start running your jest tests but pause before executing to allow a debugger to attach to the process open the following in chrome about inspect after opening that link the chrome developer tools will be displayed select inspect on your process and a breakpoint will be set at the first line of the react script this is done simply to give you time to open the developer tools and to prevent jest from executing before you have time to do so click the button that looks like a play button in the upper right hand side of the screen to continue execution when jest executes the test that contains the debugger statement execution will pause and you can examine the current scope and call stack note the runinband cli option makes sure jest runs test in the same process rather than spawning processes for individual tests normally jest parallelizes test runs across processes but it is hard to debug many processes at the same time debugging tests in visual studio code debugging jest tests is supported out of the box for visual studio code https code visualstudio com use the following launch json https code visualstudio com docs editor debugging launch configurations configuration file version 0 2 0 configurations name debug cra tests type node request launch runtimeexecutable workspaceroot node modules bin react scripts args test runinband no cache env jsdom cwd workspaceroot protocol inspector console integratedterminal internalconsoleoptions neveropen developing components in isolation usually in an app you have a lot of ui components and each of them has many different states for an example a simple button component could have following states in a regular state with a text label in the disabled mode in a loading state usually it s hard to see these states without running a sample app or some examples create react app doesn t include any tools for this by default but you can easily add storybook for react https storybook js org source https github com storybooks storybook or react styleguidist https react styleguidist js org source https github com styleguidist react styleguidist to your project these are third party tools that let you develop components and see all their states in isolation from your app storybook for react demo http i imgur com 7ciawpb gif you can also deploy your storybook or style guide as a static app this way everyone in your team can view and review different states of ui components without starting a backend server or creating an account in your app getting started with storybook storybook is a development environment for react ui components it allows you to browse a component library view the different states of each component and interactively develop and test components first install the following npm package globally sh npm install g storybook cli then run the following command inside your app s directory sh getstorybook after that follow the instructions on the screen learn more about react storybook screencast getting started with react storybook https egghead io lessons react getting started with react storybook github repo https github com storybooks storybook documentation https storybook js org basics introduction snapshot testing ui https github com storybooks storybook tree master addons storyshots with storybook addon storyshot getting started with styleguidist styleguidist combines a style guide where all your components are presented on a single page with their props documentation and usage examples with an environment for developing components in isolation similar to storybook in styleguidist you write examples in markdown where each code snippet is rendered as a live editable playground first install styleguidist sh npm install save react styleguidist alternatively you may use yarn sh yarn add react styleguidist then add these scripts to your package json diff scripts styleguide styleguidist server styleguide build styleguidist build start react scripts start then run the following command inside your app s directory sh npm run styleguide after that follow the instructions on the screen learn more about react styleguidist github repo https github com styleguidist react styleguidist documentation https react styleguidist js org docs getting started html publishing components to npm create react app doesn t provide any built in functionality to publish a component to npm if you re ready to extract a component from your project so other people can use it we recommend moving it to a separate directory outside of your project and then using a tool like nwb https github com insin nwb react components and libraries to prepare it for publishing making a progressive web app by default the production build is a fully functional offline first progressive web app https developers google com web progressive web apps progressive web apps are faster and more reliable than traditional web pages and provide an engaging mobile experience all static site assets are cached so that your page loads fast on subsequent visits regardless of network connectivity such as 2g or 3g updates are downloaded in the background your app will work regardless of network state even if offline this means your users will be able to use your app at 10 000 feet and on the subway on mobile devices your app can be added directly to the user s home screen app icon and all you can also re engage users using web push notifications this eliminates the need for the app store the sw precache webpack plugin https github com goldhand sw precache webpack plugin is integrated into production configuration and it will take care of generating a service worker file that will automatically precache all of your local assets and keep them up to date as you deploy updates the service worker will use a cache first strategy https developers google com web fundamentals instant and offline offline cookbook cache falling back to network for handling all requests for local assets including the initial html ensuring that your web app is reliably fast even on a slow or unreliable network opting out of caching if you would prefer not to enable service workers prior to your initial production deployment then remove the call to registerserviceworker from src index js src index js if you had previously enabled service workers in your production deployment and have decided that you would like to disable them for all your existing users you can swap out the call to registerserviceworker in src index js src index js first by modifying the service worker import javascript import unregister from registerserviceworker and then call unregister instead after the user visits a page that has unregister the service worker will be uninstalled note that depending on how service worker js is served it may take up to 24 hours for the cache to be invalidated offline first considerations 1 service workers require https https developers google com web fundamentals getting started primers service workers you need https although to facilitate local testing that policy does not apply to localhost http stackoverflow com questions 34160509 options for testing service workers via http 34161385 34161385 if your production web server does not support https then the service worker registration will fail but the rest of your web app will remain functional 1 service workers are not currently supported https jakearchibald github io isserviceworkerready in all web browsers service worker registration won t be attempted src registerserviceworker js on browsers that lack support 1 the service worker is only enabled in the production environment deployment e g the output of npm run build it s recommended that you do not enable an offline first service worker in a development environment as it can lead to frustration when previously cached assets are used and do not include the latest changes you ve made locally 1 if you need to test your offline first service worker locally build the application using npm run build and run a simple http server from your build directory after running the build script create react app will give instructions for one way to test your production build locally and the deployment instructions deployment have instructions for using other methods be sure to always use an incognito window to avoid complications with your browser cache 1 if possible configure your production environment to serve the generated service worker js with http caching disabled http stackoverflow com questions 38843970 service worker javascript update frequency every 24 hours if that s not possible github pages github pages for instance does not allow you to change the default 10 minute http cache lifetime then be aware that if you visit your production site and then revisit again before service worker js has expired from your http cache you ll continue to get the previously cached assets from the service worker if you have an immediate need to view your updated production deployment performing a shift refresh will temporarily disable the service worker and retrieve all assets from the network 1 users aren t always familiar with offline first web apps it can be useful to let the user know https developers google com web fundamentals instant and offline offline ux inform the user when the app is ready for offline consumption when the service worker has finished populating your caches showing a this web app works offline message and also let them know when the service worker has fetched the latest updates that will be available the next time they load the page showing a new content is available please refresh message showing this messages is currently left as an exercise to the developer but as a starting point you can make use of the logic included in src registerserviceworker js src registerserviceworker js which demonstrates which service worker lifecycle events to listen for to detect each scenario and which as a default just logs appropriate messages to the javascript console 1 by default the generated service worker file will not intercept or cache any cross origin traffic like http api requests integrating with an api backend images or embeds loaded from a different domain if you would like to use a runtime caching strategy for those requests you can eject npm run eject and then configure the runtimecaching https github com googlechrome sw precache runtimecaching arrayobject option in the swprecachewebpackplugin section of webpack config prod js config webpack config prod js progressive web app metadata the default configuration includes a web app manifest located at public manifest json public manifest json that you can customize with details specific to your web application when a user adds a web app to their homescreen using chrome or firefox on android the metadata in manifest json public manifest json determines what icons names and branding colors to use when the web app is displayed the web app manifest guide https developers google com web fundamentals engage and retain web app manifest provides more context about what each field means and how your customizations will affect your users experience analyzing the bundle size source map explorer https www npmjs com package source map explorer analyzes javascript bundles using the source maps this helps you understand where code bloat is coming from to add source map explorer to a create react app project follow these steps sh npm install save source map explorer alternatively you may use yarn sh yarn add source map explorer then in package json add the following line to scripts diff scripts analyze source map explorer build static js main start react scripts start build react scripts build test react scripts test env jsdom then to analyze the bundle run the production build then run the analyze script npm run build npm run analyze deployment npm run build creates a build directory with a production build of your app set up your favorite http server so that a visitor to your site is served index html and requests to static paths like static js main hash js are served with the contents of the static js main hash js file static server for environments using node https nodejs org the easiest way to handle this would be to install serve https github com zeit serve and let it handle the rest sh npm install g serve serve s build the last command shown above will serve your static site on the port 5000 like many of serve https github com zeit serve s internal settings the port can be adjusted using the p or port flags run this command to get a full list of the options available sh serve h other solutions you don t necessarily need a static server in order to run a create react app project in production it works just as fine integrated into an existing dynamic one here s a programmatic example using node https nodejs org and express http expressjs com javascript const express require express const path require path const app express app use express static path join dirname build app get function req res res sendfile path join dirname build index html app listen 9000 the choice of your server software isn t important either since create react app is completely platform agnostic there s no need to explicitly use node the build folder with static assets is the only output produced by create react app however this is not quite enough if you use client side routing read the next section if you want to support urls like todos 42 in your single page app serving apps with client side routing if you use routers that use the html5 pushstate history api https developer mozilla org en us docs web api history api adding and modifying history entries under the hood for example react router https github com reacttraining react router with browserhistory many static file servers will fail for example if you used react router with a route for todos 42 the development server will respond to localhost 3000 todos 42 properly but an express serving a production build as above will not this is because when there is a fresh page load for a todos 42 the server looks for the file build todos 42 and does not find it the server needs to be configured to respond to a request to todos 42 by serving index html for example we can amend our express example above to serve index html for any unknown paths diff app use express static path join dirname build app get function req res app get function req res res sendfile path join dirname build index html if you re using apache http server https httpd apache org you need to create a htaccess file in the public folder that looks like this options multiviews rewriteengine on rewritecond request filename f rewriterule index html qsa l it will get copied to the build folder when you run npm run build if you re using apache tomcat http tomcat apache org you need to follow this stack overflow answer https stackoverflow com a 41249464 4878474 now requests to todos 42 will be handled correctly both in development and in production on a production build and in a browser that supports service workers https developers google com web fundamentals getting started primers service workers the service worker will automatically handle all navigation requests like for todos 42 by serving the cached copy of your index html this service worker navigation routing can be configured or disabled by eject ing npm run eject and then modifying the navigatefallback https github com googlechrome sw precache navigatefallback string and navigatefallbackwhitelist https github com googlechrome sw precache navigatefallbackwhitelist arrayregexp options of the swpreacheplugin configuration config webpack config prod js when users install your app to the homescreen of their device the default configuration will make a shortcut to index html this may not work for client side routers which expect the app to be served from edit the web app manifest at public manifest json public manifest json and change start url to match the required url scheme for example js start url building for relative paths by default create react app produces a build assuming your app is hosted at the server root br to override this specify the homepage in your package json for example js homepage http mywebsite com relativepath this will let create react app correctly infer the root path to use in the generated html file note if you are using react router 4 you can root link s using the basename prop on any router br more information here https reacttraining com react router web api browserrouter basename string br br for example js browserrouter basename calendar link to today renders a href calendar today serving the same build from different paths note this feature is available with react scripts 0 9 0 and higher if you are not using the html5 pushstate history api or not using client side routing at all it is unnecessary to specify the url from which your app will be served instead you can put this in your package json js homepage this will make sure that all the asset paths are relative to index html you will then be able to move your app from http mywebsite com to http mywebsite com relativepath or even http mywebsite com relative path without having to rebuild it azure https azure microsoft com see this https medium com to pe deploying create react app on microsoft azure c0f6686a4321 blog post on how to deploy your react app to microsoft azure see this https medium com strid host create react app on azure 986bc40d5bf2 pycfnafbg blog post or this https github com ulrikaugustsson azure appservice static repo for a way to use automatic deployment to azure app service firebase https firebase google com install the firebase cli if you haven t already by running npm install g firebase tools sign up for a firebase account https console firebase google com and create a new project run firebase login and login with your previous created firebase account then run the firebase init command from your project s root you need to choose the hosting configure and deploy firebase hosting sites and choose the firebase project you created in the previous step you will need to agree with database rules json being created choose build as the public directory and also agree to configure as a single page app by replying with y sh project setup first let s associate this project directory with a firebase project you can create multiple project aliases by running firebase use add but for now we ll just set up a default project what firebase project do you want to associate as default example app example app fd690 database setup firebase realtime database rules allow you to define how your data should be structured and when your data can be read from and written to what file should be used for database rules database rules json database rules for example app fd690 have been downloaded to database rules json future modifications to database rules json will update database rules when you run firebase deploy hosting setup your public directory is the folder relative to your project directory that will contain hosting assets to uploaded with firebase deploy if you have a build process for your assets use your build s output directory what do you want to use as your public directory build configure as a single page app rewrite all urls to index html yes wrote build index html i writing configuration info to firebase json i writing project information to firebaserc firebase initialization complete important you need to set proper http caching headers for service worker js file in firebase json file or you will not be able to see changes after first deployment issue 2440 https github com facebookincubator create react app issues 2440 it should be added inside hosting key like next hosting headers source service worker js headers key cache control value no cache now after you create a production build with npm run build you can deploy it by running firebase deploy sh deploying to example app fd690 i deploying database hosting database rules ready to deploy i hosting preparing build directory for upload uploading 75 hosting build folder uploaded successfully hosting 8 files uploaded successfully i starting release process may take several minutes deploy complete project console https console firebase google com project example app fd690 overview hosting url https example app fd690 firebaseapp com for more information see add firebase to your javascript project https firebase google com docs web setup github pages https pages github com note this feature is available with react scripts 0 2 0 and higher step 1 add homepage to package json the step below is important br if you skip it your app will not deploy correctly open your package json and add a homepage field for your project json homepage https myusername github io my app or for a github user page json homepage https myusername github io create react app uses the homepage field to determine the root url in the built html file step 2 install gh pages and add deploy to scripts in package json now whenever you run npm run build you will see a cheat sheet with instructions on how to deploy to github pages to publish it at https myusername github io my app https myusername github io my app run sh npm install save gh pages alternatively you may use yarn sh yarn add gh pages add the following scripts in your package json diff scripts predeploy npm run build deploy gh pages d build start react scripts start build react scripts build the predeploy script will run automatically before deploy is run if you are deploying to a github user page instead of a project page you ll need to make two additional modifications 1 first change your repository s source branch to be any branch other than master 1 additionally tweak your package json scripts to push deployments to master diff scripts predeploy npm run build deploy gh pages d build deploy gh pages b master d build step 3 deploy the site by running npm run deploy then run sh npm run deploy step 4 ensure your project s settings use gh pages finally make sure github pages option in your github project settings is set to use the gh pages branch img src http i imgur com hujer9l png width 500 alt gh pages branch setting step 5 optionally configure the domain you can configure a custom domain with github pages by adding a cname file to the public folder notes on client side routing github pages doesn t support routers that use the html5 pushstate history api under the hood for example react router using browserhistory this is because when there is a fresh page load for a url like http user github io todomvc todos 42 where todos 42 is a frontend route the github pages server returns 404 because it knows nothing of todos 42 if you want to add a router to a project hosted on github pages here are a couple of solutions you could switch from using html5 history api to routing with hashes if you use react router you can switch to hashhistory for this effect but the url will be longer and more verbose for example http user github io todomvc todos 42 k yknaj read more https reacttraining com react router web api router about different history implementations in react router alternatively you can use a trick to teach github pages to handle 404 by redirecting to your index html page with a special redirect parameter you would need to add a 404 html file with the redirection code to the build folder before deploying your project and you ll need to add code handling the redirect parameter to index html you can find a detailed explanation of this technique in this guide https github com rafrex spa github pages heroku https www heroku com use the heroku buildpack for create react app https github com mars create react app buildpack br you can find instructions in deploying react with zero configuration https blog heroku com deploying react with zero configuration resolving heroku deployment errors sometimes npm run build works locally but fails during deploy via heroku following are the most common cases module not found error cannot resolve file or directory if you get something like this remote failed to create a production build reason remote module not found error cannot resolve file or directory mydirectory in tmp build 1234 src it means you need to ensure that the lettercase of the file or directory you import matches the one you see on your filesystem or on github this is important because linux the operating system used by heroku is case sensitive so mydirectory and mydirectory are two distinct directories and thus even though the project builds locally the difference in case breaks the import statements on heroku remotes could not find a required file if you exclude or ignore necessary files from the package you will see a error similar this one remote could not find a required file remote name index html remote searched in tmp build a2875fc163b209225122d68916f1d4df public remote remote npm err linux 3 13 0 105 generic remote npm err argv tmp build a2875fc163b209225122d68916f1d4df heroku node bin node tmp build a2875fc163b209225122d68916f1d4df heroku node bin npm run build in this case ensure that the file is there with the proper lettercase and that s not ignored on your local gitignore or gitignore global netlify https www netlify com to do a manual deploy to netlify s cdn sh npm install netlify cli g netlify deploy choose build as the path to deploy to setup continuous delivery with this setup netlify will build and deploy when you push to git or open a pull request 1 start a new netlify project https app netlify com signup 2 pick your git hosting service and select your repository 3 set yarn build as the build command and build as the publish directory 4 click deploy site support for client side routing to support pushstate make sure to create a public redirects file with the following rewrite rules index html 200 when you build the project create react app will place the public folder contents into the build output now https zeit co now now offers a zero configuration single command deployment you can use now to deploy your app for free 1 install the now command line tool either via the recommended desktop tool https zeit co download or via node with npm install g now 2 build your app by running npm run build 3 move into the build directory by running cd build 4 run now name your project name from within the build directory you will see a now sh url in your output like this ready https your project name tpspyhtdtk now sh copied to clipboard paste that url into your browser when the build is complete and you will see your deployed app details are available in this article https zeit co blog unlimited static s3 https aws amazon com s3 and cloudfront https aws amazon com cloudfront see this blog post https medium com omgwtfmarc deploying create react app to s3 or cloudfront 48dae4ce0af on how to deploy your react app to amazon web services s3 and cloudfront surge https surge sh install the surge cli if you haven t already by running npm install g surge run the surge command and log in you or create a new account when asked about the project path make sure to specify the build folder for example sh project path path to project build note that in order to support routers that use html5 pushstate api you may want to rename the index html in your build folder to 200 html before deploying to surge this ensures that every url falls back to that file https surge sh help adding a 200 page for client side routing advanced configuration you can adjust various development and production settings by setting environment variables in your shell or with env adding development environment variables in env variable development production usage browser white check mark x by default create react app will open the default system browser favoring chrome on macos specify a browser https github com sindresorhus opn app to override this behavior or set it to none to disable it completely if you need to customize the way the browser is launched you can specify a node script instead any arguments passed to npm start will also be passed to this script and the url where your app is served will be the last argument your script s file name must have the js extension host white check mark x by default the development web server binds to localhost you may use this variable to specify a different host port white check mark x by default the development web server will attempt to listen on port 3000 or prompt you to attempt the next available port you may use this variable to specify a different port https white check mark x when set to true create react app will run the development server in https mode public url x white check mark create react app assumes your application is hosted at the serving web server s root or a subpath as specified in package json homepage building for relative paths normally create react app ignores the hostname you may use this variable to force assets to be referenced verbatim to the url you provide hostname included this may be particularly useful when using a cdn to host your application ci large orange diamond white check mark when set to true create react app treats warnings as failures in the build it also makes the test runner non watching most cis set this flag by default react editor white check mark x when an app crashes in development you will see an error overlay with clickable stack trace when you click on it create react app will try to determine the editor you are using based on currently running processes and open the relevant source file you can send a pull request to detect your editor of choice https github com facebookincubator create react app issues 2636 setting this environment variable overrides the automatic detection if you do it make sure your systems path https en wikipedia org wiki path variable environment variable points to your editor s bin folder you can also set it to none to disable it completely chokidar usepolling white check mark x when set to true the watcher runs in polling mode as necessary inside a vm use this option if npm start isn t detecting changes generate sourcemap x white check mark when set to false source maps are not generated for a production build this solves oom issues on some smaller machines node path white check mark white check mark same as node path in node js https nodejs org api modules html modules loading from the global folders but only relative folders are allowed can be handy for emulating a monorepo setup by setting node path src troubleshooting npm start doesn t detect changes when you save a file while npm start is running the browser should refresh with the updated code br if this doesn t happen try one of the following workarounds if your project is in a dropbox folder try moving it out if the watcher doesn t see a file called index js and you re referencing it by the folder name you need to restart the watcher https github com facebookincubator create react app issues 1164 due to a webpack bug some editors like vim and intellij have a safe write feature that currently breaks the watcher you will need to disable it follow the instructions in adjusting your text editor https webpack js org guides development adjusting your text editor if your project path contains parentheses try moving the project to a path without them this is caused by a webpack watcher bug https github com webpack watchpack issues 42 on linux and macos you might need to tweak system settings https github com webpack docs wiki troubleshooting not enough watchers to allow more watchers if the project runs inside a virtual machine such as a vagrant provisioned virtualbox create an env file in your project directory if it doesn t exist and add chokidar usepolling true to it this ensures that the next time you run npm start the watcher uses the polling mode as necessary inside a vm if none of these solutions help please leave a comment in this thread https github com facebookincubator create react app issues 659 npm test hangs on macos sierra if you run npm test and the console gets stuck after printing react scripts test env jsdom to the console there might be a problem with your watchman https facebook github io watchman installation as described in facebookincubator create react app 713 https github com facebookincubator create react app issues 713 we recommend deleting node modules in your project and running npm install or yarn if you use it first if it doesn t help you can try one of the numerous workarounds mentioned in these issues facebook jest 1767 https github com facebook jest issues 1767 facebook watchman 358 https github com facebook watchman issues 358 ember cli ember cli 6259 https github com ember cli ember cli issues 6259 it is reported that installing watchman 4 7 0 or newer fixes the issue if you use homebrew http brew sh you can run these commands to update it watchman shutdown server brew update brew reinstall watchman you can find other installation methods https facebook github io watchman docs install html build install on the watchman documentation page if this still doesn t help try running launchctl unload f library launchagents com github facebook watchman plist there are also reports that uninstalling watchman fixes the issue so if nothing else helps remove it from your system and try again npm run build exits too early it is reported that npm run build can fail on machines with limited memory and no swap space which is common in cloud environments even with small projects this command can increase ram usage in your system by hundreds of megabytes so if you have less than 1 gb of available memory your build is likely to fail with the following message the build failed because the process exited too early this probably means the system ran out of memory or someone called kill 9 on the process if you are completely sure that you didn t terminate the process consider adding some swap space https www digitalocean com community tutorials how to add swap on ubuntu 14 04 to the machine you re building on or build the project locally npm run build fails on heroku this may be a problem with case sensitive filenames please refer to this section resolving heroku deployment errors moment js locales are missing if you use a moment js https momentjs com you might notice that only the english locale is available by default this is because the locale files are large and you probably only need a subset of all the locales provided by moment js https momentjs com multiple locale support to add a specific moment js locale to your bundle you need to import it explicitly br for example js import moment from moment import moment locale fr if import multiple locales this way you can later switch between them by calling moment locale with the locale name js import moment from moment import moment locale fr import moment locale es moment locale fr this will only work for locales that have been explicitly imported before npm run build fails to minify some third party packages don t compile their code to es5 before publishing to npm this often causes problems in the ecosystem because neither browsers except for most modern versions nor some tools currently support all es6 features we recommend to publish code on npm as es5 at least for a few more years br to resolve this 1 open an issue on the dependency s issue tracker and ask that the package be published pre compiled note create react app can consume both commonjs and es modules for node js compatibility it is recommended that the main entry point is commonjs however they can optionally provide an es module entry point with the module field in package json note that even if a library provides an es modules version it should still precompile other es6 features to es5 if it intends to support older browsers 2 fork the package and publish a corrected version yourself 3 if the dependency is small enough copy it to your src folder and treat it as application code in the future we might start automatically compiling incompatible third party modules but it is not currently supported this approach would also slow down the production builds alternatives to ejecting ejecting npm run eject lets you customize anything but from that point on you have to maintain the configuration and scripts yourself this can be daunting if you have many similar projects in such cases instead of ejecting we recommend to fork react scripts and any other packages you need this article https auth0 com blog how to configure create react app dives into how to do it in depth you can find more discussion in this issue https github com facebookincubator create react app issues 682 something missing if you have ideas for more how to recipes that should be on this page let us know https github com facebookincubator create react app issues or contribute some https github com facebookincubator create react app edit master packages react scripts template readme md | react application time and attendance management system | server |
CV-Star-Sensor | cv star sensor welcome to you if you re coming from instructables msc project repo for computer vision star identification and satellite orientation project currently active please see a short explanatory video on youtube https www youtube com watch v ayilzsxmrgo additionally this tutorial https pythonprogramming net haar cascade object detection python opencv tutorial is a really useful beginner s guide to opencv classifier training contents so far stellarium scripts used to capture thousands of images from stellarium in order to be processed into negative image datasets for machine learning training zipped folders containing negative image datasets as well as bg txt files and python programs used to create these python programs used to create the positive images used for cascade training image files of the fiducial markers applied to starfields to identify the patterns of bright stars that the machine learning relies upon for the identification a sample set of 31 trained cascades for the northern celestial hemisphere python programs used to test the trained cascades against a supplied starfield image what next 19 08 19 i have finished working on this project as part of my university course i hope to be able to spend further time on it as a hobby in order to keep developing the system there are lots of improvements and additions i would like to have time to make i hope that this repository may be of use to someone and if you have questions please contact me i will continue to monitor and work on this project the best source of reference here is my msc thesis itself which can be found above 25 05 20 i ve been putting more thought into the potential improvement and applications of this project i hope that the instructables writeup will help other people find this repo and hopefully we can work together to develop this further | ai |
|
braccio_camai | braccio camai description low cost robot arm powered by computer vision and artificial intelligence perfect for experimenting and diy projects using an arduino braccio robot arm raspberry pi 3 and google coral usb accelerator it allows you to actively track and follow more that 90 different type of objects autonomously it is fully integrated in ros robot operating system using moveit or joint interface and uses state of the art algorithms for object detection or classification powered by tensorflow demo braccio camai https img youtube com vi eyv84zgbcvc 0 jpg https www youtube com watch v eyv84zgbcvc braccio camai low cost and open source robot arm powered by ros computer vision and ai open source code tutorials videos and plug and play raspberry pi images are available more information and documentation on https braccio camai readthedocs io questions issues and contributions are more than welcome author francisco j garcia r 2019 license this project is licensed under mit see license file for more details this project was based on great previous developments ros braccio urdf https github com grassjelly ros braccio urdf and ros braccio moveit https github com zakizadeh ros braccio moveit | ros google-coral robot-arm arduino raspberry-pi-3 object-detection tensorflow | ai |
Cloud_computing_DE | cloud computing de contains cloud computing data engineering labs | cloud |
|
frontend-assignment | intricately front end challenge requirements and design mocks are available here https docs google com document d 1b3ofnk0nc2dawtpj1flky9ekzqjjri ejdxqv8nmpiw edit heading h rmoqo627p0vn prerequisites node https nodejs org en 8 4 0 and npm https www npmjs com get npm 5 4 1 all commands described in this document should be executed from the project s root directory getting started clone the repo install all project s dependencies with npm install and run npm start when it s done building the app is accessible from localhost 8080 | front_end |
|
0xDeCA10B | sharing updatable models sum on blockchain formerly decentralized collaborative ai on blockchain img src assets logo gif raw true width 500 alt animated logo for the project a neural network appears on a block the nodes change color until finally converging the block slides away on a chain and the process restarts on the next blank block put horizontally since build status badges are normally horizontal demo demo folder simulation simulation folder security demo test https github com microsoft 0xdeca10b actions workflows demo test yml badge svg branch main https github com microsoft 0xdeca10b actions workflows demo test yml simulation test https github com microsoft 0xdeca10b actions workflows simulation test yml badge svg branch main https github com microsoft 0xdeca10b actions workflows simulation test yml build status https dev azure com maluuba 0xdeca10b apis build status security 20checks branchname main https dev azure com maluuba 0xdeca10b build latest definitionid 118 branchname main sharing updatable models sum on blockchain is a framework to host and train publicly available machine learning models ideally using a model to get a prediction is free adding data consists of validation by three steps as described below img src assets architecture flow png raw true width 500 alt picture of a someone sending data to the adddata method in collaborativetrainer which sends data to the 3 main components as further described next 1 the incentivemechanism validates the request to add data for instance in some cases a stake or deposit is required in some cases the incentive mechanism can also be triggered later to provide users with payments or virtual karma points 2 the datahandler stores data and meta data on the blockchain this ensures that it is accessible for all future uses not limited to this smart contract 3 the machine learning model is updated according to predefined training algorithms in addition to adding data anyone can query the model for predictions for free the basics of the framework can be found in our blog post blog1 a demo of one incentive mechanism can be found here demo more details can be found in the initial paper overview paper describing the framework accepted to blockchain 2019 the ieee international conference on blockchain this repository contains demos demo folder showcasing some proof of concept systems using the ethereum blockchain there is a locally deployable test blockchain and demo dashboard to interact with smart contracts written in solidity simulation tools simulation folder written in python to quickly see how models and incentive mechanisms would work when deployed img src assets aka ms 0xdeca10b qr png raw true width 250 alt picture of a qr code with aka ms 0xdeca10b written in the middle faq concerns aren t smart contracts just for simple code there are many options we can restrict the framework to simple models perceptron naive bayes nearest centroid etc we can also combine off chain computation with on chain computation in a few ways such as encoding off chain to a higher dimensional representation and just have the final layers of the model fine tuned on chain using secure multiparty computation or using external apis or as they are called the blockchain space oracles to train and run the model we can also use algorithms that do not require all models parameters to be updated e g perceptron we hope to inspire more research in efficient ways to update more complex models some of those proposals are not in the true spirit of this system which is to share models completely publicly but for some applications they may be suitable at least the data would be shared so others can still use it to train their own models will transaction fees be too high fees in ethereum are low enough for simple models a few cents as of july 2019 simple machine learning models are good for many applications as described the previous answer there are ways to keep transactions simple fees are decreasing ethereum is switching to proof of stake other blockchains may have lower or possibly no fees what about storing models off chain storing the model parameters off chain e g using ipfs is an option but many of the popular solutions do not have robust mirroring to ensure that the model will still be available if a node goes down one of the major goals of this project is to share models and improve their availability the easiest way to do that now is to have the model stored and trained in a smart contract we re happy to make improvements if you do know of a solution that would be cheaper and more robust than storing models on a blockchain like ethereum then let us know by filing an issue what if i just spam bad data this depends on the incentive mechanism im chosen but essentially you will lose a lot of money others will notice the model is performing badly or does not work as expected and then stop contributing to it depending on the im such as in deposit refund and take self assessment others that already submitted good data will gladly take your deposits without submitting any more data furthermore people can easily automatically correct your data using techniques from unsupervised learning such as clustering they can then use the data offline for their own private model or even deploy a new collection system using that model what if no one gives bad data then no one can profit that s great this system will work as a source for quality data and models people will contribute data to help improve the machine learning models they use in their daily life profit depends on the incentive mechanism im yes in deposit refund and take self assessment the contributors will not profit and should be able to claim back their own deposits in the prediction market based mechanism contributors can still get rewarded by the original provider of the bounty and test set learn more papers more details can be found in our initial paper decentralized collaborative ai on blockchain overview paper which describes the framework accepted to blockchain 2019 the ieee international conference on blockchain an analysis of several machine learning models with the self assessment incentive mechanism can be found in our second paper analysis of models for decentralized and collaborative ai on blockchain self assessment analysis paper which was accepted to the 2020 international conference on blockchain http blockchain1000 org 2020 contributing this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g label comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments demo folder demo simulation folder simulation demo https aka ms 0xdeca10b demo blog1 https aka ms 0xdeca10b blog1 overview paper https aka ms 0xdeca10b paper self assessment analysis paper https arxiv org abs 2009 06756 | blockchain ml ai economics machine-learning artificial-intelligence ethereum truffle prediction-mar prediction-market python node react smart-contracts | blockchain |
applied-ml | applied machine learning rstudio conf 2020 spiral calendar january 27 and 28 2020 br alarm clock 09 00 17 00 br hotel continental ballroom rooms 4 ballroom level br writing hand rstd io conf http rstd io conf br ledger part 1 https rstudio conf 2020 github io applied ml part 1 html 2 https rstudio conf 2020 github io applied ml part 2 html 3 https rstudio conf 2020 github io applied ml part 3 html 4 https rstudio conf 2020 github io applied ml part 4 html 5 https rstudio conf 2020 github io applied ml part 5 html 6 https rstudio conf 2020 github io applied ml part 6 html gitter https badges gitter im conf2020 applied ml community svg https gitter im conf2020 applied ml community utm source badge utm medium badge utm campaign pr badge img src https github com rstudio conf 2020 applied ml raw master images rotate gif width 400 align middle class center overview machine learning is the study and application of algorithms that learn from and make predictions on data from search results to self driving cars it has manifested itself in all areas of our lives and is one of the most exciting and fast growing fields of research in the world of data science this two day course will provide an overview of using r for supervised learning the session will step through the process of building visualizing testing and comparing models that are focused on prediction the goal of the course is to provide a thorough workflow in r that can be used with many different regression or classification techniques case studies on real data will be used to illustrate the functionality and several different predictive models are illustrated the course focuses on both low and high level approaches to modeling using the tidyverse and uses several types of models for illustration learning objectives attendees will be able to use the tidymodels packages to create tune fit visualize and assess models created for the purpose of prediction is this course for me this course requires basic familiarity with r and the tidyverse prework if you want to read up a bit about predictive modeling before the workshop check out chapter 1 https bookdown org max fes intro intro html and chapter 3 https bookdown org max fes review predictive modeling process html of feature engineering and selection https bookdown org max fes we will have rstudio server pro instances with all of the packages installed as well as the above github repository available if you would like to run r locally the installation instructions are r install packages c ameshousing c50 devtools discrim earth ggthemes glmnet see important note below klar lubridate modeldata party proc rpart stringr textfeatures tidymodels repos http cran rstudio com devtools install github c tidymodels tidymodels tidymodels tune tidymodels textrecipes koalaverse vip gadenbuie countdown important note a new version of glmnet was released on 2019 11 09 although it states that it depends on r 3 5 0 it may not install on r versions 3 6 0 we will be on site at least 30min before the workshop commences in case you need any help getting packages installed prior to this you can email max rstudio com with questions note we don t provide the rmd files for the slides mostly because they are complex and we don t support them however we do get requests for people who would like to use them as a template so we provide part 1 rmd if you want to use this format for your presentations schedule time activity 09 00 10 30 session 1 10 30 11 00 coffee break 11 00 12 30 session 2 12 30 13 30 lunch break 13 30 15 00 session 3 15 00 15 30 coffee break 15 30 17 00 session 4 instructors max kuhn and davis vaughan https i creativecommons org l by 4 0 88x31 png this work is licensed under a creative commons attribution 4 0 international license https creativecommons org licenses by 4 0 | tidymodels machine-learning regression classification text-analysis | ai |
Embedded-System-Design-Laboratory-Homework | embedded system stm32f7 disco lcd projects lab01 uart simply display text on lcd using the bsp board support library in lab01 uart drivers bsp change the text on user button pressed print something on uart and blink led at different frequency no touch screen envloed boring lcd displays and led blinking doesn t deserve a gif lab02 pong game a really simple pong game with only 4 possible angles speed adjustable img src docs pics lab02 preview gif alt drawing width 600 lab03 minesweeper view branch https github com qqq89513 embedded system design laboratory homework commits lab03 a minesweeper with custom mine counts and timer showing how much time you ve played gui is implemented with touchgfx img src docs pics lab03 preview gif alt drawing width 600 hw01 calc view branch https github com qqq89513 embedded system design laboratory homework commits hw01 this is a simple calculator with a user friendly gui on lcd and the input is acquired from the touch screen gui is implemented with touchgfx img src docs pics hw01 preview gif alt drawing width 600 | homework stm32 embeded-systems | os |
API_Design_Assignment | api design assignment part 2 michael dratch cs 680 8 20 23 summary an api design for a distributed polling application using the hateoas constraints | cloud |
|
holoclean | master build status https travis ci org holoclean holoclean svg branch master https travis ci org holoclean holoclean dev build status https travis ci org holoclean holoclean svg branch dev https travis ci org holoclean holoclean holoclean a machine learning system for data enrichment holoclean http www holoclean io is built on top of pytorch and postgresql holoclean is a statistical inference engine to impute clean and enrich data as a weakly supervised machine learning system holoclean leverages available quality rules value correlations reference data and multiple other signals to build a probabilistic model that accurately captures the data generation process and uses the model in a variety of data curation tasks holoclean allows data practitioners and scientists to save the enormous time they spend in building piecemeal cleaning solutions and instead effectively communicate their domain knowledge in a declarative way to enable accurate analytics predictions and insights form noisy incomplete and erroneous data installation holoclean was tested on python versions 2 7 3 6 and 3 7 it requires postgresql version 9 4 or higher 1 install and configure postgresql we describe how to install postgresql and configure it for holoclean creating a database a user and setting the required permissions option 1 native installation of postgresql a native installation of postgresql runs faster than docker containers we explain how to install postgresql then how to configure it for holoclean use a installing postgresql on ubuntu install postgresql by running apt get install postgresql postgresql contrib for macos you can find the installation instructions on https www postgresql org download macosx https www postgresql org download macosx b setting up postgresql for holoclean by default holoclean needs a database holo and a user holocleanuser with permissions on it 1 start the postgresql psql console from the terminal using psql user username you can omit user username to use current user 2 create a database holo and user holocleanuser sql create database holo create user holocleanuser alter user holocleanuser with password abcd1234 grant all privileges on database holo to holocleanuser c holo alter schema public owner to holocleanuser you can connect to the holo database from the postgresql psql console by running psql u holocleanuser w holo holoclean currently populates the database holo with auxiliary and meta tables to clear the database simply connect as a root user or as holocleanuser and run sql drop database holo create database holo option 2 using docker if you are familiar with docker an easy way to start using holoclean is to start a postgresql docker container to start a postgresql docker container run the following command bash docker run name pghc e postgres db holo e postgres user holocleanuser e postgres password abcd1234 p 5432 5432 d postgres 11 which starts a backend server and creates a database with the required permissions you can then use docker start pghc and docker stop pghc to start stop the container note the port number which may conflict with existing postgresql servers read more about this docker image here https hub docker com postgres 2 setting up holoclean holoclean runs on python 2 7 or 3 6 we recommend running it from within a virtual environment creating a virtual environment for holoclean option 1 conda virtual environment first download anaconda not miniconda from this link https www anaconda com download follow the steps for your os and framework second create a conda environment python 2 7 or 3 6 for example to create a python 3 6 conda environment run bash conda create n hc36 python 3 6 upon starting restarting your terminal session you will need to activate your conda environment by running bash conda activate hc36 option 2 set up a virtual environment using pip and virtualenv if you are familiar with virtualenv you can use it to create a virtual environment for python 3 6 create a new environment with your preferred virtualenv wrapper for example virtualenvwrapper https virtualenvwrapper readthedocs io en latest bourne shells virtualfish https virtualfish readthedocs io en latest fish shell either follow instructions here https virtualenv pypa io en stable installation or install via pip bash pip install virtualenv then create a virtualenv environment by creating a new directory for a python 3 6 virtualenv environment bash mkdir p hc36 virtualenv python python3 6 hc36 where python3 6 is a valid reference to a python 3 6 executable activate the environment bash source hc36 bin activate install the required python packages note make sure that the environment is activated throughout the installation process when you are done deactivate it using conda deactivate source deactivate or deactivate depending on your version in the project root directory run the following to install the required packages note that this commands installs the packages within the activated virtual environment bash pip install r requirements txt note for macos users you may need to install xcode developer tools using xcode select install running holoclean see the code in examples holoclean repair example py for a documented usage of holoclean in order to run the example script run the following bash cd examples start example sh notice that the script sets up the python path environment to run holoclean | machine-learning inference-engine pytorch data-science data-enrichment | ai |
Signature_Stay | signature stay a website based on php in backend with html css and javascript for frontend and mysql as database used xamp for basic setup complete software engineering based project description the resort property management system is a system with all the features and functions required for effectively and efficiently managing the chain of resorts this system will have a user friendly and attractive interface and will be presented in a simple and easy way this system will give better options for the problem of handling large scale physical file systems for the manual errors occurring in calculations this system will ease the job of employees make the transactions error free and will be able to handle customers in a more convenient and quick manner 2 2 product functions 1 make reservations 2 search rooms available room details 3 manage restaurant spa and other activities 4 add payment 5 issue bills 6 registration login and logout 7 manage guest add update guest 8 manage room details add update delete 9 manage staff add update delete view 10 manage inventory add edit delete 11 set rates 12 retrieve reports staff payment income 13 manage users add update delete 14 taking backups 15 e mail notifications 16 guest experience management 2 4 operating environment 1 operating system windows 7 ubuntu mac 2 front end html css javascript bootstrap 3 backend mysql 4 server xampp 5 browser any browser 6 monitor with minimum resolution of 1024x768 keyboard and mouse 7 a laser printer will need to be used to print these reports and notes | server |
|
FreeRTOS-Cellular-Interface-Community-Supported-Ports | freertos cellular interface community supported ports this repository contains the cellular module ports supported by freertos community members we welcome your contribution to expand the catalog of modems how to contribute a cellular module port 1 fork the freertos freertos cellular interface repository 2 write code for a new cellular module for details see porting the cellular interface library to another modem https www freertos org cellular porting guide html 3 create a new folder for the new cellular module in the root folder 4 create a readme md file and place it at the top level of the new module port folder ensure the readme md describes the following 1 hardware information give the reader a cellular module introduction and the steps to setup the hardware environment 2 objectives tell the reader a high level summary of what steps they need to take and how to use this port 3 relevant information tell the reader what they should read or take care if they would like to dive deeper 5 optional if you are contributing a new cellular module with supporting demos create a new demo in the cellular demo repository and update the readme md in the root folder 6 raise a pull request pr maintainers and other contirbutors will review your pr please be proactive in the conversation and make the requested changes as soon as possible the code will be merged after the pr is approved license the freertos cellular interface library is distributed under mit open source license the code in this repository is licensed under the mit license see the license https github com freertos freertos kernel community supported ports blob main license file | os |
|
complaint-content-classification-nlp | overview the consumer financial protection bureau cfpb is a federal u s agency that acts as a mediator when disputes arise between financial institutions and consumers via a web form consumers can send the agency a narrative of their dispute this project developed natural language processing nlp machine learning models to process the narratives text and categorize the complaints into one of five classes business case an nlp model would make the classification of complaints and their routing to the appropriate teams more efficient than manually tagged complaints about the data a data file was downloaded directly from the cfpb website https www consumerfinance gov data research consumer complaints for training and testing the model it included one year s worth of data march 2020 to march 2021 later in the project i used an api to download up to the minute data to verify the model s performance each submission was tagged with one of nine financial product classes because of similarities between certain classes as well some class imbalances i consolidated them into five classes 1 credit reporting 2 debt collection 3 mortgages and loans includes car loans payday loans student loans etc 4 credit cards 5 retail banking includes checking savings accounts as well as money transfers venmo etc after data cleaning the dataset consisted of around 162 400 consumer submissions containing narratives the dataset was still imbalanced with 56 in the credit reporting class and the remainder roughly equally distributed between 8 and 14 among the remaining classes class imbalances https github com halpert3 flatiron capstone project blob main notebooks exported images class imbalances png process exploratory data analysis instantiated various vectorizing techniques tf idf and countvectorizer with both unigrams and bigrams to explore word frequency lemmatized the corpora since words such as payment and payments appeared as separate entries using the lemmatized words made pie charts of the top word per class how it compared to other classes for example card was the top word in the credit card class appearing in 67 6 of associated narratives that word however appeared only in 1 7 of mortgages and loan narratives i assumed word frequency imbalances like this one would be useful for the model to categorize the narratives credit card pie https github com halpert3 flatiron capstone project blob main notebooks exported images credit 20card 20pie png data preparation created a dataframe with two columns a product class and narrative per line the narrative string consisted of space separated lemmatized words with stopwords such as the and if removed baseline modeling prep replaced class names with numbers performed train test split vectorized data with tf idf created function to score baseline models ran six baseline models with mostly default parameters multinomial nb random forest decision tree knn gradient boosting xgboost when scoring the models i relied mostly on the macro recall scores since the recall metric accounts for false negatives and doesn t favor an imbalanced class i also took into account the difference between the recall scores of training and test sets as closer they were the better the model was at not overfitting the best baseline models were multinomial nb decision tree and gradient boosting model refinement i use grid search and implemented different ways of vectorizing tf idf and countvectorizer with different maximum features and various parameters for the three modeling techniques multinomial nb did the best with a recall of 86 much improved from the baseline of 58 i experimented with using smote to correct for class imbalances but the model actually did better without it with gradient boosting i found parameters that yielded a recall score not far behind multinomial nb at 83 both models had only a small problem with overfitting a 3 discrepancy between training and test sets downloaded new data with an api as mentioned the cfpb also allows the downloading of data via an api i developed functions to immediately process up to 1 000 lines of downloaded data into a useable form consolidating classes lemmatizing words removing stopwords etc and then to vectorize and run the data through the multinomial nb model i achieved classification results similar to the original testing data for the model post modeling eda even though i considered multinomial nb to be my winning model i used the not far behind gradient boosting model to check for feature importances since it s an intrinsic capability of gradient boosting this chart shows the ten most important features words for classifying texts and how prevalent they were in each class importance by class https github com halpert3 flatiron capstone project blob main notebooks exported images importance 20by 20class png clearly some features were far more prevalent in particular classes such as card in credit cards experian in credit reporting and bank in retail banking other words such as account and credit had more mixed frequencies across the classes and i assume the model used these features in conjunction with other features to classify narratives next steps improve business case since consumers classified their own complaints ask cfpb employees to double check narratives classes particularly those that the model misclassified i d seek to understand how the cfpb internally routes and processes consumer complaints and develop further modeling capabilities for sub product issue and sub issue refine models use more than one year s worth of data and further refine parameters create latent dirichlet allocation lda model to develop new classification categories and learn if they might be useful to cfpb | natural-language-processing lemmatization vectorization machine-learning naive-bayes gradient-boosting | ai |
pinoco | build status https travis ci org tanakahisateru pinoco svg branch master https travis ci org tanakahisateru pinoco pinoco is a web development framework using php and mainly phptal has various usages even in cases oop based frameworks don t fit smaller code base and lightweight footprint explicit and strict view logic isolation each stored into isolated folders designer friendly content management workflow similar to that of a static site consistency of file paths and uri seamless design and preview under a local file system layout macro using tal like dreamweaver library but scm friendly unlike dreamweaver extreme transparency of design files and source code easy to introduce to plain php users procedural programmers easy to apply php to a static site high flexibility and less restriction freedom would be more powerful than convention or configuration small single purpose features lower training costs no restrictions upon using your own libraries no strict need of object oriented programing oop optional pinoco replaces the need for a web framework because it works automatically between the request and response structurally you may think that a framework should exist as a full stack and force development into formal style but pinoco is different pinoco has no database support and no scaffolding tools so you can assume it as only an environment but this environment will be a good fit for many web sites you can start doing pinoco application development on a static site which has been built with html only then you can manage your content in the same way as you would static site the name of pinoco comes from pinocchio he was a wooden puppet but the wood he was made of was enchanted by magic he could then move by his own free will just like in this story pinoco will let the file tree in your static site act autonomously though there is no relation to the character pinoko in black jack if the web creators were to black jack pinoco pinoko would be great but at best a support character pinoco also stands for php is not for object coders only i hope php aiming to oop world is also to be more easy to use for designers or script hackers htdocs hello html html p hello span tal content this message default world span p app hooks hello html php php php this message pinoco | front_end |
|
cleanlook | cleanlook an example ios application constructed utilizing the perfect quick engineering clean quick is uncle sway s spotless engineering applied to ios and macintosh projects for more information click a href https maccrackerz com rewasd crack here a | os |
|
Altschool-Cloud-Exercises | altschool cloud exercises this is where all my exercises from altschool cloud engineering is stored | cloud |
|
pycaret | div align center img src docs images logo png alt drawing width 200 an open source low code machine learning library in python pycaret 3 0 is now available pip install upgrade pycaret br p align center h3 a href https pycaret gitbook io docs a a href https pycaret gitbook io docs get started tutorials tutorials a a href https pycaret gitbook io docs learn pycaret official blog blog a a href https www linkedin com company pycaret linkedin a a href https www youtube com channel ucxa1ytyj9beeo50lxyi b3g youtube a a href https join slack com t pycaret shared invite zt row9phbm bojdevpyngf7 nxnbp307w slack a h3 p overview ci cd pytest on push https github com pycaret pycaret workflows pytest 20on 20push badge svg documentation status https readthedocs org projects pip badge version stable http pip pypa io en stable badge stable code pypi https img shields io pypi v pycaret color orange https pypi org project pycaret python versions https img shields io badge python 3 8 20 7c 203 9 20 7c 203 10 blue https badge fury io py pycaret black https img shields io badge code 20style black 000000 svg https github com psf black downloads downloads https static pepy tech personalized badge pycaret period week units international system left color grey right color blue left text weekly 20 pypi https pepy tech project pycaret downloads https static pepy tech personalized badge pycaret period month units international system left color grey right color blue left text monthly 20 pypi https pepy tech project pycaret downloads https static pepy tech personalized badge pycaret period total units international system left color grey right color blue left text cumulative 20 pypi https pepy tech project pycaret license license https img shields io pypi l ansicolortags svg https img shields io pypi l ansicolortags svg community slack https img shields io badge slack chat green svg logo slack https join slack com t pycaret shared invite zt 20gl4zb8k l zqdyi9ltrv4dwxyple7a alt text docs images quick start gif div align left welcome to pycaret pycaret is an open source low code machine learning library in python that automates machine learning workflows it is an end to end machine learning and model management tool that speeds up the experiment cycle exponentially and makes you more productive in comparison with the other open source machine learning libraries pycaret is an alternate low code library that can be used to replace hundreds of lines of code with few lines only this makes experiments exponentially fast and efficient pycaret is essentially a python wrapper around several machine learning libraries and frameworks such as scikit learn xgboost lightgbm catboost optuna hyperopt ray and few more the design and simplicity of pycaret are inspired by the emerging role of citizen data scientists a term first used by gartner citizen data scientists are power users who can perform both simple and moderately sophisticated analytical tasks that would previously have required more technical expertise pycaret was inspired by the caret library in r programming language installation option 1 install via pypi pycaret is tested and supported on 64 bit systems with python 3 7 3 8 3 9 and 3 10 ubuntu 16 04 or later windows 7 or later you can install pycaret with python s pip package manager python install pycaret pip install pycaret pycaret s default installation will not install all the optional dependencies automatically depending on the use case you may be interested in one or more extras python install analysis extras pip install pycaret analysis models extras pip install pycaret models install tuner extras pip install pycaret tuner install mlops extras pip install pycaret mlops install parallel extras pip install pycaret parallel install test extras pip install pycaret test install multiple extras together pip install pycaret analysis models check out all optional dependencies https github com pycaret pycaret blob master requirements optional txt if you want to install everything including all the optional dependencies python install full version pip install pycaret full option 2 build from source install the development version of the library directly from the source the api may be unstable it is not recommended for production use python pip install git https github com pycaret pycaret git master upgrade option 3 docker docker creates virtual environments with containers that keep a pycaret installation separate from the rest of the system pycaret docker comes pre installed with a jupyter notebook it can share resources with its host machine access directories use the gpu connect to the internet etc the pycaret docker images are always tested for the latest major releases python default version docker run p 8888 8888 pycaret slim full version docker run p 8888 8888 pycaret full quickstart 1 functional api python classification functional api example loading sample dataset from pycaret datasets import get data data get data juice init setup from pycaret classification import s setup data target purchase session id 123 model training and selection best compare models evaluate trained model evaluate model best predict on hold out test set pred holdout predict model best predict on new data new data data copy drop purchase axis 1 predictions predict model best data new data save model save model best best pipeline 2 oop api python classification oop api example loading sample dataset from pycaret datasets import get data data get data juice init setup from pycaret classification import classificationexperiment s classificationexperiment s setup data target purchase session id 123 model training and selection best s compare models evaluate trained model s evaluate model best predict on hold out test set pred holdout s predict model best predict on new data new data data copy drop purchase axis 1 predictions s predict model best data new data save model s save model best best pipeline modules div align center classification functional api oop api docs images classification functional png docs images classification oop png regression functional api oop api docs images regression functional png docs images regression oop png time series functional api oop api docs images time series functional png docs images time series oop png clustering functional api oop api docs images clustering functional png docs images clustering oop png anomaly detection functional api oop api docs images anomaly functional png docs images anomaly oop png div align left who should use pycaret pycaret is an open source library that anybody can use in our view the ideal target audience of pycaret is br experienced data scientists who want to increase productivity citizen data scientists who prefer a low code machine learning solution data science professionals who want to build rapid prototypes data science and machine learning students and enthusiasts training on gpus to train models on the gpu simply pass use gpu true in the setup function there is no change in the use of the api however in some cases additional libraries have to be installed the following models can be trained on gpus extreme gradient boosting catboost light gradient boosting machine requires gpu installation https lightgbm readthedocs io en latest gpu tutorial html logistic regression ridge classifier random forest k neighbors classifier k neighbors regressor support vector machine linear regression ridge regression lasso regression requires cuml 0 15 https github com rapidsai cuml pycaret intel sklearnex support you can apply intel optimizations https github com intel scikit learn intelex for machine learning algorithms and speed up your workflow to train models with intel optimizations use sklearnex engine there is no change in the use of the api however installation of intel sklearnex is required python pip install scikit learn intelex contributors a href https github com pycaret pycaret graphs contributors img src https contrib rocks image repo pycaret pycaret width 600 a license pycaret is completely free and open source and licensed under the mit https github com pycaret pycaret blob master license license more information important links description star tutorials tutorials developed and maintained by core developers clipboard example notebooks example notebooks created by community orange book blog official blog by creator of pycaret books documentation api docs tv videos video resources cheat sheet community cheat sheet loudspeaker discussions community discussion board on github hammer and wrench release notes release notes tutorials https pycaret gitbook io docs get started tutorials example notebooks https github com pycaret examples blog https pycaret gitbook io docs learn pycaret official blog documentation https pycaret gitbook io docs videos https pycaret gitbook io docs learn pycaret videos cheat sheet https pycaret gitbook io docs learn pycaret cheat sheet discussions https github com pycaret pycaret discussions release notes https github com pycaret pycaret releases | data-science citizen-data-scientists python machine-learning pycaret ml gpu time-series regression classification anomaly-detection clustering | ai |
phosphorusfive | phosphorus five a rad web app framework phosphorus five is a net based rad web application development framework for creating rich and secure ajax web apps it allows you to orchestrate your apps together almost as if they were made out of lego bricks p align center a href https www youtube com watch v blll2wx0yfo img alt a one minute introduction video about phosphorus five title a one minute introduction video about phosphorus five src https phosphorusfive files wordpress com 2018 03 screenshot youtube infomercial png a p out of the box phosphorus five contains the following components hyper ide a web based ide with support for 100 programming languages camphora five a crud app generator allowing you to create rich crud apps in seconds hypereval a hyperlambda web based powershell executor and a snippets database plus more installation you can install phosphorus five on a production ubuntu linux server with an automated script taking care of all dependencies or you can download its code version and play around with it locally on your windows mac or linux machine if you choose the latter you will have to make sure you have mysql server installed somehow https dev mysql com downloads mysql on your computer in addition you need visual studio xamarin https www visualstudio com vs community or mono develop https www monodevelop com to use the source code version download and install phosphorus five here https github com polterguy phosphorusfive releases both binary release and source code notice source code if you download the source version make sure you edit the core p5 webapp web config file such that it contains the correct connection string for your mysql installation this normally implies simply adding your password to the existing connection string phosphorus five will run without a valid mysql database connection string however some of its apps will not function at all or at their peak performance notice binaries the automatic linux script has only been tested on ubuntu server version 16 04 4 but might also work on other versions this script will also sigificantly increase the security of your box in addition to patching your box updating it and making sure it s using the latest stable versions of all software it installs such as for instance mono version 5 10 the script expects a vanilla linux ubuntu server and will remove any existing websites you have configured for your apache folder notice source code version on windows if you use the source code version on windows in visual studio make sure you turn off browser sync in visual studio performance on average you can expect a phosphorus five web app to perform at least 10x as fast compared to literally anything else out there below is a youtube video demonstrating the performance of hyper ide versus visual studio community edition on a macbook air to edit a simple css file in visual studio requires at least 3 times as much time due to visual studio being slow in the video below i start hyper ide up after having started visual studio and i am done with my work in hyper ide before visual studio have even loaded p align center a href https www youtube com watch v c97tkg6dgoy img alt a one minute performance demonstration of hyper ide versus visual studio title a one minute performance demonstration of hyper ide versus visual studio src https phosphorusfive files wordpress com 2018 04 how fast is hyper ide compared to visual studio png a p productivity phosphorus five is created around the axiom that you should become at least 10x more productive for some tasks your productivity will soar to extreme heights such as i demonstrate in the video below where i create a rich database crud app in 2 minutes using the integrated camphora five crud app generator p align center a href https www youtube com watch v kms tltf og img alt in this video i am creating an address book type of web app in 5 seconds title in this video i am creating an address book type of web app in 5 seconds src https phosphorusfive files wordpress com 2018 04 camphora five address book youtube video png a p msdn magazine articles about phosphorus five 1 active events one design pattern instead of a dozen https msdn microsoft com en us magazine mt795187 2 make c more dynamic with hyperlambda https msdn microsoft com en us magazine mt809119 3 could managed ajax put your web apps in the fast lane https msdn microsoft com en us magazine mt826343 dzone articles about phosphorus five 1 creating an operating system with 5 lines of code https dzone com articles creating an operating system with 5 lines of code 2 creating a lambda web service https dzone com articles creating a lambda web service 3 creating an ajax mysql datagrid with 7 lines of code https dzone com articles creating an ajax mysql datagrid with 7 lines of co 4 3 days coding challenge creating mysql admin for asp net https dzone com articles 3 days coding challenge creating mysql admin for a 5 creating documentation for your software in zero seconds https dzone com articles creating documentation for your software in zero s license phosphorus five is free and open source software and distributed under the terms of the gnu public license however proprietary enabling licenses are available for a fee faq faq faq md download download and install phosphorus five here https github com polterguy phosphorusfive releases | web phosphorus-five hyperlambda web-os operating-system application-framework asp-net csharp dotnet | front_end |
Stacker | img src http f cl ly items 0z2m1e2n1b2j2a0s3147 stacker png alt drawing width 400px stacker is an ios view controller to kickstart development of hybrid native web ios apps stacker was built to keep your navigation native while the rest of your app is driven by webviews using stacker s special urls http www lokimeyburg com stacker docs url structure built for iphones running ios 7 ios 8 ipad support coming soon features build url driven ios apps http www lokimeyburg com stacker docs url structure send messages between obj c and javascript custom navigation button handlers view external websites in a separate modal view theming options pull to refresh on all pages error pages app version and device information sent in http headers getting started documentation view the official stacker documentation http www lokimeyburg com stacker docs getting started to get started creating a stackercontroller if you re using cocoapods http cocoapods org in your podfile add pod lmstacker git https github com lokimeyburg stacker git and then run pod install now in your xcode project import stacker import lmstackercontroller h create a controller and point it to your web app s url lmstackercontroller mycontroller lmstackercontroller alloc initwithurl http localhost 3000 x page title home how to update the documentation the documentation site http www lokimeyburg com stacker is being created with github pages https pages github com and so it can be found by checking out the gh pages https github com lokimeyburg stacker tree gh pages branch the actual documentation is in the docs folder if you find a spelling mistake or any errors in the documentation please submit a pull request to the branch | front_end |
|
blockchain-programming-golang | blockchain programming with go living document a self paced study guide for learning how to program blockchains in go prerequisites this blockchain programming course assumes that you re already familiar with bitcoin https bitcoin org and know some go https golang org bitcoin white paper bitcoin a peer to peer electonic cash system https bitcoin org bitcoin pdf https bitcoin org bitcoin pdf bitcoin for developers the defacto source of truth documentation https bitcoin org en developer guide block chain https bitcoin org en developer guide block chain bitcoin lightning engineering main site http lightning network how it works http lightning network how it works developers documentation https lightning engineering https lightning engineering exposure to the go golang programming language to the minimum you have attempted and finished a tour of go https tour golang org welcome 1 ideally you also have been exposed to effective go https golang org doc effective go html free resources code your own blockchain in less than 200 lines of go coral health https medium com mycoralhealth code your own blockchain in less than 200 lines of go e296282bcffc building a blockchain in golang youtube series https youtu be mylht9bb6oe bitcoin development with go https gobitcoinbook org https gobitcoinbook org cryptocurrency cabal http bitcoin class org http bitcoin class org any videos by todd mcleod https www youtube com user toddmcleod https www youtube com user toddmcleod gophercises https gophercises com https gophercises com building a blockchain in go blog series https jeiwan cc posts building blockchain in go part 1 https jeiwan cc posts building blockchain in go part 1 create and sign bitcoin transactions with golang https www thepolyglotdeveloper com 2018 03 create sign bitcoin transactions golang https www thepolyglotdeveloper com 2018 03 create sign bitcoin transactions golang create a bitcoin hardware wallet with golang and a raspberry pi zero https www thepolyglotdeveloper com 2018 03 create bitcoin hardware wallet golang raspberry pi zero https www thepolyglotdeveloper com 2018 03 create bitcoin hardware wallet golang raspberry pi zero building a crypto exchange mostly in go https around25 com blog building a trading engine for a crypto exchange https around25 com blog building a trading engine for a crypto exchange paid resources greater commons greater commons https greatercommons com learn golang applied go https appliedgo com https appliedgo com calhoun io https www calhoun io courses https www calhoun io courses stephen grider udemy course https www udemy com go the complete developers guide https www udemy com go the complete developers guide find a job as a go golang developer golang crypto https golangcrypto com https golangcrypto com crypto jobs list https cryptojobslist com https cryptojobslist com we love golang https www welovegolang com https www welovegolang com not go golang specific but still great programming blockchain https programmingblockchain com https programmingblockchain com bitcoin dev network https bitcoindev network https bitcoindev network justin moon on youtube https www youtube com channel uclp4oswuhyszz3zrvbiodjg videos https www youtube com channel uclp4oswuhyszz3zrvbiodjg videos james lopp s bitcoin resources list https lopp net bitcoin html 9https lopp net bitcoin html ren pickhardt https www youtube com user renepickhardt https www youtube com user renepickhardts attend the programmable money workshops in san francisco ca programmable money https www meetup com programmable money https www meetup com programmable money copyright 2019 fod diop mit license | blockchain |
|
backpack | backpack design system backpack is a collection of design resources reusable components and guidelines for creating skyscanner s products npm version https badge fury io js skyscanner 2fbackpack web svg https badge fury io js skyscanner 2fbackpack web build status https github com skyscanner backpack workflows backpack 20ci badge svg https github com skyscanner backpack actions quick links documentation https www skyscanner design changelog https github com skyscanner backpack releases usage installation sh npm install save skyscanner backpack web contributing to contribute please see contributing md contributing md list of packages bpk animate height packages bpk animate height bpk component accordion packages bpk component accordion bpk component autosuggest packages bpk component autosuggest bpk component badge packages bpk component badge bpk component banner alert packages bpk component banner alert bpk component barchart packages bpk component barchart bpk component blockquote packages bpk component blockquote bpk component breadcrumb packages bpk component breadcrumb bpk component breakpoint packages bpk component breakpoint bpk component button packages bpk component button bpk component calendar packages bpk component calendar bpk component card packages bpk component card bpk component checkbox packages bpk component checkbox bpk component chip packages bpk component chip bpk component close button packages bpk component close button bpk component code packages bpk component code bpk component datatable packages bpk component datatable bpk component datepicker packages bpk component datepicker bpk component description list packages bpk component description list bpk component dialog packages bpk component dialog bpk component drawer packages bpk component drawer bpk component fieldset packages bpk component fieldset bpk component form validation packages bpk component form validation bpk component graphic promotion packages bpk component graphic promotion bpk component grid toggle packages bpk component grid toggle bpk component horizontal nav packages bpk component horizontal nav bpk component icon packages bpk component icon bpk component image packages bpk component image bpk component infinite scroll packages bpk component infinite scroll bpk component input packages bpk component input bpk component label packages bpk component label bpk component link packages bpk component link bpk component list packages bpk component list bpk component loading button packages bpk component loading button bpk component mobile scroll container packages bpk component mobile scroll container bpk component modal packages bpk component modal bpk component navigation bar packages bpk component navigation bar bpk component nudger packages bpk component nudger bpk component page indicator packages bpk component page indicator bpk component pagination packages bpk component pagination bpk component panel packages bpk component panel bpk component phone input packages bpk component phone input bpk component popover packages bpk component popover bpk component progress packages bpk component progress bpk component radio packages bpk component radio bpk component rtl toggle packages bpk component rtl toggle bpk component card button packages bpk component card button bpk component section header packages bpk component section header bpk component section list packages bpk component section list bpk component select packages bpk component select bpk component slider packages bpk component slider bpk component spinner packages bpk component spinner bpk component star rating packages bpk component star rating bpk component switch packages bpk component switch bpk component table packages bpk component table bpk component text packages bpk component text bpk component textarea packages bpk component textarea bpk theming packages bpk theming bpk component ticket packages bpk component ticket bpk component tooltip packages bpk component tooltip bpk react utils packages bpk react utils bpk mixins packages bpk mixins bpk stylesheets packages bpk stylesheets list of external packages these components are part of backpack and are utilised by the components but live in the foundations repository these are installed separately and installation information can be found in the backpack foundations repo https github com skyscanner backpack foundations component version skyscanner bpk svgs https github com skyscanner backpack foundations tree main packages bpk svgs npm version https badge fury io js 40skyscanner 2fbpk svgs svg https badge fury io js 40skyscanner 2fbpk svgs skyscanner bpk foundations web https github com skyscanner backpack foundations tree main packages bpk foundations web npm version https badge fury io js 40skyscanner 2fbpk foundations web svg https badge fury io js 40skyscanner 2fbpk foundations web | design-system react nodejs component-library backpack skyscanner | os |
nypl-design-system | reservoir design system build status https github com nypl nypl design system actions workflows ci yml badge svg branch development npm version https badge fury io js 40nypl 2fdesign system react components svg https badge fury io js 40nypl 2fdesign system react components the reservoir design system ds is nypl s open source extensible react library for products and experiences with the accessibility as its foundation it ships functional stateless components with consistent nypl styling you can learn more about the project and its goals on the project s wiki https github com nypl nypl design system wiki storybook documentation v2 production deployed to github pages https nypl github io nypl design system reservoir v2 path docs welcome docs development qa deployed to vercel https nypl design system vercel app path docs welcome docs v1 production deployed to vercel https nypl design system git reservoir v173 nypl vercel app table of contents 1 using the design system in your product using the design system in your product 2 using chakra ui components using chakra ui components 3 storybook storybook 4 accessibility accessibility 5 contributing quickstart contributing quickstart 6 local app development local app development 7 typescript usage typescript usage 8 unit testing unit testing 9 cdn cdn using the design system in your product the reservoir design system package is distributed on npm 1 install the package in your app s directory sh npm install nypl design system react components 2 import the dsprovider component in order to use ds components in a consuming application there is a necessary step that must be done for component styles to properly render consuming applications need to wrap all the ds components with a simple provider component fortunately this only needs to be done once at the top level of the consuming application once the following is completed ds components that internally use chakra ui will render styles properly jsx your main application file import dsprovider from nypl design system react components const applicationcontainer props return dsprovider div classname my app children div dsprovider 3 import the minified styles nypl design system react components dist styles css file in your app this file contains normalized reset css rules system fonts the react datepicker s styles breakpoint css variables and overriding styles for a few components importing this file varies on the stack of the application for nextjs apps the file can be imported in the app tsx file tsx app tsx import nypl design system react components dist styles css import dsprovider templateappcontainer from nypl design system react components otherwise it can be imported in the app s main scss file scss main scss for example import nypl design system react components dist styles css body note using tilde to import scss css is no longer a best practice for apps using recent versions of webpack or parcel scss no longer a best practice import nypl design system react components dist styles css for apps using parcel prepend the string import with npm such as scss import npm nypl design system react components dist styles css 4 use ds components consult storybook for the list of available components and props that they require here s an example with the link component jsx import link from nypl design system react components function newcomponent props return link href https www hathitrust org hathitrust link sometimes you may have conflicts perhaps with react router in that case you can alias your imports jsx import as ds from nypl design system react components import link from react router function newcomponent props return ds link link to license public domain link ds link using chakra ui components the chakra ui component library has been integrated into the reservoir design system we are still progressing towards using chakra components and patterns to build ds components and therefore documentation and features are expected to change while the implementation details of ds components will use chakra the ds package itself will export some chakra components the list of re exported chakra components can be found in the main index ts src index ts file they include box center circle grid griditem hstack square stack vstack find more information about the design system s internal use of chakra to create and refactor components in the storybook documentation page the following two links have the same information but in different formats for your reading preference mdx format src docs chakra stories mdx storybook page https nypl github io nypl design system reservoir v2 path docs chakra ui docs chakra was integrated into the design system in version 0 25 0 for those looking to update to a version greater than or equal 0 25 0 check out our chakra migration guide chakra migration guide md storybook the reservoir design system leverages storybook to document all the react components and style guidelines the storybook documentation for version 2 x can be found on github pages https nypl github io nypl design system reservoir v2 path docs welcome docs for your convenience the reservoir design system components have been organized into logical categories based on both form and function please refer to the components section in the storybook sidebar documentation instances there are currently three main instances of the reservoir design system storybook documentation there are also preview sites that are used to quickly and easily view pull request changes production the production storybook documentation for ds version 2 x is deployed to github pages https nypl github io nypl design system reservoir v2 path docs welcome docs this is the main instance we use to share the latest stable release of the reservoir design system this documentation site is deployed through github actions github workflows gh pages yml only on merges to the release and gh pages branches as of july 2021 the github pages production site gets deployed every two weeks on the same schedule as npm releases development the development storybook documentation is deployed to vercel https nypl design system vercel app path docs welcome docs this development site has all the working updates that get merged to the development branch this means that this site is constantly being updated as pull requests are being merged in this site is used to see the lastest changes during a working sprint before a production release is made version 1 x the storybook documentation for ds version 1 x is deployed to vercel https nypl design system git reservoir v173 nypl vercel app if you are using a ds version less than 2 0 this is the storybook documentation you should be referencing while the ds team will continue to support version 1 x we will not be adding new features or components to this version we highly recommend updating to version 2 x for design update and bug fixes preview sites preview storybook documentation sites are deployed to vercel on every commit push to every branch in this repository they follow a pattern such as nypl design system hash nypl vercel app where hash is a random hash created by vercel this means that the url varies and that those instances are eventually shut off they are not meant to be used as long term sites but rather for reviewing working changes within the team react component documentation when actively developing components or fixing bugs make sure that the related stories are created or updated this means updating the respective component name stories mdx file for information on how to write stories check out the anatomy of a story https github com nypl nypl design system wiki anatomy of a story wiki page for stand alone document pages in storybook you need to 1 create the page name stories mdx file in src docs 2 add the file reference to the storybook main cjs file in the stories array react component versions to help consuming application developers understand which version of the ds is required for a specific component each component story page includes the following when a component was added to the ds minimum version of the ds required for the latest version of a component example component version table component version ds version added 0 20 1 latest 0 23 2 static build make sure not to commit the directory created from the following process there should be no need to run the static storybook instance while actively developing it s used exclusively for building out the gh pages environment and deploying it to github pages https nypl github io nypl design system reservoir v2 path docs welcome docs in the event that you do run the static storybook npm script run sh npm run build storybook v2 you can then view reservoir v2 index html in your browser make sure not to commit this directory accessibility development and storybook the reservoir design system is built with accessibility in mind by using chakra ui as our foundational base the custom ds components built with chakra have accessibility concerns already implemented on top of built in accessible elements ds components internally work to link labels with input elements to add correct aria attributes to visually hide text but still associate it with the correct element for titles and descriptions and much more we make use of eslint plugin jsx a11y for finding accessibility errors through linting and through ide environments jest axe for running axe core https github com dequelabs axe core on every component s unit test file this is part of the automated tests that run in github actions through the npm test command storybook addon a11y for real time accessibility testing in the browser through storybook every component has a tab that displays violations passes and incomplete checks performed by axe core if applicable ds components have section s on accessibility in their storybook documentation for example in the slider s storybook file src components slider slider stories mdx there are two accessibility sections for each of the two slider types single and range this gives an explanation on additional changes we made to make the combination of elements in the slider component accessible product requirements the reservoir design system provides accessible stories but real data can necessitate additional accessibility requirements beyond what we re committed to in our generic extensible components to ensure your products final result is accessible please adhere to the accessibility requirements put together by nypl s accessibility coordinator on metronome http themetronome co nypl s metronome instance is currently password protected for access to metronome please contact nypl s ux team or design system team contributing quickstart follow these steps to setup a local installation of the project 1 clone the repo sh git clone https github com nypl nypl design system git 2 install all the dependencies sh npm install 3 run the storybook instance and view it at http localhost 6006 sh npm run storybook you can now edit styles or templates in the src directory and they will automatically re build and update in the storybook instance adding new stories or changing story names will require a page refresh information about active maintainers how we run reviews and more can be found in our wiki page for contributing to the design system https github com nypl nypl design system wiki contributing to the react library follow the contribution document github contributing md to follow git branching convetions component creation and update guidelines testing methodoly and documentation guidelines node version we recommend using node version 16 x the github actions for linting automated testing deploying to github pages and releasing to npm are all running on node 16 x if you are using nvm the local nvmrc file using 16 x can be use to set your local node version with the nvm use command make sure your machine has node version 16 x installed through nvm already git branch workflow there are currently two main branches for the ds development is the main and default branch for the ds all new feature and bug fix pull requests should be made against this branch release is the branch used to deploy the static storybook instance to github pages the ds production storybook instance when a new version of the ds is ready for release the development branch is merged into the release branch through a pull request once merged github actions will run to deploy the static storybook as well as publish the new version to npm here is a pull request https github com nypl nypl design system pull 1249 that follows the convention outlined in how to run a release https github com nypl nypl design system wiki how to run a release when working on a new feature or a bug fix 1 create a new branch off of development with the following naming convention ticket number your feature or bug name for example if the jira ticket is dsd 1234 and the feature is add more animal crossing examples then potential branch names can be dsd 1234 add more animal crossing examples dsd 1234 more ac examples or dsd 1234 animal crossing examples the ticket number in the branch name is usually more helpful than the text that follows 2 create a pull request that points to the development branch 3 if your pull request is approved and should be merged merge it this is indicated with the ship it github label sometimes some features must wait and the do not merge label is added to the pull request release candidates for new big feature updates we typically want to qa it in the turbine test app before the real stable release is made in this case we create release candidate npm packages this can be based off the feature branch or the developement branch once the feature is merged in at the moment this is a manual process for this example we will use version 1 5 0 as the new version that will be released 1 whether on the feature branch or the development branch the version in the package json file must be updated to include the rc suffix for example 1 5 0 becomes 1 5 0 rc this is to indicate that this is a release candidate version 2 delete the package lock json file and the node modules directory 3 run npm install to install all the dependencies and create a new package lock json file with the updated version 4 run npm publish to publish the new release candidate version to npm make sure you have an npm account are logged in to npm on your machine and have the correct permissions to publish to the nypl design system react components package what happens if qa finds a bug in the release candidate version in the turbine test app 1 update or fix the bug in a new branch 2 once approved merge the pull request into the feature branch or the development branch 3 follow the same steps above to create a new release candidate version but this time the rc suffix should be incremented for example 1 5 0 rc becomes 1 5 0 rc1 4 qa the new release candidate version in the turbine test app the release candidate version passed qa and is ready for production what do we do now 1 celebrate 2 make sure all the new changes are merged into the development branch 3 remove the rc suffix from the version in the package json file 4 delete the package lock json file and the node modules directory 5 run npm install to install all the dependencies and create a new package lock json file with the updated version 6 push the changes to github and create a new pull request from development that points to the release branch 7 once approved and merged a github action will run that will automatically deploy the static storybook to github pages and publish the new version to npm local app development sometimes you may want to test out a new feature or bug fix in a local app rather than publishing a release candidate version to npm while this is possible it is not always straightforward please note that the following instructions depend on the node version for both the design system and the local app if the node versions are different the instructions may not work this is a limitation of npm developing with npm install to develop with a local version of the design system 1 in the root of the consuming application directory run sh npm install no save path to design system developing with npm link to develop with a local version of the design system 1 in the design system directory run sh npm link 2 go to the consuming application directory and run sh npm link nypl design system react components 3 go back to the design system directory and run the following command it allows the local design system to be rebuilt and exported automatically sh npm start error troubleshooting it s possible when running npm link that you ll get an invalid hook issue if this occurs it s most likely caused by having two versions of react when trying to run the application while the nypl ds package is linked this duplicate react https reactjs org warnings invalid hook call warning html duplicate react issue is covered by the react team to be more specific you should run the following in your local ds directory where path to application is the local directory of the consuming application sh npm link path to application node modules react now you should be able to run npm start in the ds directory and npm run dev or whatever your application uses in the application directory and not get an invalid hook error npm unlink to unlink the ds codebase 1 run npm unlink in the design system directory 2 run npm unlink no save nypl design system react components in the consuming application typescript usage the reservoir design system is built with typescript check out the design system s typescript documentation typescript md for more information on why we chose to build react components in typescript and the benefits and the gotchas we encountered unit testing the reservoir design system runs unit tests with jest and react testing library to run all tests once sh npm test if you re actively writing or updating tests you can run the tests in watch mode this will wait for any changes and run when a file is saved sh npm run test watch if you want to run tests on only one specific file run sh npm test src path to file for example to test the link component run sh npm test src components link link test tsx snapshot testing the nypl ds implements snapshot testing with react test renderer and jest using react testing library to test our components works well to make sure that what the user sees is what the component should be rendering there are also some behavioral tests that test user interactions if however a component s dom or scss styling was unintentionally updated we can catch those bugs through snapshot testing the react test renderer package will create a directory and a file to hold snap files after a unit test is created using the notification component as an example this is the basic layout for a snapshot test tsx import renderer from react test renderer it renders the ui snapshot correctly const tree renderer create notification id notificationid notificationheading notification heading notificationcontent notification content tojson expect tree tomatchsnapshot if this is a new test and we run npm test a snapshots notification test tsx snap src components notification snapshots notification test tsx snap file is created this holds the dom structure as well as any inline css that was added tsx exports notification snapshot renders the ui snapshot correctly 1 aside classname notification notification standard id notificationid removed for brevity aside now if we unintentionally update the notification tsx component to render a div instead of an aside element this test will fail if you want to update any existing snapshots re run the test script as sh npm test updatesnapshot each snapshot file also includes a link to its jest snapshot documentation https jestjs io docs snapshot testing which is recommended to read storybook jest addon through the storybook addon jest https www npmjs com package storybook addon jest plugin we can see a component s suite of unit tests right storybook in the addons panel a test tab will display all the tests for the current component and whether they pass or fail after writing new tests run npm run test generate output to create a new json file that is used by storybook this json file contains all the test suites for all the components and storybook picks this up and automatically combines a component with its relevant unit tests make sure to commit this file although new builds on github pages will recreate this file for the production storybook instance cdn you can also use the design system styles in your project through the unpkg cdn but not that this is not recommended for production use html link href https unpkg com nypl design system react components dist styles css script src https unpkg com nypl design system react components dist design system react components umd cjs script src https unpkg com nypl design system react components dist design system react components js if you need to reference a particular version you can do do by including the version number in the url for version 1 6 0 html link href https unpkg com nypl design system react components 1 5 1 dist styles css script src https unpkg com nypl design system react components 1 5 1 dist design system react components cjs production min js script src https unpkg com nypl design system react components 1 5 1 dist design system react components esm js for version 1 6 0 html link href https unpkg com nypl design system react components 1 6 0 dist styles css script src https unpkg com nypl design system react components 1 6 0 dist design system react components umd cjs script src https unpkg com nypl design system react components 1 6 0 dist design system react components js you can check out a working codepen with unpkg here https codepen io edwinguzman pen exmxgkx | design-system design-patterns | os |
ros2-tensorflow | ros2 tensorflow use tensorflow to load pretrained neural networks and perform inference through ros2 interfaces img src data detection png alt rviz2 detection output width 50 height 50 the output can be directly visualized through rviz requirements in order to build the ros2 tensorflow package the following dependencies are needed required dependencies ros2 foxy https docs ros org en foxy installation html rosdep dependencies opencv python https pypi org project opencv python tensorflow https www tensorflow org install vision msgs https github com kukanani vision msgs optional dependencies tensorflow object detection models https github com tensorflow models blob master research object detection g3doc installation md for object detection tasks tensorflow slim https github com google research tf slim for object segmentation tasks the provided dockerfile contains an ubuntu 18 04 environment with all the dependencies and this package already installed to use the dockerfile git clone https github com alsora ros2 tensorflow git cd ros2 tensorflow docker bash build sh bash run sh build this section describes how to build the ros2 tensorflow package and the required depenencies in case you are not using the provided dockerfile get the source code and create the ros 2 workspace git clone https github com alsora ros2 tensorflow git home ros2 tensorflow mkdir p home tf ws src cd home tf ws ln s home ros2 tensorflow ros2 tensorflow src install required dependencies using rosdep rosdep install from paths src ignore src rosdistro foxy y install the tensorflow object detection models optional make sure to specify the correct python version according to your system sudo apt get install y protobuf compiler python lxml python tk pip install user cython contextlib2 jupyter matplotlib pillow git clone https github com tensorflow models git usr local lib python3 8 dist packages tensorflow models cd usr local lib python3 8 dist packages tensorflow models research protoc object detection protos proto python out echo export pythonpath pythonpath usr local lib python3 8 dist packages tensorflow models research home bashrc install tensorflow slim optional pip install tf slim build and install the ros2 tensorflow package colcon build source install local setup sh usage the basic usage consists in creating a ros 2 node which loads a tensorflow model and another ros 2 node that acts as a client and receives the result of the inference it is possible to specify which model a node should load note that if the model is specified via url as it is by default the first time the node is executed a network connection will be required in order to download the model object detection task test the object detection server by running in separate terminals ros2 run tf detection py server ros2 run tf detection py client test setup a real object detection pipeline using a stream of images coming from a ros 2 camera node rviz2 ros2 run tf detection py server ros2 run image tools cam2image ros args p frequency 2 0 image classification task test the image classification server by running in separate terminals ros2 run tf classification py server ros2 run tf classification py client test loading different models the repository contains convenient apis for loading tensorflow models into the ros 2 nodes models are defined using the modeldescriptor class which contains all the information required for loading a model and performing inference on it it can either contain a path where the model can be found on the machine or an url where the model can be downloaded the first time different model formats are also supported such as frozen models and saved models some known supported models are already present as examples see classification models ros2 tensorflow tf classification py tf classification py models py and detection models ros2 tensorflow tf detection py tf detection py models py the tensorflow models repository https github com tensorflow models contains many pretrained models that can be used for example you can get additional tensorflow model for object detection from the detection model zoo https github com tensorflow models blob master research object detection g3doc detection model zoo md coco trained models | ros2 tensorflow ros2-tensorflow image-detection image-classification computer-vision | ai |
Mobile-Dev | mobile dev workplace for mobile development learner to make our project | front_end |
|
chameleon-llm | lizard chameleon plug and play compositional reasoning with gpt 4 science problems https img shields io badge task science problems blue science problems https img shields io badge task mathqa blue science problems https img shields io badge task tableqa blue chain of thought https img shields io badge model tool use green gpt 4 https img shields io badge model gpt 4 green llms https img shields io badge model llms green code for the paper chameleon plug and play compositional reasoning with large language models https arxiv org abs 2304 09842 bell if you have any questions or suggestions please don t hesitate to let us know you can directly email pan lu https lupantech github io using the email address lupantech gmail com comment on the twitter https twitter com lupantech status 1648879085115052033 or post an issue on this repository project page https chameleon llm github io paper https arxiv org abs 2304 09842 twitter https twitter com lupantech status 1648879085115052033 linkedin https www linkedin com feed update urn li activity 7056703894063644672 youtube https www youtube com watch v ewfixik4vjs ab channel worldofai slides https lupantech github io docs chameleon llm pan lu google brain 2023 05 05 pdf p align center img src https raw githubusercontent com chameleon llm chameleon llm github io main images logo png width 10 br tentative logo for b chameleon b p news 2023 05 06 thrilled to see that our chameleon paper has been ranked 1 out of 1 682 ai papers by alphasignal https alphasignalai beehiiv com p weeks top 5 ai papers utm source alphasignalai beehiiv com utm medium newsletter utm campaign this week s top 5 ai papers 2023 05 05 we are excited to share that pan lu was invited to deliver a talk to the reasoning team at google brain view the presentation slides here slides https lupantech github io docs chameleon llm pan lu google brain 2023 05 05 pdf 2023 04 24 our work has been featured in a marktechpost https www marktechpost com 2023 04 24 meet chameleon a plug and play compositional reasoning framework that harnesses the capabilities of large language models article 2023 04 23 our research has been recognized as one of the top ml papers of the week by dair ai https www linkedin com pulse top ml papers week dair ai 8e trackingid w6d1ow8fxkstjgdfuwgynq 3d 3d 2023 04 22 thrilled to announce that our work has been featured on worldofai https www youtube com watch v ewfixik4vjs ab channel worldofai s youtube channel https www youtube com watch v ewfixik4vjs ab channel worldofai 2023 04 21 our work is the trending project on https trends vercel app link https raw githubusercontent com lupantech chameleon llm main assets trend png 2023 04 20 huge thanks to john nay https twitter com johnjnay status 1649036276627132418 for sharing our work on twitter https twitter com johnjnay status 1649036276627132418 2023 04 19 our research is now listed on papers with code https paperswithcode com paper chameleon plug and play compositional 2023 04 19 we appreciate aran komatsuzaki https twitter com arankomatsuzaki status 1648848332977221632 for featuring our work on twitter https twitter com arankomatsuzaki status 1648848332977221632 in a timely manner 2023 04 19 special thanks to akhaliq https twitter com akhaliq status 1648851856930533378 for promptly sharing our work on twitter https twitter com akhaliq status 1648851856930533378 2023 04 19 visit our project s homepage at chameleon llm https chameleon llm github io 2023 04 19 our paper is now accessible at https arxiv org abs 2304 09842 lizard about chameleon chameleon is a plug and play compositional reasoning framework that augments llms with various types of tools chameleon synthesizes programs to compose various tools including llm models off the shelf vision models web search engines python functions and rule based modules tailored to user interests built on top of an llm as a natural language planner chameleon infers the appropriate sequence of tools to compose and execute in order to generate a final response showcase scienceqa assets showcase scienceqa png we showcase the adaptability and effectiveness of chameleon on two tasks scienceqa https scienceqa github io and tabmwp https promptpg github io notably chameleon with gpt 4 achieves an 86 54 accuracy on scienceqa significantly improving upon the best published few shot model by 11 37 using gpt 4 as the underlying llm chameleon achieves a 17 0 increase over the state of the art model leading to a 98 78 overall accuracy on tabmwp further studies suggest that using gpt 4 as a planner exhibits more consistent and rational tool selection and is able to infer potential constraints given the instructions compared to other llms like chatgpt for more details you can find our project page here https chameleon llm github io and our paper here https arxiv org pdf 2304 09842 pdf tv youtube video we would like to express our immense gratitude to worldofai https www youtube com intheworldofai for featuring and introducing our work on youtube https www youtube com watch v ewfixik4vjs ab channel worldofai youtube video https img youtube com vi ewfixik4vjs 0 jpg https www youtube com watch v ewfixik4vjs star star history star history chart https api star history com svg repos lupantech chameleon llm type date https star history com lupantech chameleon llm date requirements openai api key https platform openai com account api keys bing search api https www microsoft com en us bing apis bing web search api if you want to enable the bing search module but the module is optional install all required python dependencies generated by pipreqs python 3 8 10 huggingface hub numpy 1 23 2 openai 0 23 0 pandas 1 4 3 transformers 4 21 1 requests 2 28 1 install all required python dependencies you can skip this step if you have set up the dependencies before and the verisons are not strictly required pip install r requirements txt configuration openai api key obtain your openai api key from https platform openai com account api keys to use openai api key for chameleon you need to have billing set up aka paid account you can set up paid account at https platform openai com account billing overview bing search api key optional obtain your bing search api key from https www microsoft com en us bing apis bing web search api the bing search api key is optional failure to set up this key will lead to a slight performance drop on the scienceqa task hammer and wrench module inventory different tools in chameleon different types of tools in our module inventory tools assets tools png tool subset tools used on scienceqa and tabmwp respectively the reusable tools in two tasks are highlighted in green tools task assets tools task png run chameleon on scienceqa science question answering scienceqa https scienceqa github io is a multi modal question answering benchmark covering a wide range of scientific topics over diverse contexts the scienceqa dataset is provided in data scienceqa https github com lupantech chameleon llm tree main data scienceqa for more details you can explore the datatset and check out the explore https scienceqa github io explore html page and visualize https scienceqa github io visualize html page for the current version the results for the image captioner and text detector are off the shelf and stored in data scienceqa captions json and data scienceqa ocrs json respectively the live calling these two modules are coming soon to run chameleon gpt 4 sh cd run scienceqa python run py model chameleon label chameleon gpt4 policy engine gpt 4 kr engine gpt 4 qg engine gpt 4 sg engine gpt 4 test split test test number 1 it will generate the predictions and save the results at results scienceqa chameleon gpt4 test json results scienceqa chameleon gpt4 test cache jsonl and results scienceqa chameleon gpt4 test cache json we can get the accuracy metrics on average and across different question classes by running sh python evaluate py data file data scienceqa problems json result root results scienceqa result files chameleon chatgpt test cache jsonl to run chameleon chatgpt sh python run py model chameleon label chameleon gpt4 policy engine gpt 3 5 turbo kr engine gpt 3 5 turbo qg engine gpt 3 5 turbo sg engine gpt 3 5 turbo test split test test number 1 our chameleon is a generalized form of the cot chain of thought https arxiv org abs 2201 11903 method where the generated program is a sequence of solution generator and answer generator by passing model as cot modules is set as solution generator answer generator to run cot chain of thought prompted gpt 4 sh python run py model cot label cot gpt4 sg engine gpt 4 test split test test number 1 to run cot chain of thought prompted chatgpt sh python run py model cot label cot chatgpt sg engine gpt 4 test split test test number 1 run chameleon on tabmwp the tabmwp dataset contains 38 431 tabular math word problems each question in tabmwp is aligned with a tabular context which is presented as an image semi structured text and a structured table the tabmwp dataset is provided in data tabmwp https github com lupantech promptpg blob main data tabmwp for more details you can explore the datatset and check out the explore https promptpg github io explore html page and visualize https promptpg github io visualize html page to run chameleon gpt 4 sh cd run tabmwp python run py model chameleon label chameleon gpt4 test split test policy engine gpt 4 rl engine gpt 4 cl engine gpt 4 tv engine gpt 4 kr engine gpt 4 sg engine gpt 4 pg engine gpt 4 test number 1 rl cell threshold 18 cl cell threshold 18 it will generate the predictions and save the results at results tabmwp chameleon gpt4 test json results tabmwp chameleon gpt4 test cache jsonl and results tabmwp chameleon gpt4 test cache json we can get the accuracy metrics on average and across different question classes by running sh python evaluate py data file data tabmwp problems test json result root results tabmwp result files chameleon chatgpt test cache jsonl to run chameleon chatgpt sh python run py model chameleon label chameleon chatgpt test split test policy engine gpt 3 5 turbo rl engine gpt 3 5 turbo cl engine gpt 3 5 turbo tv engine gpt 3 5 turbo kr engine gpt 3 5 turbo sg engine gpt 3 5 turbo pg engine gpt 3 5 turbo test number 1 rl cell threshold 18 cl cell threshold 18 to run cot chain of thought prompted gpt 4 sh python run py model cot label cot gpt4 test split test sg engine gpt 4 test number 1 to run cot chain of thought prompted chatgpt sh python run py model cot label cot chatgpt test split test sg engine gpt 3 5 turbo test number 1 our chameleon is a generalized form of the pot program of thought https arxiv org abs 2211 12588 method where the generated program is a sequence of program generator program executor and answer generator by passing model as pot modules is set as program generator program executor answer generator to run pot program of thought prompted gpt 4 sh python run py model pot label pot gpt4 test split test pg engine gpt 4 test number 1 to run pot program of thought prompted chatgpt sh python run py model pot label pot chatgpt test split test pg engine gpt 3 5 turbo test number 1 more examples more examples on scienceqa dataset showcase scienceqa more assets showcase scienceqa more png chameleon gpt 4 is able to adapt to different input queries by generating programs that compose various tools and executing them sequentially to obtain the correct answers for instance the query above asks which animal s skin is adapted for survival in cold places which involves scientific terminology related to animal survival consequently the planner decides to rely on the bing search engine for domain specific knowledge benefiting from the numerous online resources available more examples on tabmwp showcase tabmwp long assets showcase tabmwp long png the adaptability and versatility of our chameleon for various queries are also observed on tabmwp as illustrated in the examples in the figure above the first example involves mathematical reasoning on a tax form chameleon 1 calls the knowledge retrieval model to recall basic knowledge that assists in understanding such domain specific tables 2 describes the table in a more readable natural language format and 3 finally relies on program aided tools to perform precise computations in the second example the system generates python code that closely aligns with the background knowledge provided by the knowledge retrieval model the third example requires the system to locate the cell in a large tabular context given the input query chameleon calls the row lookup model to help accurately locate the relevant rows and generate the language solution via an llm model instead of relying on program based tools chart with upwards trend how good is chameleon significant improvements are observed for chameleon over both fine tuned models and few shot prompted gpt 4 chatgpt results assets results png to visualize the predictions made by chameleon simply execute the jupyter notebook corresponding to your specific task notebooks results viewer task ipynb this will provide an interactive and user friendly way to explore the results generated by the model alternatively explore our project page https chameleon llm github io for more information and options slot machine what plans are chameleon learning tool use tools called in the generated programs from chameleon chatgpt and chameleon gpt 4 on scienceqa tool call scienceqa assets tool call scienceqa png tools called in the generated programs from chameleon chatgpt and chameleon gpt 4 on tabmwp tool call tabmwp assets tool call tabmwp png transition graph execute notebooks transition task model engine ipynb to visualize the module transition graph for programs generated on the test set transitions between modules in programs generated by chameleon gpt 4 on scienceqa start is the start symbol end is a terminal symbol and the others are non terminal symbols img src assets transition scienceqa gpt4 png width 45 height 45 transitions between modules in programs generated by chameleon gpt 4 on tabmwpqa start is the start symbol end is a terminal symbol and the others are non terminal symbols img src assets transition tabmwp gpt4 png width 55 height 55 smile cat want to develop a new task construct the module inventory create prompts for llm based models within the demos directory define the input execution and output for each module in model py develop the llm planner provide a comprehensive description of the module inventory and include a few examples that demonstrate how to map queries to the target program implement the data loader and evaluation method define the data loader within model py to modify the evaluation method update the corresponding section in main py enjoy the process with the groundwork in place it s time to have fun and dive into the task at hand coffee stay connected fantastic i m always open to engaging discussions collaborations or even just sharing a virtual coffee to get in touch visit pan lu https lupantech github io s homepage for contact information white check mark cite if you find chameleon useful for your your research and applications please kindly cite using this bibtex latex article lu2023chameleon title chameleon plug and play compositional reasoning with large language models author lu pan and peng baolin and cheng hao and galley michel and chang kai wei and wu ying nian and zhu song chun and gao jianfeng journal arxiv preprint arxiv 2304 09842 year 2023 | python ai chatgpt gpt-4 llm openai tool | ai |
ynuoms.github.io | gitpod ready to code https img shields io badge gitpod ready to code blue logo gitpod https gitpod io https github com ynuoms ynuoms github io | server |
|
translationese | translationese build status https travis ci org lutzky translationese png branch master https travis ci org lutzky translationese natural language processing project under the supervision of prof shuly winter documentation is available here https translationese readthedocs org | ai |
|
EasyLM | easylm large language models llms made easy easylm is a one stop solution for pre training finetuning evaluating and serving llms in jax flax easylm can scale up llm training to hundreds of tpu gpu accelerators by leveraging jax s pjit functionality building on top of hugginface s transformers https huggingface co docs transformers main en index and datasets https huggingface co docs datasets index this repo provides an easy to use and easy to customize codebase for training large language models without the complexity in many other frameworks easylm is built with jax flax by leveraging jax s pjit utility easylm is able to train large models that don t fit on a single accelerator by sharding the model weights and training data across multiple accelerators currently easylm supports multiple tpu gpu training in a single host as well as multi host training on google cloud tpu pods currently the following models are supported llama https arxiv org abs 2302 13971 gpt j https huggingface co eleutherai gpt j 6b roberta https huggingface co docs transformers model doc roberta discord server we are running an unofficial discord community unaffiliated with google for discussion related to training llms in jax follow this link to join the discord server https discord gg rf4drg3bhp we have dedicated channels for several jax based llm frameworks include easylm jaxseq https github com sea snell jaxseq alpa https github com alpa projects alpa and levanter https github com stanford crfm levanter models trained with easylm openllama openllama is our permissively licensed reproduction of llama which can be used for commercial purposes check out the project main page here https github com openlm research open llama the openllama can serve as drop in replacement for the llama weights in easylm please refer to the llama documentation docs llama md for more details koala koala is our new chatbot fine tuned on top of llama if you are interested in our koala chatbot you can check out the blogpost https bair berkeley edu blog 2023 04 03 koala and documentation for running it locally docs koala md installation the installation method differs between gpu hosts and cloud tpu hosts the first step is to pull from github shell git clone https github com young geng easylm git cd easylm export pythonpath pwd pythonpath installing on gpu host the gpu environment can be installed via anaconda https www anaconda com products distribution shell conda env create f scripts gpu environment yml conda activate easylm installing on cloud tpu host the tpu host vm comes with python and pip pre installed simply run the following script to set up the tpu host shell scripts tpu vm setup sh documentations docs readme md the easylm documentations can be found in the docs docs directory reference if you found easylm useful in your research or applications please cite using the following bibtex software geng2023easylm author geng xinyang title easylm a simple and scalable training framework for large language models month march year 2023 url https github com young geng easylm credits the llama implementation is from jax llama https github com sea snell jax llama the jax flax gpt j and roberta implementation are from transformers https huggingface co docs transformers main en index most of the jax utilities are from mlxu https github com young geng mlxu the codebase is heavily inspired by jaxseq https github com sea snell jaxseq | deep-learning flax jax language-model natural-language-processing transformer large-language-models chatbot llama | ai |
Deep_Learning_For_Computer_Vision_With_Python | deep learning for computer vision with python pb 13 imagenet https lonepatient top 2018 07 01 deep learning for computer vision with python pb 13 html pb 12 resnet https lonepatient top 2018 06 25 deep learning for computer vision with python pb 12 html pb 11 googlenet https lonepatient top 2018 06 19 deep learning for computer vision with python pb 11 html pb 10 kaggle https lonepatient top 2018 04 19 deep learning for computer vision with python pb 10 html pb 09 hdf5 https lonepatient top 2018 04 09 deep learning for computer vision with python pb 09 html pb 08 https lonepatient top 2018 04 02 deep learning for computer vision with python pb 08 html pb 07 https lonepatient top 2018 03 25 deep learning for computer vision with python pb 07 html pb 06 https lonepatient top 2018 03 16 deep learning for computer vision with python pb 06 html pb 05 https lonepatient top 2018 03 09 deep learning for computer vision with python pb 05 html pb 04 rank n https lonepatient top 2018 03 02 deep learning for computer vision with python pb 04 html pb 03 https lonepatient top 2018 02 25 deep learning for computer vision with python pb 03 html pb 02 https lonepatient top 2018 02 18 deep learning for computer vision with python pb 02 html | deep-learning python computer-vision keras cnn cv | ai |
flink-ml | flink ml is a library which provides machine learning ml apis and infrastructures that simplify the building of ml pipelines users can implement ml algorithms with the standard ml apis and further use these infrastructures to build ml pipelines for both training and inference jobs flink ml is developed under the umbrella of apache flink https flink apache org a name start a getting started you can follow the python quick start https nightlies apache org flink flink ml docs master docs try flink ml python quick start and the java quick start https nightlies apache org flink flink ml docs master docs try flink ml java quick start to get hands on experience with flink ml python and java apis respectively a name build a building the project run the mvn clean package command then you will find a jar file that contains your application plus any libraries that you may have added as dependencies to the application target artifact id version jar a name benchmark a benchmark flink ml provides functionalities to benchmark its machine learning algorithms for detailed information please check the benchmark getting started flink ml benchmark readme md a name documentation a documentation the documentation of flink ml is located on the website https nightlies apache org flink flink ml docs master or in the docs directory of the source code a name contributing a contributing you can learn more about how to contribute in the apache flink website https flink apache org contributing how to contribute html for code contributions please read carefully the contributing code https flink apache org contributing contribute code html section for an overview of ongoing community work a name license a license the code in this repository is licensed under the apache software license 2 license todo add a guideline for developers to install flink ml and run tests | big-data flink java machine-learning ml python | ai |
NLP_EDHEC | natural language processing here you will find the teaching materials for the natural language processing course at edhec business school 2022 what is the course about the course is designed as an introduction to the basics of natural language processing for analyzing unstructured user generated content it is for beginners to the topic and nlp in general but it will be helpful to have basic knowledge of python and a familarity with data science techniques topics covered include text preprocessing in python collecting your own data from twitter https twitter com and reddit https www reddit com content analysis text embeddings and supervised learning with text data what materials are available here the sildes will be posted on the course blackboard page they mostly serve as a high level introduction to the examples and exercies in colab notebooks which are linked to from the slides themselves copies of the colab notebooks can also be found in the folder called colab https github com gordeli nlp edhec tree master colab in this repository can i work through the material on my own if you didn t attend the class you can certainly work through the materials on your own the colab notebooks are designed to be readable and doable for individuals working at their own pace the slides posted on blackboard will guide you through the content the notebooks are intendend to be worked through in order each one will have examples to view and 1 or 2 practice exercises to complete aknowledgements i would like to aknowledge steve wilson at oakland university https steverw com for making his ds3 workshop materials publically available with an mit license | ai |
|
facebook-messenger-bot | facebook messenger bot a sample chatbot to show how to natural language processing to a chatbot with the help of dialogflow https dialogflow com tutorial how to create a realtime facebook messenger chatbot with node js and dialogflow https blog pusher com facebook chatbot dialogflow getting started 1 clone this repository and cd into it 2 execute npm install to download dependencies 3 see tutorial https blog pusher com facebook chatbot dialogflow for notes on how to get the required keys from dialogflow and facebook pre requisites node js https nodejs org en and npm built with dialogflow https dialogflow com for natural language processing licence mit https opensource org licenses mit | ai |
|
xamarin-embedding-starter-kit | xamarin embedding starter kit net embedding https docs microsoft com en us xamarin tools dotnet embedding starter kit for mobile development requirements android java xamarin android 7 5 or later android studio 3 x with java 1 8 ios obj c macos 10 12 sierra or later xcode 8 3 2 or later mono 5 0 android ndk v15 embeddinator 4000 issues 574 https github com mono embeddinator 4000 issues 574 features shared code navigation between xamarin and native tests structure main folders of starter kit e4k sources of e4k as submodule dotnet shared net code xamarin libraries for android and ios android sample android project java ios sample ios project obj c getting started 1 clone the repository with submodules 2 build e4k from sources see below build e4k from sources 3 open dotnet dotnet sln via visual studio for mac 4 write your code compile in release 5 execute scripts for embedding see below embedding 6 open android ios native projects for the check build e4k from sources build e4k sh build e4k from sources default all android parameter for build only for android build e4k sh android ios parameter for build only for ios build e4k sh ios embedding before embedding to the native libraries need to build projects with visual studio with release configuration build android sh embedding dotnet androidlibrary to aar and copy to android native project folder build ios sh embedding dotnet ioslibrary to framework and copy to ios native project folder output folders output android e4k generated output for android library output ios e4k generated output for ios library troubleshoots most problems related to e4k and dependencies see here https github com mono embeddinator 4000 issues additional info framework size https github com mono embeddinator 4000 issues 601 issuecomment 370909815 limitations with arrays on android https github com mono embeddinator 4000 pull 508 nbsp copy 2018 yauheni pakala mit | android-library ios-library dotnet xamarin xamarin-android xamarin-ios embedding embedding-dotnet e4k | front_end |
stanford-cs193p | awards ranking dev global top 200 certificate https leetcode com sergeyleschev a href https leetcode com sergeyleschev img src https github com sergeyleschev sergeyleschev blob main leetcode ranking png raw true alt drawing width 410 a a href https leetcode com sergeyleschev img src https github com sergeyleschev sergeyleschev blob main leetcode medals png raw true alt drawing width 280 a languages swift stanford cs193p stanford cs193p swift 5 uikit xcode 11 ios 12 c s leschev additional stanford cs193p swiftui xcode 12 ios 14 https github com sergeyleschev stanford cs193p swiftui c s leschev | swift ios apple xcode stanford-university stanford cs193p appdevelopment uikit sergeyleschev algorithm algorithms appdeveloper appstore ios-app macos ios-swift appstoreconnect package-manager pods | os |
defects4py | defects4py a database of real faults and an experimental infrastructure to enable controlled experiments in software engineering research for python | server |
|
aws-iot-device-sdk-cpp | new version available a new aws iot device sdk is now available https github com awslabs aws iot device sdk cpp v2 it is a complete rework built to improve reliability performance and security we invite your feedback this sdk will no longer receive feature updates but will receive security updates aws iot c device sdk overview overview features features design goals design collection of metrics metrics getting started getstarted installation installation porting to different platforms porting quick links quicklinks sample apis sampleapis license license support support a name overview a overview this document provides information about the aws iot device sdk for c a name features a features the device sdk simplifies access to the pub sub functionality of the aws iot broker via mqtt and provides apis to interact with thing shadows the sdk has been tested to work with the aws iot platform to ensure best interoperability of a device with the aws iot platform mqtt connection the device sdk provides functionality to create and maintain a mqtt connection it expects to be provided with a network connection class that connects and authenticates to aws iot using either direct tls or websocket over tls this connection is used for any further publish operations it also allows for subscribing to mqtt topics which will call a configurable callback function when these messages are received on these topics thing shadow this sdk implements the specific protocol for thing shadows to retrieve update and delete thing shadows adhering to the protocol that is implemented to ensure correct versioning and support for client tokens it abstracts the necessary mqtt topic subscriptions by automatically subscribing to and unsubscribing from the reserved topics as needed for each api call inbound state change requests are automatically signalled via a configurable callback jobs this sdk also implements the jobs protocol to interact with the aws iot jobs service the iot job service manages deployment of iot fleet wide tasks such as device software firmware deployments and updates rotation of security certificates device reboots and custom device specific management tasks for additional information please see the jobs developer guide https docs aws amazon com iot latest developerguide iot jobs html a name design a design goals of this sdk the c sdk was specifically designed for devices that are not resource constrained and required advanced features such as message queueing multi threading support and the latest language features primary aspects are designed around the c 11 standard platform neutral as long as the included cmake can find a c 11 compatible compiler and threading library network layer abstracted from the sdk can use any tls library and initialization method support for multiple platforms and compilers tested on linux windows with vs2015 and mac os flexibility in picking and choosing functionality can create clients which only perform a subset of mqtt operations support for rapidjson allowing use of complex shadow document structures a name metrics a collection of metrics beginning with release v1 2 0 of the sdk aws collects usage metrics indicating which language and version of the sdk is being used this allows us to prioritize our resources towards addressing issues faster in sdks that see the most and is an important data point however we do understand that not all customers would want to report this data by default in that case the sending of usage metrics can be easily disabled by the user by using the overloaded connect action which takes in a boolean for enabling or disabling the sdk metrics p iot client connect configcommon mqtt command timeout configcommon is clean session mqtt version mqtt 3 1 1 configcommon keep alive timeout secs std move client id nullptr nullptr nullptr false false for disabling metrics a name getstarted a how to get started ensure you understand the aws iot platform and create the necessary certificates and policies for more information on the aws iot platform please visit the aws iot developer guide http docs aws amazon com iot latest developerguide iot security identity html a name installation a installation this section explains the individual steps to retrieve the necessary files and be able to build your first application using the aws iot c sdk the sdk uses cmake to generate the necessary makefile cmake version 3 2 and above is required prerequisites make sure to have latest cmake installed minimum required version is 3 2 compiler should support c 11 features we have tested this sdk with gcc 5 clang 3 8 and on visual studio 2015 openssl has version 1 1 0 and libssl dev has version 1 1 0 you can find basic information on how to set up the above on some popular platforms in platform md https github com aws aws iot device sdk cpp blob master platform md build targets the sdk itself builds as a library by default all the samples tests link to the library the library target is aws iot sdk cpp unit tests aws iot unit tests integration tests aws iot integration tests sample pub sub sample sample shadow delta sample this following sample targets are generated only if openssl is being used sample discovery sample sample robot arm sample sample switch sample steps clone the sdk from the github repository change to the repository folder create a folder called build to hold the build files and change to this folder in source builds are not allowed run cmake to build the sdk with the cli the command will download required third party libraries automatically and generate a makefile type make target name to build the desired target it will create a folder called bin that will have the build output a name porting a porting to different platforms the sdk has been written to adhere to c 11 standard without any additional compiler specific features enabled it should compile on any platform that has a modern c 11 enabled compiler without issue the platform should be able to provide a c 11 compatible threading implementation eg pthread on linux tls libraries can be added by simply implementing a derived class of networkconnection and providing an instance to the client we provide the following reference implementations for the network layer openssl mqtt over tls using openssl v1 1 0 tested on windows vs 2015 and linux the provided implementation requires openssl to be pre installed on the device use the mqtt port setting from the config file while setting up the network instance mbedtls mqtt over tls using mbedtls tested on linux the provided implementation will download mbedtls v2 3 0 from the github repo and build and link to the libraries please be warned that the default configuration of mbedtls limits packet sizes to 16k use the mqtt port setting from the config file while setting up the network instance websocket mqtt over websocket tested on both windows vs 2015 and linux uses openssl 1 1 0 as the underlying tls layer the provided implementation requires openssl to be pre installed on the device please be aware that while the provided reference implementation allows initialization of credentials from any source the recommended way to do so is to use the aws cli to generate credential files and read the generated files use the https port setting from the config file while setting up the network instance cross compiling the sdk for other platforms the included toolchainfile cmake file can be used to cross compile the sdk for other platforms procedure for testing cross compiling if using openssl 1 build download toolchain for specific platform 2 modify the toolchainfile cmake with location and target of toolchain specify toolchain directory set toolchain dir home toolchain dir here bin specify cross compilation target set target cross target here 3 cross compile openssl using the same toolchain 4 modify network cmakelists txt in and change openssl library location to cross compiled openssl 5 cd build cmake dcmake toolchain file toolchainfile cmake make 6 scp the application binary certs and config for the application into the platform you re testing 7 run application for mbedtls you don t need to cross compile mbedtls as it gets compiled when you run make with the same compiler as pointed to by the toolchain file also included is a simple example toolchain which is used for setting the default compiler as clang instead of g as an example to show how the toolchain file can be modified a name quicklinks a quick links sdk documentation http aws iot device sdk cpp docs s3 website us east 1 amazonaws com v1 4 0 index html api documentation for the sdk platform guide platform md this file lists the steps needed to set up the pre requisites on some popular platforms developers guide devguide md provides a guide on how the sdk can be included in custom code greengrass discovery support guide greengrassdiscovery md provides information on support for aws greengrass discovery service network layer implementation guide network readme md detailed description about the network layer and how to implement a custom wrapper class sample guide samples readme md details about the included samples test information tests readme md details about the included unit and integration tests mqtt 3 1 1 spec http docs oasis open org mqtt mqtt v3 1 1 csprd02 mqtt v3 1 1 csprd02 html link to the mqtt v3 1 1 spec that this sdk implements a name sampleapis a sample apis sync creating a basic mqtt client requires a networkconnection instance and mqtt command timeout in milliseconds for any internal blocking operations std shared ptr networkconnection p network connection create instance std shared ptr mqttclient p client mqttclient create p network connection std chrono milliseconds 30000 connecting to the aws iot mqtt platform rc p client connect std chrono milliseconds 30000 false mqtt version mqtt 3 1 1 std chrono seconds 60 utf8string create client id nullptr nullptr nullptr subscribe to a topic util string p topic name str topic std unique ptr utf8string p topic name utf8string create p topic name str mqtt subscription applicationcallbackhandlerptr p sub handler std bind handler this std placeholders 1 std placeholders 2 std placeholders 3 std shared ptr mqtt subscription p subscription mqtt subscription create std move p topic name mqtt qos qos0 p sub handler nullptr util vector std shared ptr mqtt subscription topic vector topic vector push back p subscription rc p client subscribe topic vector std chrono milliseconds 30000 publish to a topic util string p topic name str topic std unique ptr utf8string p topic name utf8string create p topic name str rc p client publish std move p topic name false false mqtt qos qos1 payload std chrono milliseconds 30000 unsubscribe from a topic util string p topic name str topic std unique ptr utf8string p topic name utf8string create p topic name str util vector std unique ptr utf8string topic vector topic vector push back std move p topic name rc p client subscribe topic vector std chrono milliseconds 30000 async connect is a sync only api in this version of the sdk subscribe to a topic uint16 t packet id out util string p topic name str topic std unique ptr utf8string p topic name utf8string create p topic name str mqtt subscription applicationcallbackhandlerptr p sub handler std bind handler this std placeholders 1 std placeholders 2 std placeholders 3 std shared ptr mqtt subscription p subscription mqtt subscription create std move p topic name mqtt qos qos0 p sub handler nullptr util vector std shared ptr mqtt subscription topic vector topic vector push back p subscription rc p client subscribeasync topic vector nullptr packet id out publish to a topic uint16 t packet id out util string p topic name str topic std unique ptr utf8string p topic name utf8string create p topic name str rc p client publishasync std move p topic name false false mqtt qos qos1 payload packet id out unsubscribe from a topic uint16 t packet id out util string p topic name str topic std unique ptr utf8string p topic name utf8string create p topic name str util vector std unique ptr utf8string topic vector topic vector push back std move p topic name rc p client subscribe topic vector packet id out logging to enable logging create an instance of the consolelogsystem in the main of your application as shown below std shared ptr awsiotsdk util logging consolelogsystem p log system std make shared awsiotsdk util logging consolelogsystem awsiotsdk util logging loglevel info awsiotsdk util logging initializeawslogging p log system create a log tag for your application to distinguish it from the sdk logs define log tag application application you can now add logging to any part of your application using aws log error or aws log info as shown below aws log error log tag application failed to perform action s responsehelper tostring rc c str a name license a license this sdk is distributed under the apache license version 2 0 http www apache org licenses license 2 0 see license and notice txt for more information a name support a support if you have any technical questions about aws iot c sdk use the aws iot forum https forums aws amazon com forum jspa forumid 210 for any other questions on aws iot contact aws support https aws amazon com contact us a list of known issues is maintained in knownissues md knownissues md note customers have reported deadlocks https github com aws aws iot device sdk cpp issues 14 while using the aws iot device sdk for c if you are affected a fix is available in the locking fixes https github com aws aws iot device sdk cpp tree locking fixes branch this issue is also resolved in the new aws iot device sdk for c which is currently in developer preview https github com awslabs aws iot device sdk cpp v2 | server |
|
Gym-Management-App | project logo br p align center img src screenshots logo png alt logo width 80 height 80 h3 align center my gym manager h3 p align center a flutter language based gym management app p table of contents details open open summary table of contents summary ol li a href about this project about this project a ul li a href built with built with a li ul li li a href getting started getting started a ul li a href prerequisites prerequisites a li li a href installation installation a li ul li li a href usage usage a li li a href contributing contributing a li li a href license license a li li a href contact contact a li li a href acknowledgements acknowledgements a li ol details about the project about this project p align center img src screenshots 1 png alt screenshot1 width 200 height 400 p p align center this is a flutter based android application development project done onbehalf of the mobile application development module first of all please bare following things in your mind only the android development is considered no ios platform developed this app will demonstrate basic functionalities of what i have learned first time flutter learner this is not a fully completed app most of the functions are yet be learned and applied but you will be able to do some basic functionalities in the app smile now let s see what is the purpose of the app this is more like a day to day tracking app where gym owner can keep record of members trainers equipments and financial records it is not like a complete complex database driven app this is more like a entry level app which provide basic crud activites for the app functions a list of commonly used resources that i find helpful are listed in the acknowledgements p built with these are the main languages and services i used for this project flutter https flutter dev firebase https firebase google com getting started getting started following instruction will give you an idea about how you can setup this project locally prerequisites you need to have following software and languages to start this br latest android studio and or visual studio code i prefer to use vs code along with android studio for coding which is much more simple and elegent when coding flutter sdk installation 1 install both android studio vs code or just android studio 2 install flutter if you have installed all the things correctly then you can open up one of the editors and start using this project usage examples usage following screenshots of the app will give you an idea about how this app works please use 6 or more characters as the password when using the app br br img src screenshots 2 png alt screenshot2 width 200 height 400 img src screenshots 3 png alt screenshot3 width 200 height 400 img src screenshots 4 png alt screenshot4 width 200 height 400 img src screenshots 5 png alt screenshot5 width 200 height 400 img src screenshots 6 png alt screenshot6 width 200 height 400 img src screenshots 7 png alt screenshot7 width 200 height 400 img src screenshots 8 png alt screenshot8 width 200 height 400 img src screenshots 10 png alt screenshot10 width 200 height 400 img src screenshots 11 png alt screenshot11 width 200 height 400 img src screenshots 12 png alt screenshot12 width 200 height 400 img src screenshots 13 png alt screenshot13 width 200 height 400 img src screenshots 14 png alt screenshot14 width 200 height 400 contributing contributing contributions are what make the open source community such an amazing place to be learn inspire and create any contributions you make are greatly appreciated 1 fork the project 2 create your feature branch or use master branch 3 commit your changes 4 push to the branch 5 open a pull request license license distributed under the mit license acknowledgements acknowledgements udemy flutter course for begineers recommended https www udemy com course flutter bootcamp with dart marcus ng https www youtube com channel uc6dy0rq6zdnquhq1eeergua the flutter way https www youtube com channel ucjm7i4g4z7zgcja hkhlcvw images https www iconfinder com families microworld stackoverflow https stackoverflow com | firebase android-app firebase-realtime-database flutter flutter-apps flutter-examples gym gym-management | front_end |
Rtos_cortex | 3 cortex 3 4 7 253 1 activ 1 100 os pass 2 svc handler os delay ms delay 1 tim6 system us os freeze global flag os wake global flag hold 3 service os malloc os free os task new os task del main os run systeminit main irqsize estack startup stm32fxxxx s equ irqsize 600 equ main stask estack irqsize 200 0xfffffff8 equ irq stack estack g pfnvectors word main stask word reset handler word nmi handler word hardfault handler word memmanage handler word busfault handler word usagefault handler word irq stack word irqsize word ebss word 0 word svc handler word debugmon handler word 0 word pendsv handler word systick handler g pfnvectors cmsis os enableirq irqn type irqn uint8 t priority 1 14 cmsis 0 255 4 os disableirq irqn type irqn os resource ask os resource free m4 m7 os ranlom range https github com avi crak rtos cortex https tortoisegit org | stm32 rtos print qspi | os |
mint | mint src assets mint logo svg ci https github com mint lang mint actions workflows ci yml badge svg https github com mint lang mint actions workflows ci yml discord https img shields io discord 698214718241767445 https discord gg nxfujs2 backers on open collective https opencollective com mint backers badge svg backers sponsors on open collective https opencollective com mint sponsors badge svg sponsors a refreshing programming language for the front end web aiming to solve the most common issues of single page applications spas at a language level reusable components styling routing global and local state handling synchronous and asynchronous computations that might fail while focusing on developer happiness fast compilation readability project status the project is in development we are still tweaking the language and standard library there are some bigger applications which can be used as examples learning material mint realworld https github com mint lang mint realworld 3300 loc mint ui https github com mint lang mint ui 9500 loc mint ui website https github com mint lang mint ui 27256 loc installing follow these instructions https www mint lang com install documentation tutorial https tutorial mint lang com learning guide https www mint lang com guide api docs https www mint lang com api community questions or suggestions ask on discord https discord gg kvkr5uzkhy also visit awesome mint https github com egajda awesome mint to see more guides tutorials and examples contributing read the general contributing guide https github com mint lang mint blob master contributing md and then 1 fork it https github com mint lang mint fork 2 create your feature branch git checkout b my new feature 3 commit your changes git commit am add some feature 4 push to the branch git push origin my new feature 5 create a new pull request ways you can contribute use the language this is the most helpful thing at this stage because we can discover bugs and missing features this way documentation and website the documentation always needs some work if you discover that something is not documented or can be improved you can create a pr for it in the website repository https github com mint lang mint website rails language features if you have any idea about a new language feature create a detailed issue about it with examples and description why is it needed what problems does it solve code review the compiler can use a thorough code review also code reviews for prs are welcome standard library the standard library is incomplete and needs a lot of work create modules for not yet implemented web apis or a separate package a lot of modules like string dom etc are missing a lot of features you can add new functions here with tests write a package if you have a feature you use and can be moved into a package it can be good for other developers marketing write blog posts and such to help others become aware of the language compiler there are a few issues that could be fixed and features that can be implemented in the compiler questions proposals let s discuss in the github discussions https github com mint lang mint discussions otherwise please create at new issue https github com mint lang mint issues new contributors this project exists thanks to all the people who contribute a href https github com mint lang mint graphs contributors img src https opencollective com mint contributors svg width 890 button false a backers thank you to all our backers become a backer https opencollective com mint backer a href https opencollective com mint backers target blank img src https opencollective com mint backers svg width 890 a sponsors support this project by becoming a sponsor your logo will show up here with a link to your website become a sponsor https opencollective com mint sponsor a href https opencollective com mint sponsor 0 website target blank img src https opencollective com mint sponsor 0 avatar svg a a href https opencollective com mint sponsor 1 website target blank img src https opencollective com mint sponsor 1 avatar svg a a href https opencollective com mint sponsor 2 website target blank img src https opencollective com mint sponsor 2 avatar svg a a href https opencollective com mint sponsor 3 website target blank img src https opencollective com mint sponsor 3 avatar svg a a href https opencollective com mint sponsor 4 website target blank img src https opencollective com mint sponsor 4 avatar svg a a href https opencollective com mint sponsor 5 website target blank img src https opencollective com mint sponsor 5 avatar svg a a href https opencollective com mint sponsor 6 website target blank img src https opencollective com mint sponsor 6 avatar svg a a href https opencollective com mint sponsor 7 website target blank img src https opencollective com mint sponsor 7 avatar svg a a href https opencollective com mint sponsor 8 website target blank img src https opencollective com mint sponsor 8 avatar svg a a href https opencollective com mint sponsor 9 website target blank img src https opencollective com mint sponsor 9 avatar svg a | mint-lang language compiler programming-language compile-to-js | front_end |
rtosim | rtosim rtosim is a set of efficient and extensible c libraries to connect opensim with different devices rtosim can use data provided by motion capture systems to solve opensim inverse kinematics and inverse dynamics on a frame by frame basis multiple threads operate concurrently to remove idle times due to communications with input and output devices and the data flow is automatically managed by rtosim in order to preserve data integrity and avoid race conditions read more about rtosim at the rtosim project page https simtk org home rtosim dependencies rtosim depends on the following cross platform building cmake http www cmake org 2 8 8 or later compiler visual studio http www visualstudio com 2013 or later windows only gcc http gcc gnu org 4 8 1 or later typically on linux clang http clang llvm org 3 4 or later typically on mac possibly through xcode required external libraries simbody https github com simbody simbody versions 3 7 or higher opensim https github com opensim org opensim core versions 4 1 or higher concurrency https github com realtimebiomechanics concurrency filter https github com realtimebiomechanics filter optional external libraries vicon datastream sdk http www vicon com downloads version 1 5 the latest version of rtosim works with opensim 4 3 the superbuild of opensim will install simbody and other requirements automatically for earlier versions of opensim you can use earlier editions of rtosim install rtosim works on windows mac and linux build 1 get and compile simbody https github com simbody simbody tree simbody 3 5 3 important if you want to use multiple threads to solve the inverse kinematics it is necessary to first patch simbody patch simbody 2 get and compile opensim https github com opensim org opensim core tree v3 2 0 opensim 3 get and compile concurrency https github com realtimebiomechanics concurrency 4 get and compile filter https github com realtimebiomechanics filter 5 if you want to enable the real time stream from vicon nexus you need to download and install vicon datastream sdk http www vicon com downloads add the environmental variable vicondatastream install dir that points to the installation directory of the vicon datastream sdk must contain the file client h as example c program files x86 vicon datastream sdk win32 cpp 6 build your project using cmake 7 explore how to use rtosim apis in your project using the provided examples examples patch simbody if you want to use multiple threads to solve the opensim inverse kinematics using ipopt as optimisation algorithm which is the default algorithm for constrained optimisations in opensim you need to patch simbody first to do this get the files iplapacksolverinterface cpp https github com cpizzolato simbody blob fix ipopt issue175 simtkmath optimizers src ipopt iplapacksolverinterface cpp and iplapacksolverinterface hpp https github com cpizzolato simbody blob fix ipopt issue175 simtkmath optimizers src ipopt iplapacksolverinterface hpp and use them to replace the corresponding files in your simbody distribution in the directory simtkmath optimizers src ipopt then compile and install simbody test data test data is currently unavailable under maintenance get the test data https drive google com open id 0bzmak5l0qv2puxk4sw9qcv9jsvu 1 using text files you may need to adapt the commands to your directory structure rtosim ik from file solves the inverse kinematics from marker trajectory trc files example of use rtosim ik from file model rtosim testdata models 2392 scaled clusters osim trc rtosim testdata unfilteredrawdata walking walking trc task set rtosim testdata setup walking ik taskset xml v rtosim id from file solves the inverse dynamics from a motion mot file and from ground reaction forces mot this example functions similarly to the opensim inverse dynamics tool the joint angles are filtered not in real time by rtosim id from file while the ground reaction forces have to be pre filtered example of use rtosim id from file model rtosim testdata models 2392 scaled clusters osim mot rtosim testdata filtereddata8hz walking walking mot ext loads rtosim testdata setup walking externalloads xml rtosim ik id from file solves inverse kinematics and inverse dynamics from raw marker trajectories and raw ground reaction forces it works in the same way as rtosim ik id from nexus but uses files as input example of use rtosim ik id from file model rtosim testdata models 2392 scaled clusters osim trc rtosim testdata unfilteredrawdata walking walking trc task set rtosim testdata setup walking ik taskset xml ext loads rtosim testdata setup walking externalloads xml v 2 using vicon nexus you need to have rtosim vicon nexus tested with version 2 2 3 vicon virtual system version 1 3 2 and vicon datastream sdk version 1 5 installed on your system open vicon nexus and navigate to the directory viconrawdata provided in the test data the subject tab of vicon nexus should automatically populate press the go live button to enter in live mode this is the mode used to recorda data during an acquisition open vicon virtual system and load the files walking x1d and walking x2d press the stream button in vicon virtual system to start streaming the raw data to vicon nexus markers should appear in vicon nexus markers should be correctly autolabelled in real time be sure you can see the ground reaction forces during the stance phases if not check that the force plates correctly appear unser the menu device also under the source menu check that the force plates which names are 1 right and 2 left are connected to 2 mx giganet slot 1 execute the example file rtosim ik id from nexus you need to adapt the command to your directory structure rtosim ik id from nexus model rtosim testdata models 2392 scaled clusters osim task set rtosim testdata setup walking ik taskset xml ext loads rtosim testdata setup walking externalloads nexus xml v to close the execution of rtosim press any key followed by enter for the available options execute rtosim ik id from nexus h adapt the software for your gait laboratory the representation of force plates moments in the version of vicon nexus used to test rtosim is not what you may expect to have the correct representation of joint moments in the global reference system of the gait laboratory is necessary to perform a further transformation this requires to know the position of the force plates you can get this by opening vicon nexus clicking on your force plate show advanced position currently these values are harcoded in rtosim for the griffith university gait laboratory however what you have to do to adapt it to your lab is to modify the function datafromnexus getforceplateposition const with your force plates position important the position of the force plates must be consistent with the rotation used in the function datafromnexus setaxismapping vds client client const you need to provide the position of the force plate already rotated licensing please see the file called license txt copyright c 2010 2016 c pizzolato m reggiani licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license contacts claudio pizzolato c pizzolato griffith edu au publications if you are using rtosim or part of it please cite c pizzolato m reggiani l modenese d g lloyd 2016 real time inverse kinematics and inverse dynamics for lower limb applications using opensim computer methods in biomechanics and biomedical engineering doi 10 1080 10255842 2016 1240789 to link to this article http dx doi org 10 1080 10255842 2016 1240789 acknowledgments australian national health and medical research council 628850 royal society of nz marsden fund 12 uoa 1221 us national institutes of health grant r01eb009351 commission of the european union grant ifp7 ict 2013 10 611695 | os |
|
greylog-stack | greylog stack implementation of cloud engineering challenge for greylog structure this git repository contains implementation for the greylog chanllege the terraform modules created are in this location modules networking modules networking compute modules compute database modules database the terraform workspace created is in this location workspace sandbox workspace sandbox documentation to view documentation on each module please visit the readme md file located in each module to view documentation on how to run the terraform scripts please visit the sandbox workspace sandbox workspace and read the readme md architectural diragram architectural diagram greylog jpg note to make it easy for anyone else to run this terraform configuration without doing too much configurations i did not configure a remote backend which should be used when collaboration with other engineers create a separate git repo for my modules and tag them so as not to break changes in the future | cloud |
|
crypto-ecosystems | crypto ecosystems mit license with attribution https github com electric capital crypto ecosystems blob master license crypto ecosystems is a taxonomy for sharing data around open source blockchain web3 cryptocurrency and decentralized ecosystems and tying them to github organizations and code repositories all of the ecosystems are specified in toml https github com toml lang toml configuration files this repository is not complete and hopefully it never is as there are new ecosystems and repositories created everyday how to contribute there s a couple of ways you can help grow this initiative option 1 opening a pull request you can make any toml file for an ecosystem under the data ecosystems directory or edit an existing one to help improve data around an ecosystem you can fork this repository and open a pr from the forked repo to this repo if you are not sure how to do that you can follow the tutorial in this video https www loom com share f23aab8c675940a9998b228ea1e179b7 data format an example configuration file for the bitcoin ecosystem looks like this toml ecosystem level information title bitcoin sub ecosystems these are the titles of other ecosystems in different toml files in the data ecosystems directory sub ecosystems lightning rsk smart bitcoin zeronet github organizations this is a list of links to associated github organizations github organizations https github com bitcoin https github com bitcoin core https github com bitcoinj https github com btcsuite https github com libbitcoin https github com rust bitcoin repositories these are structs including a url and tags for a git repository these urls do not necessarily have to be on github repo url https github com bitcoin bitcoin tags protocol repo url https github com bitcoinbook bitcoinbook tags documentation repo url https github com bitcoin wallet bitcoin wallet tags wallet by specifying the data as evolving config files in git we benefit from a long term auditable database that is both human and machine readable option 2 complete the ecosystem submission form if you are not a developer or you find making a commit too difficult you can use this airtable based alternative below you can visit the form here https airtable com shrn4vzmlblm3dap8 fill it submit it and we ll take care of the rest how to give attribution for usage of the electric capital crypto ecosystems to use the electric capital crypto ecosystems map you will need an attribution attribution needs to have 3 components 1 source electric capital crypto ecosystems mapping 2 link https github com electric capital crypto ecosystems 3 logo link to logo https drive google com file d 1dax6wmcbtia7kap5aauwyg6t zew9z22 view usp sharing optional everyone in the crypto ecosystem benefits from additions to this repository it is a help to everyone to include an ask to contribute next to your attribution sample request language if you re working in open source crypto submit your repository here to be counted ins sample attribution ins data source electric capital crypto ecosystems mapping https github com electric capital crypto ecosystems if you re working in open source crypto submit your repository here https github com electric capital crypto ecosystems to be counted | crypto-ecosystems taxonomy blockchain cryptocurrency decentralization | blockchain |
nextcloud-snap | snappy nextcloud nextcloud server packaged as a snap it consists of nextcloud 27 apache 2 4 php 8 1 mysql 8 redis 6 how to install get it from the snap store https snapcraft io static images badges en snap store white svg https snapcraft io nextcloud there are a number of releases available 1 by default you ll get the newest stable one but you may be interested in others how to use upon visiting the nextcloud installation for the first time you ll be prompted for an admin username and password after you provide that information you ll be logged in and able to create users install apps and upload files note that this snap includes a service that runs cron php every 15 minutes which will automatically change the cron admin setting to cron for you removable media also note that the interface providing the ability to access removable media is not automatically connected upon install so if you d like to use external storage or otherwise use a device in media or mnt for data you need to give the snap permission to access removable media by connecting that interface sudo snap connect nextcloud removable media system monitoring the system application requires a bit more access to the system than the snap uses by default e g the ability to monitor network hardware etc if you d like to utilize those features you ll need to connect the interface that allows that kind of access sudo snap connect nextcloud network observe configuration beyond the typical nextcloud configuration either by using nextcloud occ or editing var snap nextcloud current nextcloud config config php the snap exposes extra configuration options via the snap set command http https port configuration by default the snap will listen on port 80 if you enable https it will listen on both 80 and 443 and http traffic will be redirected to https but perhaps you re putting the snap behind a proxy of some kind in which case you probably want to change those ports if you d like to change the http port say to port 81 run sudo snap set nextcloud ports http 81 to change the https port say to port 444 run sudo snap set nextcloud ports https 444 note that assuming https is enabled this will cause http traffic to be redirected to port 444 you can specify both of these simultaneously as well sudo snap set nextcloud ports http 81 ports https 444 note let s encrypt will expect that nextcloud is exposed on ports 80 and 443 if you change ports and don t put nextcloud behind a proxy such that ports 80 and 443 are sent to nextcloud for that domain name let s encrypt will be unable to verify ownership of your domain and will not grant certificates also note nextcloud s automatic hostname detection can fail when behind a proxy you might notice it redirecting incorrectly if this happens override the automatic detection including the port if necessary e g sudo nextcloud occ config system set overwritehost value example com 81 php memory limit configuration by default php will use 128m as the memory limit if you notice images not getting previews generated or errors about memory exhaustion in your nextcloud log you may need to set this to a higher value if you d like to set the memory limit to a higher value say 512m run sudo snap set nextcloud php memory limit 512m to set it to be unlimited not recommended use 1 sudo snap set nextcloud php memory limit 1 cronjob interval configuration by default the cronjob interval is 15 minutes to adjust it say 10 minutes simply run sudo snap set nextcloud nextcloud cron interval 10m if you want to disable the cronjob completely run sudo snap set nextcloud nextcloud cron interval 1 to reenable it again simply set the nextcloud cron interval snap variable to a value that isn t 1 http compression configuration by default the snap does not enable http compression to enable it run sudo snap set nextcloud http compression true to disable it run sudo snap set nextcloud http compression false debug mode by default the snap installs itself in production mode which prevents apache and php from providing any detailed version or library information in the http headers and error pages debug mode can be enabled with sudo snap set nextcloud mode debug debug and production are the only valid modes included cli utilities there are a few cli utilities included nextcloud occ nextcloud s occ configuration tool you can always edit the config file directly var snap nextcloud current nextcloud config config php but the configuration tool provides a cli interface for it see nextcloud occ h for more information note that it requires sudo nextcloud mysql client mysql client preconfigured to communicate with nextcloud mysql server this may be useful in case you need to migrate nextcloud installations note that it requires sudo nextcloud mysqldump dump nextcloud database to stdout you should probaby redirect its output to a file note that it requires sudo nextcloud enable https enable https via self signed certificates let s encrypt or custom certificates http will redirect to https non custom certificates will automatically be kept up to date see nextcloud enable https h for more information note that it requires sudo nextcloud disable https disable https does not remove certificates note that it requires sudo nextcloud manual install manually install nextcloud instead of visiting it in your browser this allows you to create the admin user via the cli note that it requires sudo nextcloud export export data suitable for migrating servers by default this includes the nextcloud database configuration and data see nextcloud export h for more information note that it requires sudo nextcloud import import data exported from another nextcloud snap instance via nextcloud export by default this imports the database config and data see nextcloud import h for more information note that it requires sudo where is my stuff snap data var snap nextcloud current by default logs apache php mysql redis and nextcloud logs keys and certificates mysql database redis database nextcloud config any nextcloud apps installed by the user snap common var snap nextcloud common by default nextcloud data hacking if you change something in the snap build it install it and you can run a suite of acceptance tests against it the tests are written in ruby using capybara and rspec to run the tests you first need to install a few dependencies sudo apt install gcc g make qt5 default libqt5webkit5 dev ruby dev zlib1g dev sudo gem install bundle cd tests bundle install additionally if you do not have x configured install the following for a fake x server sudo apt install xvfb make sure the snap has a user called admin with password admin used for login tests sudo nextcloud manual install admin admin and finally run the tests cd tests rake test 1 https github com nextcloud nextcloud snap wiki release strategy | nextcloud snap file-sharing iot hacktoberfest | server |
mmcv | div align center img src https raw githubusercontent com open mmlab mmcv main docs en mmcv logo png width 300 div nbsp div div align center b font size 5 openmmlab website font b sup a href https openmmlab com i font size 4 hot font i a sup nbsp nbsp nbsp nbsp b font size 5 openmmlab platform font b sup a href https platform openmmlab com i font size 4 try it out font i a sup div div nbsp div platform https img shields io badge platform linux 7cwindows 7cmacos blue https mmcv readthedocs io en latest get started installation html pypi python version https img shields io pypi pyversions mmcv https pypi org project mmcv pytorch https img shields io badge pytorch 1 8 2 0 orange https pytorch org get started previous versions cuda https img shields io badge cuda 10 1 11 8 green https developer nvidia com cuda downloads pypi https img shields io pypi v mmcv https pypi org project mmcv badge https github com open mmlab mmcv workflows build badge svg https github com open mmlab mmcv actions codecov https codecov io gh open mmlab mmcv branch master graph badge svg https codecov io gh open mmlab mmcv license https img shields io github license open mmlab mmcv svg https github com open mmlab mmcv blob master license documentation https mmcv readthedocs io en latest installation https mmcv readthedocs io en latest get started installation html reporting issues https github com open mmlab mmcv issues new choose div div align center english readme zh cn md div highlights the openmmlab team released a new generation of training engine mmengine https github com open mmlab mmengine at the world artificial intelligence conference on september 1 2022 it is a foundational library for training deep learning models compared with mmcv it provides a universal and powerful runner an open architecture with a more unified interface and a more customizable training process mmcv v2 0 0 official version was released on april 6 2023 in version 2 x it removed components related to the training process and added a data transformation module also starting from 2 x it renamed the package names mmcv to mmcv lite and mmcv full to mmcv for details see compatibility documentation docs en compatibility md mmcv will maintain both 1 x https github com open mmlab mmcv tree 1 x corresponding to the original master https github com open mmlab mmcv tree master branch and 2 x corresponding to the main branch now the default branch versions simultaneously for details see branch maintenance plan readme md branch maintenance plan introduction mmcv is a foundational library for computer vision research and it provides the following functionalities image video processing https mmcv readthedocs io en latest understand mmcv data process html image and annotation visualization https mmcv readthedocs io en latest understand mmcv visualization html image transformation https mmcv readthedocs io en latest understand mmcv data transform html various cnn architectures https mmcv readthedocs io en latest understand mmcv cnn html high quality implementation of common cpu and cuda ops https mmcv readthedocs io en latest understand mmcv ops html it supports the following systems linux windows macos see the documentation http mmcv readthedocs io en latest for more features and usage note mmcv requires python 3 7 installation there are two versions of mmcv mmcv comprehensive with full features and various cuda ops out of the box it takes longer time to build mmcv lite lite without cuda ops but all other features similar to mmcv 1 0 0 it is useful when you do not need those cuda ops note do not install both versions in the same environment otherwise you may encounter errors like modulenotfound you need to uninstall one before installing the other installing the full version is highly recommended if cuda is available install mmcv before installing mmcv make sure that pytorch has been successfully installed following the pytorch official installation guide https github com pytorch pytorch installation for apple silicon users please use pytorch 1 13 the command to install mmcv bash pip install u openmim mim install mmcv if you need to specify the version of mmcv you can use the following command bash mim install mmcv 2 0 0 if you find that the above installation command does not use a pre built package ending with whl but a source package ending with tar gz you may not have a pre build package corresponding to the pytorch or cuda or mmcv version in which case you can build mmcv from source https mmcv readthedocs io en latest get started build html details summary installation log using pre built packages summary looking in links https download openmmlab com mmcv dist cu102 torch1 8 0 index html br collecting mmcv br b downloading https download openmmlab com mmcv dist cu102 torch1 8 0 mmcv 2 0 0 cp38 cp38 manylinux1 x86 64 whl b details details summary installation log using source packages summary looking in links https download openmmlab com mmcv dist cu102 torch1 8 0 index html br collecting mmcv 2 0 0 br b downloading mmcv 2 0 0 tar gz b details for more installation methods please refer to the installation documentation https mmcv readthedocs io en latest get started installation html install mmcv lite if you need to use pytorch related modules make sure pytorch has been successfully installed in your environment by referring to the pytorch official installation guide https github com pytorch pytorch installation bash pip install u openmim mim install mmcv lite faq if you face some installation issues cuda related issues or runtimeerrors you may first refer to this frequently asked questions https mmcv readthedocs io en latest faq html if you face installation problems or runtime issues you may first refer to this frequently asked questions https mmcv readthedocs io en latest faq html to see if there is a solution if the problem is still not solved feel free to open an issue https github com open mmlab mmcv issues citation if you find this project useful in your research please consider cite latex misc mmcv title mmcv openmmlab computer vision foundation author mmcv contributors howpublished url https github com open mmlab mmcv year 2018 contributing we appreciate all contributions to improve mmcv please refer to contributing md contributing md for the contributing guideline license mmcv is released under the apache 2 0 license while some specific operations in this library are with other licenses please refer to licenses md licenses md for the careful check if you are using our code for commercial matters branch maintenance plan mmcv currently has four branches namely main 1 x master and 2 x where 2 x is an alias for the main branch and master is an alias for the 1 x branch the 2 x and master branches will be deleted in the future mmcv s branches go through the following three stages phase time branch description rc period 2022 9 1 2023 4 5 release candidate code 2 x version will be released on 2 x branch default master branch is still 1 x version master and 2 x branches iterate normally compatibility period 2023 4 6 2023 12 31 the 2 x branch has been renamed to the main branch and set as the default branch and 1 x branch will correspond to 1 x version we still maintain the old version 1 x respond to user needs but try not to introduce changes that break compatibility main branch iterates normally maintenance period from 2024 1 1 default main branch corresponds to 2 x version and 1 x branch is 1 x version 1 x branch is in maintenance phase no more new feature support main branch is iterating normally projects in openmmlab mmengine https github com open mmlab mmengine openmmlab foundational library for training deep learning models mmcv https github com open mmlab mmcv openmmlab foundational library for computer vision mim https github com open mmlab mim mim installs openmmlab packages mmclassification https github com open mmlab mmclassification openmmlab image classification toolbox and benchmark mmdetection https github com open mmlab mmdetection openmmlab detection toolbox and benchmark mmdetection3d https github com open mmlab mmdetection3d openmmlab s next generation platform for general 3d object detection mmrotate https github com open mmlab mmrotate openmmlab rotated object detection toolbox and benchmark mmyolo https github com open mmlab mmyolo openmmlab yolo series toolbox and benchmark mmsegmentation https github com open mmlab mmsegmentation openmmlab semantic segmentation toolbox and benchmark mmocr https github com open mmlab mmocr openmmlab text detection recognition and understanding toolbox mmpose https github com open mmlab mmpose openmmlab pose estimation toolbox and benchmark mmhuman3d https github com open mmlab mmhuman3d openmmlab 3d human parametric model toolbox and benchmark mmselfsup https github com open mmlab mmselfsup openmmlab self supervised learning toolbox and benchmark mmrazor https github com open mmlab mmrazor openmmlab model compression toolbox and benchmark mmfewshot https github com open mmlab mmfewshot openmmlab fewshot learning toolbox and benchmark mmaction2 https github com open mmlab mmaction2 openmmlab s next generation action understanding toolbox and benchmark mmtracking https github com open mmlab mmtracking openmmlab video perception toolbox and benchmark mmflow https github com open mmlab mmflow openmmlab optical flow toolbox and benchmark mmediting https github com open mmlab mmediting openmmlab image and video editing toolbox mmgeneration https github com open mmlab mmgeneration openmmlab image and video generative models toolbox mmdeploy https github com open mmlab mmdeploy openmmlab model deployment framework | ai |
|
Information-Gathering | information gathering project idea briefing the primary benefit of ethical hacking is to prevent data from being stolen and misused by malicious attackers as well as discovering vulnerabilities scanning network in this cyber security domain there are various types of tools are available but day by days security issues and our technology are updated we are creating some tools like port scanner network scanner mac address changer and some others tools a hacking tool is a program designed to assist a hacker with hacking these tool use by any company or any individuals person for collect information scan port fuzzing and some more thing project working i use command line interface for project this project for beginner and help those people they gather information about email header analysis instagram information instagram username check ip information phone number information port scan web scraping give fake information this tool shows your hostname and public ip first and check your internet speed then user give input and according to option this tool work alt text project png follow us on my social networks organization 1 linkedin https in linkedin com company viehgroup 2 twitter https twitter com viehgroup 3 github https github com viehgroup 4 instagram https www instagram com viehgroup developer 1 linkedin https in linkedin com in chetanbansal11 2 twitter https twitter com cbkali7838 3 github https github com cb kali 4 instagram https www instagram com i m cbkali | server |
|
weekly-nerd-2122 | weekly nerd cmda minor web 2021 2022 tijdens de minor worden een aantal sessies georganiseerd om kennis te maken met het vakgebied workshops praatjes en lezingen van bedrijven en designers over het vakgebied nerd alert leerdoelen kennismaken met het beroepenveld ori nteren op het vakgebied vakgerelateerde artikelen leren schrijven programma werkwijze wekelijks worden presentaties gegeven door bedrijven uit het vakgebied zo krijg je een goed beeld van het vakgebied en krijg je contacten in het werkveld dat kan handig zijn voor als je een afstudeerstage zoekt of een afstudeeropdracht maak van iedere presentatie sketch notes aantekeningen fork de weekly nerd repo en verzamel je aantekeningen in de wiki of maak zelf een blog er wordt van je verwacht dat je alle weekly nerds bijwoont als je een weekly nerd mist dan krijg je een vervangende opdracht tip schrijf ook altijd een link lijstje met interessante onderwerpen die aan bod zijn gekomen drie keer schrijf je een uitgebreid artikel over een relevant onderwerp bijvoorbeeld een eigen onderzoek naar een techniek of een technische analyse van een website die in een weekly nerd is behandeld lees hier meer over het schrijven van artikelen weekly nerd artikelen en lees een paar voorbeelden van vorige jaren programma datum spreker 9 februari cyd stumpel creative developer 16 februari alvaro montoro 2 maart fenna de wilde 9 maart x 16 maart rik schennink 23 maart krijn hoetmer 30 maart vitaly friedman 6 april x 13 april chanel 20 april l onie watson 27 april rian rietveld 11 mei x 18 mei x 25 mei weekly mingle 1 juni weekly mingle 8 juni weekly mingle 15 juni weekly mingle weekly nerd 1 vrijdag 5 maart bedrijven 14 00 kickoff 14 05 dept https www deptagency com nl nl met raymond korrel https www linkedin com in raymond korrel frontend developer ilayda k kosmano lu https www linkedin com in ilaydadept interaction designer 15 00 label a https labela nl met gavin ligthart https www linkedin com in gavinligthart frontend developer vraag 1 in de post wat is een goede frontend developer https css tricks com what makes a good front end developer op css tricks staat een lijst interessante mensen die beschrijven wat een frontende developer is welke skills denk jij dat een goede frontender moet hebben en wat voor frontender ben jij eigenlijk lees ook de the great divide https css tricks com the great divide van chris coyer om deze vraag te beantwoorden weekly nerd 2 donderdag 1 april bedrijven 14 00 intro 14 01 build in amsterdam https www buildinamsterdam com cases met fenna de wilde https www linkedin com in fenna de wilde frontend developer 15 00 triple https www wearetriple com met chanel mepschen https www linkedin com in chanel mepschen 1223a9b2 shyanta vleugel https www linkedin com in shyantav frontend developers vraag 2 je hebt geleerd hoe je toegankelijke websites kan maken een belangrijk uitgangspunt voor een digital designer is er voor zorgen dat een website door iedereen te gebruiken is toch zijn veel ontwikkelbedrijven zelf niet inclusive ook de tech industrie bestaat voor een groot deel uit dezelfde type personen en testen vervolgens hun websites bij weer dezelfde types met stereotypering vooroordelen en biased uitkomsten tot gevolg herken je dit als probleem moet dit veranderen in wat voor team zie jij jezelf graag werken lees het artikel on racism and sexism in branding user interface and tech https uxdesign cc on racism and sexism in branding user interface and tech 337f5ceb7ed5 en het project working towards a more inclusive design scene in the netherlands https inclusief design en gebruik dit voor het beantwoorden van de vraag img width 1145 alt adapting to reality src https user images githubusercontent com 1391509 113145133 58267b80 922e 11eb 82e4 f7c8867b90ce png weekly nerd 3 vrijdag 30 april bedrijven 15 00 intro 15 01 mirabeau https www mirabeau nl met dave bitter https www davebitter com frontend developer alexander munz visual designer vraag 3 je bent nu 3 maanden 24 7 code aan het klopppppen hopelijk heb je super veel geleerd ben je regelmatig uitgedaagd weet je nog beter waar je grenzen liggen en hoe je je verder kan en wil onwikkelen als frontender of juist niet in de verschillende vakken die je hebt gevolgd zijn technieken en werkwijzen aan bod gekomen die een echte frontender ook doet prototypen experimenteren ingewikkelde code simpele code onderzoeken testen lezen documenteren en heel veel html css en js op de client en op de server welke onderwerpen hebben de meeste indruk op je gemaakt een gastspreker of een test een inzicht tijdens een discord sessie met een van de student assistenten schrijf per vak wat je hebt geleerd en wat je meeneemt als frontender weekly nerd artikelen drie keer schrijf je een uitgebreid artikel over een relevant web design en development onderwerp bijvoorbeeld een eigen onderzoek naar een techniek of een technische analyse van een website die in een weekly nerd is behandeld zorg voor juiste verwijzingen bronvermelding en goede leesbare teksten engels wordt aangeraden je blog met de verslagen en artikelen moet voor de laatste week van de meesterproef ingeleverd zijn deadline zondag 19 juni 2022 voor 23 59 tip schrijf elke lesblok een artikel en lever dit in dan hoef je niet alle artikelen aan het eind van de minor te schrijven en krijg je tussentijds feedback op je niveau en schrijfstijl voorbeelden van artikelen van voorgaande jaren https medium com vincentkempers functional light programming helped me a lot 99856a9ac0ff https codepen io servinnissen post plan then code https github com jamerrone weekly nerd blog blob master articles article 3 md https github com muise001 weekly nerd bruce lawson w3c over webstandards https medium com vincentkempers my experience at nlhtml5 x cssday df855997a191 | web-design web-development html-css-javascript | front_end |
Embeded-System-Design | embeded system design lab4 lab4 work of hdu itmo embeded system design name zhou guancheng student id 192050193 my variant variant3 1 create three tasks 2 create two queues the size of queues is 5 integer numbers the first queue should be used to transmit data from task 1 to task 2 the second queue should be used to transmit data from task 1 to task 3 3 task 1 should increment the local integer variable counter once per second task 1 sends the counter value to task 2 once per second and to task 3 once per two seconds the incrementing of counter variable should be paused if the corresponding queue is full and resumed if the corresponding queue is not full 4 task 2 and task 3 should toggle led ld3 once per 600 ms the number of toggling is equal counter variable received using queue from task 1 the led is shared resource between task 2 and task 3 task 2 or task 3 should work with the shared led using a mutex steps 1 step three task for our project image 20200622103047569 https tva1 sinaimg cn large 007s8zilgy1gg0v0h0yj2j30hg04b779 jpg 2 programme the code we need first of all we need to add the dependency c include queue h include freertos h include semphr h then we should define the queue and the mutex variable we need as a global variable c xqueuehandle q1 xqueuehandle q2 semaphorehandle t xsemaphore null create a queue and a mutex in the main function c q1 xqueuecreate 8 sizeof unsigned int q2 xqueuecreate 8 sizeof unsigned int create mutex xsemaphore xsemaphorecreatemutex in the default task we send the message to q1 and q2 every queue will be reset if it s full before the default task send messages to it c void startdefaulttask void argument int counter 1 user code begin 5 infinite loop for int i 0 i 1000 i if uxqueuespacesavailable q1 0 xqueuereset q1 if uxqueuespacesavailable q2 0 xqueuereset q2 osdelay 1000 counter the maximum number of counter is 3 counter counter 4 send message to q1 per second xqueuesend q1 void counter portmax delay if counter 2 0 send message to q2 once per two seconds xqueuesend q2 void counter portmax delay user code end 5 in the second task we receive the message and control the led when starttask02 receive the message when led4 is occupied we need to use mutex to occupy resources and then when the task is completed we release mutex resources c void starttask02 void argument for user code begin starttask02 int rec1 if uxqueuemessageswaiting q1 1 xqueuereceive q1 rec1 portmax delay if xsemaphore null if xsemaphoretake xsemaphore portmax delay pdtrue for int i 0 i rec1 i hal gpio togglepin gpiob gpio pin 14 osdelay 600 xsemaphoregive xsemaphore in the third task we receive the message and control the led when starttask03 receive the message when led4 is occupied we need to use mutex to occupy resources and then when the task is completed we release mutex resources c void starttask03 void argument user code begin starttask03 for int rec2 if uxqueuemessageswaiting q2 1 xqueuereceive q2 rec2 portmax delay if xsemaphore null if xsemaphoretake xsemaphore portmax delay pdtrue for int i 0 i rec2 i hal gpio togglepin gpiob gpio pin 14 osdelay 600 xsemaphoregive xsemaphore user code end starttask03 then we complie and run the code and get the correct result summary in this experiment i learned how to use the queue api and mutex and realized the communication between thread tasks and then control the led | os |
|
DL4CV2 | dl4cv2 code examples from the book deep learning for computer vision practitioner bundle by a rosebrock these are not my property i keep them as self study notes some files have more comments and other files have some bugs fixed export variable pythonpath to run the scripts correctly in a posix system linux or osx export pythonpath current path of this project the path where this file is to remove tensorflow warning about cpu enhancement support export tf cpp min log level 2 | ai |
|
Embedded-Systems-Practical | your emsys logbook in this repository you can commit your lab assignments for assessment in each of the lab subfolders you can include arduino sketches if relevant for the assignment screenshots which can be displayed in the readme md file bits of text or measurement data again if relevant for the assignment you will find a subdirectory for each of the labs lab1 lab1 lab2 lab2 lab3 lab3 lab4 lab4 lab5 lab5 lab6 lab6 glhf | os |
|
Money-Exchange-Wallet | money exchange wallet money exchange wallet is a frontend development of mobile application using flutter pages splash welcome location access login signup home profile cards collection send money withdraw money exchange money deposit money scan to pay built with flutter https flutter dev dart https dart dev dependencies used flutter vector icons https pub dev packages flutter vector icons ionicons https ionic io ionicons qr code scanner https pub dev packages qr code scanner line awesome flutter https pub dev packages line awesome flutter flutter screenutil https pub dev packages flutter screenutil cupertino icons https pub dev packages cupertino icons bubbled navigation bar https pub dev packages bubbled navigation bar google fonts https pub dev packages google fonts font awesome flutter https pub dev packages font awesome flutter rflutter alert https pub dev packages rflutter alert softwares used figma https www figma com virtual studio code https code visualstudio com authors dibas np https github com dibas np license gnu general public license v3 0 https choosealicense com licenses gpl 3 0 screenshots p align center img src https user images githubusercontent com 81339759 119874055 f12ef580 bf44 11eb 95a6 be51bb235f76 png height 800em img src https user images githubusercontent com 81339759 119874079 f724d680 bf44 11eb 997b 0161cdcea48a png height 800em img src https user images githubusercontent com 81339759 119874091 fa1fc700 bf44 11eb 90f8 3702423ad8e1 png height 800em img src https user images githubusercontent com 81339759 119874200 158ad200 bf45 11eb 9e6d f5b4419714ec png height 800em img src https user images githubusercontent com 81339759 119874223 1b80b300 bf45 11eb 9200 a27caedfb038 png height 800em img src https user images githubusercontent com 81339759 119874243 1f143a00 bf45 11eb 862f 7f9115b8957d png height 800em img src https user images githubusercontent com 81339759 119874266 24718480 bf45 11eb 9034 9e5d4887a83e png height 800em img src https user images githubusercontent com 81339759 119874285 289da200 bf45 11eb 9d4d bad02b5e7101 png height 800em img src https user images githubusercontent com 81339759 119874307 2dfaec80 bf45 11eb 9506 d38a49472010 png height 800em img src https user images githubusercontent com 81339759 119874337 34896400 bf45 11eb 839d 3be875f6e772 png height 800em img src https user images githubusercontent com 81339759 119874347 381ceb00 bf45 11eb 9c24 251410a6a717 png height 800em img src https user images githubusercontent com 81339759 119874363 3c490880 bf45 11eb 86de dc7b725b0a05 png height 800em img src https user images githubusercontent com 81339759 119874394 4539da00 bf45 11eb 9460 1fa9547afb58 png height 800em p | flutter dart android-application flutter-app flutter-ui frontend-app | front_end |
UA-CloudViewer | ua cloud viewer ua cloud viewer is a tool used in industrial iot scenarios to bridge the gap from ot to it opc ua is the standard interface for vendor neutral operational technology ot interoperability in factories plants and renewable energy farms with best in class data information modeling functionality the file format for these information models is called nodesets https reference opcfoundation org v104 core docs part6 f 1 as such it defines the industrial digital twin also known as the smart manufacturing profile the opc foundation and cesmii have worked hard over the last year to make these information models smart manufacturing profiles available online leveraging the new ua cloud library https github com opcfoundation ua cloudlibrary the ua cloud viewer can upload and later download these opc ua information models to the ua cloud library the plattform industrie 4 0 in europe has defined the industrial digital twin slightly broader not only defining the ot digital twin but the entire digital asset product along its value chain i e from design to manufacturing to operation to recycling they call this industrial digital twin the asset administration shell aas the ua cloud viewer can package ot digital twins into asset administration shells leveraging the aas exchange format aasx based on open office xml microsoft has defined the it digital twin using the digital twin definition language dtdl https docs microsoft com en us azure digital twins concepts models it also runs a cloud service leveraging these dtdl based digital twins for analytics called the microsoft azure digital twins adt service the ua cloud viewer can map ot digital twins to dtdl definitions and then upload them to adt instances additional features of the ua cloud viewer include the ability to run in a docker container for easy deployment and maintenance and comes with a web user interface several opc ua nodeset files can be loaded at once and then browsed the tool is very useful for looking at the standardized nodeset files defined in the opc ua companion specifications by the german machine builders association vdma and the german machine tool builders association vdw usage docker containers are automatically built simply run the app via docker run p 80 80 ghcr io digitaltwinconsortium ua cloudviewer main and then point your browser to http localhost if you don t have docker yet you can download it for free from here https www docker com products docker desktop loading nodeset files start docs start png open your opc ua nodeset file please note referenced nodeset files need to be uploaded too it will tell you which ones or you can set a checkbox to automatically download referenced nodeset files from the ua cloud library you can also open opc ua node nodeset set files directly from the ua cloud library browsing nodeset files browsing docs sample png you can browse and interact with the model currently read and write of a node is possible uploading nodeset files to the ua cloud library on the ua cloud library tab fill in your username and password you used when registering with the ua cloud library as well as metadata describing your nodeset file browsing docs cloudlib png build status docker build https github com digitaltwinconsortium uanodesetwebviewer actions workflows docker build yml badge svg https github com digitaltwinconsortium uanodesetwebviewer actions workflows docker build yml | server |
|
BookSourceCode | booksourcecode this repository contains all of the source code for the pro sharepoint 2013 branding and responsive web development book | front_end |
|
Contrastive-Learning-NLP-Papers | p align center h2 align center contrastive learning for natural language processing h2 p current nlp models heavily rely on effective representation learning algorithms contrastive learning is one such technique to learn an embedding space such that similar data sample pairs have close representations while dissimilar samples stay far apart from each other it can be used in supervised or unsupervised settings using different loss functions to produce task specific or general purpose representations while it has originally enabled the success for vision tasks recent years have seen a growing number of publications in contrastive nlp this first line of works not only delivers promising performance improvements in various nlp tasks but also provides desired characteristics such as task agnostic sentence representation faithful text generation data efficient learning in zero shot and few shot settings interpretability and explainability tutorial and survey 1 tutorial and survey talk presentation and blog 2 talk presentation and blog foundation of contrastive learning 3 foundation of contrastive learning contrastive learning objective contrastive learning objective sampling strategy for contrastive learning sampling strategy for contrastive learning most notable applications of contrastive learning most notable applications of contrastive learning analysis of contrastive learning analysis of contrastive learning graph contrastive learning graph contrastive learning contrastive learning for nlp 4 contrastive learning for nlp contrastive data augmentation for nlp contrastive data augmentation for nlp text classification text classification sentence embeddings and phrase embeddings sentence embeddings and phrase embeddings information extraction information extraction sequence labeling sequence labeling machine translation machine translation question answering question answering summarization summarization text generation text generation data efficient learning data efficient learning contrastive pretraining contrastive pretraining interpretability and explainability interpretability and explainability commonsense knowledge and reasoning commonsense knowledge and reasoning vision and language vision and language others others 1 tutorial and survey contrastive data and learning for natural language processing rui zhang yangfeng ji yue zhang rebecca j passonneau naacl 2022 tutorial website https contrastive nlp tutorial github io slides https contrastive nlp tutorial github io files contrastive nlp tutorial pdf video https youtu be iqzjybik4go a primer on contrastive pretraining in language processing methods lessons learned and perspectives nils rethmeier isabelle augenstein pdf https arxiv org abs 2102 12982 a survey on contrastive self supervised learning ashish jaiswal ashwin ramesh babu mohammad zaki zadeh debapriya banerjee fillia makedon pdf https www mdpi com 2227 7080 9 1 2 htm self supervised learning self prediction and contrastive learning lilian weng jong wook kim neurips 2021 tutorial website https neurips cc virtual 2021 tutorial 21895 slides https neurips cc media neurips 2021 slides 21895 pdf 2 talk presentation and blog contrastive representation learning in text danqi chen slide https cds nyu edu wp content uploads 2021 11 tad slides danqi chen compressed pdf contrastive pairs are better than independent samples for both learning and evaluation matt gardner video https drive google com file d 1dwmdeuzy9m0z5a1gzqm4i78zeqp8gyhm view contrastive representation learning lilian weng blog https lilianweng github io posts 2021 05 31 contrastive understanding contrastive learning ekin tiu blog https towardsdatascience com understanding contrastive learning d5b19fd96607 contrastive self supervised learning ankesh anand blog https ankeshanand com blog 2020 01 26 contrative self supervised learning html the beginner s guide to contrastive learning rohit kundu blog https www v7labs com blog contrastive learning guide triplet loss and online triplet mining in tensorflow olivier moindrot blog https omoindrot github io triplet loss understanding ranking loss contrastive loss margin loss triplet loss hinge loss and all those confusing names ra l g mez blog https gombru github io 2019 04 03 ranking loss contrastive learning in 3 minutes ta ying cheng blog https towardsdatascience com contrastive learning in 3 minutes 89d9a7db5a28 demystifying noise contrastive estimation jack morris blog https jxmo io posts nce phrase retrieval and beyond jinhyuk lee blog https princeton nlp github io phrase retrieval and beyond advances in understanding improving and applying contrastive learning dan fu blog https hazyresearch stanford edu blog 2022 04 19 contrastive 1 improving transfer and robustness in supervised contrastive learning mayee chen blog https hazyresearch stanford edu blog 2022 04 19 contrastive 2 tabi type aware bi encoders for open domain entity retrieval megan leszczynski blog https hazyresearch stanford edu blog 2022 04 19 contrastive 3 3 foundation of contrastive learning contrastive learning objective 1 learning a similarity metric discriminatively with application to face verification sumit chopra raia hadsell yann lecun cvpr 2005 pdf https ieeexplore ieee org abstract document 1467314 1 facenet a unified embedding for face recognition and clustering florian schroff dmitry kalenichenko and james philbin cvpr 2015 pdf https arxiv org abs 1503 03832 1 deep metric learning via lifted structured feature embedding hyun oh song yu xiang stefanie jegelka silvio savarese cvpr 2016 pdf https arxiv org abs 1511 06452 1 improved deep metric learning with multi class n pair loss objective kihyuk sohn neurips 2016 pdf https papers nips cc paper 2016 file 6b180037abbebea991d8b1232f8a8ca9 paper pdf 1 noise contrastive estimation a new estimation principle for unnormalized statistical models michael gutmann and aapo hyv rinen aistats 2010 pdf https proceedings mlr press v9 gutmann10a gutmann10a pdf 1 representation learning with contrastive predictive coding aaron van den oord yazhe li oriol vinyals arxiv pdf https arxiv org abs 1807 03748 1 learning a nonlinear embedding by preserving class neighbourhood structure ruslan salakhutdinov geoff hinton aistats 2007 pdf http proceedings mlr press v2 salakhutdinov07a salakhutdinov07a pdf 1 analyzing and improving representations with the soft nearest neighbor loss nicholas frosst nicolas papernot geoffrey hinton icml 2019 pdf http proceedings mlr press v97 frosst19a frosst19a pdf sampling strategy for contrastive learning 1 learning deep representations by mutual information estimation and maximization r devon hjelm alex fedorov samuel lavoie marchildon karan grewal phil bachman adam trischler yoshua bengio iclr 2019 pdf https arxiv org abs 1808 06670 code https github com rdevon dim 1 debiased contrastive learning ching yao chuang joshua robinson lin yen chen antonio torralba stefanie jegelka neurips 2020 pdf https arxiv org abs 2007 00224 1 contrastive learning with hard negative samples joshua robinson ching yao chuang suvrit sra stefanie jegelka iclr 2021 pdf https arxiv org abs 2010 04592 1 supervised contrastive learning prannay khosla piotr teterwak chen wang aaron sarna yonglong tian phillip isola aaron maschinot ce liu dilip krishnan neurips 2020 pdf https arxiv org abs 2004 11362 1 adversarial self supervised contrastive learning minseon kim jihoon tack sung ju hwang neurips 2020 pdf https arxiv org abs 2006 07589 code https github com kim minseon rocl 1 decoupled contrastive learning chun hsiao yeh cheng yao hong yen chi hsu tyng luh liu yubei chen yann lecun arxiv pdf https arxiv org abs 2110 06848 code https github com kim minseon rocl 1 momentum contrast for unsupervised visual representation learning kaiming he haoqi fan yuxin wu saining xie ross girshick cvpr 2020 pdf https arxiv org abs 1911 05722 code https github com facebookresearch moco 1 unsupervised learning of visual features by contrasting cluster assignments mathilde caron ishan misra julien mairal priya goyal piotr bojanowski armand joulin neurips 2020 pdf https arxiv org abs 2006 09882 code https github com facebookresearch swav 1 contrastive multiview coding yonglong tian dilip krishnan phillip isola arxiv 2019 pdf https arxiv org abs 1906 05849 code http github com hobbitlong cmc 1 prototypical contrastive learning of unsupervised representations junnan li pan zhou caiming xiong steven c h hoi iclr 2021 pdf https arxiv org abs 1906 05849 code https github com salesforce pcl most notable applications of contrastive learning 1 efficient estimation of word representations in vector space tomas mikolov kai chen greg corrado jeffrey dean arxiv pdf https arxiv org abs 1301 3781 1 a simple framework for contrastive learning of visual representations ting chen simon kornblith mohammad norouzi geoffrey hinton icml 2020 pdf https arxiv org abs 2002 05709 code https github com google research simclr 1 learning transferable visual models from natural language supervision alec radford jong wook kim chris hallacy aditya ramesh gabriel goh sandhini agarwal girish sastry amanda askell pamela mishkin jack clark gretchen krueger ilya sutskever arxiv pdf https arxiv org abs 2103 00020 code https github com openai clip analysis of contrastive learning 1 a theoretical analysis of contrastive unsupervised representation learning sanjeev arora hrishikesh khandeparkar mikhail khodak orestis plevrakis nikunj saunshi icml 2019 pdf https arxiv org abs 1902 09229 1 understanding contrastive representation learning through alignment and uniformity on the hypersphere tongzhou wang phillip isola icml 2020 pdf https arxiv org abs 2005 10242 code https github com ssnl align uniform 1 what makes for good views for contrastive learning yonglong tian chen sun ben poole dilip krishnan cordelia schmid phillip isola neurips 2020 pdf https arxiv org abs 2005 10243 code https hobbitlong github io infomin 1 demystifying contrastive self supervised learning invariances augmentations and dataset biases senthil purushwalkam abhinav gupta neurips 2020 pdf https arxiv org abs 2007 13916 1 what should not be contrastive in contrastive learning tete xiao xiaolong wang alexei a efros trevor darrell iclr 2021 pdf https arxiv org abs 2008 05659 1 dissecting supervised contrastive learning florian graf christoph d hofer marc niethammer roland kwitt icml 2021 pdf https arxiv org abs 2102 08817 code https github com plus rkwitt py supcon vs ce 1 a broad study on the transferability of visual representations with contrastive learning ashraful islam chun fu chen rameswar panda leonid karlinsky richard radke rogerio feris iccv 2021 pdf https arxiv org abs 2103 13517 1 poisoning and backdooring contrastive learning nicholas carlini andreas terzis iclr 2022 pdf https arxiv org abs 2106 09667 1 understanding dimensional collapse in contrastive self supervised learning li jing pascal vincent yann lecun yuandong tian iclr 2022 pdf https openreview net forum id yevsq05den7 1 provable guarantees for self supervised deep learning with spectral contrastive loss jeff z haochen colin wei adrien gaidon tengyu ma neurips 2021 pdf https arxiv org abs 2106 04156 1 beyond separability analyzing the linear transferability of contrastive representations to related subpopulations jeff z haochen colin wei ananya kumar tengyu ma arxiv 2022 pdf https arxiv org abs 2204 02683 1 connect not collapse explaining contrastive learning for unsupervised domain adaptation kendrick shen robbie jones ananya kumar sang michael xie jeff z haochen tengyu ma percy liang arxiv 2022 pdf https arxiv org abs 2204 00570 1 perfectly balanced improving transfer and robustness of supervised contrastive learning mayee f chen daniel y fu avanika narayan michael zhang zhao song kayvon fatahalian christopher r arxiv pdf https arxiv org abs 2204 07596 1 intriguing properties of contrastive losses ting chen calvin luo lala li neurips 2021 pdf https proceedings neurips cc paper 2021 file 628f16b29939d1b060af49f66ae0f7f8 paper pdf code https contrastive learning github io intriguing 1 rethinking infonce how many negative samples do you need chuhan wu fangzhao wu yongfeng huang arxiv pdf https arxiv org abs 2105 13003 graph contrastive learning 1 graph contrastive learning with augmentations yuning you tianlong chen yongduo sui ting chen zhangyang wang yang shen neurips 2020 pdf https arxiv org abs 2010 13902 code https github com shen lab graphcl 1 contrastive multi view representation learning on graphs kaveh hassani amir hosein khasahmadi icml 2020 pdf https arxiv org abs 2006 05582 1 deep graph contrastive representation learning yanqiao zhu yichen xu feng yu qiang liu shu wu liang wang icml workshop on graph representation learning and beyond pdf https arxiv org abs 2006 04131 code https github com cripac dig grace 4 contrastive learning for nlp contrastive data augmentation for nlp 1 learning the difference that makes a difference with counterfactually augmented data divyansh kaushik eduard hovy zachary c lipton iclr 2020 pdf https arxiv org abs 1909 12434 code https github com acmi lab counterfactually augmented data 1 nl augmenter a framework for task sensitive natural language augmentation kaustubh d dhole varun gangal sebastian gehrmann aadesh gupta zhenhao li saad mahamood abinaya mahendiran simon mille ashish srivastava samson tan tongshuang wu jascha sohl dickstein jinho d choi eduard hovy ondrej dusek sebastian ruder sajant anand nagender aneja rabin banjade lisa barthe hanna behnke ian berlot attwell connor boyle caroline brun marco antonio sobrevilla cabezudo samuel cahyawijaya emile chapuis wanxiang che mukund choudhary christian clauss pierre colombo filip cornell gautier dagan mayukh das tanay dixit thomas dopierre paul alexis dray suchitra dubey tatiana ekeinhor marco di giovanni rishabh gupta rishabh gupta louanes hamla sang han fabrice harel canada antoine honore ishan jindal przemyslaw k joniak denis kleyko venelin kovatchev kalpesh krishna ashutosh kumar stefan langer seungjae ryan lee corey james levinson hualou liang kaizhao liang zhexiong liu andrey lukyanenko vukosi marivate gerard de melo simon meoni maxime meyer afnan mir nafise sadat moosavi niklas muennighoff timothy sum hon mun kenton murray marcin namysl maria obedkova priti oli nivranshu pasricha jan pfister richard plant vinay prabhu vasile pais libo qin shahab raji pawan kumar rajpoot vikas raunak roy rinberg nicolas roberts juan diego rodriguez claude roux vasconcellos p h s ananya b sai robin m schmidt thomas scialom tshephisho sefara saqib n shamsi xudong shen haoyue shi yiwen shi anna shvets nick siegel damien sileo jamie simon chandan singh roman sitelew priyank soni taylor sorensen william soto aman srivastava kv aditya srivatsa tony sun mukund varma t a tabassum fiona anting tan ryan teehan mo tiwari marie tolkiehn athena wang zijian wang gloria wang zijie j wang fuxuan wei bryan wilie genta indra winata xinyi wu witold wydma ski tianbao xie usama yaseen m yee jing zhang yue zhang arxiv pdf https arxiv org abs 2112 02721 code https github com gem benchmark nl augmenter 1 a simple but tough to beat data augmentation approach for natural language understanding and generation dinghan shen mingzhi zheng yelong shen yanru qu weizhu chen arxiv pdf https arxiv org abs 2009 13818 code https github com dinghanshen cutoff 1 efficient contrastive learning via novel data augmentation and curriculum learning seonghyeon ye jiseon kim alice oh emnlp 2021 pdf https arxiv org abs 2109 05941 code https github com vano1205 efficientcl 1 coda contrast enhanced and diversity promoting data augmentation for natural language understanding yanru qu dinghan shen yelong shen sandra sajeev jiawei han weizhu chen iclr 2021 pdf https arxiv org abs 2010 08670 text classification 1 cert contrastive self supervised learning for language understanding hongchao fang sicheng wang meng zhou jiayuan ding pengtao xie arxiv pdf https arxiv org abs 2005 12766 code https github com ucsd ai4h cert 1 self supervised contrastive learning for efficient user satisfaction prediction in conversational agents mohammad kachuee hao yuan young bum kim sungjin lee naacl 2021 pdf https arxiv org abs 2010 11230 1 not all negatives are equal label aware contrastive loss for fine grained text classification varsha suresh desmond c ong emnlp 2021 pdf https arxiv org abs 2109 05427 1 constructing contrastive samples via summarization for text classification with limited annotations yangkai du tengfei ma lingfei wu fangli xu xuhong zhang bo long shouling ji findings of emnlp 2021 pdf https arxiv org abs 2104 05094 1 semantic re tuning via contrastive tension fredrik carlsson amaru cuba gyllensten evangelia gogoulou erik ylip hellqvist magnus sahlgren iclr 2021 pdf https openreview net forum id ov smnau pf code https github com freddefrallan contrastive tension 1 approximate nearest neighbor negative contrastive learning for dense text retrieval lee xiong chenyan xiong ye li kwok fung tang jialin liu paul bennett junaid ahmed arnold overwijk iclr 2021 pdf https arxiv org abs 2007 00808 1 improving gradient based adversarial training for text classification by contrastive learning and auto encoder yao qiu jinchao zhang jie zhou findings of acl 2021 pdf https arxiv org abs 2109 06536 1 contrastive document representation learning with graph attention networks peng xu xinchi chen xiaofei ma zhiheng huang bing xiang findings of emnlp 2021 pdf https arxiv org abs 2110 10778 1 attention based contrastive learning for winograd schemas tassilo klein moin nabi findings of emnlp 2021 pdf https arxiv org abs 2109 05108 code https github com sap samples emnlp2021 attention contrastive learning 1 cline contrastive learning with semantic negative examples for natural language understanding dong wang ning ding piji li hai tao zheng acl 2021 pdf https arxiv org abs 2107 00440 code https github com kandorm cline 1 contrastive learning enhanced nearest neighbor mechanism for multi label text classification xi ao su ran wang xinyu dai acl 2022 pdf https aclanthology org 2022 acl short 75 1 incorporating hierarchy into text encoder a contrastive learning approach for hierarchical text classification zihan wang peiyi wang lianzhe huang xin sun houfeng wang acl 2022 pdf https aclanthology org 2022 acl long 491 1 label anchored contrastive learning for language understanding zhenyu zhang yuming zhao meng chen xiaodong he naacl 2022 pdf https arxiv org abs 2205 10227 1 batch softmax contrastive loss for pairwise sentence scoring tasks anton chernyavskiy dmitry ilvovsky pavel kalinin preslav nakov naacl 2022 pdf https arxiv org abs 2110 15725 1 conditional supervised contrastive learning for fair text classification jianfeng chi william shand yaodong yu kai wei chang han zhao yuan tian emnlp findings 2022 pdf https arxiv org abs 2205 11485 sentence embeddings and phrase embeddings 1 towards universal paraphrastic sentence embeddings john wieting mohit bansal kevin gimpel karen livescu iclr 2016 pdf https arxiv org abs 1511 08198 code https github com jwieting iclr2016 1 an efficient framework for learning sentence representations lajanugen logeswaran honglak lee iclr 2018 pdf https arxiv org abs 1803 02893 code https github com lajanugen s2v 1 simcse simple contrastive learning of sentence embeddings tianyu gao xingcheng yao danqi chen emnlp 2021 pdf https arxiv org abs 2104 08821 code https github com princeton nlp simcse 1 fast effective and self supervised transforming masked language models into universal lexical and sentence encoders fangyu liu ivan vuli anna korhonen nigel collier emnlp 2021 pdf https arxiv org abs 2104 08027 code https github com cambridgeltl mirror bert 1 learning dense representations of phrases at scale jinhyuk lee mujeen sung jaewoo kang danqi chen acl 2021 pdf https arxiv org abs 2012 12624 code https github com princeton nlp densephrases 1 phrase retrieval learns passage retrieval too jinhyuk lee alexander wettig danqi chen emnlp 2021 pdf https arxiv org abs 2109 08133 code https github com princeton nlp densephrases 1 self guided contrastive learning for bert sentence representations taeuk kim kang min yoo sang goo lee acl 2021 pdf https arxiv org abs 2106 07345 1 pairwise supervised contrastive learning of sentence representations dejiao zhang shang wen li wei xiao henghui zhu ramesh nallapati andrew o arnold bing xiang emnlp 2021 pdf https arxiv org abs 2109 05424 code https github com amazon research sentence representations 1 supcl seq supervised contrastive learning for downstream optimized sequence representations hooman sedghamiz shivam raval enrico santus tuka alhanai mohammad ghassemi findings of emnlp 2021 pdf https arxiv org abs 2109 07424 code https github com hooman650 supcl seq 1 sentence bert sentence embeddings using siamese bert networks nils reimers iryna gurevych emnlp 2019 pdf https arxiv org abs 1908 10084 code https github com ukplab sentence transformers 1 an unsupervised sentence embedding method by mutual information maximization yan zhang ruidan he zuozhu liu kwan hui lim lidong bing emnlp 2020 pdf https arxiv org abs 2009 12061 code https github com yanzhangnlp is bert 1 declutr deep contrastive learning for unsupervised textual representations john giorgi osvald nitski bo wang gary bader acl 2021 pdf https arxiv org abs 2006 03659 code https github com johngiorgi declutr 1 consert a contrastive framework for self supervised sentence representation transfer yuanmeng yan rumei li sirui wang fuzheng zhang wei wu weiran xu acl 2021 pdf https arxiv org abs 2105 11741 code https github com yym6472 consert 1 dialoguecse dialogue based contrastive learning of sentence embeddings che liu rui wang jinghua liu jian sun fei huang luo si emnlp 2021 pdf https arxiv org abs 2109 12599 code https github com wangruicn dialoguecse 1 pretraining with contrastive sentence objectives improves discourse performance of language models dan iter kelvin guu larry lansing dan jurafsky acl 2020 pdf https arxiv org abs 2005 10389 code https github com google research language tree master language conpono 1 contextualized and generalized sentence representations by contrastive self supervised learning a case study on discourse relation analysis hirokazu kiyomaru sadao kurohashi naacl 2021 pdf https aclanthology org 2021 naacl main 442 pdf 1 diffcse difference based contrastive learning for sentence embeddings yung sung chuang rumen dangovski hongyin luo yang zhang shiyu chang marin solja i shang wen li wen tau yih yoon kim james glass naacl 2022 pdf https arxiv org abs 2204 10298 code https github com voidism diffcse 1 exploring the impact of negative samples of contrastive learning a case study of sentence embedding rui cao yihao wang yuxin liang ling gao jie zheng jie ren zheng wang findings of acl 2022 pdf https aclanthology org 2022 findings acl 248 1 syntax guided contrastive learning for pre trained language model shuai zhang wang lijie xinyan xiao hua wu findings of acl 2022 pdf https aclanthology org 2022 findings acl 191 1 virtual augmentation supported contrastive learning of sentence representations dejiao zhang wei xiao henghui zhu xiaofei ma andrew arnold findings of acl 2022 pdf https aclanthology org 2022 findings acl 70 1 a sentence is worth 128 pseudo tokens a semantic aware contrastive learning framework for sentence embeddings haochen tan wei shao han wu ke yang linqi song findings of acl 2022 pdf https aclanthology org 2022 findings acl 22 1 scd self contrastive decorrelation of sentence embeddings tassilo klein moin nabi acl 2022 pdf https aclanthology org 2022 acl short 44 1 a contrastive framework for learning sentence representations from pairwise and triple wise perspective in angular space yuhao zhang hongji zhu yongliang wang nan xu xiaobo li binqiang zhao acl 2022 pdf https aclanthology org 2022 acl long 336 1 debiased contrastive learning of unsupervised sentence representations kun zhou beichen zhang xin zhao ji rong wen acl 2022 pdf https aclanthology org 2022 acl long 423 1 uctopic unsupervised contrastive learning for phrase representations and topic mining jiacheng li jingbo shang julian mcauley acl 2022 pdf https aclanthology org 2022 acl long 426 1 ease entity aware contrastive learning of sentence embedding sosuke nishikawa ryokan ri ikuya yamada yoshimasa tsuruoka isao echizen naacl 2022 pdf https arxiv org abs 2205 04260 1 mcse multimodal contrastive learning of sentence embeddings miaoran zhang marius mosbach david ifeoluwa adelani michael a hedderich dietrich klakow naacl 2022 pdf information extraction 1 erica improving entity and relation understanding for pre trained language models via contrastive learning yujia qin yankai lin ryuichi takanobu zhiyuan liu peng li heng ji minlie huang maosong sun jie zhou acl 2021 pdf https arxiv org abs 2012 15022 code https github com thunlp erica 1 cil contrastive instance learning framework for distantly supervised relation extraction tao chen haizhou shi siliang tang zhigang chen fei wu yueting zhuang acl 2021 pdf https arxiv org abs 2106 10855 1 cleve contrastive pre training for event extraction ziqi wang xiaozhi wang xu han yankai lin lei hou zhiyuan liu peng li juanzi li jie zhou acl 2021 pdf https arxiv org abs 2105 14485 code https github com thu keg cleve 1 container few shot named entity recognition via contrastive learning sarkar snigdha sarathi das arzoo katiyar rebecca j passonneau rui zhang acl 2022 pdf https arxiv org abs 2109 07589 code https github com psunlpgroup container 1 tabi type aware bi encoders for open domain entity retrieval megan leszczynski daniel y fu mayee f chen christopher r findings of acl 2022 pdf https arxiv org abs 2204 08173 1 cross lingual contrastive learning for fine grained entity typing for low resource languages xu han yuqi luo weize chen zhiyuan liu maosong sun zhou botong hao fei suncong zheng acl 2022 pdf https aclanthology org 2022 acl long 159 code https github com thunlp crosset 1 hiclre a hierarchical contrastive learning framework for distantly supervised relation extraction dongyang li taolin zhang nan hu chengyu wang xiaofeng he findings of acl 2022 pdf https aclanthology org 2022 findings acl 202 1 hiure hierarchical exemplar contrastive learning for unsupervised relation extraction shuliang liu xuming hu chenwei zhang shu ang li lijie wen philip s yu naacl 2022 pdf https arxiv org abs 2205 02225 1 label refinement via contrastive learning for distantly supervised named entity recognition huaiyuan ying shengxuan luo tiantian dang sheng yu findings of naacl 2022 pdf sequence labeling 1 contrastive estimation training log linear models on unlabeled data noah a smith jason eisner acl 2005 pdf https aclanthology org p05 1044 pdf machine translation 1 contrastive learning for many to many multilingual neural machine translation xiao pan mingxuan wang liwei wu lei li acl 2021 pdf https arxiv org abs 2105 09501 code https github com panxiao1994 mrasp2 1 contrastive conditioning for assessing disambiguation in mt a case study of distilled bia jannis vamvas rico sennrich emnlp 2021 pdf https aclanthology org 2021 emnlp main 803 pdf code https github com zurichnlp contrastive conditioning 1 as little as possible as much as necessary detecting over and undertranslations with contrastive conditioning jannis vamvas rico sennrich acl 2022 pdf https aclanthology org 2022 acl short 53 1 improving word translation via two stage contrastive learning yaoyiran li fangyu liu nigel collier anna korhonen ivan vuli acl 2022 pdf https aclanthology org 2022 acl long 299 1 when do contrastive word alignments improve many to many neural machine translation zhuoyuan mao chenhui chu raj dabre haiyue song zhen wan sadao kurohashi findings of naacl 2022 pdf https arxiv org abs 2204 12165 1 cocoa mt a dataset and benchmark for contrastive controlled mt with application to formality maria nadejde anna currey benjamin hsu xing niu georgiana dinu marcello federico findings of naacl 2022 pdf https arxiv org abs 2205 04022 question answering 1 dense passage retrieval for open domain question answering vladimir karpukhin barlas o uz sewon min patrick lewis ledell wu sergey edunov danqi chen wen tau yih emnlp 2020 pdf https arxiv org abs 2004 04906 code https github com facebookresearch dpr 1 self supervised contrastive cross modality representation learning for spoken question answering chenyu you nuo chen yuexian zou findings of emnlp 2021 pdf https arxiv org abs 2109 03381 1 xmoco cross momentum contrastive learning for open domain question answering nan yang furu wei binxing jiao daxin jiang linjun yang acl 2021 pdf https aclanthology org 2021 acl long 477 pdf 1 contrastive domain adaptation for question answering using limited text corpora zhenrui yue bernhard kratzwald stefan feuerriegel emnlp 2021 pdf https arxiv org abs 2108 13854 code https github com yueeeeeeee caqa 1 to answer or not to answer improving machine reading comprehension model with span based contrastive learning yunjie ji liangyu chen chenxiao dou baochang ma xiangang li findings of naacl 2022 pdf 1 seeing the wood for the trees a contrastive regularization method for the low resource knowledge base question answering junping liu shijie mei xinrong hu xun yao jack yang yi guo findings of naacl 2022 pdf summarization 1 confit toward faithful dialogue summarization with linguistically informed contrastive fine tuning xiangru tang arjun nair borui wang bingyao wang jai amit desai aaron wade haoran li asli celikyilmaz yashar mehdad dragomir radev naacl 2022 pdf https arxiv org abs 2112 08713 1 cliff contrastive learning for improving faithfulness and factuality in abstractive summarization shuyang cao lu wang emnlp 2021 pdf https arxiv org abs 2109 09209 code https shuyangcao github io projects cliff summ 1 contrastive attention mechanism for abstractive sentence summarization xiangyu duan hongfei yu mingming yin min zhang weihua luo yue zhang emnlp 2019 pdf https aclanthology org d19 1301 pdf code https github com travel go abstractive text summarization 1 simcls a simple framework for contrastive learning of abstractive summarization yixin liu pengfei liu acl 2021 pdf https arxiv org abs 2106 01890 code https github com yixinl7 simcls 1 unsupervised reference free summary quality evaluation via contrastive learning hanlu wu tengfei ma lingfei wu tariro manyumwa shouling ji emnlp 2020 pdf https arxiv org abs 2010 01781 code https github com whl97 ls score 1 contrastive aligned joint learning for multilingual summarization danqing wang jiaze chen hao zhou xipeng qiu lei li findings of acl 2021 pdf https aclanthology org 2021 findings acl 242 pdf code https github com dqwang122 calms 1 topic aware contrastive learning for abstractive dialogue summarization junpeng liu yanyan zou hainan zhang hongshen chen zhuoye ding caixia yuan xiaojie wang findings of emnlp 2021 pdf https arxiv org abs 2109 04994 1 graph enhanced contrastive learning for radiology findings summarization jinpeng hu zhuo li zhihong chen zhen li xiang wan tsung hui chang acl 2022 pdf https aclanthology org 2022 acl long 320 text generation 1 controllable natural language generation with contrastive prefixes jing qian li dong yelong shen furu wei weizhu chen findings of acl 2022 pdf https arxiv org abs 2202 13257 code https github com yxuansu simctg 1 a contrastive framework for neural text generation yixuan su tian lan yan wang dani yogatama lingpeng kong nigel collier neurips 2022 pdf https arxiv org abs 2202 06417 code https github com yxuansu simctg 1 counter contrastive learning for language gans yekun chai haidong zhang qiyue yin junge zhang findings of emnlp 2021 pdf https aclanthology org 2021 findings emnlp 415 pdf 1 contrastive learning with adversarial perturbations for conditional text generation seanie lee dong bok lee sung ju hwang iclr 2021 pdf https arxiv org abs 2012 07280 code https github com seanie12 claps 1 logic consistency text generation from semantic parses chang shu yusen zhang xiangyu dong peng shi tao yu rui zhang findings of acl 2021 pdf https aclanthology org 2021 findings acl 388 pdf code https github com ciaranshu relogic 1 contrastive representation learning for exemplar guided paraphrase generation haoran yang wai lam piji li findings of emnlp 2021 pdf https arxiv org abs 2109 01484 code https github com lhryang crl egpg 1 grammatical error correction with contrastive learning in low error density domains hannan cao wenmian yang hwee tou ng findings of emnlp 2021 pdf https aclanthology org 2021 findings emnlp 419 code https github com nusnlp geccl 1 group wise contrastive learning for neural dialogue generation hengyi cai hongshen chen yonghao song zhuoye ding yongjun bao weipeng yan xiaofang zhao findings of emnlp 2020 pdf https arxiv org abs 2009 07543 code https github com hengyicai contrastivelearning4dialogue 1 contrastive attention for automatic chest x ray report generation fenglin liu changchang yin xian wu shen ge yuexian zou ping zhang xu sun findings of acl 2021 pdf https arxiv org abs 2106 06965 1 weakly supervised contrastive learning for chest x ray report generation an yan zexue he xing lu jiang du eric chang amilcare gentili julian mcauley chun nan hsu findings of emnlp 2021 pdf https arxiv org abs 2109 12242 1 learning with contrastive examples for data to text generation yui uehara tatsuya ishigaki kasumi aoki hiroshi noji keiichi goshima ichiro kobayashi hiroya takamura yusuke miyao coling 2020 pdf https aclanthology org 2020 coling main 213 pdf code https github com aistairc contrastive data2text 1 a simple contrastive learning objective for alleviating neural text degeneration shaojie jiang ruqing zhang svitlana vakulenko maarten de rijke arxiv pdf https arxiv org abs 2205 02517 code https github com shaojiejiang ct loss 1 keywords and instances a hierarchical contrastive learning framework unifying hybrid granularities for text generation mingzhe li xiexiong lin xiuying chen jinxiong chang qishen zhang feng wang taifeng wang zhongyi liu wei chu dongyan zhao rui yan acl 2022 pdf https aclanthology org 2022 acl long 304 data efficient learning 1 an explicit joint and supervised contrastive learning framework for few shot intent classification and slot filling han liu feng zhang xiaotong zhang siyang zhao xianchao zhang findings of emnlp 2021 pdf https arxiv org abs 2110 13691 1 few shot intent detection via contrastive pre training and fine tuning jianguo zhang trung bui seunghyun yoon xiang chen zhiwei liu congying xia quan hung tran walter chang philip yu emnlp 2021 pdf https arxiv org abs 2109 06349 code https github com jianguoz few shot intent detection 1 bridge to target domain by prototypical contrastive learning and label confusion re explore zero shot learning for slot filling liwen wang xuefeng li jiachi liu keqing he yuanmeng yan weiran xu emnlp 2021 pdf https arxiv org abs 2110 03572 code https github com w lw pclc 1 active learning by acquiring contrastive examples katerina margatina giorgos vernikos lo c barrault nikolaos aletras emnlp 2021 pdf https arxiv org abs 2109 03764 code https github com mourga contrastive active learning 1 bi granularity contrastive learning for post training in few shot scene ruikun luo guanhuan huang xiaojun quan findings of acl 2021 pdf https arxiv org abs 2106 02327 1 contrastive learning for prompt based few shot language learners yiren jian chongyang gao soroush vosoughi naacl 2022 pdf https arxiv org abs 2205 01308 1 zero shot event detection based on ordered contrastive learning and prompt based prediction senhui zhang tao ji wendi ji xiaoling wang findings of naacl 2022 pdf 1 rcl relation contrastive learning for zero shot relation extraction shusen wang bosen zhang yajing xu yanan wu bo xiao findings of naacl 2022 pdf contrastive pretraining 1 coco lm correcting and contrasting text sequences for language model pretraining yu meng chenyan xiong payal bajaj saurabh tiwary paul bennett jiawei han xia song neurips 2021 pdf https arxiv org abs 2102 08473 code https github com microsoft coco lm 1 tacl improving bert pre training with token aware contrastive learning yixuan su fangyu liu zaiqiao meng tian lan lei shu ehsan shareghi nigel collier findings of naacl 2022 pdf https arxiv org abs 2111 04198 code https github com yxuansu tacl 1 clear contrastive learning for sentence representation zhuofeng wu sinong wang jiatao gu madian khabsa fei sun hao ma arxiv pdf https arxiv org abs 2012 15466 1 supervised contrastive learning for pre trained language model fine tuning beliz gunel jingfei du alexis conneau ves stoyanov iclr 2021 pdf https arxiv org abs 2011 01403 1 pre training transformers as energy based cloze models kevin clark minh thang luong quoc v le christopher d manning emnlp 2020 pdf https arxiv org abs 2012 08561 code https github com google research electra 1 fine tuning pre trained language model with weak supervision a contrastive regularized self training approach yue yu simiao zuo haoming jiang wendi ren tuo zhao chao zhang naacl 2021 pdf https arxiv org abs 2010 07835 code https github com yueyu1030 cosine 1 data efficient pretraining via contrastive self supervision nils rethmeier isabelle augenstein arxiv pdf https arxiv org abs 2010 01061 1 multi granularity contrasting for cross lingual pre training shicheng li pengcheng yang fuli luo jun xie findings of acl 2021 pdf https aclanthology org 2021 findings acl 149 pdf 1 infoxlm an information theoretic framework for cross lingual language model pre training zewen chi li dong furu wei nan yang saksham singhal wenhui wang xia song xian ling mao heyan huang ming zhou naacl 2021 pdf https arxiv org abs 2007 07834 code https aka ms infoxlm interpretability and explainability 1 evaluating models local decision boundaries via contrast sets matt gardner yoav artzi victoria basmova jonathan berant ben bogin sihao chen pradeep dasigi dheeru dua yanai elazar ananth gottumukkala nitish gupta hanna hajishirzi gabriel ilharco daniel khashabi kevin lin jiangming liu nelson f liu phoebe mulcaire qiang ning sameer singh noah a smith sanjay subramanian reut tsarfaty eric wallace ally zhang ben zhou arxiv pdf https arxiv org abs 2004 02709 1 alice active learning with contrastive natural language explanations weixin liang james zou zhou yu emnlp 2020 pdf https arxiv org abs 2009 10259 1 explaining nlp models via minimal contrastive editing mice alexis ross ana marasovi matthew e peters findings of acl 2021 pdf https arxiv org abs 2012 13985 code https github com allenai mice 1 kace generating knowledge aware contrastive explanations for natural language inference qianglong chen feng ji xiangji zeng feng lin li ji zhang haiqing chen yin zhang acl 2021 pdf https aclanthology org 2021 acl long 196 pdf 1 contrastive explanations for model interpretability alon jacovi swabha swayamdipta shauli ravfogel yanai elazar yejin choi yoav goldberg emnlp 2021 pdf https arxiv org abs 2103 01378 code https github com allenai contrastive explanations 1 explanation graph generation via pre trained language models an empirical study with contrastive learning swarnadeep saha prateek yadav mohit bansal acl 2022 pdf https arxiv org abs 2204 04813 code https github com swarnahub explagraphgen 1 toward interpretable semantic textual similarity via optimal transport based contrastive sentence learning seonghyeon lee dongha lee seongbo jang hwanjo yu acl 2022 pdf https aclanthology org 2022 acl long 412 commonsense knowledge and reasoning 1 contrastive self supervised learning for commonsense reasoning tassilo klein moin nabi acl 2020 pdf https arxiv org abs 2005 00669 code https github com sap samples acl2020 commonsense 1 prompting contrastive explanations for commonsense reasoning tasks bhargavi paranjape julian michael marjan ghazvininejad luke zettlemoyer hannaneh hajishirzi findings of acl 2021 pdf https arxiv org abs 2106 06823 1 kfcnet knowledge filtering and contrastive learning network for generative commonsense reasoning haonan li yeyun gong jian jiao ruofei zhang timothy baldwin nan duan findings of emnlp 2021 pdf https arxiv org abs 2109 06704 1 learning from missing relations contrastive learning with commonsense knowledge graphs for commonsense inference yong ho jung jun hyung park joon young choi mingyu lee junho kim kang min kim sangkeun lee findings of acl 2022 pdf https aclanthology org 2022 findings acl 119 vision and language 1 language models can see plugging visual controls in text generation yixuan su tian lan yahui liu fangyu liu dani yogatama yan wang lingpeng kong nigel collier arxiv pdf https arxiv org abs 2205 02655 code https github com yxuansu magic 1 counterfactual contrastive learning for weakly supervised vision language grounding zhu zhang zhou zhao zhijie lin jieming zhu xiuqiang he neurips 2020 pdf https papers nips cc paper 2020 file d27b95cac4c27feb850aaa4070cc4675 paper pdf 1 unimo towards unified modal understanding and generation via cross modal contrastive learning wei li can gao guocheng niu xinyan xiao hao liu jiachen liu hua wu haifeng wang acl 2021 pdf https arxiv org abs 2012 15409 code https github com paddlepaddle research tree master nlp unimo 1 sort ing vqa models contrastive gradient learning for improved consistency sameer dharur purva tendulkar dhruv batra devi parikh ramprasaath r selvaraju neurips 2020 workshop pdf https arxiv org abs 2010 10038 code https github com sameerdharur sorting vqa 1 contrastive learning for weakly supervised phrase grounding tanmay gupta arash vahdat gal chechik xiaodong yang jan kautz derek hoiem eccv 2020 pdf https arxiv org abs 2006 09920 code http tanmaygupta info info ground 1 unsupervised natural language inference via decoupled multimodal contrastive learning wanyun cui guangyu zheng wei wang emnlp 2020 pdf https arxiv org abs 2010 08200 1 videoclip contrastive pre training for zero shot video text understanding hu xu gargi ghosh po yao huang dmytro okhonko armen aghajanyan florian metze luke zettlemoyer christoph feichtenhofer emnlp 2021 pdf https arxiv org abs 2109 14084 code https github com pytorch fairseq tree main examples mmpt 1 scaling up visual and vision language representation learning with noisy text supervision chao jia yinfei yang ye xia yi ting chen zarana parekh hieu pham quoc v le yunhsuan sung zhen li tom duerig icml 2021 pdf https arxiv org abs 2102 05918 1 umic an unreferenced metric for image captioning via contrastive learning hwanhee lee seunghyun yoon franck dernoncourt trung bui kyomin jung acl 2021 pdf https arxiv org abs 2106 14019 code https github com hwanheelee1993 umic 1 blip bootstrapping language image pre training for unified vision language understanding and generation junnan li dongxu li caiming xiong steven hoi arxiv pdf https arxiv org abs 2201 12086 code https github com salesforce blip 1 cyclip cyclic contrastive language image pretraining shashank goel hritik bansal sumit bhatia ryan a rossi vishwa vinay aditya grover arxiv pdf https arxiv org abs 2205 14459 code https github com goel shashank cyclip 1 learning video representations using contrastive bidirectional transformer chen sun fabien baradel kevin murphy cordelia schmid arxiv pdf https arxiv org abs 1906 05743 others 1 towards unsupervised dense information retrieval with contrastive learning gautier izacard mathilde caron lucas hosseini sebastian riedel piotr bojanowski armand joulin edouard grave arxiv pdf https arxiv org abs 2112 09118 1 text and code embeddings by contrastive pre training arvind neelakantan tao xu raul puri alec radford jesse michael han jerry tworek qiming yuan nikolas tezak jong wook kim chris hallacy johannes heidecke pranav shyam boris power tyna eloundou nekoul girish sastry gretchen krueger david schnurr felipe petroski such kenny hsu madeleine thompson tabarak khan toki sherbakov joanne jang peter welinder lilian weng arxiv pdf https arxiv org abs 2201 10005 code https openai com blog introducing text and code embeddings 1 multi level contrastive learning for cross lingual alignment beiduo chen wu guo bin gu quan liu yongchao wang icassp 2022 pdf https arxiv org abs 2202 13083 code https github com salesforce blip 1 understanding hard negatives in noise contrastive estimation wenzheng zhang karl stratos naacl 2021 pdf https arxiv org abs 2104 06245 code https github com wenzhengzhang hard nce el 1 scaling deep contrastive learning batch size under memory limited setup luyu gao yunyi zhang jiawei han jamie callan repl4nlp 2021 pdf https arxiv org abs 2101 06983 code https github com luyug gradcache 1 contrastive distillation on intermediate representations for language model compression siqi sun zhe gan yu cheng yuwei fang shuohang wang jingjing liu emnlp 2020 pdf https arxiv org abs 2009 14167 code https github com intersun codir 1 fairfil contrastive neural debiasing method for pretrained text encoders pengyu cheng weituo hao siyang yuan shijing si lawrence carin iclr 2021 pdf https arxiv org abs 2103 06413 1 get your vitamin c robust fact verification with contrastive evidence tal schuster adam fisch regina barzilay naacl 2021 pdf https arxiv org abs 2103 08541 code https github com talschuster vitaminc 1 supporting clustering with contrastive learning dejiao zhang feng nan xiaokai wei shangwen li henghui zhu kathleen mckeown ramesh nallapati andrew arnold bing xiang naacl 2021 pdf https arxiv org abs 2103 12953 code https github com amazon research sccl 1 modeling discriminative representations for out of domain detection with supervised contrastive learning zhiyuan zeng keqing he yuanmeng yan zijun liu yanan wu hong xu huixing jiang weiran xu acl 2021 pdf https arxiv org abs 2105 14289 code https github com parzival27 supervised contrastive learning for out of domain detection 1 contrastive out of distribution detection for pretrained transformers wenxuan zhou fangyu liu muhao chen emnlp 2021 pdf https arxiv org abs 2104 08812 code https github com wzhouad contra ood 1 contrastive fine tuning improves robustness for neural rankers xiaofei ma cicero nogueira dos santos andrew o arnold findings of acl 2021 pdf https arxiv org abs 2105 12932 1 contrastive code representation learning paras jain ajay jain tianjun zhang pieter abbeel joseph e gonzalez ion stoica emnlp 2021 pdf https arxiv org abs 2007 04973 code https github com parasj contracode 1 knowledge representation learning with contrastive completion coding bo ouyang wenbing huang runfa chen zhixing tan yang liu maosong sun jihong zhu findings of emnlp 2021 pdf https aclanthology org 2021 findings emnlp 263 pdf 1 adversarial training with contrastive learning in nlp daniela n rim dongnyeong heo heeyoul choi arxiv pdf https arxiv org abs 2109 09075 1 simple contrastive representation adversarial learning for nlp tasks deshui miao jiaqi zhang wenbo xie jian song xin li lijuan jia ning guo arxiv pdf https arxiv org abs 2111 13301 1 learning to retrieve prompts for in context learning ohad rubin jonathan herzig jonathan berant arxiv pdf https arxiv org abs 2112 08633 1 relic retrieving evidence for literary claims katherine thai yapei chang kalpesh krishna mohit iyyer acl 2022 pdf https arxiv org abs 2203 10053 code https relic cs umass edu 1 multi level contrastive learning for cross lingual alignment beiduo chen wu guo bin gu quan liu yongchao wang icassp 2022 pdf https arxiv org abs 2202 13083 1 multi scale self contrastive learning with hard negative mining for weakly supervised query based video grounding shentong mo daizong liu wei hu arxiv pdf https arxiv org abs 2203 03838 1 contrastive demonstration tuning for pre trained language models xiaozhuan liang ningyu zhang siyuan cheng zhen bi zhenru zhang chuanqi tan songfang huang fei huang huajun chen arxiv pdf https arxiv org abs 2204 04392 code https github com zjunlp promptkg tree main research demo tuning 1 gl clef a global local contrastive learning framework for cross lingual spoken language understanding libo qin qiguang chen tianbao xie qixin li jian guang lou wanxiang che min yen kan acl 2022 pdf https arxiv org abs 2204 08325 code https github com lightchen233 gl clef 1 zero shot stance detection via contrastive learning bin liang zixiao chen lin gui yulan he min yang and ruifeng xu www 2022 pdf https dl acm org doi 10 1145 3485447 3511994 code https github com hitsz hlt pt hcl 1 multi level contrastive learning for cross lingual spoken language understanding shining liang linjun shou jian pei ming gong wanli zuo xianglin zuo daxin jiang arxiv pdf https arxiv org abs 2205 03656 1 merit meta path guided contrastive learning for logical reasoning fangkai jiao yangyang guo xuemeng song liqiang nie findings of acl 2022 pdf https aclanthology org 2022 findings acl 276 1 the past mistake is the future wisdom error driven contrastive probability optimization for chinese spell checking yinghui li qingyu zhou yangning li zhongli li ruiyang liu rongyi sun zizhen wang chao li yunbo cao hai tao zheng findings of acl 2022 pdf https aclanthology org 2022 findings acl 252 1 mitigating contradictions in dialogue based on contrastive learning weizhao li junsheng kong ben liao yi cai findings of acl 2022 pdf https aclanthology org 2022 findings acl 219 1 seeking patterns not just memorizing procedures contrastive learning for solving math word problems zhongli li wenxuan zhang chao yan qingyu zhou chao li hongzhi liu yunbo cao findings of acl 2022 pdf https aclanthology org 2022 findings acl 195 1 mitigating the inconsistency between word saliency and model confidence with pathological contrastive training pengwei zhan yang wu shaolei zhou yunjian zhang liming wang findings of acl 2022 pdf https aclanthology org 2022 findings acl 175 1 disentangled knowledge transfer for ood intent discovery with unified contrastive learning yutao mou keqing he yanan wu zhiyuan zeng hong xu huixing jiang wei wu weiran xu acl 2022 pdf https aclanthology org 2022 acl short 6 1 jointcl a joint contrastive learning framework for zero shot stance detection bin liang qinglin zhu xiang li min yang lin gui yulan he ruifeng xu acl 2022 pdf https aclanthology org 2022 acl long 7 1 new intent discovery with pre training and contrastive learning yuwei zhang haode zhang li ming zhan xiao ming wu albert lam acl 2022 pdf https aclanthology org 2022 acl long 21 1 rocbert robust chinese bert with multimodal contrastive pretraining hui su weiwei shi xiaoyu shen zhou xiao tuo ji jiarui fang jie zhou acl 2022 pdf https aclanthology org 2022 acl long 65 1 sentence aware contrastive learning for open domain passage retrieval wu hong zhuosheng zhang jinyuan wang hai zhao acl 2022 pdf https aclanthology org 2022 acl long 76 1 improving event representation via simultaneous weakly supervised contrastive learning and clustering jun gao wei wang changlong yu huan zhao wilfred ng ruifeng xu acl 2022 pdf https aclanthology org 2022 acl long 216 1 contrastive visual semantic pretraining magnifies the semantics of natural language representations robert wolfe aylin caliskan acl 2022 pdf https aclanthology org 2022 acl long 217 1 multilingual molecular representation learning via contrastive pre training zhihui guo pramod sharma andy martinez liang du robin abraham acl 2022 pdf https aclanthology org 2022 acl long 242 1 simkgc simple contrastive knowledge graph completion with pre trained language models liang wang wei zhao zhuoyu wei jingming liu acl 2022 pdf https aclanthology org 2022 acl long 295 1 rewire then probe a contrastive recipe for probing biomedical knowledge of pre trained language models zaiqiao meng fangyu liu ehsan shareghi yixuan su charlotte collins nigel collier acl 2022 pdf https aclanthology org 2022 acl long 329 1 knn contrastive learning for out of domain intent classification yunhua zhou peiju liu xipeng qiu acl 2022 pdf https aclanthology org 2022 acl long 352 1 cross modal contrastive learning for speech translation rong ye mingxuan wang lei li naacl 2022 pdf https arxiv org abs 2205 02444 1 revisit overconfidence for ood detection reassigned contrastive learning with adaptive class dependent threshold yanan wu keqing he yuanmeng yan qixiang gao zhiyuan zeng fujia zheng lulu zhao huixing jiang wei wu weiran xu naacl 2022 pdf 1 contrastive representation learning for cross document coreference resolution of events and entities benjamin hsu graham horwood naacl 2022 pdf https arxiv org abs 2205 11438 1 domain confused contrastive learning for unsupervised domain adaptation quanyu long tianze luo wenya wang sinno pan naacl 2022 pdf 1 intent detection and discovery from user logs via deep semi supervised contrastive clustering rajat kumar mayur patidar vaibhav varshney lovekesh vig gautam shroff naacl 2022 pdf 1 detect rumors in microblog posts for low resource domains via adversarial contrastive learning hongzhan lin jing ma liangliang chen zhiwei yang mingfei cheng guang chen findings of naacl 2022 pdf https arxiv org abs 2204 08143 1 clmlf a contrastive learning and multi layer fusion method for multimodal sentiment detection zhen li bing xu conghui zhu tiejun zhao findings of naacl 2022 pdf https arxiv org abs 2204 05515 1 prompt augmented generative replay via supervised contrastive learning for lifelong intent detection vaibhav varshney mayur patidar rajat kumar lovekesh vig gautam shroff findings of naacl 2022 pdf 1 code mvp learning to represent source code from multiple views with contrastive pre training xin wang yasheng wang yao wan jiawei wang pingyi zhou li li hao wu jin liu findings of naacl 2022 pdf https arxiv org abs 2205 02029 1 self supervised contrastive learning with adversarial perturbations for defending word substitution based attacks zhao meng yihan dong mrinmaya sachan roger wattenhofer findings of naacl 2022 pdf https arxiv org abs 2107 07610 contributor please contact rui zhang https ryanzhumich github io if you want to add any references | contrastive-learning natural-language-processing machine-learning metric-learning representation-learning | ai |
My-Financial-Manager | mfmanager this is a personal project to learn about android mobile application development this application helps the user keep track of all the expenditures and income by recording them project title mfmanager my financial manager project duration mar 1st 2018 apr 8th 2018 framework and language android studio java sqlite usage a user can ul li open accounts li li change balance of each account li li transfer between accounts li li record expense and income li li set monthly budget categories li li enter credit card information and record credit transactions li li set monthly earning plans li ul functions ul li displays net earnings for last 7 days li li displays monthly net earnings for the past 6 months li li displays the percentage of this earnings of this month li li displays the percentage of expense in each budget category li ul costumization ul li user can selects currency li li set the default cash balance li li make balance adjustment lists visible invisible li ul sources ul li lynda com li li this code uses mpandroidchart copyright 2018 philipp jahoda which is licensed under the apache 2 0 license and can be obtained here a href https github com philjay mpandroidchart title mpandroidchart github page mpandroidchart a by philjay li ul license copyright 2018 hongjo lim licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license how to get started with the app a href https github com czebahi my financial manager blob master mfmanager 20quick 20start 20guide pdf quick start guide a follow the steps to download and get started on mfmanager screenshots login activity login activity https user images githubusercontent com 35909587 41353323 f126fcbc 6ee9 11e8 8b99 38b41396508f png setting default values the user can set default balance of cash and choose currency select default cash https user images githubusercontent com 35909587 41353056 1f765550 6ee9 11e8 86c5 9c98f9df4b3d png choose currency https user images githubusercontent com 35909587 41353349 05378438 6eea 11e8 9c39 90b7b987303b png main activity navigation drawer menu navigation drawer menu https user images githubusercontent com 35909587 41353053 1f52a088 6ee9 11e8 9ff2 84ad040304c6 png net earnings for the past 7 days 6 months main activity https user images githubusercontent com 35909587 41351477 78a2113c 6ee4 11e8 8de3 cba774759b58 png assets pie chart budget summary for this month assets piechart https user images githubusercontent com 35909587 41353480 58d8f450 6eea 11e8 85d0 299a5c9fd8da png budget summary https user images githubusercontent com 35909587 41353473 566e7ce4 6eea 11e8 93f2 0880f023a006 png the percentage of earnings earning progressbar2 https user images githubusercontent com 35909587 41352351 41aee3a0 6ee7 11e8 88c5 3b0db23e6483 png showing expense activity displays all the records of expense showing expense activity https user images githubusercontent com 35909587 41353061 1fcb0c8a 6ee9 11e8 8710 cad4fe1963cd png adding expense activity the user can add expense adding expense activity https user images githubusercontent com 35909587 41352375 524748b0 6ee7 11e8 9a47 35b0650c0205 png showing income activity displays all the records of income showing income acgtivity https user images githubusercontent com 35909587 41353062 1fd838ce 6ee9 11e8 82fb bf9b86dbb07f png adding income activity the user can add income salary https user images githubusercontent com 35909587 41353054 1f661208 6ee9 11e8 947b 27c2206a0f5a png showing accounts activity displays all the accounts showing accounts activity https user images githubusercontent com 35909587 41352499 a0958bda 6ee7 11e8 9e9b 9518679298fd png setting accounts activity the user can set an account setting accounts activity https user images githubusercontent com 35909587 41353057 1f864e4c 6ee9 11e8 9596 b28037dcc111 png budget activity the user can set expense categories budget activity https user images githubusercontent com 35909587 41353477 57bf34f8 6eea 11e8 8ba8 5acf87354576 png earning plans activity the user can set income categories earning plan activity https user images githubusercontent com 35909587 41353322 efee9bb6 6ee9 11e8 8cae 05c0721f5567 png transfer activity the user can transfer balance between accounts including cash transfer activity https user images githubusercontent com 35909587 41353063 1fee573a 6ee9 11e8 8df2 e8436a09c445 png showing credit activity the user can view the list of credit cards showing credit activity https user images githubusercontent com 35909587 41353059 1fafc952 6ee9 11e8 8dfd a91bf129b237 png showing credit detail activity the user can show the detail of credit cards showing credit detail activity https user images githubusercontent com 35909587 41353060 1fbfd068 6ee9 11e8 9ec1 4ebc96e04331 png | front_end |
|
mobile | mobile development folder independent access mobi examples of mobile end page development mobilehack here to collect a number of mobile terminal encountered on a variety of pit and relative solutions https github com rubylouvre mobilehack trip mobile terminal experience https github com doyoe trip mobile tech mobile client development usezepto how to use zepto mobile layout example mobile layout exmaple infinitescrollpage infinite drop distributed components can be customized to automatically load pages and flexible configuration manual loading album 3d album management game h5 mobile game gallery zepto js mobile gallery fullpage three fullpage examples ebx examples of a mobile terminal project | front_end |
|
frontexpress | frontexpress http fontmeme com embed php text frontexpress name atype 201 20light ttf size 90 style color 6f6f75 https frontexpressjs com an express js style router for the front end code the front end like the back end same language same framework frontexpress demo https github com camelaissani frontexpress demo build status https travis ci org camelaissani frontexpress svg branch master https travis ci org camelaissani frontexpress code climate https codeclimate com github camelaissani frontexpress badges gpa svg https codeclimate com github camelaissani frontexpress coverage status https coveralls io repos github camelaissani frontexpress badge svg branch master https coveralls io github camelaissani frontexpress branch master dependencies https img shields io gemnasium mathiasbynens he svg size shield https img shields io badge size 3 55kb brightgreen svg npm https img shields io npm dm frontexpress svg https www npmjs com package frontexpress js import frontexpress from frontexpress front end application const app frontexpress handles http 401 app use req res next if res status 401 window alert you are not authenticated please sign in else next app get req res document queryselector content innerhtml hello world app post login user req res document queryselector content innerhtml welcome req params user start listening front end requests emitted received app listen features you already know expressjs http expressjs com then you know frontexpress simple minimal core extendable through plugins lighweight framework build your front end application by handling routes ideal for single page application manage ajax requests and browser history installation from npm repository bash npm install frontexpress from bower repository bash bower install frontexpress from cdn on jsdelivr https cdn jsdelivr net npm frontexpress latest frontexpress min js documentation website and documentation https frontexpressjs com tests clone the repository bash git clone git github com camelaissani frontexpress git cd frontexpress install the dependencies and run the test suite bash npm install npm test license mit license | front-end browser router history navigation javascript url-parsing url expressjs middleware spa-application spa | front_end |
nni | div align center img src docs img nni logo png width 600 div br mit licensed https img shields io badge license mit brightgreen svg license issues https img shields io github issues raw microsoft nni svg https github com microsoft nni issues q is 3aissue is 3aopen bugs https img shields io github issues microsoft nni bug svg https github com microsoft nni issues q is 3aissue is 3aopen label 3abug pull requests https img shields io github issues pr raw microsoft nni svg https github com microsoft nni pulls q is 3apr is 3aopen version https img shields io github release microsoft nni svg https github com microsoft nni releases documentation status https readthedocs org projects nni badge version stable https nni readthedocs io en stable badge stable https img shields io github contributors anon microsoft nni https github com microsoft nni graphs contributors img src docs img readme banner png width 100 https nni readthedocs io en stable nni automates feature engineering neural architecture search hyperparameter tuning and model compression for deep learning find the latest features api examples and tutorials in our official documentation https nni readthedocs io https nni readthedocs io zh stable what s new nbsp a href nni released reminder img width 48 src docs img release icon png a new release v3 0 preview is available https github com microsoft nni releases tag v3 0rc1 released on may 5 2022 new demo available youtube entry https www youtube com channel uckcafm6861b2mnyhpbzhavw bilibili https space bilibili com 1649051673 last updated on june 22 2022 new research paper sparta deep learning model sparsity via tensor with sparsity attribute https www usenix org system files osdi22 zheng ningxin pdf published in osdi 2022 new research paper privacy preserving online automl for domain specific face detection https openaccess thecvf com content cvpr2022 papers yan privacy preserving online automl for domain specific face detection cvpr 2022 paper pdf published in cvpr 2022 newly upgraded documentation doc upgraded https nni readthedocs io en stable installation see the nni installation guide https nni readthedocs io en stable installation html to install from pip or build from source to install the current release pip install nni to update nni to the latest version add upgrade flag to the above commands nni capabilities in a glance img src docs img overview svg width 100 table tbody tr align center valign bottom td td td b hyperparameter tuning b img src docs img bar png td td b neural architecture search b img src docs img bar png td td b model compression b img src docs img bar png td tr tr valign top td align center valign middle b algorithms b td td ul li b exhaustive search b li ul li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo gridsearch tuner gridsearchtuner grid search a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo random tuner randomtuner random a li ul li b heuristic search b li ul li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo hyperopt tuner hyperopttuner anneal a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo evolution tuner evolutiontuner evolution a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo hyperband advisor hyperband hyperband a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo pbt tuner pbttuner pbt a li ul li b bayesian optimization b li ul li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo bohb advisor bohb bohb a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo dngo tuner dngotuner dngo a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo gp tuner gptuner gp a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo metis tuner metistuner metis a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo smac tuner smactuner smac a li li a href https nni readthedocs io en latest reference hpo html nni algorithms hpo tpe tuner tpetuner tpe a li ul ul td td ul li b multi trial b li ul li a href https nni readthedocs io en latest nas exploration strategy html grid search strategy grid search a li li a href https nni readthedocs io en latest nas exploration strategy html policy based rl strategy policy based rl a li li a href https nni readthedocs io en latest nas exploration strategy html random strategy random a li li a href https nni readthedocs io en latest nas exploration strategy html regularized evolution strategy regularized evolution a li li a href https nni readthedocs io en latest nas exploration strategy html tpe strategy tpe a li ul li b one shot b li ul li a href https nni readthedocs io en latest nas exploration strategy html darts strategy darts a li li a href https nni readthedocs io en latest nas exploration strategy html enas strategy enas a li li a href https nni readthedocs io en latest nas exploration strategy html fbnet strategy fbnet a li li a href https nni readthedocs io en latest nas exploration strategy html proxylessnas strategy proxylessnas a li li a href https nni readthedocs io en latest nas exploration strategy html spos strategy spos a li ul ul td td ul li b pruning b li ul li a href https nni readthedocs io en latest compression pruner html level pruner level a li li a href https nni readthedocs io en latest compression pruner html l1 norm pruner l1 norm a li li a href https nni readthedocs io en latest compression pruner html taylor fo weight pruner taylor fo weight a li li a href https nni readthedocs io en latest compression pruner html movement pruner movement a li li a href https nni readthedocs io en latest compression pruner html agp pruner agp a li li a href https nni readthedocs io en latest compression pruner html auto compress pruner auto compress a li li a href https nni readthedocs io en latest compression pruner html more a li ul li b quantization b li ul li a href https nni readthedocs io en latest compression quantizer html naive quantizer naive a li li a href https nni readthedocs io en latest compression quantizer html qat quantizer qat a li li a href https nni readthedocs io en latest compression quantizer html lsq quantizer lsq a li li a href https nni readthedocs io en latest compression quantizer html observer quantizer observer a li li a href https nni readthedocs io en latest compression quantizer html dorefa quantizer dorefa a li li a href https nni readthedocs io en latest compression quantizer html bnn quantizer bnn a li ul ul td tr align center valign bottom td td td b supported frameworks b img src docs img bar png td td b training services b img src docs img bar png td td b tutorials b img src docs img bar png td tr tr valign top td align center valign middle b supports b td td ul li pytorch li li tensorflow li li scikit learn li li xgboost li li lightgbm li li mxnet li li caffe2 li li more li ul td td ul li a href https nni readthedocs io en latest experiment local html local machine a li li a href https nni readthedocs io en latest experiment remote html remote ssh servers a li li a href https nni readthedocs io en latest experiment aml html azure machine learning aml a li li b kubernetes based b li ul li a href https nni readthedocs io en latest experiment openpai html openapi a li li a href https nni readthedocs io en latest experiment kubeflow html kubeflow a li li a href https nni readthedocs io en latest experiment frameworkcontroller html frameworkcontroller a li li a href https nni readthedocs io en latest experiment adaptdl html adaptdl a li li a href https nni readthedocs io en latest experiment paidlc html pai dlc a li ul li a href https nni readthedocs io en latest experiment hybrid html hybrid training services a li ul td td ul li b hpo b li ul li a href https nni readthedocs io en latest tutorials hpo quickstart pytorch main html pytorch a li li a href https nni readthedocs io en latest tutorials hpo quickstart tensorflow main html tensorflow a li ul li b nas b li ul li a href https nni readthedocs io en latest tutorials hello nas html hello nas a li li a href https nni readthedocs io en latest tutorials nasbench as dataset html nas benchmarks a li ul li b compression b li ul li a href https nni readthedocs io en latest tutorials pruning quick start html pruning a li li a href https nni readthedocs io en latest tutorials pruning speed up html pruning speedup a li li a href https nni readthedocs io en latest tutorials quantization quick start html quantization a li li a href https nni readthedocs io en latest tutorials quantization speed up html quantization speedup a li ul ul td tbody table img src docs static img webui gif alt webui width 100 resources nni documentation homepage https nni readthedocs io en stable nni installation guide https nni readthedocs io en stable installation html nni examples https nni readthedocs io en latest examples html python api reference https nni readthedocs io en latest reference python api html releases change log https nni readthedocs io en latest release html related research and publications https nni readthedocs io en latest notes research publications html youtube channel of nni https www youtube com channel uckcafm6861b2mnyhpbzhavw bilibili space of nni https space bilibili com 1649051673 webinar of introducing retiarii a deep learning exploratory training framework on nni https note microsoft com msr webinar retiarii registration live html community discussions https github com microsoft nni discussions contribution guidelines if you want to contribute to nni be sure to review the contribution guidelines https nni readthedocs io en stable notes contributing html which includes instructions of submitting feedbacks best coding practices and code of conduct we use github issues https github com microsoft nni issues to track tracking requests and bugs please use nni discussion https github com microsoft nni discussions for general questions and new ideas for questions of specific use cases please go to stack overflow https stackoverflow com questions tagged nni participating discussions via the following im groups is also welcomed gitter wechat image https user images githubusercontent com 39592018 80665738 e0574a80 8acc 11ea 91bc 0836dc4cbf89 png or image https github com scarlett2018 nniutil raw master wechat png over the past few years nni has received thousands of feedbacks on github issues and pull requests from hundreds of contributors we appreciate all contributions from community to make nni thrive img src https img shields io github contributors anon microsoft nni a href https github com microsoft nni graphs contributors img src https contrib rocks image repo microsoft nni max 240 columns 18 a test status essentials type status fast test build status https msrasrg visualstudio com nniopensource apis build status fast 20test branchname master https msrasrg visualstudio com nniopensource build latest definitionid 54 branchname master full test hpo build status https msrasrg visualstudio com nniopensource apis build status full 20test 20 20hpo reponame microsoft 2fnni branchname master https msrasrg visualstudio com nniopensource build latest definitionid 90 reponame microsoft 2fnni branchname master full test nas build status https msrasrg visualstudio com nniopensource apis build status full 20test 20 20nas reponame microsoft 2fnni branchname master https msrasrg visualstudio com nniopensource build latest definitionid 89 reponame microsoft 2fnni branchname master full test compression build status https msrasrg visualstudio com nniopensource apis build status full 20test 20 20compression reponame microsoft 2fnni branchname master https msrasrg visualstudio com nniopensource build latest definitionid 91 reponame microsoft 2fnni branchname master training services type status local linux build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20local 20 20linux branchname master https msrasrg visualstudio com nniopensource build latest definitionid 92 branchname master local windows build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20local 20 20windows branchname master https msrasrg visualstudio com nniopensource build latest definitionid 98 branchname master remote linux to linux build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20remote 20 20linux 20to 20linux branchname master https msrasrg visualstudio com nniopensource build latest definitionid 64 branchname master remote windows to windows build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20remote 20 20windows 20to 20windows branchname master https msrasrg visualstudio com nniopensource build latest definitionid 99 branchname master openpai build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20openpai 20 20linux branchname master https msrasrg visualstudio com nniopensource build latest definitionid 65 branchname master frameworkcontroller build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20frameworkcontroller branchname master https msrasrg visualstudio com nniopensource build latest definitionid 70 branchname master kubeflow build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20kubeflow branchname master https msrasrg visualstudio com nniopensource build latest definitionid 69 branchname master hybrid build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20hybrid branchname master https msrasrg visualstudio com nniopensource build latest definitionid 79 branchname master azureml build status https msrasrg visualstudio com nniopensource apis build status integration 20test 20 20aml branchname master https msrasrg visualstudio com nniopensource build latest definitionid 78 branchname master related projects targeting at openness and advancing state of art technology microsoft research msr https www microsoft com en us research group systems and networking research group asia had also released few other open source projects openpai https github com microsoft pai an open source platform that provides complete ai model training and resource management capabilities it is easy to extend and supports on premise cloud and hybrid environments in various scale frameworkcontroller https github com microsoft frameworkcontroller an open source general purpose kubernetes pod controller that orchestrate all kinds of applications on kubernetes by a single controller mmdnn https github com microsoft mmdnn a comprehensive cross framework solution to convert visualize and diagnose deep neural network models the mm in mmdnn stands for model management and dnn is an acronym for deep neural network sptag https github com microsoft sptag space partition tree and graph sptag is an open source library for large scale vector approximate nearest neighbor search scenario nn meter https github com microsoft nn meter an accurate inference latency predictor for dnn models on diverse edge devices we encourage researchers and students leverage these projects to accelerate the ai development and research license the entire codebase is under mit license license | automl deep-learning neural-architecture-search hyperparameter-optimization distributed bayesian-optimization automated-machine-learning machine-learning machine-learning-algorithms data-science tensorflow pytorch neural-network deep-neural-network model-compression feature-engineering nas python hyperparameter-tuning mlops | ai |
natural-language-processing | intro to natural language processing brought to you by lesley cordero http www columbia edu lc2958 and adi https adicu com last major update was in 2017 and isn t being actively maintained table of contents 0 0 setup 00 setup 0 1 python pip 01 python pip 0 2 libraries 02 libraries 0 3 other 03 other 1 0 background 10 background 1 1 what is nlp 11 what is nlp 1 2 why is nlp important 12 why is nlp importance 1 3 why is nlp a hard problem 13 why is nlp a hard problem 1 4 glossary 14 glossary 2 0 sentiment analysis 20 sentiment analysis 2 1 preparing the data 21 preparing the data 2 1 1 training data 211 training data 2 1 2 test data 212 test data 2 2 building a classifier 22 building a classifier 2 3 classification 53 classification 2 4 accuracy 24 accuracy 3 0 regular expressions 30 regular expressions 3 1 simplest form 31 simplest form 3 2 case sensitivity 32 case sensitivity 3 3 disjunctions 33 disjunctions 3 4 ranges 34 ranges 3 5 exclusions 35 exclusions 3 6 question marks 36 question marks 3 7 kleene star 37 kleene star 3 8 wildcards 38 wildcards 3 9 kleene 39 kleene 4 0 word tagging and models 40 word tagging and models 4 1 nltk parts of speech tagger 41 nltk parts of speech tagger 4 1 1 ambiguity 411 ambiguity 4 2 unigram models 42 unigram models 4 3 bigram models 43 bigram models 5 0 normalizing text 40 normalizing text 5 1 stemming 51 stemming 5 1 1 what is stemming 511 what is stemming 5 1 2 types of stemmers 512 types of stemmers 5 2 lemmatization 52 lemmatization 5 2 1 what is lemmatization 521 what is lemmatization 5 2 2 wordnetlemmatizer 522 wordnetlemmatizer 6 0 final words 60 final words 6 1 resources 61 resources 6 2 more stuff 62 mini courses 0 0 setup this guide was written in python 3 6 0 1 python anaconda download python https www python org downloads and pip http docs continuum io anaconda install 0 2 libraries we ll be working with the re library for regular expressions and nltk for natural language processing techniques so make sure to install them to install these libraries enter the following commands into your terminal pip3 install re pip3 install nltk 0 3 other since we ll be working on textual analysis we ll be using datasets that are already well established and widely used to gain access to these datasets enter the following command into your command line note that this might take a few minutes sudo python3 m nltk downloader all lastly download the data we ll be working with in this example positive tweets https github com lesley2958 natural language processing blob master pos tweets txt br negative tweets https github com lesley2958 natural language processing blob master neg tweets txt now you re all set to begin 1 0 background 1 1 what is nlp natural language processing or nlp is an area of computer science that focuses on developing techniques to produce machine driven analyses of text 1 2 why is natural language processing important nlp expands the sheer amount of data that can be used for insight since so much of the data we have available is in the form of text this is extremely important to data science a specific common application of nlp is each time you use a language conversion tool the techniques used to accurately convert text from one language to another very much falls under the umbrella of natural language processing 1 3 why is nlp a hard problem language is inherently ambiguous once person s interpretation of a sentence may very well differ from another person s interpretation because of this inability to consistently be clear it s hard to have an nlp technique that works perfectly 1 4 glossary here is some common terminology that we ll encounter throughout the workshop b corpus b plural corpora a collection of written texts that serve as our datasets b nltk b natural language toolkit the python module we ll be using repeatedly it has a lot of useful built in nlp techniques b token b a string of contiguous characters between two spaces or between a space and punctuation marks a token can also be an integer real or a number with a colon 2 0 sentiment analysis so you might be asking what exactly is sentiment analysis well sentiment analysis involves building a system to collect and determine the emotional tone behind words this is important because it allows you to gain an understanding of the attitudes opinions and emotions of the people in your data at a high level sentiment analysis involves natural language processing and artificial intelligence by taking the actual text element transforming it into a format that a machine can read and using statistics to determine the actual sentiment 2 1 preparing the data to accomplish sentiment analysis computationally we have to use techniques that will allow us to learn from data that s already been labeled so what s the first step formatting the data so that we can actually apply nlp techniques python import nltk def format sentence sent return word true for word in nltk word tokenize sent here format sentence changes a piece of text in this case a tweet into a dictionary of words mapped to true booleans though not obvious from this function alone this will eventually allow us to train our prediction model by splitting the text into its tokens i e i tokenizing i the text true animals true are true the true ever true dogs true best true you ll learn about why this format is important is section 2 2 using the data on the github repo we ll actually format the positively and negatively labeled data python pos with open pos tweets txt as f for i in f pos append format sentence i pos python neg with open neg tweets txt as f for i in f neg append format sentence i neg 2 1 1 training data next we ll split the labeled data we have into two pieces one that can train data and the other to give us insight on how well our model is performing the training data will inform our model on which features are most important python training pos int 9 len pos neg int 9 len neg 2 1 2 test data we won t use the test data until the very end of this section but nevertheless we save the last 10 of the data to check the accuracy of our model python test pos int 1 len pos neg int 1 len neg 2 2 building a classifier all nltk classifiers work with feature structures which can be simple dictionaries mapping a feature name to a feature value in this example we ve used a simple bag of words model where every word is a feature name with a value of true python from nltk classify import naivebayesclassifier classifier naivebayesclassifier train training to see which features informed our model the most we can run this line of code python classifier show most informative features most informative features no true neg pos 20 6 1 0 awesome true pos neg 18 7 1 0 headache true neg pos 18 0 1 0 beautiful true pos neg 14 2 1 0 love true pos neg 14 2 1 0 hi true pos neg 12 7 1 0 glad true pos neg 9 7 1 0 thank true pos neg 9 7 1 0 fan true pos neg 9 7 1 0 lost true neg pos 9 3 1 0 2 3 classification just to see that our model works let s try the classifier out with a positive example python example1 this workshop is awesome print classifier classify format sentence example1 pos now for a negative example python example2 this workshop is awful print classifier classify format sentence example2 neg 2 4 accuracy now there s no point in building a model if it doesn t work well luckily once again nltk comes to the rescue with a built in feature that allows us find the accuracy of our model python from nltk classify util import accuracy print accuracy classifier test 0 9562326869806094 turns out it works decently well but it could be better i think we can agree that the data is kind of messy there are typos abbreviations grammatical errors of all sorts so how do we handle that can we handle that 3 0 regular expressions a regular expression is a sequence of characters that define a string 3 1 simplest form the simplest form of a regular expression is a sequence of characters contained within b two backslashes b for example i python i would be python 3 2 case sensitivity regular expressions are b case sensitive b which means p and p are distinguishable from eachother this means i python i and i python i would have to be represented differently as follows python and python we can check these are different by running python import re re1 re compile python print bool re1 match python 3 3 disjunctions if you want a regular expression to represent both i python i and i python i however you can use b brackets b or the b pipe b symbol as the disjunction of the two forms for example pp ython or python python could represent either i python i or i python i likewise 0123456789 would represent a single integer digit the pipe symbols are typically used for interchangable strings such as in the following example dog cat 3 4 ranges if we want a regular expression to express the disjunction of a range of characters we can use a b dash b for example instead of the previous example we can write 0 9 similarly we can represent all characters of the alphabet with a z 3 5 exclusions brackets can also be used to represent what an expression b cannot b be if you combine it with the b caret b sign for example the expression p represents any character special characters included but p 3 6 question marks question marks can be used to represent the expressions containing zero or one instances of the previous character for example i colou r represents either i color i or i colour i question marks are often used in cases of plurality for example i computers can be either i computers i or i computer i if you want to extend this to more than one character you can put the simple sequence within parenthesis like this feb ruary this would evaluate to either i february i or i feb i 3 7 kleene star to represent the expressions containing zero or b more b instances of the previous character we use an b asterisk b as the kleene star to represent the set of strings containing i a ab abb abbb i the following regular expression would be used ab 3 8 wildcards wildcards are used to represent the possibility of any character and symbolized with a b period b for example beg n from this regular expression the strings i begun begin began i etc can be generated 3 9 kleene to represent the expressions containing at b least b one or more instances of the previous character we use a b plus b sign to represent the set of strings containing i ab abb abbb i the following regular expression would be used ab 4 0 word tagging and models given any sentence you can classify each word as a noun verb conjunction or any other class of words when there are hundreds of thousands of sentences even millions this is obviously a large and tedious task but it s not one that can t be solved computationally 4 1 nltk parts of speech tagger nltk is a package in python that provides libraries for different text processing techniques such as classification tokenization stemming parsing but important to this example tagging python import nltk text nltk word tokenize python is an awesome language nltk pos tag text python python nnp is vbz an dt awesome jj language nn not sure what dt jj or any other tag is just try this in your python shell python nltk help upenn tagset jj jj adjective or numeral ordinal third ill mannered pre war regrettable oiled calamitous first separable ectoplasmic battery powered participatory fourth still to be named multilingual multi disciplinary 4 1 1 ambiguity but what if a word can be tagged as more than one part of speech for example the word sink depending on the content of the sentence it could either be a noun or a verb furthermore what if a piece of text demonstrates a rhetorical device like sarcasm or irony clearly this can mislead the sentiment analyzer to misclassify a regular expression 4 2 unigram models remember our bag of words model from earlier one of its characteristics was that it didn t take the ordering of the words into account that s why we were able to use dictionaries to map each words to true values with that said unigram models are models where the order doesn t make a difference in our model you might be wondering why we care about unigram models since they seem to be so simple but don t let their simplicity fool you they re a foundational block for a lot of more advanced techniques in nlp python from nltk corpus import brown brown tagged sents brown tagged sents categories news brown sents brown sents categories news unigram tagger nltk unigramtagger brown tagged sents unigram tagger tag brown sents 2007 various jj of in the at apartments nns are ber of in the at terrace nn type nn being beg on in the at ground nn floor nn so ql that cs entrance nn is bez direct jj 4 3 bigram models here ordering does matter python bigram tagger nltk bigramtagger brown tagged sents bigram tagger tag brown sents 2007 notice the changes from the last time we tagged the words of this same sentence various jj of in the at apartments nns are ber of in the at terrace nn type nn being beg on in the at ground nn floor nn so cs that cs entrance nn is bez direct jj 5 0 normalizing text the best data is data that s consistent textual data usually isn t but we can make it that way by normalizing it to do this we can do a number of things at the very least we can make all the text so that it s all in lowercase you may have already done this before given a piece of text python raw omg natural language processing is so cool and i m really enjoying this workshop tokens nltk word tokenize raw tokens i lower for i in tokens omg natural language processing is so cool and i m really enjoying this workshop 5 1 stemming but we can do more 5 1 1 what is stemming stemming is the process of converting the words of a sentence to its non changing portions in the example of amusing amusement and amused above the stem would be amus 5 1 2 types of stemmers you re probably wondering how do i convert a series of words to its stems luckily nltk has a few built in and established stemmers available for you to use they work slightly differently since they follow different rules which you use depends on whatever you happen to be working on first let s try the lancaster stemmer python lancaster nltk lancasterstemmer stems lancaster stem i for i in tokens this should have the output omg nat langu process is so cool and i m real enjoy thi workshop secondly we try the porter stemmer python porter nltk porterstemmer stem porter stem i for i in tokens notice how natural maps to natur instead of nat and really maps to realli instead of real in the last stemmer omg natur languag process is so cool and i m realli enjoy thi workshop 5 2 lemmatization 5 2 1 what is lemmatization lemmatization is the process of converting the words of a sentence to its dictionary form for example given the words amusement amusing and amused the lemma for each and all would be amuse 5 2 2 wordnetlemmatizer once again nltk is awesome and has a built in lemmatizer for us to use python from nltk import wordnetlemmatizer lemma nltk wordnetlemmatizer text women in technology are amazing at coding ex i lower for i in text split lemmas lemma lemmatize i for i in ex woman in technology are amazing at coding notice that women is changed to woman 6 0 final words going back to our original sentiment analysis we could have improved our model in a lot of ways by applying some of techniques we just went through the twitter data is seemingly messy and inconsistent so if we really wanted to get a highly accurate model we could have done some preprocessing on the tweets to clean it up secondly the way in which we built our classifier could have been improved our feature extraction was relatively simple and could have been improved by using a bigram model rather than the bag of words model we could have also fixed our bayes classifier so that it only took the most frequent words into considerations 6 1 resources natural language processing with python http bit ly nlp w python br regular expressions cookbook http bit ly regular expressions cb intermediate natural language processing brought to you by lesley cordero http www columbia edu lc2958 this guide assumes some basic knowledge of natural language processing more specifically it assumes knowledge contained in this http learn adicu com nlp tutorial table of contents 0 0 setup 00 setup 0 1 python and pip 01 python pip 0 2 libraries 02 libraries 0 3 other 03 other 1 0 background 10 background 1 1 polarity flippers 11 polarity flippers 1 1 1 negation 111 negation 1 2 multiword expressions 12 multiword expressions 1 3 wordnet 13 wordnet 1 3 1 synsets 131 synsets 1 3 2 negation 132 negations 1 4 sentiwordnet 14 sentiwordnet 1 5 stop words 15 stop words 1 6 testing 16 testing 1 6 1 cross validation 161 cross validation 1 6 2 precision 162 precision 1 7 logistic regression 17 logistic regression 2 0 information extraction 20 information extraction 2 1 data forms 21 data forms 2 2 what is information extraction 22 what is information extraction 3 0 chunking 30 chunking 3 1 noun phrase chunking 31 noun phrase chunking 4 0 named entity extraction 40 named entity extraction 4 1 spacy 41 spacy 4 2 nltk 42 nltk 5 0 relation extraction 50 relation extraction 5 1 rule based systems 51 rule based systems 5 2 machine learning 52 machine learning 6 0 sentiment analysis 60 sentiment analysis 6 1 loading the data 61 loading the data 6 2 preparing the data 62 preparing the data 6 3 linear classifier 63 linear classifier 7 0 final words 70 final words 7 1 resources 71 resources 7 2 mini courses 72 mini courses 0 0 setup this guide was written in python 3 6 0 1 python pip if you haven t already please download python https www python org downloads and pip https pip pypa io en stable installing 0 2 libraries we ll be working with the re library for regular expressions and nltk for natural language processing techniques so make sure to install them to install these libraries enter the following commands into your terminal pip3 install nltk 3 2 4 pip3 install spacy 1 8 2 pip3 install pandas 0 20 1 pip3 install scikit learn 0 18 1 0 3 other sentence boundary detection requires the dependency parse which requires data to be installed so enter the following command in your terminal python3 m spacy en download all 0 4 virtual environment if you d like to work in a virtual environment you can set it up as follows pip3 install virtualenv virtualenv your env and then launch it with source your env bin activate to execute the visualizations in matplotlib do the following cd matplotlib vim matplotlibrc and then write backend tkagg in the file now you should be set up with your virtual environment cool now we re ready to start 1 0 background 1 1 polarity flippers polarity flippers are words that change positive expressions into negative ones or vice versa 1 1 1 negation negations directly change an expression s sentiment by preceding the word before it an example would be the cat is not nice 1 1 2 constructive discourse connectives constructive discourse connectives are words which indirectly change an expression s meaning with words like but an example would be i usually like cats but this cat is evil 1 2 multiword expressions multiword expressions are important because depending on the context can be considered positive or negative for example this song is shit is definitely considered negative whereas this song is the shit is actually considered positive simply because of the addition of the before the word shit 1 3 wordnet wordnet is an english lexical database with emphasis on synonymy sort of like a thesaurus specifically nouns verbs adjectives and adjectives are grouped into synonym sets 1 3 1 synsets nltk has a built in wordnet that we can use to find synonyms we import it as such python from nltk corpus import wordnet as wn if we feed a word to the synsets method the return value will be the class to which belongs for example if we call the method on motorcycle python print wn synsets motorcar we get synset car n 01 awesome stuff but if we want to take it a step further we can we ve previously learned what lemmas are if you want to obtain the lemmas for a given synonym set you can use the following method python print wn synset car n 01 lemma names this will get you car auto automobile machine motorcar even more you can do things like get the definition of a word python print wn synset car n 01 definition again pretty neat stuff a motor vehicle with four wheels usually propelled by an internal combustion engine 1 3 2 negation with wordnet we can easily detect negations this is great because it s not only fast but it requires no training data and has a fairly good predictive accuracy on the other hand it s not able to handle context well or work with multiple word phrases 1 4 sentiwordnet based on wordnet synsets sentiwordnet is a lexical resource for opinion mining where each synset is assigned three sentiment scores positivity negativity and objectivity python from nltk corpus import sentiwordnet as swn cat swn senti synset cat n 03 python cat pos score python cat neg score python cat obj score 1 5 stop words stop words are extremely common words that would be of little value in our analysis are often excluded from the vocabulary entirely some common examples are determiners like the a an another but your list of stop words or b stop list b depends on the context of the problem you re working on 1 6 testing 1 6 1 cross validation cross validation is a model evaluation method that works by not using the entire data set when training the model i e some of the data is removed before training begins once training is completed the removed data is used to test the performance of the learned model on this data this is important because it prevents your model from over learning or overfitting your data 1 6 2 precision precision is the percentage of retrieved instances that are relevant it measures the exactness of a classifier a higher precision means less false positives while a lower precision means more false positives 1 6 3 recall recall is the percentage of relevant instances that are retrieved higher recall means less false negatives while lower recall means more false negatives improving recall can often decrease precision because it gets increasingly harder to be precise as the sample space increases 1 6 4 f measure the f1 score is a measure of a test s accuracy that considers both the precision and the recall 1 7 logistic regression logistic regression is a generalized linear model commonly used for classifying binary data its output is a continuous range of values between 0 and 1 usually representing the probability and its input is some form of discrete predictor 2 0 information extraction information extraction is the process of acquiring meaning from text in a computational manner 2 1 data forms 2 1 1 structured data structured data is when there is a regular and predictable organization of entities and relationships 2 1 2 unstructured data unstructured data as the name suggests assumes no organization this is the case with most written textual data 2 2 what is information extraction with that said information extraction is the means by which you acquire structured data from a given unstructured dataset there are a number of ways in which this can be done but generally information extraction consists of searching for specific types of entities and relationships between those entities an example is being given the following text martin received a 98 on his math exam whereas jacob received a 84 eli who also took the same test received an 89 lastly ojas received a 72 this is clearly unstructured it requires reading for any logical relationships to be extracted through the use of information extraction techniques however we could output structured data such as the following name grade martin 98 jacob 84 eli 89 ojas 72 3 0 chunking chunking is used for entity recognition and segments and labels multitoken sequences this typically involves segmenting multi token sequences and labeling them with entity types such as person organization or time 3 1 noun phrase chunking noun phrase chunking or np chunking is where we search for chunks corresponding to individual noun phrases we can use nltk as is the case most of the time to create a chunk parser we begin with importing nltk and defining a sentence with its parts of speeches tagged which we covered in the previous tutorial python import nltk sentence the dt little jj yellow jj dog nn barked vbd at in the dt cat nn next we define the tag pattern of an np chunk a tag pattern is a sequence of part of speech tags delimited using angle brackets e g dt jj nn this is how the parse tree for a given sentence is acquired python pattern np dt jj nn finally we create the chunk parser with the nltk regexpparser class python npchunker nltk regexpparser pattern and lastly we actually parse the example sentence and display its parse tree python result npchunker parse sentence result draw 4 0 named entity extraction named entities are noun phrases that refer to specific types of individuals such as organizations people dates etc therefore the purpose of a named entity recognition ner system is to identify all textual mentions of the named entities 4 1 spacy in the following exercise we ll build our own named entity recognition system with the python module spacy a python module commonly used for natural language processing in industry python import spacy import pandas as pd using spacy we ll load the built in english tokenizer tagger parser ner and word vectors we indicate this with the parameter en python nlp spacy load en we need an example to actually process so below is some text from columbia s website python review columbia university was founded in 1754 as king s college by royal charter of king george ii of england it is the oldest institution of higher learning in the state of new york and the fifth oldest in the united states controversy preceded the founding of the college with various groups competing to determine its location and religious affiliation advocates of new york city met with success on the first point while the anglicans prevailed on the latter however all constituencies agreed to commit themselves to principles of religious liberty in establishing the policies of the college in july 1754 samuel johnson held the first classes in a new schoolhouse adjoining trinity church located on what is now lower broadway in manhattan there were eight students in the class at king s college the future leaders of colonial society could receive an education designed to enlarge the mind improve the understanding polish the whole man and qualify them to support the brightest characters in all the elevated stations in life one early manifestation of the institution s lofty goals was the establishment in 1767 of the first american medical school to grant the m d degree with this example in mind we feed it into the tokenizer python doc nlp review going along the process of named entity extraction we begin by segmenting the text i e splitting it into a list of sentences python sentences sentence orth for sentence in doc sents list of sentences print there were sentences found format len sentences and we get there were 9 sentences found now we go a step further and count the number of nounphrases by taking advantage of chunk properties python nounphrases np orth np root head orth for np in doc noun chunks print there were noun phrases found format len nounphrases and we get there were 54 noun phrases found lastly we achieve our final goal entity extraction python entities list doc ents converts entities into a list print there were entities found format len entities and we get there were 22 entities found so now we can turn this into a dataframe for better visualization python orgs and people entity orth for entity in entities if entity label in org person pd dataframe orgs and people unsurprisingly columbia university is an entity along with other names like king s college and samuel johnson 0 columbia university 1 king s college 2 king george ii of england 3 samuel johnson 4 trinity church 5 king s college in summary named entity extraction typically follows the process of sentence segmentation noun phrase chunking and finally entity extraction 4 2 nltk next we ll work through a similar example as before this time using the nltk module to extract the named entities through the use of chunk parsing as always we begin by importing our needed modules and example python import nltk import re content starbucks has not been doing well lately then as always we tokenize the sentence and follow up with parts of speech tagging python tokenized nltk word tokenize content tagged nltk pos tag tokenized print tagged great now we ve got something to work with starbucks nnp has vbz not rb been vbn doing vbg well rb lately rb so we take this pos tagged sentence and feed it to the nltk ne chunk method this method returns a nested tree object so we display the content with namedent draw python namedent nltk ne chunk tagged namedent draw now if you wanted to simply get the named entities from the namedent object we created how do you think you would go about doing so 5 0 relation extraction once we have identified named entities in a text we then want to analyze for the relations that exist between them this can be performed using either rule based systems which typically look for specific patterns in the text that connect entities and the intervening words or using machine learning systems that typically attempt to learn such patterns automatically from a training corpus 5 1 rule based systems in the rule based systems approach we look for all triples of the form x a y where x and y are named entities and a is the string of words that indicates the relationship between x and y using regular expressions we can pull out those instances of a that express the relation that we are looking for in the following code we search for strings that contain the word in the special regular expression b ing b allows us to disregard strings such as success in supervising the transition of where in is followed by a gerund python in re compile r bin b b ing for doc in nltk corpus ieer parsed docs nyt 19980315 for rel in nltk sem relextract extract rels org loc doc corpus ieer pattern in print nltk sem relextract rtuple rel and so we get org whyy in loc philadelphia org mcglashan amp sarrail firm in loc san mateo org freedom forum in loc arlington org brookings institution the research group in loc washington org idealab a self described business incubator based in loc los angeles org open text based in loc waterloo org wgbh in loc boston org bastille opera in loc paris org omnicom in loc new york org ddb needham in loc new york org kaplan thaler group in loc new york org bbdo south in loc atlanta org georgia pacific in loc atlanta note that the x and y named entitities types all match with one another object type matching is an important and required part of this process 5 2 machine learning we won t be going through an example of a machine learning based entity extraction algorithm but it s important to note the different machine learning algorithms that can be implemented to accomplish this task of relation extraction most simply logistic regression can be used to classify the objects that relate to one another but additionally algorithms like suport vector machines and random forest could also accomplish the job which algorithm you ultimately choose depends on which outperforms in terms of speed and accuracy in summary it s important to note that while these algorithms will likely have high accurate rates labeling thousands of relations and entities is incredibly expensive 6 0 sentiment analysis as we saw in the previous tutorial sentiment analysis refers to the use of text analysis and statistical learning to identify and extract subjective information in textual data for our last exercise in this tutorial we ll introduce and use linear models in the context of a sentiment analysis problem 6 1 loading the data first we begin by loading the data since we ll be using data available online we ll use the urllib module to avoid having to manually download any data python import urllib request once imported we ll then define the test and training data urls as variables as well as filenames for each of those datasets this is so that we can easily download these to our local computer python test url https dl dropboxusercontent com u 8082731 datasets umich si650 testdata txt train url https dl dropboxusercontent com u 8082731 datasets umich si650 training txt test file test data csv train file train data csv using the links and filenames from above we ll officially download the data using the urlib request urlretrieve method test data f urllib request urlretrieve test url test file train data f urllib request urlretrieve train url train file now that we ve downloaded our datasets we can load them into pandas dataframes with the read csv function we ll start off with our test data and then repeat the same code for our training data python import pandas as pd test data df pd read csv test file header none delimiter t quoting 3 test data df columns text the key difference here is that we set columns to a list of two elements instead of one this is because we need a column to indicate the label otherwise the model won t be able to train for our text data before however we explicitly don t want the training label since our model will be predicting those labels python train data df pd read csv train file header none delimiter t quoting 3 train data df columns sentiment text just to see how the dataframe looks let s call the head method on both dataframes python test data df head and we get text 0 i don t care what anyone says i like hillar 1 have an awesome time at purdue 2 yep i m still in london which is pretty awes 3 have to say i hate paris hilton s behavior bu 4 i will love the lakers python train data df head and we get sentiment text 0 1 the da vinci code book is just awesome 1 1 this was the first clive cussler i ve ever rea 2 1 i liked the da vinci code a lot 3 1 i liked the da vinci code a lot 4 1 i liked the da vinci code but it ultimatly did 6 2 preparing the data to implement our bag of words linear classifier we need our data in a format that allows us to feed it in to the classifer using sklearn feature extraction text countvectorizer in the python scikit learn module we can convert the text documents to a matrix of token counts so first we import all the needed modules python import re import nltk from sklearn feature extraction text import countvectorizer from nltk stem porter import porterstemmer we need to remove punctuations lowercase remove stop words and stem words all these steps can be directly performed by countvectorizer if we pass the right parameter values we can do this as follows we first create a stemmer using the porter stemmer implementation python stemmer porterstemmer def stem tokens tokens stemmer stemmed stemmer stem item for item in tokens return stemmed here we have our tokenizer which removes non letters and stems python def tokenize text text re sub a za z text tokens nltk word tokenize text stems stem tokens tokens stemmer return stems here we init the vectoriser with the countvectorizer class making sure to pass our tokenizer and stemmers as parameters remove stop words and lowercase all characters python vectorizer countvectorizer analyzer word tokenizer tokenize lowercase true stop words english max features 85 next we use the fit transform method to transform our corpus data into feature vectors since the input needed is a list of strings we concatenate all of our training and test data python features vectorizer fit transform train data df text tolist test data df text tolist here we re simply converting the features to an array so we have an easier data structure to use python features nd features toarray 6 3 linear classifier finally we begin building our classifier earlier we learned what a bag of words model here we ll be using a similar model but with some modifications to refresh your mind this kind of model simplifies text to a multi set of terms frequencies so first we ll split our training data to get an evaluation set as we mentioned before we ll use cross validation to split the data sklearn has a built in method that will do this for us all we need to do is provide the data and assign a training percentage in this case 85 python from sklearn cross validation import train test split x train x test y train y test train test split features nd 0 len train data df train data df sentiment train size 0 85 random state 1234 now we re ready to train our classifier we ll be using logistic regression to model this data once again sklearn has a built in model for you to use so we begin by importing the needed modules and calling the class python from sklearn linear model import logisticregression log model logisticregression and as always we need actually do the training so we call the fit method on our data python log model log model fit x x train y y train now we use the classifier to label the evaluation set we created earlier python y pred log model predict x test you can see that this array of labels looks like array 0 1 0 0 1 0 6 4 accuracy in sklearn there is a function called sklearn metrics classification report which calculates several types of predictive scores on a classification model so here we check out how exactly our model is performing python from sklearn metrics import classification report print classification report y test y pred and we get precision recall f1 score support 0 0 98 0 99 0 98 467 1 0 99 0 98 0 99 596 avg total 0 98 0 98 0 98 1063 where precision recall and f1 score are the accuracy values discussed in the section 1 6 support is the number of occurrences of each class in y true and x true 6 5 retraining finally we can re train our model with all the training data and use it for sentiment classification with the original unlabeled test set so we repeat the process from earlier this time with different data python log model logisticregression log model log model fit x features nd 0 len train data df y train data df sentiment test pred log model predict features nd len train data df so again we can see what the predictions look array 1 1 1 1 1 0 and lastly let s actually look at our predictions using the random module to select a random sliver of the data we predicted on we ll print the results python import random spl random sample range len test pred 10 for text sentiment in zip test data df text spl test pred spl print sentiment text recall that 0 indicates a negative sentence and 1 indicates a positive 0 harvard is dumb i mean they really have to be stupid to have not wanted her to be at their school 0 i ve been working on an article and antid oto has been er so upset about the shitty harvard plagiarizer that he hasn t been able to even look at keyboards 0 i hate the lakers 0 boston sucks 0 stupid kids and their need for honda emblems 1 london museums i really love the museums in london because there are a lot for me to see and they are free 0 stupid ucla 1 as title tho i hate london i did love alittle bit about london 1 i love the lakers even tho trav makes fun of me 1 that i love you aaa lllooootttttt 7 0 final words remembering the sentiment analysis we performed with the naive bayes classifier we can see that the logistic regression classifier performs better with accuracy rates of 98 you might be asking yourself why this is remember that the naive bayes classifier was a unigram model in that it failed to consider the words that preceded 7 1 resources natural language processing with python http bit ly nlp w python br regular expressions cookbook http bit ly regular expressions cb | natural-language-processing curriculum | ai |
FlexGen | flexgen high throughput generative inference of large language models with a single gpu paper https arxiv org abs 2303 06865 flexgen is a high throughput generation engine for running large language models with limited gpu memory flexgen allows high throughput generation by io efficient offloading compression and large effective batch sizes motivation in recent years large language models llms have shown great performance across a wide range of tasks increasingly llms have been applied not only to interactive applications such as chat but also to many back of house tasks these tasks include benchmarking information extraction data wrangling and form processing one key characteristic of these applications is that they are throughput oriented they require running llm inferences over millions of tokens in batches e g all the private documents in a company s corpus or all the tasks in the helm https crfm stanford edu helm latest benchmark these workloads are less sensitive to latency the user starts up a job and lets it run overnight but increasing throughput is critical for reducing costs throughput is a measure of tokens processed per second over the job s entire runtime which can be hours throughput oriented workloads provide opportunities to trade off latency for higher throughput which makes it easier to take advantage of low cost commodity gpus the goal of flexgen is to create a high throughput system to enable new and exciting applications of foundation models to throughput oriented tasks on low cost hardware such as a single commodity gpu instead of expensive systems check out the examples examples of what you can run on a single commodity gpu with flexgen including benchmarking and data wrangling limitation as an offloading based system running on weak gpus flexgen also has its limitations flexgen can be significantly slower than the case when you have enough powerful gpus to hold the whole model especially for small batch cases flexgen is mostly optimized for throughput oriented batch processing settings e g classifying or extracting information from many documents in batches on single gpus this project was made possible thanks to a collaboration with a href https cs stanford edu img src https identity stanford edu wp content uploads sites 3 2020 06 wordmark nospace red png height 20 a nbsp nbsp nbsp a href https sky cs berkeley edu img src https upload wikimedia org wikipedia commons thumb 8 82 university of california 2c berkeley logo svg 1280px university of california 2c berkeley logo svg png height 22 a nbsp nbsp nbsp a href https www andrew cmu edu user beidic img src https upload wikimedia org wikipedia commons 9 9b carnegie mellon wordmark svg height 20 a nbsp nbsp nbsp a href https www together xyz img src https images squarespace cdn com content v1 6358bea282189a0adf57fe16 eef09191 631f 40d9 9bfd f875b25bcf0b together logo black transparent2 png height 20 a nbsp nbsp nbsp a href https research yandex com img src https storage yandexcloud net yandex research assets yandex research png height 20 a nbsp nbsp nbsp a href https ds3lab inf ethz ch img src https user images githubusercontent com 1608867 220273382 c09669b3 42fd 47c2 b88c 7ed55cb43820 png height 20 a content installation installation usage and examples usage and examples get started with a single gpu get started with a single gpu run helm benchmark with flexgen run helm benchmark with flexgen run data wrangling tasks with flexgen run data wrangling tasks with flexgen scaling to distributed gpus scaling to distributed gpus api example api example frequently asked questions frequently asked questions performance results performance results how it works how it works roadmap roadmap installation requirements pytorch 1 12 help https pytorch org get started locally method 1 with pip pip install flexgen method 2 from source git clone https github com fminference flexgen git cd flexgen pip install e usage and examples get started with a single gpu opt 1 3b to get started you can try a small model like opt 1 3b first it fits into a single gpu so no offloading is required flexgen will automatically download weights from hugging face python3 m flexgen flex opt model facebook opt 1 3b you should see some text generated by opt 1 3b and the benchmark results opt 30b to run large models like opt 30b you will need to use cpu offloading you can try commands below the percent argument specifies the offloading strategy for parameters attention cache and hidden states separately the exact meaning of this argument can be found here https github com fminference flexgen blob 9d092d848f106cd9eaf305c12ef3590f7bcb0277 flexgen flex opt py l1271 l1279 python3 m flexgen flex opt model facebook opt 30b percent 0 100 100 0 100 0 opt 175b to run opt 175b you need to download the weights from metaseq https github com facebookresearch metaseq tree main projects opt and convert the weights into alpa format https alpa ai tutorials opt serving html convert opt 175b weights into alpa formats you can then try to offloading all weights to disk by python3 m flexgen flex opt model facebook opt 175b percent 0 0 100 0 100 0 offload dir your ssd folder run helm benchmark with flexgen flexgen can be integrated into helm https crfm stanford edu helm a language model benchmark framework as its execution backend you can use the commands below to run a massive multitask language understanding mmlu scenario https crfm stanford edu helm latest group mmlu with a single t4 16gb gpu and 200gb of dram pip install crfm helm python3 m flexgen apps helm run description mmlu model text subject abstract algebra data augmentation canonical pad to seq len 512 model facebook opt 30b percent 20 80 0 100 0 100 gpu batch size 48 num gpu batches 3 max eval instance 100 note that only a subset of helm scenarios is tested see more tested scenarios here flexgen apps helm passed 30b sh run data wrangling tasks with flexgen you can run the examples in this paper can foundation models wrangle your data https arxiv org abs 2205 09911 by following the instructions here flexgen apps data wrangle scaling to distributed gpus if you have multiple machines with gpus flexgen can combine offloading with pipeline parallelism to allow scaling for example if you have 2 gpus but the aggregated gpu memory is less than the model size you still need offloading flexgen allow you to do pipeline parallelism with these 2 gpus to accelerate the generation but to have scaled performance you should have gpus on distributed machines see examples here https github com fminference flexgen tree main benchmark flexgen distributed gpus api example we demonstrate the usage of flexgen api in completion py flexgen apps completion py this example shows how to run generation for two sentences to get the best throughput out of flexgen you typically need to batch more sentences generation api flexgen has a generation api following the style of hugging face s transformers python output ids model generate input ids do sample true temperature 0 7 max new tokens 32 stop stop example commands you can use the example commands below if you do not have enough gpu cpu memory see the handle out of memory handle out of memory section complete with opt 6 7b you need at least 15gb of gpu memory python3 m flexgen apps completion model facebook opt 6 7b complete with opt 30b you need about 90gb of cpu memory python3 m flexgen apps completion model facebook opt 30b percent 0 100 100 0 100 0 complete with instruction tuned opt iml max 30b you need about 90gb of cpu memory python3 m flexgen apps completion model facebook opt iml max 30b percent 0 100 100 0 100 0 frequently asked questions how to set the offloading strategy and percent we will release an automatic policy optimizer later but now you have to manually try a few strategies the idea of high throughput generation is to offload parameters and attention cache as much as possible to the cpu and disk if necessary you can see the reference strategies in our benchmark here https github com fminference flexgen blob 9d092d848f106cd9eaf305c12ef3590f7bcb0277 benchmark flexgen bench suite py l39 l79 to avoid out of memory you can tune the percent to offload more tensors to the cpu and disk how to handle out of memory if you do not have enough gpu cpu memory here are a few things you can try they save more memory but run slower do not pin weights by adding pin weight 0 this can reduce the weight memory usage on cpu by around 20 or more enable weight compression by adding compress weight this can reduce the weight memory usage by around 70 offload all weights to disk by using percent 0 0 100 0 100 0 this requires very little cpu and gpu memory performance results generation throughput token s the corresponding effective batch sizes and lowest offloading devices are in parentheses please see here benchmark batch size table md for more details system opt 6 7b opt 30b opt 175b hugging face accelerate 25 12 2 on gpu 0 62 8 on cpu 0 01 2 on disk deepspeed zero inference 9 28 16 on cpu 0 60 4 on cpu 0 01 1 on disk petals 8 25 2 on gpu 2 84 2 on gpu 0 08 2 on gpu flexgen 25 26 2 on gpu 7 32 144 on cpu 0 69 256 on disk flexgen with compression 29 12 72 on gpu 8 38 512 on cpu 1 12 144 on cpu hardware an nvidia t4 16gb instance on gcp with 208gb of dram and 1 5tb of ssd workload input sequence length 512 output sequence length 32 the batch size is tuned to a large value that maximizes the generation throughput for each system metric generation throughput token s number of the generated tokens time for processing prompts time for generation how to reproduce benchmark flexgen latency throughput trade off the figure below shows the latency and throughput trade off of three offloading based systems on opt 175b left and opt 30b right flexgen achieves a new pareto optimal frontier with significatnly higher maximum throughput for both models other systems cannot further increase throughput due to out of memory flexgen c is flexgen with compression img src https github com fminference flexgen blob main docs throughput vs latency jpg alt image width 500 img how it works flexgen can be flexibly configured under various hardware resource constraints by aggregating memory and computation from the gpu cpu and disk through a linear programming optimizer it searches for the best pattern to store and access the tensors including weights activations and attention key value kv cache flexgen further compresses both weights and kv cache to 4 bits with negligible accuracy loss one key idea of flexgen is to play the latency throughput trade off achieving low latency is inherently challenging for offloading methods but the i o efficiency of offloading can be greatly boosted for throughput oriented scenarios see the figure above flexgen utilizes a block schedule to reuse weight and overlap i o with computation as shown in figure b below while other baseline systems use an inefficient row by row schedule as shown in figure a below img src https github com fminference flexgen raw main docs block schedule jpg alt image width 500 img more technical details see our paper https arxiv org abs 2303 06865 roadmap we plan to work on the following features optimize the performance for multiple gpus on the same machine support more models bloom codegen glm x release the cost model and policy optimizer macbook support m1 and m2 amd support | deep-learning gpt-3 high-throughput large-language-models machine-learning offloading opt | ai |
ManufacturingManager | manufacturingmanager manufacturing information technology application | server |
|
vitals | vital signs extraction system assessment of vital signs is an essential part of surveillance of critically ill patients to detect condition changes and clinical deterioration while most modern electronic medical records allow for vitals to be recorded in a structured format the frequency and quality of what is electronically stored may differ from how often these measures are actually recorded we created a tool that extracts blood pressure heart rate temperature respiratory rate blood oxygen saturation and pain level from nursing and other clinical notes recorded in the course of inpatient care to supplement structured vital sign data if you use this system please cite patterson ov jones m yao y viernes b alba pr iwashyna tj duvall sl extraction of vital signs from clinical notes stud health technol inform 2015 216 1035 available from http www ncbi nlm nih gov pubmed 26262334 pipeline modules createnumericpipeline numericannotator regexannotator groovy config file integernumber doublenumber annotationfilter type numeric with remove children true timestampannotator regexannotator type timestamp excludenumberpattern annotationpatternannotator type numexclude annotationfilter remove type indicator analyzenumbersae removeoverlappingannotations numericexclude numeric number setvalue createtermandindicatorpipeline unitsannotator regexannotator groovy config file unit with concept termannotator regexannotator groovy config file bp term resp term hr term notit term annotationfilter remove type term type unit remove children true annotationfilter remove type term if overlapps with type unit indicatorpatternannotator annotationpatternannotator type indicator annotationfilter remove type indicator termexcludepatternannotator annotationpatternannotator type termexclude start of an irrelevant section createwindowspipeline windowannotator from type indicator 20 tokens to the right hiprecisionwindow windowannotator from type indicator 50 tokens to the right lowerprecisionwindow windowannotator from type termexclude 10 tokens to the right excludeallwindow annotationfilter remove numeric if overlaps with excludeallwindow remove children true createpatternspipeline rangepattern annotationpatternannotator type range adjustrangeannotator remove covered type range adjust span to include only the values set value1 and value2 potentialbppattern annotationpatternannotator type potential bp excludebppattern annotationpatternannotator type ex potential bp annotationfilter remove type potential bp covered by type ex potential bp adjustpotentialbpae remove covered type potential bp adjust span to include only the values set value1 and value2 relationpatternannotator annotationpatternannotator type relation relationwithtimepatternannotator annotationpatternannotator type relation timestamp createvitalrulespipeline marknotitae extracttemperatureae extractso2ae extractbloodpressureae extractrespiratoryae extractheightae extractweightae extractpainae extractheartrateae annotationfilter remove all covered annotatations of the same type pipelinevariables valuetypes all but bp annotationfilter remove all covered annotatations of the same type pipelinevariables valuebptypes annotationfilter remove all general bp annotations if overlap with systolic bp or diastolic bp filtertimestampae remove all time stamps that were not included in any of the output value | ai |
|
MTsSportsLine | m t s sports line m t s sports line is a fictitious brand created just for the final work of databases subject this final work had the followings objectives write the initial requirement analysis create the respective entity relationship model do the relational schema create an database based on the previous relational schema implement an graphic interface in vb net capable of interact with the database created the graphic interface has four main categories stores clients workers and deliveries each one of this is a tab in our graphic interface where the user can check details and procede some operations operations like a simple purchase or a return but is also capable of more complex ones for example list all the store s workers and check all the sells done by worker here is some screenshots of the interface img src resources screenshot 1 png width 749 heigth 387 img src resources screenshot 2 png width 739 heigth 387 to see all the information from our final work open the report pdf https github com tiagoadonis mtssportsline blob master report pdf | server |
|
Internship | backend web development screening test template for backend web development task to implement a backend for useraccounts in a website target 3 users end time 5 pm 08 december fork this repository to your profile after finishing the work commit the code to your branch in your profile and then create a pull request to the main branch of this repository know more about pull requests https docs github com en free pro team latest github collaborating with issues and pull requests creating a pull request know more about forks https docs github com en free pro team latest github collaborating with issues and pull requests about forks know more about collaborative use of github https docs github com en free pro team latest github collaborating with issues and pull requests working with forks visit our website https zstream in contact us at contact zstream in all rights reserved evoura technologies pvt ltd 2020 | front_end |
|
PyCV-time | hackpad https pycv time hackpad com python computer vision time tgksr12rp66 fm demo png challenges experiments members pycv time s1 opencv official samples opencv python2 projects pycv utils techtree docker | ai |
|
Simple-Mooc | simple mooc group homework of database course at school of software engineering sjtu members chuzhe tang saiyang gou xinyi yu shihao chen | server |
|
SensorValueScaling | sensor data conversion in matlab we usually use different sensors in embedded system design the actual value of these sensors need scaling on the fixed point number this matlab function sensor conversion provides a generalized way to convert between fixed point sensor outputs and actual physical values the function takes into account the word length fractional length and physical range of the sensor values to accurately perform the conversion requirements matlab environment how to use copy the function sensor conversion m into your matlab working directory call the function in your matlab script or command window function signature actual value sensor output sensor conversion mode value word length frac length min value max value parameters mode string either toactual for converting sensor output to actual value or tosensor for the reverse value the value to be converted word length the word length of the fixed point representation frac length the fractional length of the fixed point representation min value the minimum physical value the sensor can measure max value the maximum physical value the sensor can measure examples to convert from sensor output to actual value convert a 16 bit sensor output of 6550 to its actual value using a q15 1 format and physical range of 300 to 1100 actual value sensor conversion toactual 6550 16 1 300 1100 to convert from an actual value to sensor output convert an actual value of 800 to a 16 bit sensor output using a q15 1 format and physical range of 300 to 1100 sensor output sensor conversion tosensor 800 16 1 300 1100 | os |
|
Computer_Vision_Literatures | computer vision literatures i put the papers reading notes since 2019 from evernote onto github for sharing mainly are some collections of classic and state of art literatures focused on computer vision | ai |
|
ESSD-110.2-PA1 | essd 110 2 pa1 embedded system software design project 1 | os |
|
Computer-Vision | computer vision this repository contains computer vision notebooks from my website appliedprogramming net http www appliedprogramming net computer vision home html computer vision 1 basics http www appliedprogramming net computer vision basics html 2 opencv basics http www appliedprogramming net computer vision opencvbasics html 3 operations on images http www appliedprogramming net computer vision imageoperations html 4 image processing http www appliedprogramming net computer vision imageprocessing html 5 feature detection http www appliedprogramming net computer vision featuredetection html 6 video analysis http www appliedprogramming net computer vision videoanalysis html 7 camera calibration and 3d reconstruction http www appliedprogramming net computer vision cameracalibration html texture flow http www appliedprogramming net computer vision textureflow html flood fill demo http www appliedprogramming net computer vision floodfill html object detection and path planning http www appliedprogramming net computer vision pathplanning html barcode detector http www appliedprogramming net computer vision barcodedetection html face detection using haar cascades http www appliedprogramming net computer vision facedetection html human detection http www appliedprogramming net computer vision humandetection html digit recognition http www appliedprogramming net computer vision digitrecognition html optical character recognition using k nearest neighbours http www appliedprogramming net computer vision ocr using k nearest neighbours html optical character recognition using support vector machines http www appliedprogramming net computer vision ocr using support vector machines html shape detection http www appliedprogramming net computer vision shapedetection html zooming http www appliedprogramming net computer vision zooming html hr | ai |
|
django-vue-admin | django vue admin img https img shields io badge license mit blue svg https gitee com liqianglog django vue admin blob master license img https img shields io badge python 3e 3 7 x green svg https python org pypi django version badge https img shields io badge django 20versions 3 2 blue https docs djangoproject com zh hans 3 2 img https img shields io badge node 3e 3d 2012 0 0 brightgreen https nodejs org zh cn img https gitee com liqianglog django vue admin badge star svg theme dark https gitee com liqianglog django vue admin readme zh md preview https demo django vue admin com official website document https www django vue admin com qq group https qm qq com cgi bin qm qr k fodnhhc8djlrhgysnyhob8p5rgoga6vs jump from webapi community https bbs django vue admin com plugins market https bbs django vue admin com plugmarket html github https github com liqianglog django vue admin about we are a group of young people who love code in this hot era we hope to calm down and bring some of our colors and colors through code because of love so embrace the future development roadmap please leave your valuable suggestions for creating a more comprehensive dvadmin submit requirements https rgej2wr12o feishu cn share base form shrcnshnfec9urj6rior3xppd3f roadmap https rgej2wr12o feishu cn base kevwbazaeazgd2s8smkc36pjnwb essay competition to promote better community development we are organizing the dvadmin essay competition exciting prizes including perpetual commercial licenses await you click here to view the details https bbs django vue admin com question 462 html framework introduction django vue admin https gitee com dvadmin django vue admin is a set of all open source rapid development platform no reservation for individuals and enterprises free use front end adoption d2admin https github com d2 projects d2 admin vue https cn vuejs org elementui https element eleme cn the backend uses the python language django framework as well as the powerful django rest framework https pypi org project djangorestframework permission authentication use django rest framework simplejwt https pypi org project djangorestframework simplejwt supports the multi terminal authentication system support loading dynamic permission menu multi way easy permission control special thanks d2admin https github com d2 projects d2 admin vue element admin https github com panjiachen vue element admin special thanks jetbrains https www jetbrains com to provide a free intellij idea license for this open source project online experience demo address http demo django vue admin com http demo django vue admin com demo account superadmin demo password admin123456 docs https django vue admin com https django vue admin com communication communication community click here https bbs django vue admin com plugins market click here https bbs django vue admin com plugmarket html django vue admin discussion group 01 full 812482043 click here to join the group chat https qm qq com cgi bin qm qr k ajvwjdvh es4mpjquoo32n0suck22te5 jump from webapi django vue admin discussion group 02 full 687252418 click here to join the group chat https qm qq com cgi bin qm qr k 4jjn4ijwgfxj8yjxbb gtsuwjr34wldc jump from webapi django vue admin discussion group 03 442108213 click here to join the group chat https qm qq com cgi bin qm qr k espuf6a1fcx0xry4w6czcvbnji4knsa0 jump from webapi qr code image img src https foruda gitee com images 1685090287886551832 e3afa9e1 5074988 png width 200 core function 1 menu management configure the system menu operation permissions button permissions back end interface permissions etc 2 department management configure the system organization company department role 3 role management role menu permission allocation data permission allocation set roles according to the department for data range permission division 4 rights specifies the rights of the authorization role 5 user management the user is the system operator this function mainly completes the system user configuration 6 interface whitelist specifies the interface that does not need permission verification 7 dictionary management maintenance of some fixed data frequently used in the system 8 regional management to manage provinces cities counties and regions 9 attachment management unified management of all files and pictures on the platform 10 operation logs log and query the system normal operation log and query system exception information 11 plugins market https bbs django vue admin com plugmarket html based on the django framework vue admin application and plug in development source code url gitee main push https gitee com liqianglog django vue admin https gitee com liqianglog django vue admin github https github com liqianglog django vue admin https github com liqianglog django vue admin project star introduction django vue admin https gitee com liqianglog django vue admin gitee star https gitee com liqianglog django vue admin badge star svg theme white https gitee com liqianglog django vue admin github stars https img shields io github stars liqianglog django vue admin svg style social label stars https github com liqianglog django vue admin management dashboard based on br vue2 element d2admin django django vue3 admin https gitee com huge dream django vue3 admin gitee star https gitee com huge dream django vue3 admin badge star svg theme white https gitee com huge dream django vue3 admin github stars https img shields io github stars huge dream django vue3 admin svg style social label stars https github com huge dream django vue3 admin management dashboard implemented based on br vue3 vue next admin fastcrud django plugins market click here to view the latest development progress https rgej2wr12o feishu cn base kevwbazaeazgd2s8smkc36pjnwb table tblpongo56gp6zn9 view vewpla5hdc plugin market https bbs django vue admin com plugmarket html plugin name development status description dvadmin3 celery https bbs django vue admin com plugmarket 129 html released enables asynchronous tasks in dvadmin3 including task scheduling and record management dvadmin celery https bbs django vue admin com plugmarket 115 html released enables asynchronous tasks in dvadmin3 including task scheduling and record management dvadmin sms https bbs django vue admin com plugmarket 128 html released integrates sms service plugins for various platforms dvadmin vform https bbs django vue admin com plugmarket 118 html released low code form designer plugin dvadmin tenants https bbs django vue admin com plugmarket 124 html released saas mode for multi tenancy management dvadmin third https bbs django vue admin com plugmarket 122 html released plugin for managing third party users dvadmin ak sk https bbs django vue admin com plugmarket 120 html released manages encryption keys for verifying authentication strings dvadmin pay https bbs django vue admin com plugmarket 131 html released payment plugin for dvadmin supports wechat pay and alipay dvadmin uniapp https bbs django vue admin com plugmarket 130 html released uniapp plugin for dvadmin dvadmin cloud storage development plugin for storing files using various cloud storage providers dvadmin es development search plugin for elasticsearch dvadmin low code crud development low code generation plugin dvadmin flow development workflow plugin before start project you need python 3 8 0 nodejs 14 0 mysql 5 7 0 optional the default database is sqlite3 8 0 is recommended redis optional the latest edition frontend bash clone code git clone https gitee com liqianglog django vue admin git enter code dir cd web install dependence npm install registry https registry npm taobao org start service npm run dev visit http localhost 8080 in your browser parameters such as boot port can be configured in the env development file build the production environment npm run build backend bash 1 enter code dir cd backend 2 copy conf env example py to conf dir rename as env py 3 in env py configure database information mysql database recommended version 8 0 mysql database character set utf8mb4 4 install pip dependence pip3 install r requirements txt 5 execute the migration command python3 manage py makemigrations python3 manage py migrate 6 initialization data python3 manage py init 7 initialize provincial municipal and county data python3 manage py init area 8 start backend python3 manage py runserver 0 0 0 0 8000 or gunicorn gunicorn c gunicorn conf py application asgi application visit backend swagger visit url http localhost 8080 http localhost 8080 the default address is this one if you want to change it follow the configuration file account superadmin password admin123456 docker compose shell docker compose up d initialize backend data first execution only docker exec ti dvadmin django bash python manage py makemigrations python manage py migrate python manage py init area python manage py init exit frontend url http 127 0 0 1 8080 backend url http 127 0 0 1 8080 api change 127 0 0 1 to your own public ip address on the server account superadmin password admin123456 docker compose stop docker compose down docker compose restart docker compose restart docker compose on start build docker compose up d build demo screenshot image 01 https foruda gitee com images 1682179942561449504 020863bb 5074988 jpeg image 02 https foruda gitee com images 1682179701820334814 f20eb5e8 5074988 png image 03 https foruda gitee com images 1682179718209143602 e6b6a4b1 5074988 png image 04 https foruda gitee com images 1681118349561624452 d917f8bc 5074988 jpeg image 05 https foruda gitee com images 1681118368415555513 03a8db63 5074988 jpeg image 06 https foruda gitee com images 1681118379484890540 6f9caa75 5074988 jpeg image 07 https foruda gitee com images 1681118387902110958 86d86d80 5074988 jpeg image 08 https foruda gitee com images 1681118398381431700 1e3fa0ec 5074988 jpeg image 09 https foruda gitee com images 1681118450796081811 aa00a240 5074988 png image 10 https foruda gitee com images 1681118482618114892 5cc2e297 5074988 png image 11 https foruda gitee com images 1681118492497719384 52a47252 5074988 png image 12 https foruda gitee com images 1681118517168485285 f34152ba 5074988 png commercial license image 13 https foruda gitee com images 1681118527820910716 43a7c660 5074988 png | python vue django django-rest-framework django-vue-admin d2admin admin element-ui dvadmin | front_end |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.