names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
CC6205
cc6205 natural language processing this is a course on natural language processing lecturer felipe bravo marquez https felipebravom com tas gabriel iturra bocaz https giturra cl jorge ortiz https www ortizfuentes com consuelo rojas sebasti n tinoco and felipe urrutia http www dim uchile cl furrutia course notes in spanish https raw githubusercontent com dccuchile cc6205 master slides apunte pdf lectures tuesday 14 30 16 00 thursday 14 30 16 00 course program https docs google com document d 1dnja7nf0b26arwf gmnjf9l6sltvtyfpucdhfcgg4d0 edit usp sharing in spanish course calendar calendar md youtube playlist with lectures https www youtube com playlist list plppko85egxixih54h qz48yhphenvjqbi info this course aims to provide a comprehensive introduction to natural language processing nlp by covering essential concepts we strive to strike a balance between traditional techniques such as n gram language models naive bayes and hidden markov models hmms and modern deep neural networks including word embeddings recurrent neural networks rnns and transformers the course material draws from various sources in many instances sentences from these sources are directly incorporated into the slides the neural network topics primarily rely on the book neural network methods for natural language processing https link springer com book 10 1007 978 3 031 02165 7 by goldberg non neural network topics such as probabilistic language models naive bayes and hmms are sourced from michael collins course http www cs columbia edu mcollins and dan jurafsky s book https web stanford edu jurafsky slp3 additionally some slides are adapted from online tutorials and other courses such as manning s stanford course http web stanford edu class cs224n slides 1 introduction to natural language processing slides nlp introduction pdf tex source file slides nlp introduction tex video 1 https youtu be hektnottgvu video 2 https youtu be p8cwni f kg 2 vector space model and information retrieval slides nlp ir pdf tex source file slides nlp ir tex video 1 https www youtube com watch v fxivclf370w list plppko85egxixih54h qz48yhphenvjqbi index 2 t 0s video 2 https www youtube com watch v f8ng1emmpzk list plppko85egxixih54h qz48yhphenvjqbi index 2 3 probabilistic language models slides nlp plm pdf tex source file slides nlp plm tex notes http www cs columbia edu mcollins lm spring2013 pdf video 1 https www youtube com watch v 9e2jj6kcb4y list plppko85egxixih54h qz48yhphenvjqbi index 3 video 2 https www youtube com watch v zwqbeqxlra0 list plppko85egxixih54h qz48yhphenvjqbi index 4 video 3 https www youtube com watch v tsumfqwflaa list plppko85egxixih54h qz48yhphenvjqbi index 5 video 4 https www youtube com watch v s3twdv4sqkg list plppko85egxixih54h qz48yhphenvjqbi index 6 4 text classification and naive bayes slides nlp nb pdf tex source file slides nlp nb tex notes https web stanford edu jurafsky slp3 4 pdf video 1 https youtu be kg9bk9oy1hu video 2 https youtu be iqte5kkhvze video 3 https youtu be tsjg0 x3abk 5 linear models slides nlp linear pdf tex source file slides nlp linear tex video 1 https youtu be zhbxdsnlzea video 2 https youtu be fooua uawse video 3 https youtu be dqbzhdqa1eq video 4 https youtu be 1nfwwxqfaza 6 neural networks slides nlp neural pdf tex source file slides nlp neural tex video 1 https youtu be ohzha8h2xn0 video 2 https youtu be 2lxank0w6g4 video 3 https youtu be budii9qitzy video 4 https youtu be kkn2ipy vgk 7 word vectors slides nlp wordvectors pdf tex source file slides nlp wordvectors tex video 1 https youtu be wtwusjmc9ca video 2 https youtu be xdxzq7ju95u video 3 https youtu be ikyc3drvodk 8 sequence labeling and hidden markov models slides nlp hmm pdf tex source file slides nlp hmm tex notes http www cs columbia edu mcollins hmms spring2013 pdf video 1 https youtu be ngfozz8yk0 video 2 https youtu be tjgb yqog54 video 3 https youtu be aaa5qoi8vco video 4 https youtu be 4pkwidkf 6y 9 memms and crfs slides nlp crf pdf tex source file slides nlp crf tex notes 1 http www cs columbia edu mcollins crf pdf notes 2 http www cs columbia edu mcollins fb pdf video 1 https youtu be qli 4lsudkg video 2 https youtu be plolkqwkonw video 3 https youtu be zpuwdy6o28y 10 convolutional neural networks slides nlp cnn pdf tex source file slides nlp cnn tex video https youtu be llzw5fn40r8 11 recurrent neural networks slides nlp rnn pdf tex source file slides nlp rnn tex video 1 https youtu be bmhjukzz3nk video 2 https youtu be z43yfr1iivk video 3 https youtu be 7l5jxqdwnjk 12 sequence to sequence models and attention slides nlp seq2seq pdf tex source file slides nlp seq2seq tex video 1 https youtu be opkxrjisqmm video 2 https youtu be wq7ihm5vob0 13 transformer architecture slides nlp transformer pdf tex source file slides nlp seq2seq tex video 1 https youtu be 8re23uq8ru0 14 contextualized embeddings and large language models slides nlp llm pdf video 1 https youtu be ssgbgzphymi video 2 https youtu be c qfzwu6eue video 3 https youtu be 5j4mgl3guvy nlp libraries and tools 1 nltk natural language toolkit https www nltk org 2 gensim https radimrehurek com gensim 3 spacy industrial strength nlp https spacy io 4 torchtext https torchtext readthedocs io en latest 5 allennlp open source project for designing deep leaning based nlp models https allennlp org 6 huggingface transformers https huggingface co docs transformers index 7 chatgpt https chat openai com 8 google bard https bard google com 9 stanza a python nlp library for many human languages https stanfordnlp github io stanza 10 flairnlp a very simple framework for state of the art natural language processing nlp https github com flairnlp flair 11 wefe the word embeddings fairness evaluation framework https wefe readthedocs io en latest 12 whatlies a library that tries help you to understand what lies in word embeddings https rasahq github io whatlies 13 laser a library to calculate and use multilingual sentence embeddings https github com facebookresearch laser 14 sentence transformers multilingual sentence embeddings using bert roberta xlm roberta co with pytorch https github com ukplab sentence transformers 15 datasets a lightweight library with one line dataloaders for many public datasets in nlp https github com huggingface datasets 16 rivertext a python library for training and evaluating incremental word embeddings from text data streams https dccuchile github io rivertext notes and books 1 speech and language processing 3rd ed draft by dan jurafsky and james h martin https web stanford edu jurafsky slp3 2 michael collins nlp notes http www cs columbia edu mcollins 3 a primer on neural network models for natural language processing by joav goldberg https u cs biu ac il yogo nnlp pdf 4 natural language understanding with distributed representation by kyunghyun cho https arxiv org abs 1511 07916 5 a survey of large language models https arxiv org abs 2303 18223 6 natural language processing book by jacob eisenstein https github com jacobeisenstein gt nlp class blob master notes eisenstein nlp notes pdf 7 nltk book http www nltk org book 8 embeddings in natural language processing by mohammad taher pilehvar and jose camacho collados http josecamachocollados com book embnlp draft pdf 9 dive into deep learning book https d2l ai 10 contextual word representations a contextual introduction by noah a smith https arxiv org pdf 1902 06006 pdf other nlp courses 1 cs224n natural language processing with deep learning stanford course http web stanford edu class cs224n 2 deep learning in nlp slides by horacio rodr guez https www cs upc edu horacio ahlt deeplearning02 pdf 3 david bamman nlp slides berkley http people ischool berkeley edu dbamman nlp18 html 4 cs 521 statistical natural language processing by natalie parde university of illinois http www natalieparde com teaching cs521 spring2020 html 5 10 free top notch natural language processing courses https www kdnuggets com 2019 10 10 free top notch courses natural language processing html videos 1 natural language processing mooc videos by dan jurafsky and chris manning 2012 https www youtube com playlist list ploromvodv4rofzndyrlw3 ni7tmltmijz disable polymer true 2 natural language processing mooc videos by michael collins 2013 https www youtube com channel ucb jx4jh3qqmp69rmkwpl1a playlists shelf id 3 view 50 sort dd 3 natural language processing with deep learning by chris manning and richard socher 2017 https www youtube com playlist list pl3fw7lu3i5jsnh1rnuwq tcylnr7ekre6 4 cs224n natural language processing with deep learning winter 2019 https www youtube com playlist list ploromvodv4rohcuxmzknm7j3fvwbby42z 5 computational linguistics i by jordan boyd graber university of maryland https www youtube com playlist list plegwunz91wfupebli97 wueap90jo 15i 5 visualizing and understanding recurrent networks https skillsmatter com skillscasts 6611 visualizing and understanding recurrent networks 5 bert research series by chris mccormick https www youtube com playlist list plam9sighpgwobuh4 4fr xvdbe5uneaf6 5 successes and challenges in neural models for speech and language michael collins https www youtube com watch v jfwqrmdtmlo 5 more on transforemers bert and friends by jorge p rez https tv vera com uy video 55388 other resources 1 acl portal https www aclweb org portal 2 awesome nlp a curated list of resources dedicated to natural language processing https github com keon awesome nlp 3 nlp progress repository to track the progress in natural language processing nlp http nlpprogress com 4 corpora mailing list https mailman uib no listinfo corpora 5 open llm leaderboard https huggingface co spaces huggingfaceh4 open llm leaderboard 6 real world nlp book allennlp tutorials http www realworldnlpbook com 7 the illustrated transformer a very illustrative blog post about the transformer http jalammar github io illustrated transformer 8 better language models and their implications openai blog https openai com blog better language models 9 understanding lora and qlora the powerhouses of efficient finetuning in large language models https medium com gitlostmurali understanding lora and qlora the powerhouses of efficient finetuning in large language models 7ac1adf6c0cf 10 rnn effectiveness http karpathy github io 2015 05 21 rnn effectiveness 11 superglue an benchmark of natural language understanding tasks https super gluebenchmark com 12 decanlp the natural language decathlon a benchmark for studying general nlp models that can perform a variety of complex natural language tasks http decanlp com 13 chatbot and related research paper notes with images https github com ricsinaruto seq2seqchatbots wiki chatbot and related research paper notes with images 14 ben trevett s torchtext tutorials https github com bentrevett 15 plmpapers a collection of papers about pre trained language models https github com thunlp plmpapers 16 the illustrated gpt 2 visualizing transformer language models https jalammar github io illustrated gpt2 17 linguistics nlp and interdisciplinarity or look at your data by emily m bender https medium com emilymenonbender linguistics nlp and interdisciplinarity or look at your data e49e03d37c9c 18 the state of nlp literature part i by saif mohammad https medium com nlpscholar state of nlp cbf768492f90 19 from word to sense embeddings a survey on vector representations of meaning https arxiv org pdf 1805 04032 pdf 20 10 ml nlp research highlights of 2019 by sebastian ruder https ruder io research highlights 2019 index html 21 towards a conversational agent that can chat about anything https ai googleblog com 2020 01 towards conversational agent that can html m 1 22 the super duper nlp repo a collection of colab notebooks covering a wide array of nlp task implementations https notebooks quantumstat com 23 the big bad nlp database a collection of nearly 300 well organized sortable and searchable natural language processing datasets https datasets quantumstat com 24 a primer in bertology what we know about how bert works https arxiv org abs 2002 12327 25 how self attention with relative position representations works https link medium com wfxx3d96f7 26 deep learning based text classification a comprehensive review https arxiv org pdf 2004 03705 pdf 27 teaching nlp is quite depressing and i don t know how to do it well by yoav goldberg https twitter com yoavgo status 1318567498653061122 28 the nlp index https index quantumstat com 29 100 must read nlp papers https github com amanchadha 100 nlp papers
ai
women-violence-and-technology
gender based violence using technology this is a repository to list the different organizations that are currently working on understanding how gender based violence is enhanced through the usage of technology it is also a place to list the different studies and research that is already out there this is the first step towards research around gender based violence and to the generation of ideas that will help to minimize it and that will help to increase the safety and security of groups targeted by this violence organizations computer security and privacy for survivors of intimate partner violence https www ipvtechresearch org clinic to end tech abuse https www ceta tech cornell edu coalition against stalkware https stopstalkerware org technology safety https www techsafety org vita activa https vita activa org pakistan s cyber harassment helpline https digitalrightsfoundation pk cyber harassment helpline brazil s maria lab https www marialab org ecuador s taller comunicaci n mujer https www tcmujer org wb inicio mexico s luchadoras https luchadoras mx echap https echap eu org studies on gender based violence battered women and learned helplessness https www ncjrs gov app publications abstract aspx id 46167 by l e walker leaving an abusive partner an empirical review of predictors the process of leaving and psychological well being leaving an abusive partner an empirical review of predictors the process of leaving and psychological well being by deborah k anderson daniel g saunders studies on gender based violence enhanced with digital tools studies or research academic approach stories from survivors privacy security practices when coping with intimate partner abuse https dl acm org doi abs 10 1145 3025453 3025875 by tara matthews kathleen o leary anna turner manya sleeper jill palzkill woelfer martin shelton cori manthorne elizabeth f churchill sunny consolvo privacy threats in intimate relationships https academic oup com cybersecurity article 6 1 tyaa006 5849222 searchresult 1 by karen levy bruce schneier intimate surveillance https www uidaho edu media uidaho responsive files law law review articles volume 51 51 3 levy karen ec ashx by karen levy trauma informed computing towards safer technology experiences for all http nixdell com papers chi22 trauma informed computing pdf by janet x chen allison mcdonald yixin zou emily tseng kevin roundy acar tamersoy florian schaub thomas ristenpart and nicola dell care infrastructures for digital security in intimate partner violence https www ipvtechresearch org files ugd 884c63 60bad8c4a8e1421eaefef28f0ca5c70a pdf by emily tseng mehrnaz sabet rosanna bellini harkiran kaur sodhi thomas ristenpart and nicola dell so called privacy breeds evil narrative justifications for intimate partner surveillance in online forums https rist tech cornell edu papers forums pdf by rosanna bellini emily tseng nora mcdonald rachel greenstadt damon mccoy thomas ristenpart and nicola dell the tools and tactics used in intimate partner surveillance an analysis of online infidelity forums https arxiv org abs 2005 14341 by emily tseng rosanna bellini nora mcdonald matan danos rachel greenstadt damon mccoy nicola dell and thomas ristenpart a stalker s paradise how intimate partner abusers exploit technology http nixdell com papers stalkers paradise intimate pdf by diana freed jackeline palmer diana minchala karen levy thomas ristenpart and nicola dell various https ipvtechbib randhome io other various https github com cornelltech cs5439 fall2018 blob master readme md organization approach what is access why are women less connected https www derechosdigitales org wp content uploads what is access mx pdf by derechos digitales covid 19 and technology enabled intimate partner violence https 82beb9a6 b7db 490a 88be 9f149bafe221 filesusr com ugd c4e6d5 739b032c9b814b1997d85454b02c5057 pdf index true the predator in your pocket a multidisciplinary assessment of the stalkerware application industry https citizenlab ca 2019 06 the predator in your pocket a multidisciplinary assessment of the stalkerware application industry navegando libres por la web https www navegandolibres org ecuador s diagnose of online gender violence https www navegandolibres org images navegando diagnostico navegando libres f pdf not revenge porn new trends in non consensual intimate imagery in uganda the role of digital security https thebachchaoproject org not revenge porn new trends in non consensual intimate imagery in uganda the role of digital security session at rightscon online guideline approach 13 manifestations of gender based violence using technology https www genderit org resources 13 manifestations gender based violence using technology ideas ietf draft help tails
server
LLM-Guide
llm guide welcome to llm guide a resource for training hosting and developing with large language models llms apis model provider price size type link luminous supreme aleph alpha 0 0 1750 1k tokens 70b text generation link https www aleph alpha com pricing luminous extended aleph alpha 0 0450 1k tokens 30b text generation link https www aleph alpha com pricing luminous base aleph alpha 0 0300 1k tokens 13b text generation link https www aleph alpha com pricing j2 jumbo ai21 0 0150 1k tokens 178b text generation link https www ai21 com studio pricing j2 grande ai21 0 0100 1k tokens 17b text generation link https www ai21 com studio pricing j2 large ai21 0 00300 1k tokens 7 5b text generation link https www ai21 com studio pricing default cohere 2 5 per 1000 generation units text generation link https cohere ai pricing gpt 3 5 turbo openai 0 0020 1k tokens 175b text generation link https openai com pricing text davinci 003 openai 0 0200 1k tokens 175b text generation link https openai com pricing text curie 001 openai 0 0020 1k tokens 6 7b text generation link https openai com pricing text babbage 001 openai 0 0005 1k tokens 1 3b text generation link https openai com pricing text ada 001 openai 0 0004 1k tokens text generation link https openai com pricing code davinci 002 openai 1k tokens code generation link https openai com pricing code cushman 001 openai 1k tokens 12b code generation link https openai com pricing text embedding ada 002 openai 0 0004 1k tokens text embedding link https openai com pricing default cohere 1 0 per 1000 embeddings text embedding link https cohere ai pricing open source models bloom bigscience large open science open access multilingual language model https huggingface co bigscience bloom bloomz https huggingface co bigscience bloomz opt 175b democratizing access to large scale language https forms gle bdb2i44qwcr2mcjn6 galactica 120b trained on a large scale scientific corpus https huggingface co facebook galactica 120b llama open and efficient foundation language models https github com facebookresearch llama gpt neox 20b https huggingface co eleutherai gpt neox 20b gpt neoxt chat base 20b https huggingface co togethercomputer gpt neoxt chat base 20b gpt j 6b https huggingface co eleutherai gpt j 6b glm 130b https github com thudm glm 130b yalm 100b https github com yandex yalm 100b ul2 20b https huggingface co google ul2 h3 2 7b https huggingface co danfu09 h3 2 7b pygmalion 6b https huggingface co pygmalionai pygmalion 6b open sources projects langchain building applications with llms through composability https github com hwchase17 langchain petals run 100b language models at home bittorrent style https github com bigscience workshop petals open assistant open source chatgpt like model https open assistant io together a decentralized cloud for artificial intelligence https www together xyz runhouse https github com run house runhouse chroma the open source embedding database https github com chroma core chroma chatllama llama based chatgpt training process https github com nebuly ai nebullvm tree main apps accelerate chatllama openchatkit https github com togethercomputer openchatkit helm holistic evaluation of language models https github com stanford crfm helm guardrails https github com shreyar guardrails stanford alpaca an instruction following llama model https github com tatsu lab stanford alpaca energonai large scale model inference https github com hpcaitech energonai llm providers openai https openai com cohere https cohere ai ai21labs https www ai21 com gooseai https goose ai deepinfra https deepinfra com forefronai https www forefront ai nlp cloud https nlpcloud com baseten https app baseten co explore llm training frameworks and tools alpa training and serving large scale neural networks https github com alpa projects alpa deepspeed microsoft https github com microsoft deepspeed composer mosaicml https github com mosaicml composer colassal ai https github com hpcaitech colossalai bmtrain https github com openbmb bmtrain flower https github com adap flower adap https www adap com en papers training compute optimal large language models https arxiv org abs 2203 15556 scaling laws for neural language models https arxiv org abs 2001 08361 language models are few shot learners gpt 3 https arxiv org abs 2005 14165 opt open pre trained transformer language models opt 175b https arxiv org abs 2205 01068 bloom a 176b parameter open access multilingual language model https arxiv org abs 2211 05100 training language models to follow instructions with human feedback instructgpt https arxiv org abs 2203 02155 scaling language models methods analysis insights from training gopher https arxiv org abs 2112 11446 llama open and efficient foundation language models https research facebook com publications llama open and efficient foundation language models gpt neox 20b an open source autoregressive language model https arxiv org abs 2204 06745 palm scaling language modeling with pathways https arxiv org pdf 2204 02311 pdf attention is all you need https arxiv org abs 1706 03762 llm guides awesome llm https github com hannibal046 awesome llm prompt engineering guide https github com dair ai prompt engineering guide awesome chatgpt prompts https github com f awesome chatgpt prompts awesome chatgpt https github com humanloop awesome chatgpt using llama with m1 mac https dev l1x be posts 2023 03 12 using llama with m1 mac llmops tools and services human loop https humanloop com steamship https www steamship com promptly https trypromptly com promptlayer https github com magnivorg prompt layer library honeyhive https honeyhive ai promptable https promptable ai gradientj https gradientj com promptbase https promptbase com flowgpt https flowgpt com vellum https www vellum ai dust ai https dustai es en tutorials video langchain tutorial https youtube com playlist list plqzxakvf1bpnqer9mlmdbntnfspzddiu5 langchain for gen ai and llms https youtube com playlist list pliuou7oqgtliev9utifmm6 4pxg hln6f let s build gpt from scratch in code spelled out https youtu be kcc8fmeb1ny notebook langchain tutorials https github com gkamradt langchain tutorials datasets anthropic hh rlhf https huggingface co datasets anthropic hh rlhf openai summarize from feedback https huggingface co datasets openai summarize from feedback people to follow james briggs youtube https www youtube com jamesbriggs twitter https twitter com jamescalam linkedin https www linkedin com in jamescalam github https github com jamescalam elvis saravia youtube https www youtube com elvissaravia twitter https twitter com omarsar0 linkedin https www linkedin com in omarsar github https github com dair ai data independent https www youtube com dataindependent
llm nlp
ai
login-custom-backend-app
login with custom bakend app
server
sveltekit-for-beginners
sveltekit for beginners learn full stack web development with sveltekit project setup clone the project sh git clone https github com joysofcode sveltekit for beginners git install dependencies sh npm i database rename env example to env database url file dev db create the database from the prisma schema sh npx prisma db push seed the database sh npx prisma db seed inspect your database with prisma studio npx prisma studio development start the project and open http localhost 3000 sh npm run dev production build and preview sh npm run build npm run preview
front_end
SuperHackerDocs
superhackerdocs the presented documentation could be found on super hacker discs distributed in the post soviet space this documentation is a collection of information about vintage technologies from the late 90 s in the absence of the internet such discs and bbs were the only source of information for russian programmers due to expiration period the submitted documentation is unlikely to violate any rights therefore i post it purely for historical purposes perhaps someone will cause bouts of nostalgia smile super hacker 98 super hacker 98 jpg
vintage-computers docs hardware-information preservation
server
University-of-Primorska-Famnit
university of primorska univerza na primorskem universit del litorale br famnit br verbal lesson non code files from erasmus exchange famnit logo big https user images githubusercontent com 64928475 164076230 b5105762 d5b4 47c3 a313 ad2afd5f7b6d png img src https user images githubusercontent com 64928475 164076230 b5105762 d5b4 47c3 a313 ad2afd5f7b6d png alt a width 500
server
Lua-RTOS-ESP32-lobo
clone of the whitecat s lua rtos esp32 repository https github com whitecatboard lua rtos esp32 with some added functionality added modules led support for ws2812 neopixel rgb leds tft full support for ili9341 st7735 based tft modules in 4 wire spi mode supported are many graphics elements fixed width and proportional fonts 7 included unlimited number of fonts from file jpeg bmp and raw bitmap images touch screen supported read from display memory supported cam support for arducam mini 2mp camera module uses spi i2c interfaces jpeg format supported sizes 176x120 to 1600x1200 capture to file or buffer capture directly to tft display tft jpgimage x y scale cam function cjson fast standards compliant encoding parsing routines full support for json with utf 8 including decoding surrogate pairs optional run time support for common exceptions to the json specification infinity nan no dependencies on other libraries modified modules io added support for ymodem file transfer io ymsend io ymreceive functions attributes functions returns file directors type size and timestamp os added function os exists for checking file existance added calibration for sleep function datetime preserved after sleep bootreason added text boot reason description added sleepcalib function to calibrate sleep time os resetreason returns reset reason as numeric and string descriptive values added os list function enhanced os ls lists file timestamps sfpiffs fat free and total drive space number of files in directory match files by wildchard added os mountfat os unmountfat functions added os compile function to compile lua source file to lua bytecode can also list the bytecode useful for learning how lua virtual machine works i2c added high level functions send receive sendreceive spi based on new espi driver net http start stop functions renamed to serverstart serverstop added http client functions get post postfile sensor added config options do individually enable disable ds1820 bme280 sensors support for bme280 temperature humidity and pressure sensors in i2c mode spiffs removed the complete implementation and replaced with slightly different one spiffs source files are unchanged original files prom pellepl s repository added timestamp to files directories mkspiffs sets files directories timestamp formating spiffs on startup does not reset the system sd card support fat fs removed the complete implementation and replaced with driver based on esp idf sdmmc driver sdcard can be connected in 1 bit or 4 bit mode os mountfat os unmountfat functions provided to mount sdcard if inserted after boot or change the card sleep boot os sleep function improved time is preserved after sleep sleep time calibration added os setsleepcalib function boot count added and reported at start and available as lua function boot reason reported on boot and available as lua function espi driver new espi driver implemented based on esp idf spi master driver queued dma direct nondma transfers combined includes special support for display functions other added some documentations lua examples tools what s lua rtos lua rtos is a real time operating system designed to run on embedded systems with minimal requirements of flash and ram memory currently lua rtos is available for esp32 esp8266 and pic32mz platforms and can be easilly ported to other 32 bit platforms lua rtos is the main core of the whitecat ecosystem that is being developed by a team of engineers educators and living lab designers designed for build internet of things networks in an easy way lua rtos has a 3 layers design 1 in the top layer there is a lua 5 3 4 interpreter which offers to the programmer all resources provided by lua 5 3 4 programming language plus special modules for access the hardware pio adc i2c rtc etc and middleware services provided by lua rtos lora wan mqtt 2 in the middle layer there is a real time micro kernel powered by freertos this is the responsible for that things happen in the expected time 3 in the bottom layer there is a hardware abstraction layer which talk directly with the platform hardware http whitecatboard org git luaos png for porting lua rtos to other platforms is only necessary to write the code for the bottom layer because the top and the middle layer are the same for all platforms how is it programmed the lua rtos compatible boards can be programmed in two ways using the lua programming language directly or using a block based programming language that translates blocks to lua no matter if you use lua or blocks both forms of programming are made from the same programming environment the programmer can decide for example to made a fast prototype using blocks then change to lua and finally back to blocks http whitecatboard org wp content uploads 2016 11 block example png http whitecatboard org wp content uploads 2016 11 code example png in our wiki https github com whitecatboard lua rtos esp32 wiki you have more information about this how to get lua rtos firmware prerequisites 1 please note you need probably to download and install drivers for your board s usb to serial adapter for windows and mac osx versions the gnu linux version usually doesn t need any drivers this drivers are required for connect to your board through a serial port connection board whitecat esp32 n1 https www silabs com products development tools software usb to uart bridge vcp drivers esp32 core https www silabs com products development tools software usb to uart bridge vcp drivers esp32 thing http www ftdichip com drivers vcp htm method 1 get a precompiled firmware 1 install esptool the esp32 flasher utility following this instructions https github com espressif esptool 1 get the precompiled binaries for your board board whitecat esp32 n1 http whitecatboard org firmware php board whitecat esp32 n1 esp32 core http whitecatboard org firmware php board esp32 core board esp32 thing http whitecatboard org firmware php board esp32 thing generic http whitecatboard org firmware php board generic 2 uncompress to your favorite folder lua unzip luartos 10 whitecat esp32 n1 1488209955 zip method 2 build by yourself 1 install esp32 toolchain for your desktop platform please follow the instructions provided by espressif windows https github com espressif esp idf blob master docs windows setup rst mac os https github com espressif esp idf blob master docs macos setup rst linux https github com espressif esp idf blob master docs linux setup rst 1 clone esp idf repository from espressif lua git clone recursive https github com espressif esp idf git 1 clone lua rtos repository lua git clone recursive https github com whitecatboard lua rtos esp32 1 setup the build environment go to lua rtos esp32 folder lua cd lua rtos esp32 edit the env file and change host platform path idf path library path pkg config path cpath for fit to your installation locations now do lua source env 1 set the default configuration for your board board run this command whitecat esp32 n1 make sdkconfig defaults whitecat esp32 n1 defconfig esp32 core make sdkconfig defaults esp32 core board defconfig esp32 thing make sdkconfig defaults esp32 thing defconfig generic make sdkconfig defaults generic defconfig 1 change the default configuration you can change the default configuration doing lua make menuconfig remember to check the device name for your board s usb to serial adapter under the serial flasher config default serial port category 1 compile build lua rtos and flash it to your esp32 board lua make flash flash the spiffs file system image to your esp32 board lua make flashfs connect to the console you can connect to the lua rtos console using your favorite terminal emulator program such as picocom minicom hyperterminal putty etc the connection parameters are speed 115200 bauds data bits 8 stop bits 1 parity none terminal emulation vt100 for example if you use picocom lua picocom baud 115200 dev tty slab usbtouart lua booting lua rtos boot reason deep sleep reset digital core boot count 1 sleep time 10 sec from tue mar 14 19 28 16 2017 to tue mar 14 19 28 26 2017 w h i t e c a t lua rtos lobo 0 2 build 1489519645 copyright c 2015 2017 whitecatboard org board type esp32 thing cpu esp32 rev 0 at 240 mhz flash eui d665503346133812 spiffs0 start address at 0x180000 size 1024 kb spiffs0 mounted mounting sd card ok mode spi 1bit name ncard type sdhc sdxc speed default speed 25 mhz size 15079 mb csd ver 1 sector size 512 capacity 30881792 read bl len 9 scr sd spec 2 bus width 5 redirecting console messages to file system lua rtos lobo 0 2 powered by lua 5 3 4 executing system lua executing autorun lua
esp32 lua rtos lua-rtos
os
mobile_devices
mobile app development with react native this repository holds examples about different subjects using react native https facebook github io react native for mobile development each app focuses on a specific subject they serve as basic examples not full applications slides here is a link https drive google com open id 1gdhoompbkfccr2ff18fo1civjsnsbwxg to presentations explaining the function and code of each subject
front_end
EVOCamCal-vehicleSpeedEstimation
vehicle speed estimation from roadside camera this repo holds my graduation project the goal is to harness computer vision and projective geometry to successfully transform a simple street camera into a vehicle speed estimator please if you have any concerns feel free to raise an issue vehicleestimation https user images githubusercontent com 41920808 136614193 12f8896f c8c3 46b0 8251 a90ce3963fcf gif set up clone the repo git clone https github com hector6298 titulacion vehice speed estimation git python dependencies install all dependencies for python3 cd titulacion vehicle speed estimation sudo pip3 install r requirements txt note that you also have to install opencv for c in order to use the camera calibrator i used this https vitux com opencv ubuntu article to install opencv 4 5 on my local machine running ubuntu 20 04 now compile the calibration code cd manual calib g i usr local include l usr local lib g o bin src main cpp src cfg cpp src camcal cpp lopencv core lopencv imgproc lopencv highgui lopencv imgcodecs lopencv calib3d lm run bin configuring object detectors thanks to hunglc007 for the implementation of yolov4 download yolov4 weights from here https drive google com open id 1cewmfusmpjywbrnujrukhpmwre b9pat then convert the weights to tensorflow usable weights cd python3 workflowutils yolov4 save model py weights workflowutils yolov4 data yolov4 weights output workflowutils yolov4 data checkpoints yolov4 input size 512 model yolov4 download all my dataset this dataset https drive google com drive folders 1sqsvgd72b57plkgskkswdlueab5y4rcf usp sharing contains footage from 10 videos taken from the department of transportation of seattle these videos are in mp4 format and each video contains the following annotations date when the videos were recorded 2d image coordinates from different points in the scene longitude latitude coordinates from a view from above each representing the same spot as the pixel coordinates forming point correspondences the address of the scene all these annotations are contained on a single json file called points json we are all set running the calibrations there is a simple bash script already implemented to set up the configuration file and running the calibration for all the videos type sh bash scripts executecalibrations sh after all the calibrations were computed the results should be stored in manual calib data there is one folder for each calibration method base ransac and eda the directory should look like tesis manual calib data base calibration1 jpg calibration1 txt calibration14 jpg caliration14 txt ransac calibration1 jpg calibration1 txt calibration14 jpg caliration14 txt eda calibration1 jpg calibration1 txt calibration14 jpg caliration14 txt the jpg files contain the virtual model of the street in each scene furthermore the txt files follow this pattern h 1 h 2 h 3 h 4 h 5 h 6 h 7 h 8 1 projection error backprojection error distance error if all of these files look ok then we are ready to move on to the next components pre computing and saving the vehicle detections this step was carried out due to hardware limitations to speed up the experimentation it will pre compute all bounding boxes for every video and store them on pickle files type sh bash scripts obtaindets sh on my computer it took almost 120 hours so it will take quite some time after everything is done pickles should be stored in results detections like so tesis results detections video1 video2 video14 executing the tracker and computing all raw measures this part will use the previous detections and will compute all speed and tracking measures using the object tracker run sh bash scripts executetracking sh it should generate two folders inside results folder velocities and spot velocities the results directory should now look like tesis results detections velocities iou velocities1 velocities14 spot velocities iou spot velocities1 spot velocities14 computing all the final plots after all the raw material is computed it is time to get the plots to diagnose the system there is also a script for this type sh bash scripts plotextractor sh we should know have one additional folder inside results called plots tables as you can guess it contains plots and tables about the distributions of speed and tracks for every video
deep-learning computer-vision homography-estimation evolutionary-algorithms thesis-project
ai
hooli-data-eng-pipelines
hooli inc data engineering this repository includes dagster application code developed by the fictional data engineering team at hooli getting started you can clone and run this example locally git clone https github com dagster io hooli data eng pipelines pip install e dev make manifest dagster dev code structure to understand the structure start with the file hooli data eng definitions py this example includes a few key dagster concepts assets are used to represent the datasets the hooli data team manages this example includes assets generated from dbt python and other sources resources represent external systems this example uses different resources for different environments duckdb locally snowflake s3 in production the example also shows how to create custom resources see resources api py jobs allow us to automate when our assets are updated this example includes jobs that run on a schedule and sensors that trigger jobs to run when upstream data is ready jobs can target assets see definitions py or they can define imperative operations see jobs watch s3 py job configuration allows dagster to parameterize tasks this example includes a forecasting model with hyper parameters passed as job config partitions and backfills allow dagster to represent partitioned data with no additional code this example shows how daily partitioned assets can automatically be scheduled daily and how those same daily partitions can seemlessly roll up into a weekly partitioned asset the asset big orders in hooli data eng assets forecasting init py uses spark locally spark is run through a local pyspark process in production a resources databricks py databricks step launcher is used to dynamically create a spark cluster for processing the asset model nb is an example of dagstermill which lets you run jupyter notebooks as assets including notebooks that should take upstream assets as inputs sensors are used to run jobs based on external events see for example hooli data eng jobs watch s3 py declarative scheduling is used to keep certain marketing and analytics assets up to date based on a stakeholder sla using freshness policies and auto materialization policies examples include hooli data eng assets marketing init py and dbt project models analytics weekly order summary sql retries are enabled for both runs and assets making the pipeline robust to occassional flakiness see hooli data eng definitions py for examples of retries on jobs and hooli data eng assets marketing init py for an example of a more complex retry policy on an asset including backoff and jitter flakiness is generated in hooli data eng resources api py alerts are enabled through dagster clould alert policies based on job tags a custom alert is also specified to notify when assets with slas are later than expected see hooli data eng assets delayed asset alerts py deployment architecture this repository uses dagster cloud hybrid architecture with github actions to provide ci cd the main branch is deployed to dagster cloud using the workflow in github workflows each commit a new docker image is built and pushed to our container registry these images are deployed into an eks cluster by our running dagster agent which also syncronizes changes with dagster cloud the open pr in this repository shows how dagster supports full integration testing with a branch deployment in this case the pr is code for a second competing model this change also highlights how you can test dependency changes this cxapability is also implemented in the github action in this repo dev notes in the repo wiki
cloud
IOT-AC-Light-Dimmer-With-Alexa
use esp8266 2 5 0 version for this project iot light dimmer iot light dimmer alt text https github com nassir malik iot light dimmer blob master drawing jpg part list triac bta16 600b optocoupler 4n25 triac output optocoupler moc3021 bridge rectifier diode d11510s 120k ohms 5 120k ohms 5 1k ohms 5 resistor 331 resistor 471 resistor 103 led youtube tutorial part 1 https www youtube com watch v efeosil ibq youtube tutorial part 2 https www youtube com watch v uc9vymdijkq 1 setup hassio mqtt password welcome light platform mqtt name office lights state topic netmedias office lights status command topic netmedias office lights switch brightness state topic netmedias office lights brightness brightness command topic netmedias office lights brightness set qos 0 optimistic false 2 downgrade esp board in arduino ide 3 download esplexa and copy it into lib folder https github com aircoookie espalexa 4 download project update ssid and password and flash it to nodemcu
server
remoted
remoted is not live anymore remoted io this is the source code and issue board for https remoted io remoted is an aggregator for remote jobs for it professionals software engineering database cloud security you are more than welcome to participate to help making https remoted io the best remote job board give feedback report bugs suggest improvements and ask questions here https github com remoted io remoted issues other useful links remoted io https remoted io remoted graphql api https remoted io graphql jobseeker source code https github com remoted io jobseeker remoted source code is licensed as agpl https github com remoted io remoted blob master license md made with by andre pena https twitter com andrerpena
server
react
p align center img width 300px src https user images githubusercontent com 4608155 127241386 f11da52d 00d9 4366 b01c 6f4c1ebcf7f2 png p h1 align center primer react h1 p align center a react implementation of github s primer design system p p align center a aria label npm package href https www npmjs com package primer react img alt src https img shields io npm v primer react svg a a aria label contributors graph href https github com primer react graphs contributors img src https img shields io github contributors primer react svg a a aria label last commit href https github com primer react commits main img alt src https img shields io github last commit primer react svg a a aria label license href https github com primer react blob main license img src https img shields io github license primer react svg alt a p documentation our documentation site lives at primer style react https primer style react you ll be able to find detailed documentation on getting started all of the components our theme our principles and more installation install primer react in your project with your package manager of choice console npm install primer react console yarn add primer react roadmap you can track our roadmap progress in the roadmap project board https github com primer react projects 3 see more detail in the quarterly planning discussions https github com primer react discussions discussions q 5broadmap 5d and find a list of all the current epic tracking issues https github com primer react discussions 997 contributing we love collaborating with folks inside and outside of github and welcome contributions see the contributing docs contributor docs contributing md for more info on code style testing coverage and troubleshooting new component proposals we welcome and encourage new component proposals from internal github teams our best work comes from collaborating directly with the teams using primer react components in their projects if you d like to kick off a new component proposal please submit an issue using the component proposal issue template https github com primer react issues new template new component proposal md and we will get in touch
primer react component-library design-system
os
DM2193FA2014
dm2193 intro to web development nyu http j hnnybens n com capture imami png johnny benson fall 2014 tuesday thursday 7 30 9 20pm rm 813 3 credit hrs office hours by appointment or directly after class email johnny benson nyu edu class website https github com idmnyu dm2193fa2014 syllabus the assignments in this web studio are arranged sequentially to enable the production of a website of professional quality design and production the studio for those seriously interested in web design stresses interactivity usability quality and appropriateness of look and feel students are expected to develop content and complete a professional quality website program goals the following idm program goals are introduced and reinforced within this course students will develop conceptual thinking skills to generate ideas and content in order to solve problems or create opportunities develop technical skills to realize their ideas develop critical thinking skills that will allow them to analyze and position their work within cultural historic aesthetic economic and technological contexts gain knowledge of professional practices and organizations by developing their verbal visual and written communication for documentation and presentation exhibition and promotion networking and career preparation develop collaboration skills to actively and effectively work in a team or group course goals this course will help students to understand current web design and development practices gain understanding of content strategy web usability information architecture and design and user experience ux processes explore issues of interface design including visual design and interaction design write html5 css3 to develop responsive websites get familiar with javascript jquery syntax and structure be able to organize complex web content into a meaningful hierarchical and aesthetically pleasing website be able to tell stories and utilize new web technologies to create interactive exploratory work get comfortable enough with web platforms to be able to further learn front end programming learn how to proactively learn understand and implement the iterative process develop a vocabulary to evaluate and critique web design and development create an internal developer creative community learning outcomes by the end of the course students will be able to design build and develop content for a professional quality website course structure class will be comprised of lectures workshops peer learning and critiques there will be weekly homework assignments show tells and presentations that students will present to the class quizzes and two web projects critiques crit are the best way to articulate your ideas to others and get immediate feedback during the crit the class analyzes and suggests ways to increase the impact of each existing idea take notes when your work is being critiqued and do not edit the responses whether you agree with them or not review your crit notes and reflect upon what was said ask yourself how you could combine transform or expand what you are doing to make your project better however resist the temptation to incorporate all suggestions and comments only utilize the ones that work for you and the project all dates and assignments are subject to change at the discretion of the professor depending on the interests and pace of the class in addition a guest speaker will come in and talk about their work to the class which specific class they appear depending on their scheduling and availability additional readings homework assignments or changes will be announced in class and github respository course requirements consult the class github for information about past classes come to class on time and be familiar with the current class topics and be ready to answer or ask questions and participate in discussions complete all assignments by due date acquire and keep up with all of the readings read all assigned readings before class devote at a minimum 12 to 24 hours per week outside of class fulfilling homework assignments reading and studying concepts covered in class evaluation grading attendance attendance is mandatory and will be taken at the beginning of every class since there is so much technical conceptual and design information to absorb regular attendance is essential unexcused absences will affect your grade one absence is allowed after that your final overall numerical grade will drop by 5 percent 1 2 a grade point e g a to an a for each additional absence be on time tardiness will affect your grade for every 15 minutes of tardiness your final numerical grade will drop by 0 42 percent contact the professor in advance if you will not be in class in person or by email is preferred your final grade will be based on a synthesis of quantitative qualitative rubrics quantitative grading overview 5 show tells 5 reading responses 5 midterm self assessment rubric 5 final self assessment rubric 5 for idm coursework documentation 5 portfolio pdf 10 coding exercises 10 tickets to leave 10 maintained course github 40 final project html5 css3 responsive tumblr theme submitted to the tumblr theme garden https www tumblr com docs en theme submission guidelines projects grading overview 40 participation in the mandatory class critique 5 process 5 craftsmanship 10 information design sitemap schematics clickthrough 40 interface design which includes interaction and user experience design including tech functionality spec visual design including comps mood boards and or style guides qualitative grading overview each student will be judged on the quality experimentation and improvement that their work shows a excellent 90 100 performance participation and attendance of the student has been of the highest level showing sustained excellence in meeting course responsibilities work clearly differentiates itself from other work has memorable impact pursues concepts and techniques above and beyond what is discussed in class the student thoroughly understands the web design and development process b very good good 80 89 performance participation and attendance of the student has been good though not of the highest level work demonstrates a better than average web design and development process c satisfactory 70 79 performance and attendance of the student has been adequate satisfactorily meeting the course requirements work is average and competent showing a basic understanding of the web design and development process d poor below average 60 69 performance and attendance of the student has been less than adequate work is lacking in many or most areas that show any understanding of visual foundation problems may include lack of interest procrastination poor planning and poor craft f unacceptable 59 below performance and attendance of the student has been such that course requirements have not been met work shows no overall understanding of the course material on many levels or either a severe lack of interest accommodations if you are student with a disability who is requesting accommodations please contact new york university s moses center for students with disabilities at 212 998 4980 or mosescsd nyu edu you must be registered with csd to receive accommodations information about the moses center can be found at www nyu edu csd the moses center is located at 726 broadway on the 2nd floor software requirements chrome or firefox github account github com join tumblr account tumblr com register developer tools for chrome or firefox firebug for firefox http getfirebug com chrome developer tools https developers google com chrome developer tools http discover devtools codeschool com html text editor sublime coda textmate bbedit textwrangler etc ftp application cyberduck transmit fetch filezilla etc web server space you can receive web server space from idm s technology manager elton kwok magnet 883 or if you already have your own that works too idm s ftp server info http sites bxmc poly edu use active mode end of semester pdf deliverable pdf cover your last name your first name intro to web development dm 2193 course semester fall 2014 course instructor johnny benson pdf contents project plan with revisions final project documentation url to example site with theme url to github with theme code screenshots for desktop screenshots for mobile tablet wireframes or schematics mural ly mood boards and or comps tech functionality specifications screenshots additional resources dm2193 resources md course weeks week 1 week 01 readme md introduction to the course week 2 week 02 readme md html xml semantics browser environment week 3 week 03 readme md cascading stylesheets part 1 separation of data and display week 4 week 04 readme md cascading stylesheets part 2 advanced selectors week 5 week 05 readme md javascript adding behavior to display week 6 week 06 readme md midterm project midterm readme md redesign craigslist or hacker news week 7 week 07 readme md midterm critique advanced interfaces sticky headers modals dropdowns slideshow lightbox week 8 week 08 readme md advanced interfaces media queries and responsive design week 9 week 09 readme md frameworks and preprocessors twitter bootstrap sass less week 10 week 10 readme md flex time to cover special topics accessibility print stylesheets mobile web server side languages week 11 week 11 readme md final project introducion to final project propose create a tumblr theme week 12 week 12 readme md final project design build sessions design critique week 13 week 13 readme md final project finish build build user experience critique final project final readme md final project due with all end of course documentation
front_end
winoqueer-v0
winoqueer v0 benchmark dataset for anti queer bias in large language models llms our paper towards winoqueer developing a benchmark for anti queer bias in large language models https arxiv org abs 2206 11484 was published in the queerinai workshop at naacl 2022 repo contents finetuning data finetuning data are currently down because of licensing concerns sorry for the outage expect the correct data to be posted on or before 09 09 finetuning scripts scripts use to preprocess data segment and normalize and finetune models tweets are normalized using tweetnormalizer from bertweet https github com vinairesearch bertweet blob master tweetnormalizer py model checkpoints model checkpoints are included for four models bert base bert large spanbert base spanbert large under three finetuning conditions none lgbtq news lgbtq twitter benchmark data winoqueer benchmark csv is the benchmark data used in our experiments in the paper use this to replicate our results our data follows the crows pairs https github com nyu mll crows pairs format and you should use their evaluation script to run our metric note some files in this repo are large you will probably need to use git lfs
ai
Avanade.SubTCSE.Project
avanade subtcse project software engineering full stack cloud
cloud
LuaNLP
luanlp lua s nlp toolkit the goal of this library is to provide native support for natural language processing tasks https en wikipedia org wiki natural language processing common nlp tasks in lua a comprehensive discussion regarding the tasks currently supported and their method of implementation is present in guide subsection presently this library is not available on luarocks however there are definite plans to add support in the future dependencies this library is dependent on pcre flavour of lrexlib https github com rrthomas lrexlib to install lrexlib bash luarocks install lrexlib pcre guide this guide will take you around a short tour of luanlp luanlp supports many of the most used nlp tasks such as word tokenization sentence tokenization stemming lemmatization parts of speech tagging sentiment analysis keyword extraction text summarization named entity recognition stopwords and n grams as of 12 04 21 word sense disambiguation is under development let us begin by loading some text as we are diving into the branches of linguistics i am selecting a relevant featured article from wikipedia rosetta stone lua text the rosetta stone is a granodiorite stele inscribed with three versions of a decree issued in memphis egypt in 196 bc during the ptolemaic dynasty on behalf of king ptolemy v epiphanes the top and middle texts are in ancient egyptian using hieroglyphic and demotic scripts respectively while the bottom is in ancient greek the decree has only minor differences between the three versions making the rosetta stone key to deciphering the egyptian scripts the stone was carved during the hellenistic period and is believed to have originally been displayed within a temple possibly at nearby sais it was probably moved in late antiquity or during the mameluk period and was eventually used as building material in the construction of fort julien near the town of rashid rosetta in the nile delta it was discovered there in july 1799 by french officer pierre fran ois bouchard during the napoleonic campaign in egypt it was the first ancient egyptian bilingual text recovered in modern times and it aroused widespread public interest with its potential to decipher this previously untranslated hieroglyphic script lithographic copies and plaster casts soon began circulating among european museums and scholars when the british defeated the french they took the stone to london under the capitulation of alexandria in 1801 it has been on public display at the british museum almost continuously since 1802 and is the most visited object there study of the decree was already underway when the first complete translation of the greek text was published in 1803 jean fran ois champollion announced the transliteration of the egyptian scripts in paris in 1822 it took longer still before scholars were able to read ancient egyptian inscriptions and literature confidently major advances in the decoding were recognition that the stone offered three versions of the same text 1799 that the demotic text used phonetic characters to spell foreign names 1802 that the hieroglyphic text did so as well and had pervasive similarities to the demotic 1814 and that phonetic characters were also used to spell native egyptian words 1822 1824 three other fragmentary copies of the same decree were discovered later and several similar egyptian bilingual or trilingual inscriptions are now known including three slightly earlier ptolemaic decrees the decree of alexandria in 243 bc the decree of canopus in 238 bc and the memphis decree of ptolemy iv c 218 bc the rosetta stone is no longer unique but it was the essential key to the modern understanding of ancient egyptian literature and civilisation the term rosetta stone is now used to refer to the essential clue to a new field of knowledge also we will be inspecting a lot of outputs and writing multiple for loops to pass through nested tables is no fun so to make things easier i am importing inspect from inspect documentation human readable representations of tables lua package path package path external lua inspect require inspect sentence tokenization let us begin with sentence tokenization to import lua tokenization require tokenizer tokenization performing sentence tokenization on the above text we get lua sent tokenizer tokenization sentence tokenize text sent tokens for sent token in sent tokenizer do table insert sent tokens sent token print sent token s end end the rosetta stone is a granodiorite stele inscribed with three versions of a decree issued in memphis egypt in 196 bc during the ptolemaic dynasty on behalf of king ptolemy v epiphanes s end the top and middle texts are in ancient egyptian using hieroglyphic and demotic scripts respectively while the bottom is in ancient greek s end the decree has only minor differences between the three versions making the rosetta stone key to deciphering the egyptian scripts s end the stone was carved during the hellenistic period and is believed to have originally been displayed within a temple possibly at nearby sais s end it was probably moved in late antiquity or during the mameluk period and was eventually used as building material in the construction of fort julien near the town of rashid rosetta in the nile delta s end it was discovered there in july 1799 by french officer pierre fran ois bouchard during the napoleonic campaign in egypt s end it was the first ancient egyptian bilingual text recovered in modern times and it aroused widespread public interest with its potential to decipher this previously untranslated hieroglyphic script s end lithographic copies and plaster casts soon began circulating among european museums and scholars s end when the british defeated the french they took the stone to london under the capitulation of alexandria in 1801 s end it has been on public display at the british museum almost continuously since 1802 and is the most visited object there s end study of the decree was already underway when the first complete translation of the greek text was published in 1803 s end jean fran ois champollion announced the transliteration of the egyptian scripts in paris in 1822 it took longer still before scholars were able to read ancient egyptian inscriptions and literature confidently s end major advances in the decoding were recognition that the stone offered three versions of the same text 1799 that the demotic text used phonetic characters to spell foreign names 1802 that the hieroglyphic text did so as well and had pervasive similarities to the demotic 1814 and that phonetic characters were also used to spell native egyptian words 1822 1824 s end three other fragmentary copies of the same decree were discovered later and several similar egyptian bilingual or trilingual inscriptions are now known including three slightly earlier ptolemaic decrees the decree of alexandria in 243 bc the decree of canopus in 238 bc and the memphis decree of ptolemy iv c 218 bc the rosetta stone is no longer unique but it was the essential key to the modern understanding of ancient egyptian literature and civilisation s end the term rosetta stone is now used to refer to the essential clue to a new field of knowledge s end as can be observed the sentence tokenizer is not 100 perfect and fails to tokenize the second last line ptolemy iv c 218 bc the rosetta stone is no to be more concrete about the algorithm s limitations out of the 52 english tests presented in pragmatic segmenter https github com diasks2 pragmatic segmenter the golden rules this sentence tokenizer generates wrong output for 14 15 18 35 36 37 38 42 45 50 51 word tokenization let us now explore word tokenization to call the penn treebank word tokenizer lua penn word tokenizer require tokenizer treebank passing sentences sent tokens lua penn word tokenizer tokenize text convert parentheses return str args text str sentence to be tokenized convert parentheses bool parentheses are converted to forms such as lrb lsb rrb rsb etc return str bool if false will split on the whitespaces and return the tokens else will return the unsplit string lua for sent token in ipairs sent tokens do local tokens penn word tokenizer tokenize sent token false false print inspect tokens end lua the rosetta stone is a granodiorite stele inscribed with three versions of a decree issued in memphis egypt in 196 bc during the ptolemaic dynasty on behalf of king ptolemy v epiphanes the top and middle texts are in ancient egyptian using hieroglyphic and demotic scripts respectively while the bottom is in ancient greek the decree has only minor differences between the three versions making the rosetta stone key to deciphering the egyptian scripts the stone was carved during the hellenistic period and is believed to have originally been displayed within a temple possibly at nearby sais it was probably moved in late antiquity or during the mameluk period and was eventually used as building material in the construction of fort julien near the town of rashid rosetta in the nile delta it was discovered there in july 1799 by french officer pierre fran ois bouchard during the napoleonic campaign in egypt it was the first ancient egyptian bilingual text recovered in modern times and it aroused widespread public interest with its potential to decipher this previously untranslated hieroglyphic script lithographic copies and plaster casts soon began circulating among european museums and scholars when the british defeated the french they took the stone to london under the capitulation of alexandria in 1801 it has been on public display at the british museum almost continuously since 1802 and is the most visited object there study of the decree was already underway when the first complete translation of the greek text was published in 1803 jean fran ois champollion announced the transliteration of the egyptian scripts in paris in 1822 it took longer still before scholars were able to read ancient egyptian inscriptions and literature confidently major advances in the decoding were recognition that the stone offered three versions of the same text 1799 that the demotic text used phonetic characters to spell foreign names 1802 that the hieroglyphic text did so as well and had pervasive similarities to the demotic 1814 and that phonetic characters were also used to spell native egyptian words 1822 1824 three other fragmentary copies of the same decree were discovered later and several similar egyptian bilingual or trilingual inscriptions are now known including three slightly earlier ptolemaic decrees the decree of alexandria in 243 bc the decree of canopus in 238 bc and the memphis decree of ptolemy iv c 218 bc the rosetta stone is no longer unique but it was the essential key to the modern understanding of ancient egyptian literature and civilisation the term rosetta stone is now used to refer to the essential clue to a new field of knowledge there is an experimental version of word tokenize present in tokenize regex tokenize in tokenization lua this version is a blown up version of algorithm present in jurafsky and martin edition 3 chapter 2 page 16 figure 2 12 let us now explore other useful functions in tokenization n grams lua tokenization generate n gram input n args input sentence to be tokenized n n gram value 2 gram for the first sentence lua inspect tokenization generate n gram sent tokens 1 2 lua the rosetta rosetta stone stone is is a a granodiorite granodiorite stele stele inscribed inscribed with with three three versions versions of of a a decree decree issued issued in in memphis memphis egypt egypt in in 196 196 bc bc during during the the ptolemaic ptolemaic dynasty dynasty on on behalf behalf of of king king ptolemy ptolemy v v epiphanes note by default tokenization generate n gram splits the input into tokens by splitting on whitespaces to improve the performance use penn word tokenizer tokenize text convert parentheses return str with return str true this will ensure that splitting on whitespaces will preserve the treebank tokenizer properties for example lua inspect tokenization generate n gram penn word tokenizer tokenize sent tokens 1 false true 2 lua the rosetta rosetta stone stone is is a a granodiorite granodiorite stele stele inscribed inscribed with with three three versions versions of of a a decree decree issued issued in in memphis memphis egypt egypt in in 196 196 bc bc during during the the ptolemaic ptolemaic dynasty dynasty on on behalf behalf of of king king ptolemy ptolemy v v epiphanes epiphanes remove punctuations lua tokenization remove punctuations input lua tokenization remove punctuations sent tokens sent tokens 1 three other fragmentary copies of the same decree were discovered later and several similar egyptian bilingual or trilingual inscriptions are now known including three slightly earlier ptolemaic decrees the decree of alexandria in 243 bc the decree of canopus in 238 bc and the memphis decree of ptolemy iv c 218 bc the rosetta stone is no longer unique but it was the essential key to the modern understanding of ancient egyptian literature and civilisation emoji tokenize finds all the text based emojis non unicode from the input text lua tokenization emoji tokenize input emojis tokenization emoji tokenize hi there it has been a long time d for emoji in emojis do print emoji end d whitespace tokenize tokenizes on whitespaces lua tokenization whitespace tokenize input lua whitespace tokenizer tokenization whitespace tokenize sent tokens sent tokens whitespace tokens for token in whitespace tokenizer do table insert whitespace tokens token end inspect whitespace tokens lua the term rosetta stone is now used to refer to the essential clue to a new field of knowledge character tokenize tokenizes on characters lua tokenization character tokenize input lua character tokenizer tokenization character tokenize sent tokens sent tokens character tokens for token in character tokenizer do table insert character tokens token end inspect character tokens lua t h e t e r m r o s e t t a s t o n e i s n o w u s e d t o r e f e r t o t h e e s s e n t i a l c l u e t o a n e w f i e l d o f k n o w l e d g e stemming the porter stemmer implemented in this library is ported to lua using the python implementation on martin porter s website https tartarus org martin porterstemmer python txt the porter algorithm can be found in the following paper porter algorithm https tartarus org martin porterstemmer def txt to import module lua porter stemmer require stemmer porter syntax porter stemmer stem word start index end index args word str word to be stemmed start index int starting index of the string in almost all cases 1 end index int ending index of the string in most cases length of the string stemming words in the 3rd sentence lua to stem words penn word tokenizer tokenize sent tokens 3 false false for word in ipairs to stem words do local stemmed porter stemmer stem word 1 string len word print word stemmed end the the decree decre has ha only onli minor minor differences differ between between the the three three versions version making make the the rosetta rosetta stone stone key kei to to deciphering deciph the the egyptian egyptian scripts script this stemming algorithm has been successfully tested using testcases from martin porter s website https tartarus org martin porterstemmer vocabulary https tartarus org martin porterstemmer voc txt and output https tartarus org martin porterstemmer output txt parts of speech an averaged perceptron based parts of speech tagger is implemented in this library this module is a port of nltk s averaged perceptron tagger which in turn was a port of textblob s averaged perceptron tagger for understanding of parts of speech taggers and their implementations refer the following readings matthew honnibal s blog detailing the concept https explosion ai blog part of speech pos tagger in python for an introduction to parts of speech tagging https web stanford edu jurafsky slp3 8 pdf to import the module lua pos tagger require pos perceptron unlike the rest of the tasks the parts of speech tagger requires training on labelled data before it can make meaningful predictions by default you can train on the conll2000 dataset using the code below note the pretrained model is not shipped so using pos tagging requires mandatory training on some dataset visualization of train txt from conll2000 confidence nn b np in in b pp the dt b np pound nn i np is vbz b vp syntax pos tagger train sentences nr iter to train the tagged sentences args sentences nested tables containing sentences and their corresponding parts of speech tags for example today nn is vbz good jj day nn yes nns it prp beautiful jj nr iter int number of training iterations pos tagger tag tokens return conf use tagdict to tag the tokenized sentences args tokens array array of tokens return conf bool if true returns the confidence scores of the tags use tagdict bool if true uses tag dictionary for single tag words if a token has a frequency of 20 or more and has a probability score greater than 97 of predicting a certain tag that tag is stored in a dictionary such tokens tags are then automatically indexed from this dictionary for training this code along with the testing part can be found in pos conll2000 test lua lua train file pos conll2000 train txt function training filename local file io open filename r local done false local training set while done true do local found end of sentence false local sentence while found end of sentence true do local sent file read local func string gmatch sent s local curr word tag chunk tag func func func if curr word nil then found end of sentence true we have reached the end elseif curr word end of training file then found end of sentence true done true else table insert sentence curr word tag end end table insert training set sentence end pos tagger train training set 8 file close end for testing on the eighth sentence try lua inspect pos tagger tag penn word tokenizer tokenize sent tokens 8 false false true true lithographic jj 0 99525661280489 copies nns 0 9999999944953 and cc 1 0 plaster nn 0 97827922854818 casts nns 0 99998149375758 soon rb 1 0 began vbd 1 0 circulating vbg 0 99999854714063 among in 1 0 european jj 0 99999399618361 museums nns 0 99996446558515 and cc 1 0 scholars nns 0 97589828477377 1 0 on conll 2000 testcases the average perceptron based implementation produces an accuracy of 97 33 see pos conll2000 readme txt for more details lemmatization currently for lemmatization a wordnet based lemmatization algorithm is supported this algorithm has been ported from nltk s nltk stem wordnetlemmatizer sources are stem wordnet html and corpus reader wordnet html https www nltk org modules nltk corpus reader wordnet html to import the module lua wordnet require lemmatizer wordnet syntax lua wordnet morphy word pos check exceptions args word str word to be lemmatized pos str parts of speech for the word available options are v verb n noun a adjective s satellite adjective r adverb check exceptions bool if true it will check for any lemmatization related exceptions as mentioned by wordnet for list of exceptions related to a particular pos see the respective exc file in lemmatizer wordnet additionaly if curious regarding s and a read different handling of adjective and satellite adjective https stackoverflow com questions 51634328 wordnetlemmatizer different handling of wn adj and wn adj sat remember it is essential that the words to be lemmatized are in lowercase lemmatizing the 3rd sentence lua tokenizer the sentence to lemmatize words penn word tokenizer tokenize sent tokens 3 false false find out all the parts of speech of the words pos tags pos tagger tag to lemmatize words false true as wordnet deals with verbs noun adjective and adverbs and as the tags returned by pos tagger tag follow the brilltagger conventions like nn rb jj etc we are creating a simple dictionary to map the brilltagger conventions to wordnet conventions map tags n n j a r r v v for i word in ipairs to lemmatize words do local lemmatized word local first char of pos string sub pos tags i 2 1 1 local pos map tags first char of pos if pos nil then find a lemmatized form for this word with a non nil tag lemmatized wordnet morphy string lower word pos true 1 if a word is not in wordnet wordnet morphy returns nil so substituting nil with the original word if lemmatized nil then lemmatized word end end print word lemmatized end the the decree decree has have only only minor minor differences difference between between the the three three versions version making make the the rosetta rosetta stone stone key key to to deciphering decipher the the egyptian egyptian scripts script note after obtaining a list of potential lemmas using nltk corpus wordnet morphy word pos the following codeblock is executed by nltk in python python lemmas wordnet morphy word pos return min lemmas key len if lemmas else word therefore for words such as saw and pos of v the lemmas obtained are saw see based on the above code it returns saw as the lemma as words are of same length however the question of which is right is subjective to prevent confusion wordnet morphy provides all the possible lemmas nltk like functionality can be performed as follows lua wn require wordnet lemmas wn morphy word pos check exceptions if lemmas 0 then table sort lemmas return table 1 else return word end sentiment analysis for sentiment analysis vader algorithm is supported the implementation in this library is a port of vadersentiment https github com cjhutto vadersentiment by cj hutto to import the module lua vader require sent vader syntax lua vader polarity scores sentence args sentence str sentence to be classified as the above rosetta stone passage has neutral sentences testing on a different example lua inspect vader polarity scores ferrari won the f1 world championship lua compound 0 8591 neg 0 0 neu 0 32 pos 0 68 drawbacks of using vader algorithm as this is a lexicon and rule based tool it does not work in cases wherein the tokens are not in the lexicon but convey sentiment for example from the dataset included in the paper from group to individual labels using deep features kotzias et al kdd 2015 amazon reviews 1 you can not answer calls with the unit never worked once 2 item does not match picture 3 lasted one day and then blew up 4 adapter does not provide enough charging current 5 i plugged it in only to find out not a darn thing worked all the selected sentences generate a compound score of 0 i e neutral as mentioned by the vadersentiment authors in their readme is specifically attuned to sentiments expressed in social media see sent test vader lua for tests on the kotzias et al kdd 2015 paper s dataset text summarization let us now explore textteaser an automatic summarization algorithm supported by this library this module is a port of the newspaper3k port https github com codelucas newspaper blob master newspaper nlp py of textteaser which was originally written by jolo balbin in scala https github com mojojolo textteaser to import the module lua summarizer require summarize textteaser syntax lua summarizer summarize title text max sents args title str title of the text body which is to be summarized text str text corpus to be summarized max sents int number of sentences in the summary summarizing our rosetta stone passage lua inspect summarizer summarize rosetta stone text 3 the rosetta stone is a granodiorite stele inscribed with three versions of a decree issued in memphis egypt in 196 bc during the ptolemaic dynasty on behalf of king ptolemy v epiphanes the decree has only minor differences between the three versions making the rosetta stone key to deciphering the egyptian scripts the term rosetta stone is now used to refer to the essential clue to a new field of knowledge note the results in this module may slightly differ from those in newspaper3k s implementation as the word and sentence tokenizers have different implementations this module depends on tokenizer tokenization lua relevant discussion regarding textteaser can be found on hacker news https news ycombinator com item id 6536896 in this hn link the author mojojolo mentions referring to the paper comments oriented blog summarization by sentence extraction http citeseerx ist psu edu viewdoc download doi 10 1 1 222 6530 rep rep1 type pdf keyword extraction this library supports keyword extraction using an algorithm known as rapid automatic keyword extraction rake algorithm this module is a python port of rake by aneesha https github com aneesha rake to import the module lua keywords require keyword rake syntax lua keywords run text topn args text str text corpus for extracting keywords from topn int number of keywords to be extracted extracting keywords from our rosetta stone passage lua keywords run text 5 lua french officer pierre fran ois bouchard ancient egyptian bilingual text recovered jean fran ois champollion announced slightly earlier ptolemaic decrees aroused widespread public interest when to use rake rake can primarily be used for obtaining key phrases in text body for 1 or 2 token keywords see the textteaser function summarize keywords note by default this implementation uses the smart stoplist to change this lua rake stopword type your choice by default stopwords supports the following fox stoplist nltk s english stoplist and smart stoplist build stop word regex is a local function in keyword rake lua rake stop word pattern build stop word regex rake stopword type rake run text topn named entity recognition at present the averaged perceptron based implementation for parts of speech tagging can be easily modified to support named entity recognition what modification is required modify line in averagedperceptron predict in pos perceptron lua lua else best label conf self classes vbz 0 to lua else best label conf self classes b loc 0 the above line provided a non nil guess as a starting guess for the perceptron to compare with the truth and update its weigths as ner models do not possess vbz class we changed it to a more appropriate b loc class to import the module lua ner tagger require pos perceptron similar to parts of speech tagger named entity recognition requires training on labelled data before it can make meaningful predictions by default you can train on the conll2003 dataset by following the instructions mentioned below to read instructions on how to download conll2003 dataset inside pos conll2003 and how to preprocess the data refer to pos conll2003 readme txt once train json valid json and test json are obtained after the above mentioned preprocessing step you can train the ner model using the following code lua json require external json pt require pos perceptron assuming train json and valid json are in pos conll2003 train file pos conll2003 train json function to json filename local file io open filename r local sents file read file a file close sents json decode sents return sents end function training filename local training set to json filename pt train training set 8 end training train file performing ner on sixth sentence of rosetta stone passage lua it was discovered there in july 1799 by french officer pierre fran ois bouchard during the napoleonic campaign in egypt ner sent penn word tokenizer tokenize sent tokens 6 false false inspect pt tag ner sent lua it o 0 99797167678855 was o 0 99999994433959 discovered o 1 0 there o 0 99999999999938 in o 0 99999999992923 july o 0 99999850186825 1799 o 0 99999999999998 by o 1 0 french b misc 0 99999999671776 officer o 0 99999999999989 pierre fran ois o 0 95948598745703 bouchard b per 0 91195727669638 during o 0 9999994612422 the o 1 0 napoleonic b misc 0 96769051909557 campaign o 1 0 in o 0 99999999999984 egypt b loc 0 99789478864941 o 0 99999836506797 to test the model on valid json or test json see pos conll2003 test lua average precision and recall results after testing on conll2003 five times each was trained for 8 iterations loc misc org per precision 0 8267 0 7416 0 7879 0 8617 recall 0 8514 0 7725 0 7307 0 8930 the precision and recall for the o tag averaged around 0 987 feature request at present this library supports a handful of algorithms if there are any specific algorithms you would like me to port to lua add them to the discussion luanlp feature requests https github com pncnmnp luanlp discussions 1 i will try implementing 1 feature every month author parth parikh https pncnmnp github io https pncnmnp github io license this library is licensed under mit license https github com pncnmnp luanlp blob main license for details regarding licenses of the codebases being ported see the respective lua files
nlp-library lua
ai
unofficial-rtthread
none released porting the none released porting in rt thread rtos
os
CDC-NLP-Occ-Injury-Coding
github logo https media exp1 licdn com dms image c4e22aqh0yzszqpnrfg feedshare shrink 800 0 e 1604534400 v beta t urajinwnaeok3d07to0q09seqmfuxkatmgpcbtsljt0 background science belongs to everyone every effort should be made to allow anyone to participate in the creation of scientific knowledge this project was motivated by the desire to expand open innovation it is a response offered to recomendations in this report https www cdc gov niosh topics surveillance pdfs a smarter national surveillance system final pdf the centers for disease control and prevention cdc used crowdsourcing to improve a natural language processing nlp machine learning ml algorithm to code unstructured work related injury narratives to oiics https wwwn cdc gov wisards oiics trees multitree aspx year 2012 two digit event codes an intra and extra mural marathon competition was held in 2019 for the first time in cdc s history these algorithms are the top five performing nlp solutions created by the crowdsourcing competitions the project was made prossible by funding from niosh dsr https www cdc gov niosh contact im dsr html and the cdc os oti https www cdc gov os technology index htm a multitute of scientific workgroups at the cdc significantly contributed towards the promotion of the project and the recruitment of intra mural competitors our effort was promoted by fcpccs https www citizenscience gov about community of practice and the ai cop https digital gov communities artificial intelligence our team has participated with multiple outlets on interviews and was invited by mit s j clinic https www jclinic mit edu and harvard s lish https lish harvard edu to speak about innovative project our team the m braingineers is deeply grateful to the funding parties and all those who dedicated their time to helping achieve this amazing success if you have any questions or comments please send us an email nej6 cdc gov m braingineers our team of 16 federal employees from 7 federal agencies was led by dr siordia https www linkedin com in carlos siordia phd 03b152b9 and dr bertke https www linkedin com in steve bertke 3bb49a9a in close collaboration with mr measure https www linkedin com in ameasure and dr russ https www linkedin com in daniel russ 9541aa15 federal agencies partipating included the cdc https www cdc gov bls https www bls gov nih https www nih gov census https www census gov cpsc https www cpsc gov fema https www fema gov and the osha https www osha gov all team members contributed and are listed by including their host federal agency and center institute office cio carlos siordia phd cdc niosh steve bertke phd niosh niosh audrey reichard mph cdc niosh syd webb phd cdc niosh alexander measure ms bls oshs daniel russ phd nih cit stacey marovich mhi cdc niosh kelly vanoli bs cdc niosh mick ballesteros phd cdc ncipc jeff purdin ms cdc niosh melissa friesen phd nih nci machell town phd cdc nccdphp lynda laughlin phd census sehsd tom schroeder ms cpsc dhids jim heeschen ms fema usfa nfdc miriam schoenbaum phd osha osa ari mini o mph cdc nchs results of intramural within cdc competition the cdc has been using machine learning for decades as discussed in this blog https blogs cdc gov genomics 2020 09 17 artificial intelligence our project represents the first time the cdc hosted an intramural nlp marathon the 19 intra mural competitors are each extraordinary analyst for our competition they had the courage to put their reputation on the line they procured management approval for participation and devoted many hours to developing their solution even those new to nlp performed admirably we are grateful for their contributions more information on 19 competitors in 9 teams is provided in this announcement https www cdc gov niosh updates upd 10 24 19 html and this blog https blogs cdc gov niosh science blog 2020 02 26 ai crowdsourcing competitors had to code unstructured work related injury narratives to 48 unique oiics https wwwn cdc gov wisards oiics trees multitree aspx year 2012 two digit event codes competitors has access to a neiss work https wwwn cdc gov wisards workrisqs data files from 2012 through 2016 with 191 835 observations results are listed by ranking including weighted f1 score 1 wf1 87 77 scott lee phd in cdc s center for surveillance epidemiology and laboratory services csels used an ensemble classifier blending four bert neural network models 2 wf1 87 15 mohammed khan ms and bill thompson phd from cdc nchhstp s division of viral hepatitis dvh used recurrent neural network with fastai on codalab 3 wf1 84 47 jasmine varghese ms benjamin metcalf ma and yuan li phd from cdc ncird s division of bacterial diseases dbd used regularized logistic regression with custom word corpus 4 wf1 84 45 keming yuan ms from cdc ncipc s division of violence prevention dvp used long short term memory recurrent neural network 5 wf1 83 32 naveena yanamala phd from cdc niosh s health effects laboratory division held used linear support vector model post custom standardization 6 wf1 82 75 li lu md phd from cdc s office of the chief operating officer ocoo used an ensemble classifier using regularized logistic regression multi layer perceptron and linear support vector models 7 wf1 81 47 joan braithwaite msph in cdc s national center for chronic disease prevention health promotion nccdphp used linear support vector model post lemmatization 8 wf1 81 00 donald malec phd from cdc nchs division of research and methodology drm used support vector machine 9 wf1 77 45 jos tom s prieto phd and faisal reza phd from cdc csels division of scientific education professional development dsepd used regularized logistic regression lasso results of extramural international competition using an inter agency agreement iaa between nasa s coeci https www nasa gov offices coeci index html and niosh https www cdc gov niosh contact im dsr html topcoder https www topcoder com was contracted and hosted an international competition https www topcoder com challenges 020c0f34 1f05 4d58 9530 680280a2994b to develop natural language processing algorithms we give special thanks to topcoder s dr contreras https www linkedin com in michael contreras 056873b and mr reitz https www linkedin com in danreitz1 international competition had 388 registrands from 26 countries about 32 of competitors where from the usa and 21 from india a total of 20 universities where represented in the competition we are grateful for willingness to consider our challenge and invest the time required to outperform our cdc model i e the model created by dr lee https www linkedin com in scott lee b767a1144 competitors used neiss work https wwwn cdc gov wisards workrisqs data from 2012 through 2017 which included a total of 229 820 observations competitors had to use unstructured injury narratives to code two digit oiics event codes topcoder had a total of 961 total unique submissions of algorithms to be scored and ranked here are the prize winners listed by ranking including weighted f1 score 1 wf1 89 20 raymond van veneti phd student in mathemathics from netherlands used an ensemble classifer with albert and dan models 2 wf1 89 12 pavel blinov phd in computer science from russia used used an ensemble classifer with bert xlnet and roberta models 3 wf1 89 09 zhuoyu wei from china used an ensemble classifer that included roberta models 4 wf1 88 99 zhensheng wang phd biostatistician from usa used an ensemble classifier that included xlnet and bert models 5 wf1 88 93 a sai sravan a full stack engineer from india used an ensemble classifier that included bert and roberta models disclaimer use of these algorithms and associated files does not imply a nasa or cdc endorsement of any one particular product service or enterprise u s government logos including the nasa and cdc logo cannot be used without express written permission these algorithms were designed based on the best available science and should not be modified or altered use of these algorithms must be accompanied by the following disclaimer and non endorsement language this natural language processing nlp algorithm was originally developed through a nasa cdc collaboration with the public community of programmers neither nasa nor cdc guarantee the accuracy of the nlp algorithm the u s government does not make any warranty of any kind either expressed or implied for any non u s government version of the nlp algorithm use by a non u s government organization or enterprise does not imply a u s government endorsement of any one particular product service or enterprise or that this use of the nlp algorithm represents the official views of the u s government public domain standard notice this repository constitutes a work of the united states government and is not subject to domestic copyright protection under 17 usc 105 this repository is in the public domain within the united states and copyright and related rights in the work worldwide are waived through the cc0 1 0 universal public domain dedication https creativecommons org publicdomain zero 1 0 all contributions to this repository will be released under the cc0 dedication by submitting a pull request you are agreeing to comply with this waiver of copyright interest license standard notice this project constitutes a work of the united states government and is not subject to domestic copyright protection under 17 usc 105 the project utilizes code licensed under the terms of the apache software license and therefore is licensed under asl v2 or later this program is free software you can redistribute it and or modify it under the terms of the apache software license version 2 or at your option any later version this program is distributed in the hope that it will be useful but without any warranty without even the implied warranty of merchantability or fitness for a particular purpose see the apache software license for more details you should have received a copy of the apache software license along with this program if not see this license http www apache org licenses license 2 0 html privacy standard notice this repository contains only non sensitive publicly available data and information all material and community participation is covered by the disclaimer https github com cdcgov template blob master disclaimer md and code of conduct https github com cdcgov template blob master code of conduct md for more information about cdc s privacy policy please visit http www cdc gov other privacy html https www cdc gov other privacy html contributing standard notice anyone is encouraged to contribute to the repository by forking https help github com en github getting started with github fork a repo and submitting a pull request if you are new to github you might start with a basic tutorial https help github com en github getting started with github set up git by contributing to this project you grant a world wide royalty free perpetual irrevocable non exclusive transferable license to all users under the terms of the apache software license v2 http www apache org licenses license 2 0 html or later all comments messages pull requests and other submissions received through cdc including this github page may be subject to applicable federal law including but not limited to the federal records act and may be archived learn more here http www cdc gov other privacy html records management standard notice this repository is not a source of government records but is a copy to increase collaboration and collaborative potential all government records will be published through the cdc web site http www cdc gov additional standard notices please refer to cdc s template repository https github com cdcgov template blob master open practices md for more information about contributing to this repository public domain notices and disclaimers https github com cdcgov template blob master open practices md and code of conduct https github com cdcgov template blob master code of conduct md learn more about cdc github practices for open source projects here https github com cdcgov template blob master open practices md general disclaimer this repository was created for use by nasa and cdc programs to collaborate on public health related projects in support of agency mission github is not hosted by the nasa or the cdc but is a third party website used by nasa and cdc and its partners to share information and collaborate on software neither nasa s nor cdc s use of github imply an endorsement of any one particular service product or enterprise
ai
flyingaps
flyingaps this repo is for the kth embedded systems design project course mf2063 https www kth se social course mf2063
os
Embedded-Systems-Applications
sj2 c development environment an sj2 board is used at san jose state university sjsu to teach embedded systems courses part of this git repository also includes development environment for not just an arm controller but also support to compile and test software on your host machine such as windows mac etc the sample project of the sj2 board contains code written in c that anyone can understand easily it was designed to learn the low level workings of a microcontroller platform with an rtos project highlights fully implemented in c minimalistic design with little to no abstractions follows good coding principles such as yagni and dry infrastructure highlights supports mac linux windows out of the box version controlled toolchains and other supplementary tools no vms no wsl dependency on windows next steps build and flash project readme getting started md read more about scons readme scons md to figure out how to build projects build system we use scons https scons org as our build platform the developers of sj2 c applied experience of diverse set of build systems acquired over many years and resorted to this one the candidates were scons used at tesla and many other companies bazel used at google zoox and many other companies make scons discusses the advantages and disadvantages of other existing build systems build system comparison https github com scons scons wiki sconsvsotherbuildtools from experience we can state that bazel is really cool but hard to extend and problematic in windows scons dependencies are tricky but it is based on python and easy to understand and extend scons takes advantage of a python interpreter making it portable across multiple platforms mac linux windows history we wanted to build a strong foundational sample project for sj 2 development board that is used at sjsu originally preet created sj1 development board which meant that there was one development board across multiple classes at sjsu i was an enthusiast and created a hybrid project composed of c and c sources i love c a little more than c because i can express artistic designs in the language but a language is really a tool and you need to select the right tool for the job presently i work on embedded firmware code for automotive industry which is in c and software code in c because c is the right tool for firmware and c is the right tool for software sj2 original https github com kammce sjsu dev2 software was also designed by an enthusiast khalil who is a very talented person but expressing a starter project in c increased the complexity so much that many developers had a difficult time adopting it this is where the sj2 c was born which was completely re designed to write simple code that everyone could understand many students are new to programming and it was not expected that c would come naturally the code may appear less fancy but it is simple to understand and traceable with minimal abstractions the goal is to avoid designing yet another arduino platform there is no such thing as magic in the field of firmware engineering
os
Virtual-Golf-Contest-Community-Web-App-Development
azimuth ltd we top topgolf
server
inoERP
inoerp is an oneapp https docs rikdata com go back end flutter front end based enterprise management system the erp systems contain all the required modules for running small to midsize businesses the features are similar to oracle r12 cloud application and sap ecc hana s 4 the application uses mysql database and oneapp javascript apis to create business logic all the database and javascript codes are available on github the client is available for andriod ios macos windows and web the server is available for windows macos and linux documentation http docs inoerp com rest apis http api inoerp com web demo https https demo inoerp com 8090 http http demo inoerp com 8085 contact contact rikdata com rikdata com gmail com the web client is experimental and doesn t have all functionalities of native clients windows macos andriod ios the performance of the web is also not at the same level as a native client so try the application with a native client and use the above url in your native client server mysql 1 install mysql ver 8 0 2 change mysql settings on the config json file dbconnname inoerp dbtype mysql host localhost portnumber 3306 dbname inoerp username yourdbusername password yourdbpassword connpoll 5 maxconnpoll 10 defaultrowlimit 5 3 import the database mysql u root p home files inoerp sql database file is available assets db mysql folder the import process will create the required inoerp schema create database if not exists inoerp 40100 default character set utf8mb4 collate utf8mb4 0900 ai ci 80016 default encryption n use inoerp set global log bin trust function creators 1 ensure set global log bin trust function creators 1 settings 1 enter server hostname and port on the config json file application protocol http hostname localhost portnumber 8085 certfile keyfile 2 change any other settings on the config json file as per business requirement start stop you can start the server like any other application you can stop the server using oneapp desktop mobile client you can also send a rest request to your host stop to stop the application to send a stop request you must have admin authority oneapp win exe or in linux nohup oneapp linux client access the application using any client of your choice the clients are available for andriod windows macos ios web download client https docs rikdata com docs download the console will show you a message stating the host and port when the server starts server start should not take more than 10 15 seconds starting server localhost 8085 open the application in a browser and test that you can login with the default username and password admin admin img src http docs inoerp com images modules admin server server 01 png width 800 click on the sign in button and the system will redirect you to the dashboard caution the web client is experimental and doesn t have all functionalities of native clients windows macos andriod ios the performance of the web is also not at the same level as a native client so try the application with a native client and use the above url in your native client img src http docs inoerp com images modules admin server server 02 png width 800 read how to configure and use any client oneapp https docs rikdata com docs quickstart modules below are the fully functional erp modules available in inoerp general ledger gl chart of accounts inoerp allows a multi segment accounting structure that you can use to represent all segments of a business transaction ex 001 100 1020202 0100 100 where 001 represents a specific company business unit legal entity 100 represents a cost center 1020202 a natural account such as asset liability expense income or owners equity 2 calendars define as many different financial calendars as required ex one calendar ino corp for corporate and ino usa ino uk for specific countries 3 account combinations 4 currency conversions 5 ledger a set of a calendar currency and chart of accounts 6 banks 7 journal accounts payable ap 1 suppliers 2 ap transactions 2 1 invoices 2 2 debit memo 2 3 credit memo 3 po transaction matching 4 multi select matching 5 ap payments 5 1 single invoice payment 5 2 multi select payment 6 transfer journals to gl accounts receivable ar 1 customer 2 ar transactions 2 1 invoices 2 2 debit memo 2 3 credit memo 2 4 deposit 2 5 guarantee 2 6 charge back 3 ar payments 3 1 single invoice payment 3 2 multi select payment 4 transfer journals to gl fixed asset accounting fa 1 asset 2 depreciation 3 transactions 4 configuration organizations org 1 enterprise org 2 legal org 3 business org 4 inventory org 5 address inventory inv 1 item master 2 unit of measure 3 sub inventory 4 locator 5 inventory transactions 6 material receipts po receipt ir receipt rma receipt 7 onhand value 8 cycle count cycle count adjustment cycle count approval 9 abc analysis purchasing po 1 purchase order standard blanket agreement planned po 2 requisitions external internal 3 rfq quote 4 approval for po requisition sales distributions sd 1 sales order creation auto booking 2 sales picking 3 delivery shipment 4 auto ar invoice bills of material bom 1 departments 2 resources 3 routing 4 bom 5 super bom costing cst 1 material element 2 material oh 3 overhead 4 resources 5 standard cost 6 cost roll up 7 cost update work in process wip 1 work order 2 wip move transactions 3 wip resource transactions 4 wip material transaction 5 wo completion return supply chain planning scp 1 forecast 2 mds 3 mrp 4 min max planning multi bin min max human resource hr 1 employee education experience planned po 2 job 3 position 4 compensation payroll 4 leave system 4 approval hierarchy basic features 1 options 2 value groups 3 transaction types 4 custom reporting 5 search 5 multi select 6 mass upload admin 1 user 2 roles and role base access control 3 notification 4 document approval modules under development 1 project system 2 asset maintenance 3 helpdesk dynamic pull system the idea behind inoerp is to provide a dynamic pull based system where the demand supply changes frequently and traditional planning systems such as mrp or kanban are incompetent to provide a good inventory turn a dynamic pull system is an advanced version of a pull system that encompasses the best feature of the traditional pull system mrp the major disadvantage of the conventional kanban system is the fixed kanban size and requirement of at least two bins for the entire operation in a sudden demand decrease the kanban system can result in extra inventory and the value of unused inventory can go up to 2 bin size similarly in case of unexpected demand increases can result in a line down and the issue will be severe if the lead times are not short the dynamic pull system overcomes this issue by recalculating the bucket size kanban size lot size before creating any supply requisitions purchase order work order each time a new supply is created the system automatically decides the best supply size per the actual demand the old php version of inoerp is available https github com php inoerp inoerp
dynamics-365 erp mes oracle sap
front_end
T-Lab-Inventory
t lab inventory
server
relational_rails
readme schema design img width 689 alt screen shot 2021 05 19 at 3 35 02 pm src https user images githubusercontent com 26797256 118887731 fa222600 b8b7 11eb 837e 1c1ec1a1d401 png also available here https app dbdesigner net designer schema 418137
beginner-project ruby-on-rails rspec
server
blockchain
blockchain code your own blockchain in less than 120 lines of java added the real peer to peer network at 2018 06 15
blcokchain blockchain-technology cryptocurrency peer-to-peer p2p-network java java8 blockchain-demos
blockchain
internet-identity
p align center a href https identity ic0 app target blank rel noopener noreferrer img width 600 src ii logo png alt internet identity a p p align center a href https github com dfinity internet identity actions workflows canister tests yml img src https github com dfinity internet identity actions workflows canister tests yml badge svg alt canister tests a a href https github com dfinity internet identity actions workflows rust yml img src https github com dfinity internet identity actions workflows rust yml badge svg alt rust a a href https github com dfinity internet identity actions workflows frontend checks yml img src https github com dfinity internet identity actions workflows frontend checks yml badge svg alt frontend checks and lints a a href https github com dfinity internet identity releases img src https img shields io github downloads dfinity internet identity total label downloads logo github alt github all releases a p p align center a href https identity ic0 app https identity ic0 app a a href https internetcomputer org docs current references ii spec specification a br br a href https forum dfinity org c internet identity 32 forum a a href https github com dfinity internet identity issues new report an issue a a href https discord gg e9fxceag2j discord a p internet identity is an authentication service for the internet computer ic it is the authentication system that allows hundreds of thousands of users to log in to dapps like distrikt dscvr and more internet identity is simple it uses some of the webauthn api to allow users to register and authenticate without passwords using touchid faceid windows hello and more flexible integrating internet identity in a dapp or even web 2 app is as simple as opening the internet identity s http interface https identity ic0 app in a new tab no need to interact with the canister smart contract directly secure different identities are issued for each app a user authenticates to and cannot be linked back to the user for more information see what is internet identity https internetcomputer org docs current tokenomics identity auth what is ic identity on internetcomputer org https internetcomputer org table of contents getting started getting started local replica local replica architecture overview architecture overview building with docker building with docker integration with internet identity integration with internet identity build features and flavors build features and flavors features features flavors flavors stable memory compatibility stable memory compatibility getting help getting help links links getting started this section gives an overview of internet identity s architecture instructions on how to build the wasm module canister and finally pointers for integrating internet identity in your own applications local replica use the internet identity canister in your local dfx project by adding the following code snippet to your dfx json file json canisters internet identity type custom candid https github com dfinity internet identity releases latest download internet identity did wasm https github com dfinity internet identity releases latest download internet identity dev wasm gz remote id ic rdmx6 jaaaa aaaaa aaadq cai frontend alternatively you can use the dfx deps https internetcomputer org blog features dfx deps subcommand to manage a local internet identity canister add the following to your dfx json file json canisters internet identity type pull id rdmx6 jaaaa aaaaa aaadq cai next run the following commands sh dfx deps pull dfx deps init rdmx6 jaaaa aaaaa aaadq cai argument null dfx deps deploy architecture overview internet identity is an authentication service for the internet computer ic all programs on the internet computer are wasm modules or canisters canister smart contracts architecture ii architecture png this is an excalidraw com image source is ii architecture excalidraw internet identity runs as a single canister which both serves the frontend application code and handles the requests sent by the frontend application code the canister backend interface is specified by the internet identity did src internet identity internet identity did candid interface the backend canister code is located in src internet identity src internet identity and the frontend application code served by the canister through the http request method is located in src frontend src frontend the internet identity authentication service works indirectly by issuing delegations on the user s behalf basically attestations signed with some private cryptographic material owned by the user the private cryptographic material never leaves the user s device the internet identity frontend application uses the webauthn api to first create the private cryptographic material and then the webauthn api is used again to sign delegations for information on how internet identity works in more detail please refer to the following internet identity presentation https youtu be oxer8uzgebo streamed during the genesis event internet identity specification spec the official internet identity specification building with docker to get the canister wasm module for internet identity you can either download a release from the releases page or build the code yourself the simplest way to build the code yourself is to use docker and the docker build scripts docker build script bash scripts docker build the dockerfile dockerfile specifies build instructions for internet identity building the dockerfile will result in a scratch container that contains the wasm module at internet identity wasm gz the build can be customized with build features build features and flavors we recommend using the docker build scripts docker build script it simplifies the usage of build features build features and flavors and extracts the wasm module from the final scratch container you can find instructions for building the code without docker in the hacking document integration with internet identity the using dev build demos using dev build demo shows a documented example project that integrates internet identity for more please refer to the client authentication protocol section https internetcomputer org docs current references ii spec client authentication protocol of the internet identity specification spec to integration internet identity in your app from scratch for a just add water approach using the agent js https github com dfinity agent js library also used by using dev build check out kyle peacock s blogpost http kyle peacock com blog dfinity integrating internet identity if you re interested in the infrastructure of how to get the internet identity canister and how to test it within your app check out using dev build demos using dev build which uses the internet identity development canister build features and flavors the internet identity build can be customized to include features features that are useful when developing and testing we provide pre built flavors flavors of internet identity that include different sets of features features these options can be used both when building with docker building with docker and without docker hacking the features are enabled by setting the corresponding environment variable to 1 any other string as well as not setting the environment variable will disable the feature for instance bash ii fetch root key 1 dfx build ii dummy captcha 1 ii dummy auth 1 scripts docker build these options should only ever be used during development as they effectively poke security holes in internet identity the features are described below note if you add a feature here add it to features ts in the frontend codebase too even if the feature only impacts the canister code and not the frontend environment variable description ii fetch root key when enabled this instructs the frontend code to fetch the root key from the replica br the internet computer https ic0 app uses a private key to sign responses this private key not being available locally the local replica generates its own this option effectively tells the internet identity frontend to fetch the public key from the replica it connects to when this option is not enabled the internet identity frontend code will use the hard coded public key of the internet computer ii dummy captcha when enabled the captcha challenge sent by the canister code to the frontend code is always the known string a this is useful for automated testing ii dummy auth when enabled the frontend code will use a known stable private key for registering anchors and authenticating this means that all anchors will have the same public key s in particular this bypasses the webauthn flows touchid windows hello etc which simplifies automated testing ii insecure requests when enabled the upgrade insecure requests directive is removed from the content security policy in order to allow local development with safari flavors we offer some pre built wasm modules that contain flavors i e sets of features targetting a particular use case flavors can be downloaded from the table below for the latest release or from the release page https github com dfinity internet identity releases for a particular release flavor description production this is the production build deployed to https identity ic0 app includes none of the build features https github com dfinity internet identity releases latest download internet identity production wasm test this flavor is used by internet identity s test suite it fully supports authentication but uses a known captcha value for test automation includes the following features br ul li code ii fetch root key code li li code ii dummy captcha code li ul https github com dfinity internet identity releases latest download internet identity test wasm development this flavor contains a version of internet identity that effectively performs no checks it can be useful for external developers who want to integrate internet identity in their project and care about the general internet identity authentication flow without wanting to deal with authentication and in particular webauthentication includes the following features br ul li code ii fetch root key code li li code ii dummy captcha code li li code ii dummy auth code li li code ii insecure requests code li ul br see the using dev build demos using dev build readme md project for an example on how to use this flavor https github com dfinity internet identity releases latest download internet identity dev wasm stable memory compatibility internet identity requires data in stable memory to have a specific layout in order to be upgradeable the layout has been changed multiple times in the past this is why ii stable memory is versioned and each version of ii is only compatible to some stable memory versions if on upgrade ii traps with the message stable memory layout version is no longer supported then the stable memory layout has changed and is no longer compatible the easiest way to address this is to reinstall the canister thus wiping stable memory a canister can be reinstalled by executing dfx deploy canister mode reinstall getting help we re here to help here are some ways you can reach out for help if you get stuck internet identity bug tracker https github com dfinity internet identity issues create a new ticket if you encounter a bug using internet identity or if an issue arises when you try to build the code dfinity forum https forum dfinity org c internet identity 32 the forum is a great place to look for information and to ask for help support https support dfinity org hc en us requests new create a support request if you d like to keep things private links internet identity specification spec the official internet identity specification integration with internet identity http kyle peacock com blog dfinity integrating internet identity by kyle peacock what is internet identity https internetcomputer org docs current tokenomics identity auth what is ic identity on internetcomputer org https internetcomputer org internet identity presentation https youtu be oxer8uzgebo on youtube streamed during the genesis event excalidraw https excalidraw com used to make diagrams distrikt https distrikt io webauthn https webauthn guide dscvr https dscvr one hacking hacking md running locally ic https internetcomputer org spec https internetcomputer org docs current references ii spec releases https github com dfinity internet identity releases docker https docker io links links candid https internetcomputer org docs current developer docs build languages candid candid concepts
authentication blockchain identity
blockchain
LightTube
lighttube a lightweight privacy respecting alternative frontend for youtube features lightweight ad free dislike counts from return youtube dislike https www returnyoutubedislike com subscription feed proxying the videos through lighttube screenshots desktop page light dark search image https github com kuylar lighttube raw v2 rewrite screenshots desktop light search png image https github com kuylar lighttube raw v2 rewrite screenshots desktop dark search png video image https github com kuylar lighttube raw v2 rewrite screenshots desktop light video png image https github com kuylar lighttube raw v2 rewrite screenshots desktop dark video png mobile page light dark search image https github com kuylar lighttube raw v2 rewrite screenshots mobile light search png image https github com kuylar lighttube raw v2 rewrite screenshots mobile dark search png video image https github com kuylar lighttube raw v2 rewrite screenshots mobile light video png image https github com kuylar lighttube raw v2 rewrite screenshots mobile dark video png documentation https github com kuylar lighttube wikis used libraries https github com kuylar lighttube blob master otherlibs md public instances https github com kuylar lighttube blob master instances md
youtube youtube-player youtube-video
front_end
LLM
llm large language models
ai
Terraform-Ansible-Project
terraform ansible project
cloud
introduction-to-rtos
introduction to real time operating systems rtos welcome to the demo code and solutions section of my introduction to rtos course this repository houses all of the example code and solutions that you may use as references when working through the rtos examples for freertos img src images intro to rtos png alt intro to rtos course logo width 500 this course is hosted on youtube that you may take for free all you need to do is watch the videos and complete the challenge issued at the end of each video i highly recommend you try each challenge before peeking at the solutions here the powerpoint slides are also made available under the cc by 4 0 license if you wish to use or modify them for review or teaching your own rtos class if you use the slides please give credit to shawn hymel and digi key electronics in your slides as per the cc by 4 0 requirements https creativecommons org licenses by 4 0 chapter title video solution slides 01 what is an rtos video https www youtube com watch v f321087yyy4 list plebqazb0huyq4hapu1cjed6t3du0h34bz index 1 solution https www digikey com en maker projects what is a realtime operating system rtos 28d8087f53844decafa5000d89608016 slides 01 what is an rtos rtos part 01 pptx raw true 02 getting started with freertos video https www youtube com watch v jir7xm rirs list plebqazb0huyq4hapu1cjed6t3du0h34bz index 2 solution https www digikey com en maker projects introduction to rtos solution to part 2 freertos b3f84c9c9455439ca2dcb8ccfce9dec5 slides 02 getting started with freertos rtos part 02 pptx raw true 03 task scheduling and management video https www youtube com watch v 95yubclyf3e list plebqazb0huyq4hapu1cjed6t3du0h34bz index 3 solution https www digikey com en maker projects introduction to rtos solution to part 3 task scheduling 8fbb9e0b0eed4279a2dd698f02ce125f slides 03 task scheduling and management rtos part 03 pptx raw true 04 memory allocation video https www youtube com watch v qske3yzrw5i list plebqazb0huyq4hapu1cjed6t3du0h34bz index 4 solution https www digikey com en maker projects introduction to rtos solution to part 4 memory management 6d4dfcaa1ff84f57a2098da8e6401d9c slides 04 memory allocation rtos part 04 pptx raw true 05 queue video https www youtube com watch v phj3lxoowei list plebqazb0huyq4hapu1cjed6t3du0h34bz index 5 solution https www digikey com en maker projects introduction to rtos solution to part 5 freertos queue example 72d2b361f7b94e0691d947c7c29a03c9 slides 05 queue rtos part 05 pptx raw true 06 mutex video https www youtube com watch v i55aurpbits list plebqazb0huyq4hapu1cjed6t3du0h34bz index 6 solution https www digikey com en maker projects introduction to rtos solution to part 6 freertos mutex example c6e3581aa2204f1380e83a9b4c3807a6 slides 06 mutex rtos part 06 pptx raw true 07 semaphore video https www youtube com watch v 5jcmtba9qee list plebqazb0huyq4hapu1cjed6t3du0h34bz index 7 solution https www digikey com en maker projects introduction to rtos solution to part 7 freertos semaphore example 51aa8660524c4daba38cba7c2f5baba7 slides 07 semaphore rtos part 07 pptx raw true 08 software timer video https www youtube com watch v b1f1iex0tso list plebqazb0huyq4hapu1cjed6t3du0h34bz index 8 solution https www digikey com en maker projects introduction to rtos solution to part 8 software timers 0f64cf758da440a29476165a5b2e577e slides 08 software timer rtos part 08 pptx raw true 09 hardware interrupts video https www youtube com watch v qsflcf6ahxu list plebqazb0huyq4hapu1cjed6t3du0h34bz index 9 solution https www digikey com en maker projects introduction to rtos solution to part 9 hardware interrupts 3ae7a68462584e1eb408e1638002e9ed slides 09 hardware interrupts rtos part 09 pptx raw true 10 deadlock and starvation video https www youtube com watch v hrswi4hienc list plebqazb0huyq4hapu1cjed6t3du0h34bz index 10 solution https www digikey com en maker projects introduction to rtos solution to part 10 deadlock and starvation 872c6a057901432e84594d79fcb2cc5d slides 10 deadlock rtos part 10 pptx raw true 11 priority inversion video https www youtube com watch v c2xkhxromha list plebqazb0huyq4hapu1cjed6t3du0h34bz index 11 solution https www digikey com en maker projects introduction to rtos solution to part 11 priority inversion abf4b8f7cd4a4c70bece35678d178321 slides 11 priority inversion rtos part 11 pptx raw true 12 multicore systems video https www youtube com watch v lpshuch5aqc list plebqazb0huyq4hapu1cjed6t3du0h34bz index 12 solution https www digikey com en maker projects introduction to rtos solution to part 12 multicore systems 369936f5671d4207a2c954c0637e7d50 slides 12 multicore rtos part 12 pptx raw true directory structure examples and solutions are housed in dirctories that correspond to each chapter or video number for example if you watch intro to rtos part 3 task scheduling you should refer to the directory 03 task scheduling and management in it you will find 2 directories one marked demo that gives the finished demo code used during the video so you may run it and examine it at your own pace and another marked solution that provides one possible solution to the challenge issued at the end of the video if a challenge is issued in a video that starts with some code it will be listed as a challenge arduino sketch in the naming scheme shown below directories are in the following structure where xx is the part or chapter number xx name of chapter esp32 freertos xx challenge name esp32 freertos xx demo name esp32 freertos xx solution name rtos part xx pptx powerpoint slides used in each video can be found within the respective xx name of chapter directory the only exception to this is the images directory which is where i keep images for this repository license powerpoint slides are licensed under the creative commons cc by 4 0 license https creativecommons org licenses by 4 0 you are welcome to use and modify them for your own review and teaching if you use them please give credit to shawn hymel and digi key electronics all code in this repository unless otherwise noted is licensed under the zero clause bsd free public license 1 0 0 0bsd https opensource org licenses 0bsd permission to use copy modify and or distribute this software for any purpose with or without fee is hereby granted the software is provided as is and the author disclaims all warranties with regard to this software including all implied warranties of merchantability and fitness in no event shall the author be liable for any special direct indirect or consequential damages or any damages whatsoever resulting from loss of use data or profits whether in an action of contract negligence or other tortious action arising out of or in connection with the use or performance of this software
os
nodejs-status-page
simple node js status page simple status page created to learn a bit of a web development it s a learning project in the making considering it s my first time with the javascript node js i will be honestly surprised if it turns out usable planned features status page status page with customizable title description site wide custom information message list of websites and their statuses website name and description ping based ok outage status admin enabled maintenance mode time since maintenance outage started custom information message list of past incidents type of incident affected site time of incident how long it lasted admin panel ability to login and logout of the panel first time admin user setup ability to manage site configuration name description url global information message ability to manage websites add new websites delete existing websites edit website data name description url information message maintenance mode delete previous incidents used technologies javascript node js express js mangodb mangoose bootstrap jquery ejs templates passport js mock ups img src https user images githubusercontent com 1345297 53684220 18b59a00 3d0b 11e9 8112 88c085c4d518 png alt screenshot width 256 img src https user images githubusercontent com 1345297 53684226 21a66b80 3d0b 11e9 9ba1 944555d7ed35 png alt screenshot width 256 img src https user images githubusercontent com 1345297 53684222 18b59a00 3d0b 11e9 979a 6eba4f03cb7b png alt screenshot width 256 img src https user images githubusercontent com 1345297 53684223 194e3080 3d0b 11e9 9222 600912c37ec1 png alt screenshot width 256 todo x project setup x basic models controllers and routes x layout mock ups connect with real data display real data on main page handle login register logout and authorization display real data on admin page handle global configuration admin actions display real data on website page handle website admin actions further development proper configuration file proper installation script ability to use ping curl with grep or nc to monitor websites granular configuration of timeouts and retries ability to manage admin users twitter integration
website statuspage nodejs javascript expressjs mangodb mangoose bootstrap jquery ejs-templates passportjs
front_end
vue-argon-design-system
h1 id argon design system a href https www creative tim com product vue argon design system vue argon design system a h1 p img src https s3 amazonaws com creativetim bucket products 92 original opt argon vue thumbnail jpg 1534236902 alt product gif p p start your development with a design system for bootstrap 4 it is open source free and it features many components that can help you create amazing websites p h4 id fully coded components fully coded components h4 p vue argon design system is built with over 100 individual components giving you the freedom of choosing and combining all components can take variations in colour that you can easily modify using sass files p p you will save a lot of time going from prototyping to full functional code because all elements are implemented this design system is coming with prebuilt examples so the development process is seamless switching from our pages to the real website is very easy to be done p p every element has multiple states for colors styles hover focus that you can easily access and use p h4 id complex documentation complex documentation h4 p each element is well presented in a very complex documentation you can read more about the idea behind this design system here you can check the components here and the foundation here p h4 id example pages example pages h4 p if you want to get inspiration or just show something directly to your clients you can jump start your development with our pre built example pages you will be able to quickly set up the basic structure for your web project p h2 id table of contents table of contents h2 ul li a href demo demo a li li a href quick start quick start a li li a href documentation documentation a li li a href file structure file structure a li li a href browser support browser support a li li a href resources resources a li li a href reporting issues reporting issues a li li a href technical support or questions technical support or questions a li li a href licensing licensing a li li a href useful links useful links a li ul h2 id demo demo h2 ul li a href https demos creative tim com vue argon design system index page a li li a href https demos creative tim com vue argon design system landing landing page a li li a href https demos creative tim com vue argon design system profile profile page a li li a href https demos creative tim com vue argon design system login login page a li li a href https demos creative tim com vue argon design system register register page a li ul p a href https demos creative tim com argon design system view more a p h2 id quick start quick start h2 ul li a href https github com creativetimofficial vue argon design system archive master zip download from github a li li a href https www creative tim com product vue argon design system download from creative tim a li li clone the repo code class highlighter rouge git clone https github com creativetimofficial vue argon design system git code li ul h2 id documentation documentation h2 p the documentation for the vue argon design system is hosted at our a href https demos creative tim com vue argon design system website a p h2 id file structure file structure h2 p within the download you ll find the following directories and files p div class highlighter rouge div class highlight pre class highlight code argon vue argon design system app vue main js router js assets scss argon scss bootstrap custom vendor font awesome css font awesome css font awesome min css fonts fontawesome otf fontawesome webfont eot fontawesome webfont svg fontawesome webfont ttf fontawesome webfont woff fontawesome webfont woff2 nucleo css nucleo svg css nucleo css fonts nucleo icons eot nucleo icons svg nucleo icons ttf nucleo icons woff nucleo icons woff2 components badge vue basebutton vue basecheckbox vue baseinput vue basenav vue baseradio vue baseslider vue baseswitch vue card vue closebutton vue icon vue navbartogglebutton vue layout appfooter vue appheader vue plugins argon kit js globalcomponents js globaldirectives js views components vue landing vue login vue profile vue register vue components basicelements vue carousel vue customcontrols vue downloadsection vue examples vue hero vue icons vue inputs vue javascriptcomponents vue navigation vue code pre div div h2 id browser support browser support h2 p at present we officially aim to support the last two versions of the following browsers p p img src https s3 amazonaws com creativetim bucket github browser chrome png width 64 height 64 img src https s3 amazonaws com creativetim bucket github browser firefox png width 64 height 64 img src https s3 amazonaws com creativetim bucket github browser edge png width 64 height 64 img src https s3 amazonaws com creativetim bucket github browser safari png width 64 height 64 img src https s3 amazonaws com creativetim bucket github browser opera png width 64 height 64 p h2 id resources resources h2 ul li demo a href https demos creative tim com argon design system https demos creative tim com vue argon design system a li li download a href https www creative tim com product vue argon design system https www creative tim com product vue argon design system a li li license agreement a href https www creative tim com license https www creative tim com license a li li support a href https www creative tim com contact us https www creative tim com contact us a li li issues a href https github com creativetimofficial vue argon design system issues github issues page a li ul h2 id reporting issues reporting issues h2 p we use github issues as the official bug tracker for the vue argon design system here are some advices for our users that want to report an issue p ol li make sure that you are using the latest version of the vue argon design system check the changelog from your copy on our a href https www creative tim com website a li li providing us reproducible steps for the issue will shorten the time it takes for it to be fixed li li some issues may be browser specific so specifying in what browser you encountered the issue might help li ol h2 id technical support or questions technical support or questions h2 p if you have questions or need help integrating the product please a href https www creative tim com contact us contact us a instead of opening an issue p h2 id licensing licensing h2 ul li p copyright 2018 creative tim https www creative tim com p li li p licensed under mit https github com creativetimofficial vue argon design system blob master license md p li ul h2 id useful links useful links h2 ul li a href https www creative tim com bootstrap themes more products a from creative tim li li a href https www youtube com channel ucvytg4scw rovb9ohkzzd1w tutorials a li li a href https www creative tim com bootstrap themes free freebies a from creative tim li li a href https www creative tim com affiliates new affiliate program a earn money li ul h2 id social media social media h2 ul li twitter a href https twitter com creativetim https twitter com creativetim a li li facebook a href https www facebook com creativetim https www facebook com creativetim a li li dribbble a href https dribbble com creativetim https dribbble com creativetim a li li google a href https plus google com creativetimpage https plus google com creativetimpage a li li instagram a href https www instagram com creativetimofficial https www instagram com creativetimofficial a li ul
os
CogStack-Pipeline
archived this project is archived and no longer maintained cogstack nifi https github com cogstack cogstack nifi tree master deploy is the successor to this project and continues to be actively maintained introduction cogstack is a lightweight distributed fault tolerant database processing architecture and ecosystem intended to make nlp processing and preprocessing easier in resource constrained environments it comprises of multiple components where cogstack pipeline the one covered in this documentation has been designed to provide a configurable data processing pipelines for working with ehr data for the moment it mainly uses databases and files as the primary source of ehr data with the possibility of adding custom data connectors soon it makes use of the java spring batch https spring io projects spring batch framework in order to provide a fully configurable data processing pipeline with the goal of generating an annotated json files that can be readily indexed into elasticsearch https www elastic co stored as files or pushed back to a database documentation for the most up to date documentation about usage of cogstack building running with example deployments please refer to the official cogstack confluence page https cogstack atlassian net wiki spaces cogdoc discussion if you have any questions why not reach out to the community discourse forum https discourse cogstack org quick start guide a name intro a introduction tutorial introduction this simple tutorial demonstrates how to get cogstack pipeline running on a sample electronic health record ehr dataset stored initially in an external database cogstack ecosystem has been designed with handling efficiently both structured and unstructured ehr data in mind it shows its strength while working with the unstructured type of data especially as some input data can be provided as documents in pdf or image formats for the moment however we only show how to run cogstack on a set of structured and free text ehrs that have been already digitalized the part covering unstructured type of data in form of pdf documents images and other clinical notes which needs to processed prior to analysis is covered in the official cogstack confluence page https cogstack atlassian net wiki spaces cogdoc this tutorial is divided into 3 parts 1 getting cogstack link getting cogstack 2 a brief description of how does cogstack pipeline work and its ecosystem link how does it work 3 running cogstack pipeline out of the box using the dataset already preloaded into a sample database link running cogstack to skip the brief description and to get hands on running cogstack pipeline please head directly to running cogstack running cogstack part the main directory with resources used in this tutorial is available in the cogstack bundle under examples this tutorial is based on the example 2 however there are more examples available to play with a name getting cogstack a getting cogstack the most convenient way to get cogstack bundle is to download it directly from the official github repository https github com cogstack cogstack pipeline either by cloning the source by using git bash git clone https github com cogstack cogstack pipeline git or by downloading the bundle from the repository s releases page https github com cogstack cogstack pipeline releases and decompressing it a name how does it work a how does cogstack work data processing workflow the data processing workflow of cogstack pipeline is based on java spring batch https spring io framework not to dwell too much into technical details and just to give a general idea the data is being read from a predefined data source later it follows a number of processing operations with the final result stored in a predefined data sink cogstack pipeline implements variety of data processors data readers and writers with scalability mechanisms that can be selected in cogstack job configuration although the data can be possibly read from different sources the most frequently used data sink is elasicsearch https www elastic co for more details about the cogstack functionality please refer to the cogstack documentation https cogstack atlassian net wiki spaces cogdoc overview cogstack extras fig cogstack pipeline sm2 png cogstack platform and data processing workflow content description in this tutorial we only focus on a simple and very common use case where cogstack pipeline reads and process structured and free text ehrs data from a single postgresql database the result is then stored in elasticsearch where the data can be easily queried in kibana https www elastic co products kibana dashboard however cogstack pipeline data processing engine also supports multiple data sources please see example 3 https cogstack atlassian net wiki spaces cogdoc which covers such case a sample cogstack ecosystem cogstack ecosystem consists of multiple inter connected microservices running together for the ease of use and deployment we use docker https www docker com more specifically docker compose https docs docker com compose and provide compose files for configuring and running the microservices the selection of running microservices depends mostly on the specification of ehr data source s data extraction and processing requirements in this tutorial the cogstack ecosystem is composed of the following microservices samples db postgresql database loaded with a sample dataset under db samples name cogstack pipeline cogstack data processing pipeline with worker s cogstack job repo postgresql database for storing information about cogstack jobs elasticsearch 1 elasticsearch search engine single node for storing and querying the processed ehr data kibana kibana data visualization tool for querying the data from elasticsearch since all the examples share the common configuration for the microservices used the base docker compose file is provided in examples docker common docker compose yml the docker compose file with configuration of microservices being overriden for this example can be found in examples example2 docker docker compose override yml both configuration files are automatically used by docker compose when deploying cogstack as will be shown later a name datasets a sample datasets the sample dataset used in this tutorial consists of two types of ehr data synthetic structured synthetic ehrs generated using synthea https synthetichealth github io synthea application medial reports unstructured medical health report documents obtained from mtsamples https www mtsamples com these datasets although unrelated are used together to compose a combined dataset full description of these datasets can be found in the official cogstack confluence page https cogstack atlassian net wiki spaces cogdoc a name running cogstack a running cogstack platform running cogstack pipeline for the first time for the ease of use cogstack is being deployed and run using docker however before starting the cogstack ecosystem for the first time one needs to have the database dump files for sample data either by creating them locally or downloading from amazon s3 to download the database dumps just type in the main examples directory bash bash download db dumps sh next a setup scripts needs to be run locally to prepare the docker images and configuration files for cogstack data processing pipeline the script is available in examples example2 path and can be run as bash bash setup sh as a result a temporary directory deploy will be created containing all the necessary artifacts to deploy cogstack docker based deployment next we can proceed to deploy cogstack ecosystem using docker compose it will configure and start microservices based on the provided compose files common base configuration copied from examples docker common docker compose yml example specific configuration copied from examples example2 docker docker compose override yml moreover the postgresql database container comes with pre initialized database dump ready to be loaded directly into in order to run cogstack type in the examples example2 deploy directory bash docker compose up in the console there will be printed status logs of the currently running microservices for the moment however they may be not very informative sorry we re working on that connecting to the microservices cogstack ecosystem the picture below sketches a general idea on how the microservices are running and communicating within a sample cogstack ecosystem used in this tutorial workflow docs quickstart assets uservices png cogstack data processing workflow connecting to es kibana and postgresql assuming that everything is working fine we should be able to connect to the running microservices selected running services elasticsearch 1 and kibana have their port connections forwarded to host localhost kibana and elasticsearch kibana dashboard used to query the ehrs can be accessed directly in browser via url http localhost 5601 the data can be queried using a number of elasticsearch indices e g sample observations view usually each index will correspond to the database view in db samples samples db postgresql database from which the data was ingested however when entering kibana dashboard for the first time an index pattern needs to be configured in the kibana management panel for more information about its creation please refer to the official kibana documentation https www elastic co guide en kibana current tutorial define index html in addition elasticsearch rest end point can be accessed via url http localhost 9200 it can be used to perform manual queries or to be used by other external services for example one can list the available indices bash curl http localhost 9200 cat indices or query one of the available indices sample observations view bash curl http localhost 9200 sample observations view for more information about possible documents querying or modification operations please refer to the official elasticsearch documentation https www elastic co guide en elasticsearch reference current getting started html as a side note the name for elasticsearch node in the docker compose has been set as elasticsearch 1 the 1 ending emphasizes that for larger scale deployments multiple elasticsearch nodes can be used typically a minimum of 3 postgresql sample database moreover the access postgresql database with the input sample data is exposed directly at localhost 5555 the database name is db sample with user test and password test to connect one can run bash psql u test w d db samples h localhost p 5555 publications cogstack experiences of deploying integrated information retrieval and extraction services in a large national health service foundation trust hospital richard jackson asha agrawal kenneth lui amos folarin honghan wu tudor groza angus roberts genevieve gorrell xingyi song damian lewsley doug northwood clive stringer robert stewart richard dobson bmc medical informatics and decision making 18 no 1 2018 47 https dx doi org 10 1186 2fs12911 018 0623 9 logos extras fig kcl boxed redcmyk a4 002 3 gif logos extras fig logo nhs png logos extras fig dnmkjemkekmbnegf png cogstack pipeline extras fig chdabdkadmelbenn png logos extras fig cti banner jpg logos extras fig igimnobhggalgaln png logos extras fig kmlbdnlfopmabpbk png logos extras fig bojpdbeeffipbedm png
batch-processing cogstack elasticsearch spring nlp tika tesseract ocr semantic-search alerting
ai
memfault-firmware-sdk
circleci https circleci com gh memfault memfault firmware sdk svg style svg https circleci com gh memfault memfault firmware sdk coverage https img shields io codecov c gh memfault memfault firmware sdk master https codecov io gh memfault memfault firmware sdk memfault firmware sdk ship firmware with confidence more details about the memfault platform itself how it works and step by step integration guides can be found here https mflt io embedded getting started getting started to start integrating in your platform today create a memfault cloud account https mflt io signup components the sdk is designed as a collection of components so you can include only what is needed for your project the sdk has been designed to have minimal impact on code space bandwidth and power consumption the components components directory folder contains the various components of the sdk each component contains a readme md source code header files and platform header files the platform header files describe the interfaces which the component relies on that you must implement for some of the platform dependencies we have provided ports that can be linked into your system directly or used as a template for further customization you can find them in the ports ports folder for some of the popular mcus vendor sdks we have already provided a reference implementation for platform dependencies which can be found in the examples examples folder these can also serve as a good example when initially setting up the sdk on your platform main components panics fault handling coredump and reboot tracking and reboot loop detection api metrics used to monitor device health over time i e connectivity battery life mcu resource utilization hardware degradation etc please refer to the readme md in each of these for more details support components core common code that is used by all other components demo common code that is used by demo apps for the various platforms http http client api to post coredumps and events directly to the memfault service from devices util various utilities integrating the memfault sdk add memfault sdk to your repository the memfault sdk can be added directly into your repository the structure typically looks like your project third party memfault memfault firmware sdk submodule files where port to your platform will be implemented memfault platform port c memfault platform coredump regions c configuration headers memfault platform config h memfault trace reason user config def memfault metrics heartbeat config def memfault platform log config h if you are using git the memfault sdk is typically added to a project as a submodule git submodule add git github com memfault memfault firmware sdk git your project third party memfault memfault firmware sdk this makes it easy to track the history of the memfault sdk you should not need to make modifications to the memfault sdk the typical update flow is git pull the latest upstream check changes md changes md to see if any modifications are needed update to the new submodule commit in your repo alternatively the memfault sdk may be added to a project as a git subtree or by copying the source into a project add sources to build system make if you are using make makefiles memfaultworker mk can be used to very easily collect the source files and include paths required by the sdk c memfault sdk root the to the root of this repo from your project memfault components the sdk components to be used i e core util include memfault sdk root makefiles memfaultworker mk your src files memfault components srcs your include paths memfault components inc folders cmake if you are using cmake cmake memfault cmake in a similar fashion to collection source files and include paths c set memfault sdk root the path to the root of the memfault firmware sdk repo list append memfault components the sdk components to be used i e core util include memfault sdk root cmake memfault cmake memfault library memfault sdk root memfault components memfault components srcs memfault components inc folders memfault components srcs contains the sources needed for the library and memfault components inc folders contains the include paths other build systems if you are not using one of the above build systems to include the sdk you need to do is add the c files located at components component src c to your build system add components component include to the include paths you pass to the compiler running the unit tests the sdk code is covered extensively by unit tests they can be found in the tests folder if you d like to run them yourself check out the instructions in tests readme md tests readme md to learn more about unit testing best practices for firmware development check out our blog post on this topic https interrupt memfault com blog unit testing basics the unit tests are run by circleci upon every commit to this repo see badges at the top for build test coverage status of the master branch faq why does a coredump not show up under issues after uploading it make sure to upload the symbols to the same project to which you upload coredumps also make sure the software type and software version reported by the device see device information in components core readme md match the software type and software version that was entered when creating the software version and symbol artifact online more information on build ids and uploading symbol files can be found here https mflt io symbol file build ids i m getting error xyz what to do now don t hesitate to contact us for help you can reach us through support memfault com mailto support memfault com license unless specifically indicated otherwise in a file all memfault firmware sdk files are all licensed under the memfault license license txt a few files in the examples examples and ports ports directory are licensed differently based on vendor requirements
memfault embedded firmware sdk rtos
os
Quad-Sim
quadcopter dynamic modeling and simulation quad sim v1 00 overview to download click download zip on the right to download all of our materials as a single file a youtube video providing a brief overview of our project was created for the 2014 matlab and simulink student design challenge this video can be viewed at http youtu be kigwzfglgio copyright c 2014 d hartman k landis m mehrer s moreno j kim contact please email reasonable questions suggestions and complaints to dchengr gmail com provided here is an assortment of materials designed to assist users in modeling and simulation of a quadcopter specifically test rig designs for component performance measurement several matlab data analysis tools and guis r2013a tested a configurable simulink quadcopter simulation and a bit more stuff the full package should be available for download at https github com dch33 quad sim these materials are partially the result of a senior design project at drexel university the team consisted of d hartman k landis m mehrer s moreno and j kim our faculty advisor was dr b c chang as this is our first attempt at a public release of our materials there are undoubtedly errors omissions and downright lies contained herein expect frequent updates as we find and correct issues we do not claim to be experts all of our materials are provided simply as a service to the multi rotor community in sincere hope that it will prove useful as a basis for further inquiry users are expected to reference our materials against more reliable sources and use their best judgment or consult professional advice where appropriate particularly where safety may be a concern quadcopters and rc vehicles are dangerous and are not toys use caution and follow all manufacturer safety instructions that said we hope you find these materials helpful good luck general instructions we provide documentation and instructions related to quadcopter dynamic modeing and simulation for control design a good starting point is to take a close look at what materials are provided within these documents and see how it fits into your project needs in general it would be advisable to add all of the matlab and simulink related folders to the matlab path so that they can be easily accessed within the matlab environment once you understand what we provide you can tackle the materials in any order or split up tasks among a team generally speaking the order of tasks should be fairly self evident and to some degree flexible depending on the needs of your project and your available resources license notice this file is part of a quadcopter dynamic modeling and simulation package quad sim quad sim is free you can redistribute it and or modify it under the terms of the gnu lesser general public license as published by the free software foundation either version 3 of the license or at your option any later version quad sim is distributed in the hope that it will be useful but without any warranty without even the implied warranty of merchantability or fitness for a particular purpose see the gnu lesser general public license for more details you should have received a copy of the gnu lesser general public license along with quad sim if not see http www gnu org licenses
os
zephyr-rtos-tutorial
a href https www zephyrproject org p align center img src images logo no bg png p a a step by step guide that teaches you how to use zephyr rtos it assumes knowledge of c no previous experience with rtos basic embedded electronics knowledge gpio timers interrupt each lesson builds on the previous one most lessons end with exercises with solutions that show how the covered concepts can be used in a practical application this tutorial is under active development if you want to participate please read the contribution guide docs contributions md a web version of this tutorial can be found here https maksimdrachov github io zephyr rtos tutorial table of contents x introduction docs introduction md x contribution guide docs contributions md x prerequisites docs prerequisites md lesson 1 zephyr setup 1 1 installation x macos docs 1 zephyr setup install mac os md linux docs 1 zephyr setup install linux md windows docs 1 zephyr setup install windows md 1 2 basic workspace setup x macos docs 1 zephyr setup setup mac os md linux docs 1 zephyr setup setup linux md windows docs 1 zephyr setup setup windows md x lesson 2 introduction x 2 1 rtos basics docs 2 introduction rtos basics md x 2 2 zephyr structure docs 2 introduction zephyr structure md x 2 3 tutorial structure docs 2 introduction tutorial structure md x lesson 3 threads x 3 1 introduction docs 3 threads introduction md x 3 2 commands docs 3 threads commands md x 3 3 kconfig docs 3 threads kconfig md x 3 4 exercise docs 3 threads exercise md x lesson 4 gpio x 4 1 introduction docs 4 gpio introduction md x 4 2 commands docs 4 gpio commands md x 4 3 kconfig docs 4 gpio kconfig md x 4 4 exercise docs 4 gpio exercise md x lesson 5 scheduling x 5 1 introduction docs 5 scheduling introduction md x 5 2 commands docs 5 scheduling commands md x 5 3 kconfig docs 5 scheduling kconfig md x 5 4 exercise docs 5 scheduling exercise md lesson 6 logging x 6 1 introduction docs 6 logging introduction md x 6 2 commands docs 6 logging commands md x 6 3 kconfig docs 6 logging kconfig md 6 4 exercise docs 6 logging exercise md lesson 7 debugging x 7 1 introduction docs 7 debugging introduction md x 7 2 commands docs 7 debugging commands md x 7 3 kconfig docs 7 debugging kconfig md x 7 4 exercise docs 7 debugging exercise md x lesson 8 interrupts x 8 1 introduction docs 8 interrupts introduction md x 8 2 commands docs 8 interrupts commands md x 8 3 kconfig docs 8 interrupts kconfig md x 8 4 exercise docs 8 interrupts exercise md lesson 9 timers x 9 1 introduction docs 9 timers introduction md x 9 2 commands docs 9 timers commands md x 9 3 kconfig docs 9 timers kconfig md x 9 4 exercise docs 9 timers exercise md lesson 10 mutexes x 10 1 introduction docs 10 mutexes introduction md x 10 2 commands docs 10 mutexes commands md x 10 3 kconfig docs 10 mutexes kconfig md 10 4 exercise docs 10 mutexes exercise md useful links general zephyr official documentation https docs zephyrproject org latest introduction to the zephyr rtos video https www youtube com watch v jr5e5kz9a k awesome zephyr https github com fkromer awesome zephyr youtube channels the linux foundation https www youtube com c linuxfoundationorg search query zephyr zephyr project https www youtube com c zephyrproject videos projects using zephyr golioth iot cloud platform https github com golioth zephyr sdk battery management system https github com scttnlsn bms mg100 iot sensor module https github com lairdcp mg100 firmware anyl embedded crypto wallet for iot https github com anylsite anyl wallet ble sensor https github com patrickmoffitt zephyr ble sensor pinetime smartwatch https github com endian albin pinetime hypnos uwb position tracking https github com rt loc zephyr dwm1001 zmk mechanical keyboard firmware https github com zmkfirmware zmk air quality sensor https github com exploratoryengineering air quality sensor node zephyr libraries zephyr scientific library https github com zscilib zscilib micro ros zephyr module https github com micro ros micro ros zephyr module sof sound dsp firmware https github com thesofproject sof contact follow me on twitter https twitter com maksimdrachov stay up to date on my latest blogs projects
embedded zephyr zephyr-rtos tutorial
os
MLfromscratch
ml algorithms from scratch machine learning algorithm implementations from scratch you can find tutorials with the math and code explanations on my channel here https www youtube com playlist list plqnslrfeh2upcrywf u2etjdxxkl8nl7e algorithms implemented knn linear regression logistic regression naive bayes perceptron svm decision tree random forest principal component analysis pca k means adaboost linear discriminant analysis lda installation and usage this project has 2 dependencies numpy for the maths implementation and writing the algorithms scikit learn for the data generation and testing matplotlib for the plotting pandas for loading data note do note that only numpy is used for the implementations others help in the testing of code and making it easy for us instead of writing that too from scratch you can install these using the command below sh linux or macos pip3 install r requirements txt windows pip install r requirements txt you can run the files as following sh python m mlfromscratch algorithm file with algorithm file being the valid filename of the algorithm without the extension for example if i want to run the linear regression example i would do python m mlfromscratch linear regression watch the playlist alt text https img youtube com vi nglyx54e1lu hqdefault jpg https www youtube com watch v nglyx54e1lu list plqnslrfeh2upcrywf u2etjdxxkl8nl7e
ai
Capstone_Project-Sentiment_Analysis
nlp sentiment analysis on beauty products introduction the company wants to develop a software tool that will identify the positive and negative words which customers use when they write reviews for the beauty products as their purchase inclination for that they gave their 9 years beauty products reviews between 2005 2014 and asked us to develop a model which will identify positive and negative words used in the reviews as a component of customer s sentiment towards to the company s beauty products project proposal the problem and approach to solve the problem in project proposal you can click here https github com shiningdata capstone project sentiment analysis blob master project proposal capstone 20project 20proposal pdf to reach it data the data is in standford analysis project snap webpage the original data was in a json format there in order to analyze the data i imported json package and decoded json file with using query in order to convert json file to csv file format the original dataset for this project can be found here http snap stanford edu data amazon productgraph categoryfiles reviews beauty 10 json gz data wrangling data insection and text preprocessing is completed in this section it can found here https github com shiningdata capstone project sentiment analysis blob master data wrangling amazon beauty products review sentiment analysis ipynb exploratory data analysis univarate and bivariate analysises are done by using bar charts and wordcloud visualizations these analysis can be found here https github com shiningdata capstone project sentiment analysis blob master data storytelling sentiment analysis data storytelling ipynb interim report can be found here https github com shiningdata capstone project sentiment analysis blob master interim report capstone 20project 20interim pdf modeling this is a supervised binary classification problem we are trying to predict the sentiment based on the reviews left by females who bought beauty products in amazon e commerce online platform we used python s scikit learn libraries to solve the problem in this context we implemented logistic regression random forest naive bayes xgboost catboost algorithms and simple neural network as well since the ratings of the reviews were not distributed normally we decided to decrease rating classes from 5 to 2 by merging rating 1 2 as bad and rating 4 5 as good while dropping rating 3 from the dataset for feature selection we applied threshold for word occurrence with using min df max df pca and singular value decomposition for feature engineering we applied countvectorizer tf idf hashing vectorizer and word2vec to the text data in order to turn a collection of text documents into numerical feature vectors modeling notebooks can be reached from following links count vectorizer tf idf hashing vectorizer with traditional algorithms https github com shiningdata capstone project sentiment analysis blob master deliverables sentiment analysis 1 cv tf idf hash ipynb threshold for word occurence pca singular value decomposition smote techniques used with traditional algorithms https github com shiningdata capstone project sentiment analysis blob master deliverables sentiment analysis 2 expwordlst smote pca trnctdsvd ipynb word2vec with keras https github com shiningdata capstone project sentiment analysis blob master deliverables sentiment analysis 3 word2vec keras ipynb conclusion final report can be found here https github com shiningdata capstone project sentiment analysis blob master deliverables capstone 20project 20final 20report pdf project presentation can be found here https github com shiningdata capstone project sentiment analysis blob master deliverables capstone 20project 20presentation pptx project presentation keynote can be found here https github com shiningdata capstone project sentiment analysis blob master deliverables capstone 20project 20presentation 20keynotes pdf
ai
Microservice
circleci https dl circleci com status badge img gh anteneh2121 microservice tree main svg style svg https dl circleci com status badge redirect gh anteneh2121 microservice tree main udagram image filtering application udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into two parts 1 frontend angular web application built with ionic framework 2 backend restful api node express application getting started tip it s recommended that you start with getting the backend api running since the frontend web application depends on the api prerequisite 1 the depends on the node package manager npm you will need to download and install node from https nodejs com en download https nodejs org en download this will allow you to be able to run npm commands 2 environment variables will need to be set these environment variables include database connection details that should not be hard coded into the application code environment script a file named set env sh has been prepared as an optional tool to help you configure these variables on your local development environment we do not want your credentials to be stored in git after pulling this starter project run the following command to tell git to stop tracking the script in git but keep it stored locally this way you can use the script for your convenience and reduce risk of exposing your credentials git rm cached set env sh afterwards we can prevent the file from being included in your solution by adding the file to our gitignore file 1 database create a postgresql database either locally or on aws rds the database is used to store the application s metadata we will need to use password authentication for this project this means that a username and password is needed to authenticate and access the database the port number will need to be set as 5432 this is the typical port that is used by postgresql so it is usually set to this port by default once your database is set up set the config values for environment variables prefixed with postgres in set env sh if you set up a local database your postgres host is most likely localhost if you set up an rds database your postgres host is most likely in the following format us west 1 rds amazonaws com you can find this value in the aws console s rds dashboard 2 s3 create an aws s3 bucket the s3 bucket is used to store images that are displayed in udagram set the config values for environment variables prefixed with aws in set env sh 3 backend api launch the backend api locally the api is the application s interface to s3 and the database to download all the package dependencies run the command from the directory udagram api bash npm install to run the application locally run bash npm run dev you can visit http localhost 8080 api v0 feed in your web browser to verify that the application is running you should see a json payload feel free to play around with postman to test the api s 4 frontend app launch the frontend app locally to download all the package dependencies run the command from the directory udagram frontend bash npm install install ionic framework s command line tools for us to build and run the application bash npm install g ionic prepare your application by compiling them into static files bash ionic build run the application locally using files created from the ionic build command bash ionic serve you can visit http localhost 8100 in your web browser to verify that the application is running you should see a web interface tips 1 take a look at udagram api does it look like we can divide it into two modules to be deployed as separate microservices 2 the dockerignore file is included for your convenience to not copy node modules copying this over into a docker container might cause issues if your local environment is a different operating system than the docker image ex windows or macos vs linux 3 it s useful to lint your code so that changes in the codebase adhere to a coding standard this helps alleviate issues when developers use different styles of coding eslint has been set up for typescript in the codebase for you to lint your code run the following bash npx eslint ext js ts src to have your code fixed automatically run bash npx eslint ext js ts src fix cercle ci pipline image https user images githubusercontent com 88293613 199466252 13b74f2a 3338 4542 8ec0 0775502025a7 png runing container pods image https user images githubusercontent com 88293613 199466882 edc9f78f a563 43a6 adf1 e874c5030527 png 4 set env sh is really for your backend application frontend applications have a different notion of how to store configurations configurations for the application endpoints can be configured inside of the environments environment ts files 5 in set env sh environment variables are set with export var value setting it this way is not permanent every time you open a new terminal you will have to run set env sh to reconfigure your environment variables to verify if your environment variable is set you can check the variable with a command like echo postgres username p about me p p backend cloud p boy b anteneh bizuneh b br nbsp nbsp nbsp nbsp nbsp a href https twitter com twitter a nbsp nbsp nbsp nbsp nbsp a href https github com github a nbsp nbsp nbsp nbsp nbsp a href https linkedin com linkedin a br br
cloud
schema
schema ui framework schema is a modular responsive front end framework to easily and quickly help you jumpstart your process in building complex interfaces for the web right out the box it was created by dan malarkey http danmalarkey com and the feature 23 http feature23 com team learn more about schema and how to use it by reviewing the documentation http danmalarkey github io schema get started download the latest release of schema https github com danmalarkey schema releases tag 2 0 0 clone the repo locally git clone https github com danmalarkey schema git install with bower bower install schema ui schema documentation download and use schema at http danmalarkey github io schema bugs if you find bugs please review the issues section https github com danmalarkey schema issues to verify no one else has reported it if not please create a new issue https github com danmalarkey schema issues new and label it accordingly versioning i m new to developing and maintaining software but i m going to try my best i m sure i ll screw something up to stick to semantic versioning http semver org changelog releases as of v2 i will be documenting releases much more than i did with version one check out the releases during development https github com danmalarkey schema releases license schema is licensed under the mit license http opensource org licenses mit easydropdown by patrick kunka is licensed under license creative commons attribution 3 0 unported cc by 3 0 http creativecommons org licenses by 3 0 contact email hello danmalarkey com twitter http twitter com dan malarkey headquartered http feature23 com work ui ux designer http feature23 com
front_end
ServiceNever
servicenever servicenever is a help desk ticketing system service no i mean never is just as useful as a sharepoint list designed by it for it submit a ticket https sharepointlist com features provides helpful troubleshooting resources for end users request a new computer download more ram restart computer button log in portal download virus countdown timer for next available phone representative easy type text fields hit the space bar and it will enable autotyping hover over the description box to and the cursor will change to a loading icon description will type in rainbow font for better viewing where s waldo captcha designed to keep all bots and users out first submit button flashes red and will also move away from the cursor second submit button underneath that takes the description of problem box and conveniently places it into a https lmgt org search added chat box so users can now chat with tech support
os
DataModelingWithPostgres
modeling sparkify database with postgres project summary this project models transforms the song user data for the sparkify company using postgres so that it can be analyzed by their analytical team it uses a star schema with dimension fact tables to make it easy to write analytical sql queries on the database it uses a python script to etl the data from the song log files to the postgres database how to run the project please run the following python scripts 1 create tables py it creates the database the dimension fact tables for the sparkify database 1 etl py it loads the data into the sparkify database from the song log files explanation of files you ll find the following files in the repo 1 test ipynb displays the first few rows of each table to let you check your database 1 create tables py drops and creates your tables you run this file to reset your tables before each time you run your etl scripts 1 etl ipynb reads and processes a single file from song data and log data and loads the data into your tables this notebook contains detailed instructions on the etl process for each of the tables 1 etl py reads and processes files from song data and log data and loads them into your tables you can fill this out based on your work in the etl notebook 1 sql queries py contains all your sql queries and is imported into the last three files above 1 readme md provides discussion on your project
server
NLP-Guide
h1 align center img src https user images githubusercontent com 45159366 131386286 e23991d5 a1aa 4ee9 9582 874dc0854c1a png br natural language processing nlp guide h1 a href https github com mikeroyal tab followers img alt followers title follow me for updates src https custom icon badges demolab com github followers mikeroyal color 236ad3 labelcolor 1155ba style for the badge logo person add label follow logocolor white a a guide covering natural language processing nlp including the applications libraries and tools that will make you a better and more efficient natural language processing nlp development note you can easily convert this markdown file to a pdf in vscode https code visualstudio com using this handy extension markdown pdf https marketplace visualstudio com items itemname yzane markdown pdf p align center img src https user images githubusercontent com 45159366 131386296 7fc72332 fb40 421c 82e1 76be7acc6307 png br p table of contents 1 getting started with natural language processing nlp https github com mikeroyal nlp guide getting started with nlp developer resources developer resources nlp training courses certifications nlp training courses certifications 2 natural language processing nlp tools libraries and frameworks https github com mikeroyal nlp guide nlp tools libraries and frameworks 3 algorithms https github com mikeroyal nlp guide algorithms 4 machine learning https github com mikeroyal nlp guide machine learning 5 cuda development https github com mikeroyal nlp guide cuda development 6 matlab development https github com mikeroyal nlp guide matlab development 7 c c development https github com mikeroyal nlp guide cc development 8 python development https github com mikeroyal nlp guide python development 9 java development https github com mikeroyal nlp guide java development 10 r development https github com mikeroyal nlp guide r development 11 julia development https github com mikeroyal nlp guide julia development getting started with nlp back to the top table of contents natural language processing nlp https www ibm com cloud learn natural language processing is a branch of artificial intelligence ai focused on giving computers the ability to understand text and spoken words in much the same way human beings can nlp combines computational linguistics rule based modeling of human language with statistical machine learning and deep learning models developer resources natural language processing with python s nltk package https realpython com nltk nlp python cognitive services apis for ai developers microsoft azure https azure microsoft com en us services cognitive services artificial intelligence services amazon web services aws https aws amazon com machine learning ai services google cloud natural language api https cloud google com natural language docs reference rest nlp training courses certifications top natural language processing courses online udemy https www udemy com topic natural language processing introduction to natural language processing nlp udemy https www udemy com course natural language processing top natural language processing courses coursera https www coursera org courses query natural 20language 20processing natural language processing coursera https www coursera org learn language processing natural language processing in tensorflow coursera https www coursera org learn natural language processing tensorflow learn natural language processing with online courses and lessons edx https www edx org learn natural language processing build a natural language processing solution with microsoft azure pluralsight https www pluralsight com courses build natural language processing solution microsoft azure natural language processing nlp training courses nobleprog https www nobleprog com nlp training natural language processing with deep learning course standford online https online stanford edu courses cs224n natural language processing deep learning advanced natural language processing mit opencourseware https ocw mit edu courses electrical engineering and computer science 6 864 advanced natural language processing fall 2005 certified natural language processing expert certification iabac https iabac org artificial intelligence certification certified natural language processing expert natural language processing course intel https software intel com content www us en develop training course natural language processing html nlp tools libraries and frameworks back to the top https github com mikeroyal nlp guide table of contents natural language toolkit nltk https www nltk org is a leading platform for building python programs to work with human language data it provides easy to use interfaces to over 50 corpora and lexical resources https nltk org nltk data such as wordnet along with a suite of text processing libraries for classification tokenization stemming tagging parsing and semantic reasoning wrappers for industrial strength nlp libraries spacy https spacy io is a library for advanced natural language processing in python and cython it s built on the very latest research and was designed from day one to be used in real products spacy comes with pretrained pipelines and currently supports tokenization and training for 60 languages it also features neural network models for tagging parsing named entity recognition text classification and more multi task learning with pretrained transformers like bert corenlp https stanfordnlp github io corenlp is a set of natural language analysis tools written in java corenlp enables users to derive linguistic annotations for text including token and sentence boundaries parts of speech named entities numeric and time values dependency and constituency parses coreference sentiment quote attributions and relations nlpnet https github com erickrf nlpnet is a python library for natural language processing tasks based on neural networks it performs part of speech tagging semantic role labeling and dependency parsing flair https github com flairnlp flair is a simple framework for state of the art natural language processing nlp models to your text such as named entity recognition ner part of speech tagging pos special support for biomedical data sense disambiguation and classification with support for a rapidly growing number of languages catalyst https github com curiosity ai catalyst is a c natural language processing library built for speed inspired by spacy s design https spacy io it brings pre trained models out of the box support for training word and document embeddings and flexible entity recognition models apache opennlp https opennlp apache org is an open source library for a machine learning based toolkit used in the processing of natural language text it features an api for use cases like named entity recognition https en wikipedia org wiki named entity recognition sentence detection pos part of speech tagging https en wikipedia org wiki part of speech tagging tokenization https en wikipedia org wiki tokenization data security feature extraction https en wikipedia org wiki feature extraction chunking https en wikipedia org wiki chunking psychology parsing https en wikipedia org wiki parsing and coreference resolution https en wikipedia org wiki coreference dynet https github com clab dynet is a neural network library developed by carnegie mellon university and many others it is written in c with bindings in python and is designed to be efficient when run on either cpu or gpu and to work well with networks that have dynamic structures that change for every training instance these kinds of networks are particularly important in natural language processing tasks and dynet has been used to build state of the art systems for syntactic parsing machine translation morphological inflection and many other application areas mlpack https mlpack org is a fast flexible c machine learning library written in c and built on the armadillo https arma sourceforge net linear algebra library the ensmallen https ensmallen org numerical optimization library and parts of boost https boost org opennn https www opennn net is an open source neural networks library for machine learning it contains sophisticated algorithms and utilities to deal with many artificial intelligence solutions microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework tensorflow model optimization toolkit https www tensorflow org model optimization is a suite of tools that users both novice and advanced can use to optimize machine learning models for deployment and execution it provides supported techniques that include quantization and pruning for sparse weights along with apis built specifically for keras deepspars https github com neuralmagic deepsparse is an inference runtime offering gpu class performance on cpus and apis to integrate ml into your application intel neural compressor https github com intel neural compressor is a low precision optimization tool targeting to provide unified apis for network compression technologies such as low precision quantization sparsity pruning knowledge distillation across different deep learning frameworks to pursue optimal inference performance keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml pytorch https pytorch org is a library for deep learning on irregular input data such as graphs point clouds and manifolds primarily developed by facebook s ai research lab eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks chainer https chainer org is a python based deep learning framework aiming at flexibility it provides automatic differentiation apis based on the define by run approach dynamic computational graphs as well as object oriented high level apis to build and train neural networks it also supports cuda cudnn using cupy https github com cupy cupy for high performance training and inference anaconda https www anaconda com is a very popular data science platform for machine learning and deep learning that enables users to develop models train them and deploy them plaidml https github com plaidml plaidml is an advanced and portable tensor compiler for enabling deep learning on laptops embedded devices or other devices where the available computing hardware is not well supported or the available software stack contains unpalatable license restrictions scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms caffe https github com bvlc caffe is a deep learning framework made with expression speed and modularity in mind it is developed by berkeley ai research bair the berkeley vision and learning center bvlc and community contributors theano https github com theano theano is a python library that allows you to define optimize and evaluate mathematical expressions involving multi dimensional arrays efficiently including tight integration with numpy apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture apache airflow https airflow apache org is an open source workflow management platform created by the community to programmatically author schedule and monitor workflows airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers airflow is ready to scale to infinity open neural network exchange onnx https github com onnx is an open ecosystem that empowers ai developers to choose the right tools as their project evolves onnx provides an open source format for ai models both deep learning and traditional ml it defines an extensible computation graph model as well as definitions of built in operators and standard data types bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters numba https github com numba numba is an open source numpy aware optimizing compiler for python sponsored by anaconda inc it uses the llvm compiler project to generate machine code from python syntax numba can compile a large subset of numerically focused python including many numpy functions additionally numba has support for automatic parallelization of loops generation of gpu accelerated code and creation of ufuncs and c callbacks algorithms back to the top https github com mikeroyal nlp guide table of contents fuzzy logic https www investopedia com terms f fuzzy logic asp is a heuristic approach that allows for more advanced decision tree processing and better integration with rules based programming p align center img src https user images githubusercontent com 45159366 123861872 858dce80 d8dc 11eb 9a2c 51205d1541e9 png br p architecture of a fuzzy logic system source researchgate https www researchgate net figure architecture of a fuzzy logic system fig2 309452475 support vector machine svm https web stanford edu hastie mooc slides svm pdf is a supervised machine learning model that uses classification algorithms for two group classification problems p align center img src https user images githubusercontent com 45159366 123858065 ec5cb900 d8d7 11eb 81c5 c6a8feefa84f png br p support vector machine svm source openclipart https openclipart org detail 182977 svm support vector machines neural networks https www ibm com cloud learn neural networks are a subset of machine learning and are at the heart of deep learning algorithms the name structure is inspired by the human brain copying the process that biological neurons nodes signal to one another p align center img src https user images githubusercontent com 45159366 123858036 e5ce4180 d8d7 11eb 8c52 43d7c7e6e3c4 png br p deep neural network source ibm https www ibm com cloud learn neural networks convolutional neural networks r cnn https stanford edu shervine teaching cs 230 cheatsheet convolutional neural networks is an object detection algorithm that first segments the image to find potential relevant bounding boxes and then run the detection algorithm to find most probable objects in those bounding boxes p align center img src https user images githubusercontent com 45159366 123858026 e36be780 d8d7 11eb 9034 8859d6f09490 png br p convolutional neural networks source cs231n https cs231n github io convolutional networks conv recurrent neural networks rnns https www ibm com cloud learn recurrent neural networks is a type of artificial neural network which uses sequential data or time series data p align center img src https user images githubusercontent com 45159366 123858062 ebc42280 d8d7 11eb 9252 97e058bda8bd png br p recurrent neural networks source slideteam https www slideteam net recurrent neural networks rnns ppt powerpoint presentation file templates html multilayer perceptrons mlps https deepai org machine learning glossary and terms multilayer perceptron is multi layer neural networks composed of multiple layers of perceptrons https en wikipedia org wiki perceptron with a threshold activation p align center img src https user images githubusercontent com 45159366 123858053 e8c93200 d8d7 11eb 844c 60463ecf662c png br p multilayer perceptrons source deepai https deepai org machine learning glossary and terms multilayer perceptron random forest https www ibm com cloud learn random forest is a commonly used machine learning algorithm which combines the output of multiple decision trees to reach a single result a decision tree in a forest cannot be pruned for sampling and therefore prediction selection its ease of use and flexibility have fueled its adoption as it handles both classification and regression problems p align center img src https user images githubusercontent com 45159366 124398881 fe21d000 dccc 11eb 8f5f 0a0730d85d55 png br p random forest source wikimedia https community tibco com wiki random forest template tibco spotfirer wiki page decision trees https www cs cmu edu bhiksha courses 10 601 decisiontrees are tree structured models for classification and regression p align center img src https user images githubusercontent com 45159366 124398883 ffeb9380 dccc 11eb 9adb 66729a353132 png br p decision trees source cmu http www cs cmu edu bhiksha courses 10 601 decisiontrees naive bayes https en wikipedia org wiki naive bayes classifier is a machine learning algorithm that is used solved calssification problems it s based on applying bayes theorem https www mathsisfun com data bayes theorem html with strong independence assumptions between the features p align center img src https user images githubusercontent com 45159366 124398885 00842a00 dccd 11eb 89c1 bd4c1adbf305 png br p bayes theorem source mathisfun https www mathsisfun com data bayes theorem html machine learning back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 96352527 ad077880 1078 11eb 98b7 da1c0586cf0e png br p img src https user images githubusercontent com 45159366 105645196 dccfd480 5e4e 11eb 95d1 c5eb560b72fd jpeg machine learning deep learning frameworks learning resources for ml machine learning https www ibm com cloud learn machine learning is a branch of artificial intelligence ai focused on building apps using algorithms that learn from data models and improve their accuracy over time without needing to be programmed machine learning by stanford university from coursera https www coursera org learn machine learning aws training and certification for machine learning ml courses https aws amazon com training learning paths machine learning machine learning scholarship program for microsoft azure from udacity https www udacity com scholarships machine learning scholarship microsoft azure microsoft certified azure data scientist associate https docs microsoft com en us learn certifications azure data scientist microsoft certified azure ai engineer associate https docs microsoft com en us learn certifications azure ai engineer azure machine learning training and deployment https docs microsoft com en us azure devops pipelines targets azure machine learning learning machine learning and artificial intelligence from google cloud training https cloud google com training machinelearning ai machine learning crash course for google cloud https developers google com machine learning crash course jupyterlab https jupyterlab readthedocs io scheduling jupyter notebooks on amazon sagemaker ephemeral instances https aws amazon com blogs machine learning scheduling jupyter notebooks on sagemaker ephemeral instances how to run jupyter notebooks in your azure machine learning workspace https docs microsoft com en us azure machine learning how to run jupyter notebooks machine learning courses online from udemy https www udemy com topic machine learning machine learning courses online from coursera https www coursera org courses query machine 20learning learn machine learning with online courses and classes from edx https www edx org learn machine learning ml frameworks libraries and tools tensorflow https www tensorflow org is an end to end open source platform for machine learning it has a comprehensive flexible ecosystem of tools libraries and community resources that lets researchers push the state of the art in ml and developers easily build and deploy ml powered applications keras https keras io is a high level neural networks api written in python and capable of running on top of tensorflow cntk or theano it was developed with a focus on enabling fast experimentation it is capable of running on top of tensorflow microsoft cognitive toolkit r theano or plaidml pytorch https pytorch org is a library for deep learning on irregular input data such as graphs point clouds and manifolds primarily developed by facebook s ai research lab amazon sagemaker https aws amazon com sagemaker is a fully managed service that provides every developer and data scientist with the ability to build train and deploy machine learning ml models quickly sagemaker removes the heavy lifting from each step of the machine learning process to make it easier to develop high quality models azure databricks https azure microsoft com en us services databricks is a fast and collaborative apache spark based big data analytics service designed for data science and data engineering azure databricks sets up your apache spark environment in minutes autoscale and collaborate on shared projects in an interactive workspace azure databricks supports python scala r java and sql as well as data science frameworks and libraries including tensorflow pytorch and scikit learn microsoft cognitive toolkit cntk https docs microsoft com en us cognitive toolkit is an open source toolkit for commercial grade distributed deep learning it describes neural networks as a series of computational steps via a directed graph cntk allows the user to easily realize and combine popular model types such as feed forward dnns convolutional neural networks cnns and recurrent neural networks rnns lstms cntk implements stochastic gradient descent sgd error backpropagation learning with automatic differentiation and parallelization across multiple gpus and servers apple coreml https developer apple com documentation coreml is a framework that helps integrate machine learning models into your app core ml provides a unified representation for all models your app uses core ml apis and user data to make predictions and to train or fine tune models all on the user s device a model is the result of applying a machine learning algorithm to a set of training data you use a model to make predictions based on new input data tensorflow macos https github com apple tensorflow macos is a mac optimized version of tensorflow and tensorflow addons for macos 11 0 accelerated using apple s ml compute framework apache opennlp https opennlp apache org is an open source library for a machine learning based toolkit used in the processing of natural language text it features an api for use cases like named entity recognition https en wikipedia org wiki named entity recognition sentence detection pos part of speech tagging https en wikipedia org wiki part of speech tagging tokenization https en wikipedia org wiki tokenization data security feature extraction https en wikipedia org wiki feature extraction chunking https en wikipedia org wiki chunking psychology parsing https en wikipedia org wiki parsing and coreference resolution https en wikipedia org wiki coreference apache airflow https airflow apache org is an open source workflow management platform created by the community to programmatically author schedule and monitor workflows install principles scalable airflow has a modular architecture and uses a message queue to orchestrate an arbitrary number of workers airflow is ready to scale to infinity open neural network exchange onnx https github com onnx is an open ecosystem that empowers ai developers to choose the right tools as their project evolves onnx provides an open source format for ai models both deep learning and traditional ml it defines an extensible computation graph model as well as definitions of built in operators and standard data types apache mxnet https mxnet apache org is a deep learning framework designed for both efficiency and flexibility it allows you to mix symbolic and imperative programming to maximize efficiency and productivity at its core mxnet contains a dynamic dependency scheduler that automatically parallelizes both symbolic and imperative operations on the fly a graph optimization layer on top of that makes symbolic execution fast and memory efficient mxnet is portable and lightweight scaling effectively to multiple gpus and multiple machines support for python r julia scala go javascript and more autogluon https autogluon mxnet io index html is toolkit for deep learning that automates machine learning tasks enabling you to easily achieve strong predictive performance in your applications with just a few lines of code you can train and deploy high accuracy deep learning models on tabular image and text data anaconda https www anaconda com is a very popular data science platform for machine learning and deep learning that enables users to develop models train them and deploy them plaidml https github com plaidml plaidml is an advanced and portable tensor compiler for enabling deep learning on laptops embedded devices or other devices where the available computing hardware is not well supported or the available software stack contains unpalatable license restrictions opencv https opencv org is a highly optimized library with focus on real time computer vision applications the c python and java interfaces support linux macos windows ios and android scikit learn https scikit learn org stable index html is a python module for machine learning built on top of scipy numpy and matplotlib making it easier to apply robust and simple implementations of many popular machine learning algorithms weka https www cs waikato ac nz ml weka is an open source machine learning software that can be accessed through a graphical user interface standard terminal applications or a java api it is widely used for teaching research and industrial applications contains a plethora of built in tools for standard machine learning tasks and additionally gives transparent access to well known toolboxes such as scikit learn r and deeplearning4j caffe https github com bvlc caffe is a deep learning framework made with expression speed and modularity in mind it is developed by berkeley ai research bair the berkeley vision and learning center bvlc and community contributors theano https github com theano theano is a python library that allows you to define optimize and evaluate mathematical expressions involving multi dimensional arrays efficiently including tight integration with numpy ngraph https github com nervanasystems ngraph is an open source c library compiler and runtime for deep learning the ngraph compiler aims to accelerate developing ai workloads using any deep learning framework and deploying to a variety of hardware targets it provides the freedom performance and ease of use to ai developers nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org jupyter notebook https jupyter org is an open source web application that allows you to create and share documents that contain live code equations visualizations and narrative text jupyter is used widely in industries that do data cleaning and transformation numerical simulation statistical modeling data visualization data science and machine learning apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache spark connector for sql server and azure sql https github com microsoft sql spark connector is a high performance connector that enables you to use transactional data in big data analytics and persists results for ad hoc queries or reporting the connector allows you to use any sql database on premises or in the cloud as an input data source or output data sink for spark jobs apache predictionio https predictionio apache org is an open source machine learning framework for developers data scientists and end users it supports event collection deployment of algorithms evaluation querying predictive results via rest apis it is based on scalable open source services like hadoop hbase and other dbs elasticsearch spark and implements what is called a lambda architecture cluster manager for apache kafka cmak https github com yahoo cmak is a tool for managing apache kafka https kafka apache org clusters bigdl https bigdl project github io is a distributed deep learning library for apache spark with bigdl users can write their deep learning applications as standard spark programs which can directly run on top of existing spark or hadoop clusters eclipse deeplearning4j dl4j https deeplearning4j konduit ai is a set of projects intended to support all the needs of a jvm based scala kotlin clojure and groovy deep learning application this means starting with the raw data loading and preprocessing it from wherever and whatever format it is in to building and tuning a wide variety of simple and complex deep learning networks tensorman https github com pop os tensorman is a utility for easy management of tensorflow containers by developed by system76 https system76 com tensorman allows tensorflow to operate in an isolated environment that is contained from the rest of the system this virtual environment can operate independent of the base system allowing you to use any version of tensorflow on any version of a linux distribution that supports the docker runtime numba https github com numba numba is an open source numpy aware optimizing compiler for python sponsored by anaconda inc it uses the llvm compiler project to generate machine code from python syntax numba can compile a large subset of numerically focused python including many numpy functions additionally numba has support for automatic parallelization of loops generation of gpu accelerated code and creation of ufuncs and c callbacks chainer https chainer org is a python based deep learning framework aiming at flexibility it provides automatic differentiation apis based on the define by run approach dynamic computational graphs as well as object oriented high level apis to build and train neural networks it also supports cuda cudnn using cupy https github com cupy cupy for high performance training and inference xgboost https xgboost readthedocs io is an optimized distributed gradient boosting library designed to be highly efficient flexible and portable it implements machine learning algorithms under the gradient boosting framework xgboost provides a parallel tree boosting also known as gbdt gbm that solve many data science problems in a fast and accurate way it supports distributed training on multiple machines including aws gce azure and yarn clusters also it can be integrated with flink spark and other cloud dataflow systems cuml https github com rapidsai cuml is a suite of libraries that implement machine learning algorithms and mathematical primitives functions that share compatible apis with other rapids projects cuml enables data scientists researchers and software engineers to run traditional tabular ml tasks on gpus without going into the details of cuda programming in most cases cuml s python api matches the api from scikit learn cuda development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 94306481 e17b8f00 ff27 11ea 832f c85374acb3b1 png br p p align center img src https user images githubusercontent com 45159366 117718735 55a23480 b191 11eb 874d e690d09cd490 png br p cuda toolkit source nvidia developer cuda https developer nvidia com cuda zone cuda learning resources cuda https developer nvidia com cuda zone is a parallel computing platform and programming model developed by nvidia for general computing on graphical processing units gpus with cuda developers are able to dramatically speed up computing applications by harnessing the power of gpus in gpu accelerated applications the sequential part of the workload runs on the cpu which is optimized for single threaded the compute intensive portion of the application runs on thousands of gpu cores in parallel when using cuda developers can program in popular languages such as c c fortran python and matlab cuda toolkit documentation https docs nvidia com cuda index html cuda quick start guide https docs nvidia com cuda cuda quick start guide index html cuda on wsl https docs nvidia com cuda wsl user guide index html cuda gpu support for tensorflow https www tensorflow org install gpu nvidia deep learning cudnn documentation https docs nvidia com deeplearning cudnn api index html nvidia gpu cloud documentation https docs nvidia com ngc ngc introduction index html nvidia ngc https ngc nvidia com is a hub for gpu optimized software for deep learning machine learning and high performance computing hpc workloads nvidia ngc containers https www nvidia com en us gpu cloud containers is a registry that provides researchers data scientists and developers with simple access to a comprehensive catalog of gpu accelerated software for ai machine learning and hpc these containers take full advantage of nvidia gpus on premises and in the cloud cuda tools libraries and frameworks cuda toolkit https developer nvidia com cuda downloads is a collection of tools libraries that provide a development environment for creating high performance gpu accelerated applications the cuda toolkit allows you can develop optimize and deploy your applications on gpu accelerated embedded systems desktop workstations enterprise data centers cloud based platforms and hpc supercomputers the toolkit includes gpu accelerated libraries debugging and optimization tools a c c compiler and a runtime library to build and deploy your application on major architectures including x86 arm and power nvidia cudnn https developer nvidia com cudnn is a gpu accelerated library of primitives for deep neural networks https developer nvidia com deep learning cudnn provides highly tuned implementations for standard routines such as forward and backward convolution pooling normalization and activation layers cudnn accelerates widely used deep learning frameworks including caffe2 https caffe2 ai chainer https chainer org keras https keras io matlab https www mathworks com solutions deep learning html mxnet https mxnet incubator apache org pytorch https pytorch org and tensorflow https www tensorflow org cuda x hpc https www nvidia com en us technologies cuda x is a collection of libraries tools compilers and apis that help developers solve the world s most challenging problems cuda x hpc includes highly tuned kernels essential for high performance computing hpc nvidia container toolkit https github com nvidia nvidia docker is a collection of tools libraries that allows users to build and run gpu accelerated docker containers the toolkit includes a container runtime library https github com nvidia libnvidia container and utilities to automatically configure containers to leverage nvidia gpus minkowski engine https nvidia github io minkowskiengine is an auto differentiation library for sparse tensors it supports all standard neural network layers such as convolution pooling unpooling and broadcasting operations for sparse tensors cutlass https github com nvidia cutlass is a collection of cuda c template abstractions for implementing high performance matrix multiplication gemm at all levels and scales within cuda it incorporates strategies for hierarchical decomposition and data movement similar to those used to implement cublas cub https github com nvidia cub is a cooperative primitives for cuda c kernel authors tensorman https github com pop os tensorman is a utility for easy management of tensorflow containers by developed by system76 https system76 com tensorman allows tensorflow to operate in an isolated environment that is contained from the rest of the system this virtual environment can operate independent of the base system allowing you to use any version of tensorflow on any version of a linux distribution that supports the docker runtime numba https github com numba numba is an open source numpy aware optimizing compiler for python sponsored by anaconda inc it uses the llvm compiler project to generate machine code from python syntax numba can compile a large subset of numerically focused python including many numpy functions additionally numba has support for automatic parallelization of loops generation of gpu accelerated code and creation of ufuncs and c callbacks chainer https chainer org is a python based deep learning framework aiming at flexibility it provides automatic differentiation apis based on the define by run approach dynamic computational graphs as well as object oriented high level apis to build and train neural networks it also supports cuda cudnn using cupy https github com cupy cupy for high performance training and inference cupy https cupy dev is an implementation of numpy compatible multi dimensional array on cuda cupy consists of the core multi dimensional array class cupy ndarray and many functions on it it supports a subset of numpy ndarray interface catboost https catboost ai is a fast scalable high performance gradient boosting https en wikipedia org wiki gradient boosting on decision trees library used for ranking classification regression and other machine learning tasks for python r java c supports computation on cpu and gpu cudf https rapids ai is a gpu dataframe library for loading joining aggregating filtering and otherwise manipulating data cudf provides a pandas like api that will be familiar to data engineers data scientists so they can use it to easily accelerate their workflows without going into the details of cuda programming cuml https github com rapidsai cuml is a suite of libraries that implement machine learning algorithms and mathematical primitives functions that share compatible apis with other rapids projects cuml enables data scientists researchers and software engineers to run traditional tabular ml tasks on gpus without going into the details of cuda programming in most cases cuml s python api matches the api from scikit learn arrayfire https arrayfire com is a general purpose library that simplifies the process of developing software that targets parallel and massively parallel architectures including cpus gpus and other hardware acceleration devices thrust https github com nvidia thrust is a c parallel programming library which resembles the c standard library thrust s high level interface greatly enhances programmer productivity while enabling performance portability between gpus and multicore cpus aresdb https eng uber com aresdb is a gpu powered real time analytics storage and query engine it features low query latency high data freshness and highly efficient in memory and on disk storage management arraymancer https mratsim github io arraymancer is a tensor n dimensional array project in nim the main focus is providing a fast and ergonomic cpu cuda and opencl ndarray library on which to build a scientific computing ecosystem kintinuous https github com mp3guy kintinuous is a real time dense visual slam system capable of producing high quality globally consistent point and mesh reconstructions over hundreds of metres in real time with only a low cost commodity rgb d sensor graphvite https graphvite io is a general graph embedding engine dedicated to high speed and large scale embedding learning in various applications matlab development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 94306473 de809e80 ff27 11ea 924b 0a6947ae38bc png br p matlab learning resources matlab https www mathworks com products matlab html is a programming language that does numerical computing such as expressing matrix and array mathematics directly matlab documentation https www mathworks com help matlab getting started with matlab https www mathworks com help matlab getting started with matlab html matlab and simulink training from matlab academy https matlabacademy mathworks com mathworks certification program https www mathworks com services training certification html matlab online courses from udemy https www udemy com topic matlab matlab online courses from coursera https www coursera org courses query matlab matlab online courses from edx https www edx org learn matlab building a matlab gui https www mathworks com discovery matlab gui html matlab style guidelines 2 0 https www mathworks com matlabcentral fileexchange 46056 matlab style guidelines 2 0 setting up git source control with matlab simulink https www mathworks com help matlab matlab prog set up git source control html pull push and fetch files with git with matlab simulink https www mathworks com help matlab matlab prog push and fetch with git html create new repository with matlab simulink https www mathworks com help matlab matlab prog add folder to source control html prmlt http prml github io is matlab code for machine learning algorithms in the prml book matlab tools libraries frameworks matlab and simulink services applications list https www mathworks com products html matlab in the cloud https www mathworks com solutions cloud html is a service that allows you to run in cloud environments from mathworks cloud https www mathworks com solutions cloud html browser to public clouds https www mathworks com solutions cloud html public cloud including aws https aws amazon com and azure https azure microsoft com matlab online https matlab mathworks com is a service that allows to users to uilitize matlab and simulink through a web browser such as google chrome simulink https www mathworks com products simulink html is a block diagram environment for model based design it supports simulation automatic code generation and continuous testing of embedded systems simulink online https www mathworks com products simulink online html is a service that provides access to simulink through your web browser matlab drive https www mathworks com products matlab drive html is a service that gives you the ability to store access and work with your files from anywhere matlab parallel server https www mathworks com products matlab parallel server html is a tool that lets you scale matlab programs and simulink simulations to clusters and clouds you can prototype your programs and simulations on the desktop and then run them on clusters and clouds without recoding matlab parallel server supports batch jobs interactive parallel computations and distributed computations with large matrices matlab schemer https github com scottclowe matlab schemer is a matlab package makes it easy to change the color scheme theme of the matlab display and gui lrslibrary https github com andrewssobral lrslibrary is a low rank and sparse tools for background modeling and subtraction in videos the library was designed for moving object detection in videos but it can be also used for other computer vision and machine learning problems image processing toolbox https www mathworks com products image html is a tool that provides a comprehensive set of reference standard algorithms and workflow apps for image processing analysis visualization and algorithm development you can perform image segmentation image enhancement noise reduction geometric transformations image registration and 3d image processing computer vision toolbox https www mathworks com products computer vision html is a tool that provides algorithms functions and apps for designing and testing computer vision 3d vision and video processing systems you can perform object detection and tracking as well as feature detection extraction and matching you can automate calibration workflows for single stereo and fisheye cameras for 3d vision the toolbox supports visual and point cloud slam stereo vision structure from motion and point cloud processing statistics and machine learning toolbox https www mathworks com products statistics html is a tool that provides functions and apps to describe analyze and model data you can use descriptive statistics visualizations and clustering for exploratory data analysis fit probability distributions to data generate random numbers for monte carlo simulations and perform hypothesis tests regression and classification algorithms let you draw inferences from data and build predictive models either interactively using the classification and regression learner apps or programmatically using automl lidar toolbox https www mathworks com products lidar html is a tool that provides algorithms functions and apps for designing analyzing and testing lidar processing systems you can perform object detection and tracking semantic segmentation shape fitting lidar registration and obstacle detection lidar toolbox supports lidar camera cross calibration for workflows that combine computer vision and lidar processing mapping toolbox https www mathworks com products mapping html is a tool that provides algorithms and functions for transforming geographic data and creating map displays you can visualize your data in a geographic context build map displays from more than 60 map projections and transform data from a variety of sources into a consistent geographic coordinate system uav toolbox https www mathworks com products uav html is an application that provides tools and reference applications for designing simulating testing and deploying unmanned aerial vehicle uav and drone applications you can design autonomous flight algorithms uav missions and flight controllers the flight log analyzer app lets you interactively analyze 3d flight paths telemetry information and sensor readings from common flight log formats parallel computing toolbox https www mathworks com products matlab parallel server html is a tool that lets you solve computationally and data intensive problems using multicore processors gpus and computer clusters high level constructs such as parallel for loops special array types and parallelized numerical algorithms enable you to parallelize matlab applications without cuda or mpi programming the toolbox lets you use parallel enabled functions in matlab and other toolboxes you can use the toolbox with simulink to run multiple simulations of a model in parallel programs and models can run in both interactive and batch modes partial differential equation toolbox https www mathworks com products pde html is a tool that provides functions for solving structural mechanics heat transfer and general partial differential equations pdes using finite element analysis ros toolbox https www mathworks com products ros html is a tool that provides an interface connecting matlab and simulink with the robot operating system ros and ros 2 enabling you to create a network of ros nodes the toolbox includes matlab functions and simulink blocks to import analyze and play back ros data recorded in rosbag files you can also connect to a live ros network to access ros messages robotics toolbox https www mathworks com products robotics html provides a toolbox that brings robotics specific functionality designing simulating and testing manipulators mobile robots and humanoid robots to matlab exploiting the native capabilities of matlab linear algebra portability graphics the toolbox also supports mobile robots with functions for robot motion models bicycle path planning algorithms bug distance transform d prm kinodynamic planning lattice rrt localization ekf particle filter map building ekf and simultaneous localization and mapping ekf and a simulink model a of non holonomic vehicle the toolbox also including a detailed simulink model for a quadrotor flying robot deep learning toolbox https www mathworks com products deep learning html is a tool that provides a framework for designing and implementing deep neural networks with algorithms pretrained models and apps you can use convolutional neural networks convnets cnns and long short term memory lstm networks to perform classification and regression on image time series and text data you can build network architectures such as generative adversarial networks gans and siamese networks using automatic differentiation custom training loops and shared weights with the deep network designer app you can design analyze and train networks graphically it can exchange models with tensorflow and pytorch through the onnx format and import models from tensorflow keras and caffe the toolbox supports transfer learning with darknet 53 resnet 50 nasnet squeezenet and many other pretrained models reinforcement learning toolbox https www mathworks com products reinforcement learning html is a tool that provides an app functions and a simulink block for training policies using reinforcement learning algorithms including dqn ppo sac and ddpg you can use these policies to implement controllers and decision making algorithms for complex applications such as resource allocation robotics and autonomous systems deep learning hdl toolbox https www mathworks com products deep learning hdl html is a tool that provides functions and tools to prototype and implement deep learning networks on fpgas and socs it provides pre built bitstreams for running a variety of deep learning networks on supported xilinx and intel fpga and soc devices profiling and estimation tools let you customize a deep learning network by exploring design performance and resource utilization tradeoffs model predictive control toolbox https www mathworks com products model predictive control html is a tool that provides functions an app and simulink blocks for designing and simulating controllers using linear and nonlinear model predictive control mpc the toolbox lets you specify plant and disturbance models horizons constraints and weights by running closed loop simulations you can evaluate controller performance vision hdl toolbox https www mathworks com products vision hdl html is a tool that provides pixel streaming algorithms for the design and implementation of vision systems on fpgas and asics it provides a design framework that supports a diverse set of interface types frame sizes and frame rates the image processing video and computer vision algorithms in the toolbox use an architecture appropriate for hdl implementations soc blockset https www mathworks com products soc html is a tool that provides simulink blocks and visualization tools for modeling simulating and analyzing hardware and software architectures for asics fpgas and systems on a chip soc you can build your system architecture using memory models bus models and i o models and simulate the architecture together with the algorithms wireless hdl toolbox https www mathworks com products wireless hdl html is a tool that provides pre verified hardware ready simulink blocks and subsystems for developing 5g lte and custom ofdm based wireless communication applications it includes reference applications ip blocks and gateways between frame and sample based processing thingspeak https www mathworks com products thingspeak html is an iot analytics service that allows you to aggregate visualize and analyze live data streams in the cloud thingspeak provides instant visualizations of data posted by your devices to thingspeak with the ability to execute matlab code in thingspeak you can perform online analysis and process data as it comes in thingspeak is often used for prototyping and proof of concept iot systems that require analytics sea mat https sea mat github io sea mat is a collaborative effort to organize and distribute matlab tools for the oceanographic community gramm https github com piermorel gramm is a complete data visualization toolbox for matlab it provides an easy to use and high level interface to produce publication quality plots of complex data with varied statistical visualizations gramm is inspired by r s ggplot2 library hctsa https hctsa users gitbook io hctsa manual is a software package for running highly comparative time series analysis using matlab plotly https plot ly matlab is a graphing library for matlab yalmip https yalmip github io is a matlab toolbox for optimization modeling gnu octave https www gnu org software octave is a high level interpreted language primarily intended for numerical computations it provides capabilities for the numerical solution of linear and nonlinear problems and for performing other numerical experiments it also provides extensive graphics capabilities for data visualization and manipulation c c development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 115297894 961e0d80 a111 11eb 81c3 e2bd2ac9a7cd png br p c c learning resources c https www cplusplus com doc tutorial is a cross platform language that can be used to build high performance applications developed by bjarne stroustrup as an extension to the c language c https www iso org standard 74528 html is a general purpose high level language that was originally developed by dennis m ritchie to develop the unix operating system at bell labs it supports structured programming lexical variable scope and recursion with a static type system c also provides constructs that map efficiently to typical machine instructions which makes it one was of the most widely used programming languages today embedded c https en wikipedia org wiki embedded c is a set of language extensions for the c programming language by the c standards committee https isocpp org std the committee to address issues that exist between c extensions for different embedded systems https en wikipedia org wiki embedded system the extensions hep enhance microprocessor features such as fixed point arithmetic multiple distinct memory banks and basic i o operations this makes embedded c the most popular embedded software language in the world c c developer tools from jetbrains https www jetbrains com cpp open source c libraries on cppreference com https en cppreference com w cpp links libs c graphics libraries https cpp libhunt com libs graphics c libraries in matlab https www mathworks com help matlab call cpp library functions html c tools and libraries articles https www cplusplus com articles tools google c style guide https google github io styleguide cppguide html introduction c education course on google developers https developers google com edu c c style guide for fuchsia https fuchsia dev fuchsia src development languages c cpp cpp style c and c coding style guide by opentitan https docs opentitan org doc rm c cpp coding style chromium c style guide https chromium googlesource com chromium src master styleguide c c md c core guidelines https github com isocpp cppcoreguidelines blob master cppcoreguidelines md c style guide for ros http wiki ros org cppstyleguide learn c https www learncpp com learn c an interactive c tutorial https www learn c org c institute https cppinstitute org free c and c courses c online training courses on linkedin learning https www linkedin com learning topics c plus plus c tutorials on w3schools https www w3schools com cpp default asp learn c programming online courses on edx https www edx org learn c programming learn c with online courses on edx https www edx org learn c plus plus learn c on codecademy https www codecademy com learn learn c plus plus coding for everyone c and c course on coursera https www coursera org specializations coding for everyone c for c programmers on coursera https www coursera org learn c plus plus a top c courses on coursera https www coursera org courses query c 20programming c online courses on udemy https www udemy com topic c plus plus top c courses on udemy https www udemy com topic c programming basics of embedded c programming for beginners on udemy https www udemy com course embedded c programming for embedded systems c for programmers course on udacity https www udacity com course c for programmers ud210 c fundamentals course on pluralsight https www pluralsight com courses learn program cplusplus introduction to c on mit free online course materials https ocw mit edu courses electrical engineering and computer science 6 096 introduction to c january iap 2011 introduction to c for programmers harvard https online learning harvard edu course introduction c programmers online c courses harvard university https online learning harvard edu subject c c c tools and frameworks aws sdk for c https aws amazon com sdk for cpp azure sdk for c https github com azure azure sdk for cpp azure sdk for c https github com azure azure sdk for c c client libraries for google cloud services https github com googleapis google cloud cpp visual studio https visualstudio microsoft com is an integrated development environment ide from microsoft which is a feature rich application that can be used for many aspects of software development visual studio makes it easy to edit debug build and publish your app by using microsoft software development platforms such as windows api windows forms windows presentation foundation and windows store visual studio code https code visualstudio com is a code editor redefined and optimized for building and debugging modern web and cloud applications vcpkg https github com microsoft vcpkg is a c library manager for windows linux and macos resharper c https www jetbrains com resharper cpp features is a visual studio extension for c developers developed by jetbrains appcode https www jetbrains com objc is constantly monitoring the quality of your code it warns you of errors and smells and suggests quick fixes to resolve them automatically appcode provides lots of code inspections for objective c swift c c and a number of code inspections for other supported languages all code inspections are run on the fly clion https www jetbrains com clion features is a cross platform ide for c and c developers developed by jetbrains code blocks https www codeblocks org is a free c c and fortran ide built to meet the most demanding needs of its users it is designed to be very extensible and fully configurable built around a plugin framework code blocks can be extended with plugins cppsharp https github com mono cppsharp is a tool and set of libraries which facilitates the usage of native c c code with the net ecosystem it consumes c c header and library files and generates the necessary glue code to surface the native api as a managed api such an api can be used to consume an existing native library in your managed code or add managed scripting support to a native codebase conan https conan io is an open source package manager for c development and dependency management into the 21st century and on par with the other development ecosystems high performance computing hpc sdk https developer nvidia com hpc is a comprehensive toolbox for gpu accelerating hpc modeling and simulation applications it includes the c c and fortran compilers libraries and analysis tools necessary for developing hpc applications on the nvidia platform thrust https github com nvidia thrust is a c parallel programming library which resembles the c standard library thrust s high level interface greatly enhances programmer productivity while enabling performance portability between gpus and multicore cpus interoperability with established technologies such as cuda tbb and openmp integrates with existing software boost https www boost org is an educational opportunity focused on cutting edge c boost has been a participant in the annual google summer of code since 2007 in which students develop their skills by working on boost library development automake https www gnu org software automake is a tool for automatically generating makefile in files compliant with the gnu coding standards automake requires the use of gnu autoconf cmake https cmake org is an open source cross platform family of tools designed to build test and package software cmake is used to control the software compilation process using simple platform and compiler independent configuration files and generate native makefiles and workspaces that can be used in the compiler environment of your choice gdb http www gnu org software gdb is a debugger that allows you to see what is going on inside another program while it executes or what another program was doing at the moment it crashed gcc https gcc gnu org is a compiler collection that includes front ends for c c objective c fortran ada go and d as well as libraries for these languages gsl https www gnu org software gsl is a numerical library for c and c programmers it is free software under the gnu general public license the library provides a wide range of mathematical routines such as random number generators special functions and least squares fitting there are over 1000 functions in total with an extensive test suite opengl extension wrangler library glew https www opengl org sdk libs glew is a cross platform open source c c extension loading library glew provides efficient run time mechanisms for determining which opengl extensions are supported on the target platform libtool https www gnu org software libtool is a generic library support script that hides the complexity of using shared libraries behind a consistent portable interface to use libtool add the new generic library building commands to your makefile makefile in or makefile am maven https maven apache org is a software project management and comprehension tool based on the concept of a project object model pom maven can manage a project s build reporting and documentation from a central piece of information tau tuning and analysis utilities http www cs uoregon edu research tau home php is capable of gathering performance information through instrumentation of functions methods basic blocks and statements as well as event based sampling all c language features are supported including templates and namespaces clang https clang llvm org is a production quality c objective c c and objective c compiler when targeting x86 32 x86 64 and arm other targets may have caveats but are usually easy to fix clang is used in production to build performance critical software like google chrome or firefox opencv https opencv org is a highly optimized library with focus on real time applications cross platform c python and java interfaces support linux macos windows ios and android libcu https nvidia github io libcudacxx is the nvidia c standard library for your entire system it provides a heterogeneous implementation of the c standard library that can be used in and between cpu and gpu code antlr another tool for language recognition https www antlr org is a powerful parser generator for reading processing executing or translating structured text or binary files it s widely used to build languages tools and frameworks from a grammar antlr generates a parser that can build parse trees and also generates a listener interface that makes it easy to respond to the recognition of phrases of interest oat https oatpp io is a light and powerful c web framework for highly scalable and resource efficient web application it s zero dependency and easy portable javacpp https github com bytedeco javacpp is a program that provides efficient access to native c inside java not unlike the way some c c compilers interact with assembly language cython https cython org is a language that makes writing c extensions for python as easy as python itself cython is based on pyrex but supports more cutting edge functionality and optimizations such as calling c functions and declaring c types on variables and class attributes spdlog https github com gabime spdlog is a very fast header only compiled c logging library infer https fbinfer com is a static analysis tool for java c objective c and c infer is written in ocaml https ocaml org python development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 93133273 ce490380 f68b 11ea 81d0 7f6a3debe6c0 png br p python learning resources python https www python org is an interpreted high level programming language python is used heavily in the fields of data science and machine learning python developer s guide https devguide python org is a comprehensive resource for contributing to python for both new and experienced contributors it is maintained by the same community that maintains python azure functions python developer guide https docs microsoft com en us azure azure functions functions reference python is an introduction to developing azure functions using python the content below assumes that you ve already read the azure functions developers guide https docs microsoft com en us azure azure functions functions reference checkio https checkio org is a programming learning platform and a gamified website that teaches python through solving code challenges and competing for the most elegant and creative solutions python institute https pythoninstitute org pcep certified entry level python programmer certification https pythoninstitute org pcep certification entry level pcap certified associate in python programming certification https pythoninstitute org pcap certification associate pcpp certified professional in python programming 1 certification https pythoninstitute org pcpp certification professional pcpp certified professional in python programming 2 https pythoninstitute org pcpp certification professional mta introduction to programming using python certification https docs microsoft com en us learn certifications mta introduction to programming using python getting started with python in visual studio code https code visualstudio com docs python python tutorial google s python style guide https google github io styleguide pyguide html google s python education class https developers google com edu python real python https realpython com the python open source computer science degree by forrest knight https github com forrestknight open source cs python intro to python for data science https www datacamp com courses intro to python for data science intro to python by w3schools https www w3schools com python python intro asp codecademy s python 3 course https www codecademy com learn learn python 3 learn python with online courses and classes from edx https www edx org learn python python courses online from coursera https www coursera org courses query python python frameworks libraries and tools python package index pypi https pypi org is a repository of software for the python programming language pypi helps you find and install software developed and shared by the python community pycharm https www jetbrains com pycharm is the best ide i ve ever used with pycharm you can access the command line connect to a database create a virtual environment and manage your version control system all in one place saving time by avoiding constantly switching between windows python tools for visual studio ptvs https microsoft github io ptvs is a free open source plugin that turns visual studio into a python ide it supports editing browsing intellisense mixed python c debugging remote linux macos debugging profiling ipython and web development with django and other frameworks django https www djangoproject com is a high level python web framework that encourages rapid development and clean pragmatic design flask https flask palletsprojects com is a micro web framework written in python it is classified as a microframework because it does not require particular tools or libraries web2py http web2py com is an open source web application framework written in python allowing allows web developers to program dynamic web content one web2py instance can run multiple web sites using different databases aws chalice https github com aws chalice is a framework for writing serverless apps in python it allows you to quickly create and deploy applications that use aws lambda tornado https www tornadoweb org is a python web framework and asynchronous networking library tornado uses a non blocking network i o which can scale to tens of thousands of open connections httpie https github com httpie httpie is a command line http client that makes cli interaction with web services as easy as possible httpie is designed for testing debugging and generally interacting with apis http servers scrapy https scrapy org is a fast high level web crawling and web scraping framework used to crawl websites and extract structured data from their pages it can be used for a wide range of purposes from data mining to monitoring and automated testing sentry https sentry io is a service that helps you monitor and fix crashes in realtime the server is in python but it contains a full api for sending events from any language in any application pipenv https github com pypa pipenv is a tool that aims to bring the best of all packaging worlds bundler composer npm cargo yarn etc to the python world python fire https github com google python fire is a library for automatically generating command line interfaces clis from absolutely any python object bottle https github com bottlepy bottle is a fast simple and lightweight wsgi https www wsgi org micro web framework for python it is distributed as a single file module and has no dependencies other than the python standard library https docs python org library cherrypy https cherrypy org is a minimalist python object oriented http web framework sanic https github com huge success sanic is a python 3 6 web server and web framework that s written to go fast pyramid https trypyramid com is a small and fast open source python web framework it makes real world web application development and deployment more fun and more productive turbogears https turbogears org is a hybrid web framework able to act both as a full stack framework or as a microframework falcon https falconframework org is a reliable high performance python web framework for building large scale app backends and microservices with support for mongodb pluggable applications and autogenerated admin neural network intelligence nni https github com microsoft nni is an open source automl toolkit for automate machine learning lifecycle including feature engineering https github com microsoft nni blob master docs en us featureengineering overview md neural architecture search https github com microsoft nni blob master docs en us nas overview md model compression https github com microsoft nni blob master docs en us compressor overview md and hyperparameter tuning https github com microsoft nni blob master docs en us tuner builtintuner md dash https plotly com dash is a popular python framework for building ml data science web apps for python r julia and jupyter luigi https github com spotify luigi is a python module that helps you build complex pipelines of batch jobs it handles dependency resolution workflow management visualization etc it also comes with hadoop support built in locust https github com locustio locust is an easy to use scriptable and scalable performance testing tool spacy https github com explosion spacy is a library for advanced natural language processing in python and cython numpy https www numpy org is the fundamental package needed for scientific computing with python pillow https python pillow org is a friendly pil python imaging library fork ipython https ipython org is a command shell for interactive computing in multiple programming languages originally developed for the python programming language that offers enhanced introspection rich media additional shell syntax tab completion and rich history graphlab create https turi com is a python library backed by a c engine for quickly building large scale high performance machine learning models pandas https pandas pydata org is a fast powerful and easy to use open source data structrures data analysis and manipulation tool built on top of the python programming language pulp https coin or github io pulp is an linear programming modeler written in python pulp can generate lp files and call on use highly optimized solvers glpk coin clp cbc cplex and gurobi to solve these linear problems matplotlib https matplotlib org is a 2d plotting library for creating static animated and interactive visualizations in python matplotlib produces publication quality figures in a variety of hardcopy formats and interactive environments across platforms scikit learn https scikit learn org stable index html is a simple and efficient tool for data mining and data analysis it is built on numpy scipy and mathplotlib java development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 93925952 c0b6fd80 fccb 11ea 9f90 21c4148e3c86 png br p java learning resources java https www oracle com java is a popular programming language and development platform jdk it reduces costs shortens development timeframes drives innovation and improves application services with millions of developers running more than 51 billion java virtual machines worldwide the eclipse foundation https www eclipse org downloads is home to a worldwide community of developers the eclipse ide jakarta ee and over 375 open source projects including runtimes tools and frameworks for java and other languages getting started with java https docs oracle com javase tutorial oracle java certifications from oracle university https education oracle com java certification benefits google developers training https developers google com training google developers certification https developers google com certification java tutorial by w3schools https www w3schools com java building your first android app in java codelabs developers google com codelabs build your first android app getting started with java in visual studio code https code visualstudio com docs java java tutorial google java style guide https google github io styleguide javaguide html aosp java code style for contributors https source android com setup contribute code style chromium java style guide https chromium googlesource com chromium src master styleguide java java md get started with or tools for java https developers google com optimization introduction java getting started with java tool installer task for azure pipelines https docs microsoft com en us azure devops pipelines tasks tool java tool installer gradle user manual https docs gradle org current userguide userguide html java tools and frameworks java se https www oracle com java technologies javase tools jsp html contains several tools to assist in program development and debugging and in the monitoring and troubleshooting of production applications jdk development tools https docs oracle com javase 7 docs technotes tools includes the java web start tools javaws java troubleshooting profiling monitoring and management tools jcmd jconsole jmc jvisualvm and java web services tools schemagen wsgen wsimport xjc android studio https developer android com studio is the official integrated development environment for google s android operating system built on jetbrains intellij idea software and designed specifically for android development availble on windows macos linux chrome os intellij idea https www jetbrains com idea is an ide for java but it also understands and provides intelligent coding assistance for a large variety of other languages such as kotlin sql jpql html javascript etc even if the language expression is injected into a string literal in your java code netbeans https netbeans org features java index html is an ide provides java developers with all the tools needed to create professional desktop mobile and enterprise applications creating editing and refactoring the ide provides wizards and templates to let you create java ee java se and java me applications java design patterns https github com iluwatar java design patterns is a collection of the best formalized practices a programmer can use to solve common problems when designing an application or system elasticsearch https www elastic co products elasticsearch is a distributed restful search engine built for the cloud written in java rxjava https github com reactivex rxjava is a java vm implementation of reactive extensions http reactivex io a library for composing asynchronous and event based programs by using observable sequences it extends the observer pattern http en wikipedia org wiki observer pattern to support sequences of data events and adds operators that allow you to compose sequences together declaratively while abstracting away concerns about things like low level threading synchronization thread safety and concurrent data structures guava https github com google guava is a set of core java libraries from google that includes new collection types such as multimap and multiset immutable collections a graph library and utilities for concurrency i o hashing caching primitives strings and more it is widely used on most java projects within google and widely used by many other companies as well okhttp https square github io okhttp is a http client for java and kotlin developed by square retrofit https square github io retrofit is a type safe http client for android and java develped by square leakcanary https square github io leakcanary is a memory leak detection library for android develped by square apache spark https spark apache org is a unified analytics engine for large scale data processing it provides high level apis in scala java python and r and an optimized engine that supports general computation graphs for data analysis it also supports a rich set of higher level tools including spark sql for sql and dataframes mllib for machine learning graphx for graph processing and structured streaming for stream processing apache flink https flink apache org is an open source stream processing framework with powerful stream and batch processing capabilities with elegant and fluent apis in java and scala fastjson https github com alibaba fastjson wiki is a java library that can be used to convert java objects into their json representation it can also be used to convert a json string to an equivalent java object libgdx https libgdx com is a cross platform java game development framework based on opengl es that works on windows linux mac os x android your webgl enabled browser and ios jenkins https www jenkins io is the leading open source automation server built with java it provides over 1700 plugins https plugins jenkins io to support automating virtually anything so that humans can actually spend their time doing things machines cannot dbeaver https dbeaver io is a free multi platform database tool for developers sql programmers database administrators and analysts supports any database which has jdbc driver which basically means any database ee version also supports non jdbc datasources mongodb cassandra redis dynamodb etc redisson https redisson pro is a redis java client with features of in memory data grid over 50 redis based java objects and services set multimap sortedset map list queue deque semaphore lock atomiclong map reduce publish subscribe bloom filter spring cache tomcat scheduler jcache api hibernate mybatis rpc and local cache graalvm https www graalvm org is a universal virtual machine for running applications written in javascript python ruby r jvm based languages like java scala clojure kotlin and llvm based languages such as c and c gradle https gradle org is a build automation tool for multi language software development from mobile apps to microservices from small startups to big enterprises gradle helps teams build automate and deliver better software faster write in java c python or your language of choice apache groovy http www groovy lang org is a powerful optionally typed and dynamic language with static typing and static compilation capabilities for the java platform aimed at improving developer productivity thanks to a concise familiar and easy to learn syntax it integrates smoothly with any java program and immediately delivers to your application powerful features including scripting capabilities domain specific language authoring runtime and compile time meta programming and functional programming jacoco https www jacoco org jacoco is a free code coverage library for java which has been created by the eclemma team based on the lessons learned from using and integration existing libraries for many years apache jmeter http jmeter apache org is used to test performance both on static and dynamic resources web dynamic applications it also used to simulate a heavy load on a server group of servers network or object to test its strength or to analyze overall performance under different load types junit https junit org is a simple framework to write repeatable tests it is an instance of the xunit architecture for unit testing frameworks mockito https site mockito org is the most popular mocking framework for unit tests written in java spotbugs https spotbugs github io is a program which uses static analysis to look for bugs in java code springboot https spring io projects spring boot is a great tool that helps you to create spring powered production grade applications and services with absolute minimum fuss it takes an opinionated view of the spring platform so that new and existing users can quickly get to the bits they need yourkit https www yourkit com is a technology leader creator of the most innovative and intelligent tools for profiling java net applications r development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 126396985 130c91c7 9db4 4b74 90f8 d11c1876fdd4 png br p r learning resources r https www r project org is an open source software environment for statistical computing and graphics it compiles and runs on a wide variety of platforms such as windows and macos an introduction to r https cran r project org doc manuals r release r intro pdf google s r style guide https google github io styleguide rguide html r developer s guide to azure https docs microsoft com en us azure architecture data guide technology choices r developers guide running r at scale on google compute engine https cloud google com solutions running r at scale running r on aws https aws amazon com blogs big data running r on aws rstudio server pro for aws https aws amazon com marketplace pp rstudio rstudio server pro for aws b06w2g9pry learn r by codecademy https www codecademy com learn learn r learn r programming with online courses and lessons by edx https www edx org learn r programming r language courses by coursera https www coursera org courses query r 20language learn r for data science by udacity https www udacity com course programming for data science nanodegree with r nd118 r tools libraries and frameworks rstudio https rstudio com is an integrated development environment for r and python with a console syntax highlighting editor that supports direct code execution and tools for plotting history debugging and workspace management shiny https shiny rstudio com is a newer package from rstudio that makes it incredibly easy to build interactive web applications with r rmarkdown https rmarkdown rstudio com is a package helps you create dynamic analysis documents that combine code rendered output such as figures and prose rplugin https github com jetbrains rplugin is r language supported plugin for the intellij ide plotly https plotly r com is an r package for creating interactive web graphics via the open source javascript graphing library plotly js https github com plotly plotly js metaflow https metaflow org is a python r library that helps scientists and engineers build and manage real life data science projects metaflow was originally developed at netflix to boost productivity of data scientists who work on a wide variety of projects from classical statistics to state of the art deep learning prophet https facebook github io prophet is a procedure for forecasting time series data based on an additive model where non linear trends are fit with yearly weekly and daily seasonality plus holiday effects it works best with time series that have strong seasonal effects and several seasons of historical data lightgbm https lightgbm readthedocs io is a gradient boosting framework that uses tree based learning algorithms used for ranking classification and many other machine learning tasks dash https plotly com dash is a python framework for building analytical web applications in python r julia and jupyter mlr https mlr mlr org com is machine learning in r ml workspace https github com ml tooling ml workspace is an all in one web based ide specialized for machine learning and data science it is simple to deploy and gets you started within minutes to productively built ml solutions on your own machines ml workspace is the ultimate tool for developers preloaded with a variety of popular data science libraries tensorflow pytorch keras and mxnet and dev tools jupyter vs code and tensorboard perfectly configured optimized and integrated catboost https catboost ai is a fast scalable high performance gradient boosting on decision trees library used for ranking classification regression and other machine learning tasks for python r java c supports computation on cpu and gpu plumber https www rplumber io is a tool that allows you to create a web api by merely decorating your existing r source code with special comments drake https docs ropensci org drake is an r focused pipeline toolkit for reproducibility and high performance computing diagrammer https visualizers co diagrammer is a package you can create modify analyze and visualize network graph diagrams the output can be incorporated into r markdown documents integrated with shiny web apps converted to other graph formats or exported as image files knitr https yihui org knitr is a general purpose literate programming engine in r with lightweight api s designed to give users full control of the output without heavy coding work broom https broom tidymodels org is a tool that converts statistical analysis objects from r into tidy format julia development back to the top https github com mikeroyal nlp guide table of contents p align center img src https user images githubusercontent com 45159366 94961900 6e839280 04aa 11eb 84c6 2fb3f83e2b90 png br p julia learning resources julia https julialang org is a high level high performance https julialang org benchmarks dynamic language for technical computing julia programs compile to efficient native code for multiple platforms https julialang org downloads support tiers via llvm juliahub https juliahub com contains over 4 000 julia packages for use by the community julia observer https www juliaobserver com julia manual https docs julialang org en v1 manual getting started julialang essentials https docs julialang org en v1 base base julia style guide https docs julialang org en v1 manual style guide julia by example https juliabyexample helpmanual io julialang gitter https gitter im julialang julia dataframes tutorial using jupyter notebooks https github com bkamins julia dataframes tutorial julia academy https juliaacademy com courses preview logged out julia meetup groups https www meetup com topics julia julia on microsoft azure https juliacomputing com media 2017 02 08 azure html julia tools libraries and frameworks juliapro https juliacomputing com products juliapro html is a free and fast way to setup julia for individual researchers engineers scientists quants traders economists students and others julia developers can build better software quicker and easier while benefiting from julia s unparalleled high performance it includes 2600 open source packages or from a curated list of 250 juliapro packages curated packages are tested documented and supported by julia computing juno https junolab org is a powerful free ide based on atom https atom io for the julia language debugger jl https github com juliadebug debugger jl is the julia debuggin tool profile stdlib https docs julialang org en v1 manual profile is a module provides tools to help developers improve the performance of their code when used it takes measurements on running code and produces output that helps you understand how much time is spent on individual line s revise jl https github com timholy revise jl allows you to modify code and use the changes without restarting julia with revise you can be in the middle of a session and then update packages switch git branches and or edit the source code in the editor of your choice any changes will typically be incorporated into the very next command you issue from the repl this can save you the overhead of restarting julia loading packages and waiting for code to jit compile juliagpu https juliagpu org is a github organization created to unify the many packages for programming gpus in julia with its high level syntax and flexible compiler julia is well positioned to productively program hardware accelerators like gpus without sacrificing performance ijulia jl https github com julialang ijulia jl is the julia kernel for jupyter aws jl https github com juliacloud aws jl is a julia interface for amazon web services https aws amazon com cuda jl https juliagpu gitlab io cuda jl is a package for the main programming interface for working with nvidia cuda gpus using julia it features a user friendly array abstraction a compiler for writing cuda kernels in julia and wrappers for various cuda libraries xla jl https github com juliatpu xla jl is a package for compiling julia to xla for tensor processing unit tpu https cloud google com tpu nanosoldier jl https github com juliaci nanosoldier jl is a package for running juliaci services on mit s nanosoldier cluster julia for vscode https www julia vscode org is a powerful extension for the julia language jump jl https jump dev is a domain specific modeling language for mathematical optimization https en wikipedia org wiki mathematical optimization embedded in julia optim jl https github com julianlsolvers optim jl is a univariate and multivariate optimization in julia rcall jl https github com juliainterop rcall jl is a package that allows you to call r functions from julia javacall jl http juliainterop github io javacall jl is a package that allows you to call java functions from julia pycall jl https github com juliapy pycall jl is a package that allows you to call python functions from julia mxnet jl https github com dmlc mxnet jl is the apache mxnet julia package mxnet jl brings flexible and efficient gpu computing and state of art deep learning to julia knet https denizyuret github io knet jl latest is the ko university deep http www ku edu tr en learning framework implemented in julia by deniz yuret https www denizyuret com and collaborators it supports gpu operation and automatic differentiation using dynamic computational graphs for models defined in plain julia distributions jl https github com juliastats distributions jl is a julia package for probability distributions and associated functions dataframes jl http juliadata github io dataframes jl stable is a tool for working with tabular data in julia flux jl https fluxml ai is an elegant approach to machine learning it s a 100 pure julia stack and provides lightweight abstractions on top of julia s native gpu and ad support irtools jl https github com fluxml irtools jl is a simple and flexible ir format expressive enough to work with both lowered and typed julia code as well as external irs cassette jl https github com jrevels cassette jl is a julia package that provides a mechanism for dynamically injecting code transformation passes into julia s just in time jit compilation cycle enabling post hoc analysis and modification of cassette unaware julia programs without requiring manual source annotation or refactoring of the target code contribute x if would you like to contribute to this guide simply make a pull request https github com mikeroyal nlp guide pulls license back to the top https github com mikeroyal nlp guide table of contents distributed under the creative commons attribution 4 0 international cc by 4 0 public license https creativecommons org licenses by 4 0
natural-language-processing nlp nlp-machine-learning awesome awesome-list natural-language natural-language-procressing nlp-keywords-extraction nlp-library nlp-parsing nlp-resources speech-processing speech-recognition speech-synthesis speech-enhancement langauge-model gpt-3 machine-translation semantic-search
ai
PlasticNet
plastic net trash detection an ibm space tech team project plasticnet is an ibm tech for good open source project developed by the ibm space tech team to build a repository of ai object detection models to classify types brands of plastics trash on beaches trash in the ocean and more we can scale this effort with the global community of developers participating and contributing towards this noble effort with long term goals to help with ocean cleanup and positively impact climate change for more information on how to get started check out the plasticnet wiki https github com ibm plasticnet wiki goals the goals for our project are listed below as the following real time detection of different types of trash plastic in particular in the ocean utilizing transfer learning on different machine learning object detection architectures in the future we would also like to be able to improve our model to be able to recognize logos brands on trash in order to detect and identify which company different types of ocean beach trash come from to build a fully functional plasticnet machine learning pipeline that can be easily used to train and test object detection models based from architectures such as yolov4 faster rcnn ssd resnet efficient det tensorflow etc all accessible inside a command line client to provide a set of pretrained plasticnet models that can be utilized for future development and improvement via transfer learning implement our models to work on real time satellite and camera footage basic project structure and technologies used feat diagram img img plasticnetprojectarchitecturediagram png the plasticnet command line program combines yolov4 and tensorflow object detection api technologies into a single easily usable machine learning pipeline cli collaborators can use the plasticnet cli to prepare models for training via transfer learning from the provided pre trained plasticnet models train custom detection models built upon pre trained plasticnet models export the trained models and finally test the trained models the cli was created so these steps can all be done with a few simple commands https github com ibm plasticnet wiki utilizing the plasticnet command line client initially trained via transfer learning from pre trained yolo weights https github com mattokc35 darknet pre trained models and pre trained tensorflow models from the tensorflow detection model zoo https github com tensorflow models blob master research object detection g3doc tf2 detection zoo md our official plasticnet model zoo https github com ibm plasticnet blob main modelzoo md can be used by collaborators for the further improvement development of new plasticnet object detection models for labeling images we utilized ibm s cloud annotations https github com ibm plasticnet wiki creating your own dataset for custom training demo of object detection yolov4 9 class v3 plasticnet demo best yolo model with face masks and fishing nets included plasticnet demo https img youtube com vi eqrklsfv8cy 0 jpg https youtu be eqrklsfv8cy yolo plasticnet demo tensorflow efficientdet d1 9 class demo plasticnet demo https img youtube com vi jwky3ooc7rw 0 jpg https youtu be jwky3ooc7rw tensorflow efficientdet d1 plasticnet demo get started to get started with plasticnet you first must clone the repository using the following command sh git clone https github com ibm plasticnet git for more detailed instructions on how to get started check out the plasticnet wiki https github com ibm plasticnet wiki once the repository is cloned you can run the following command in the python evironment of your choice note it is recommended that this is done in a new python environment to avoid any issues between package dependencies the setup script currently only supports macos but windows and linux support will be added soon sh cd plasticnet python setup py once the setup script has finished running you should restart your terminal session and run the following command sh plasticnet this will open the plasticnet terminal so you can easily download our models from our model zoo test the models on videos webcam and images or train on top of an existing model a list of all commands can be found by typing help and more detailed instructions about aguments for any command can be found with help command name to exit the command line type quit using yolo models if you intend to train yolo models you may have to make some changes to the makefile depending on your system the makefile is located in darknet and you will want to update these parameters to whatver your system has 1 meaning enabled make gpu 1 cudnn 0 cudnn half 0 opencv 1 avx 0 openmp 0 libso 0 it is highly recommended you install cudnn opencv for use with darknet as it will expedite the training process you can additionally update the yolo obj cfg file located in darknet cfg with any parameters you choose specifically the number of iterations classes and filters here s a guide for setting these values https github com mattokc35 darknet how to train to detect your custom objects after you have updated this makefile and or the configuration file run make clean and make to be able to start darknet training with the plasticnet cli test results see our spreadsheet documentating our test results from different trained models https docs google com spreadsheets d 1mcfc2hqjohrp2 g723d8 xd19luuukd frroiy5ksoq edit usp sharing resources forked yolov4 plasticnet darknet repository https github com mattokc35 darknet labeling images with ibm cloud annotations https cloud annotations ai darknet yolo https pjreddie com darknet yolo restoring integrity to the oceans rio https www oceansintegrity com pacific whale foundation https www pacificwhale org tensorflow object detection api https github com tensorflow models tree master research object detection yolo with tensorflow https github com theaiguyscode tensorflow yolov4 tflite ocean plastic statistics more than 1 million seabirds and 100 000 marine animals die from plastic pollution every year 100 of baby sea turtles have plastic in their stomachs there is now 5 25 trillion macro and micro pieces of plastic in our ocean 46 000 pieces in every square mile of ocean weighing up to 269 000 tonnes every day around 8 million pieces of plastic makes their way into our oceans the great pacific garbage patch is around 1 6 million square kilometers bigger than texas the world produces 381 million tonnes in plastic waste yearly this is set to double by 2034 50 of this is single use plastic only 9 has ever been recycled over 2 million tonnes of plastic packaging are used in the uk each year 88 of the sea s surface is polluted by plastic waste between 8 to 14 million tonnes enters our ocean every year britain contributes an estimated 1 7 million tonnes of plastic annually the us contributes 38 million tonnes of plastic every year plastic packaging is the biggest culprit resulting in 80 million tonnes of waste yearly from the us alone on uk beaches there are 5000 pieces of plastic 150 plastic bottles for each mile more than 1 million plastic bags end up in the trash every minute the world uses over 500 billion plastic bags a year that s 150 for each person on earth 8 3 billion plastic straws pollute the world s beaches but only 1 of straws end up as waste in the ocean by 2020 the number of plastics in the sea will be higher than the number of fish 1 in 3 fish caught for human consumption contains plastic plastic microbeads are estimated to be one million times more toxic than the seawater around it products containing microbeads can release 100 000 tiny beads with just one squeeze source condor ferries shocking ocean plastic statistics https www condorferries co uk plastic in the ocean statistics
ai
Distance_and_TimeDisplay_FreeRTOS_STM32F411
distance and timedisplay freertos stm32f411 this is a project where multitaksing is demonstrated with freertos and stm32f411 evalboard summary in this project time in hhmmss format is displayed on lcd display and the distance of an object from an ultrasonic sensor hc sr04 is displayed in cm lcd display is controlled via i2c bus in this project i have used gps module to extract time ultrasound module hc 04 to measure distance in cm between the module and the object time and distance are displayed on the lcd display the uc sends commands to lcd via i2c bus i have used freertos to manage my 4 tasks task1 is a dummy task that blinks leds task 2 parses time from a buffer that is filled when a dma transfer complete is done task 3 calculates the distance depending on the pulsewidth of the echo signal recieved from hc 04 module task 4 is for displaying the parsed time and distance to the lcd img src images img1 jpg alt img width 400 img src images img2 jpg alt img width 400 img src images img3 jpg alt img width 400 hw used stm32f411ve discovery board ultrasonic sensor hc sr04 sensor module 1604 lcd display via i2c bus waveshare wireless uart gps module neo 7m c usart to usb module for debugging logicanalyzer sw toolchain sw4stm32 built by ac6 based on eclipse ide arm gcc compiler st link support salae logicanalyzer gui operating system ubuntu linux source code features freertos port for cortexm4 standard peripheral libraries stdperiph driver working principle distance the hc 04 module has trigger and echo pin timer 2 tim2 ch3 is configured to send the trigger signal with a specific dutycycle img src images image2 triggersignal png alt img width 600 depending on the distance of the object from the ultrasound module a pulse is generated by echo pin the echo pin is connected to input capture pin timer1 ch1 the pulsewidth of the received pulse in milliseconds is used to determine the distance for more details refer this pdf https www mouser com ds 2 813 hcsr04 1022824 pdf sound travels at approx 343 meters second distance in centimeters is hence pulsewidth ms 34300 cm s 2 distance in cm pw 1000 seconds 17150 cm seconds img src images image1 trig echo png alt img width 600 img src images image3 trig echo png alt img width 600 img src images image4 4 16cm echo png alt img width 600
freertos stm32f411 stm32f4-discovery ultrasonic-sensor gps-device multitasking
os
frontend-resources
frontend resources resources https user images githubusercontent com 78463849 173004939 c06b4552 cb3a 48c5 b06a db8e67b850ee png resources for front end developers topics colors colors images images fonts fonts icons icons logos and svg bg logos and svg bg alerts alerts animations animations image sliders image sliders videos and sounds videos and sounds react ui tools packages react ui tools ackages design templates design templates useful tools tools colors colors name explaination colorhunt https colorhunt co get a list of colors for your needed colormind http colormind io colormind is a color scheme generator that uses deep learning it can learn color styles from photographs movies and popular art colorswall https colorswall com palettes most popular palettes color collections coolors https coolors co the super fast color palettes generator brandcolors http brandcolors net the biggest collection of official brand color codes around images images name explaination pexels https www pexels com the best free stock photos royalty free images videos shared by creators skitterphoto https skitterphoto com a place to find show and share public domain photos unsplash https unsplash com the internet s source of freely usable images pixabay https pixabay com over 2 6 million high quality stock images videos and music shared by our talented community picspree https picspree com en discover and download beautiful royalty free images stock photos illustrations and vectors findaphoto https www chamberofcommerce org findaphoto browse through over 1 million high quality stock photos across multiple free and paid stock photo sites from one tab freeimages https www freeimages com free stock photos royalty free images fonts fonts name explaination googlefonts https fonts google com library of around 1000 free licensed font families 1001 freefonts https www 1001freefonts com the ultimate font download download 10 000 fonts for just 19 95 licensed for personal and commercial use fontget https www fontget com free stock photos royalty free images has a variety of fonts available to download and sorted neatly with tags icons icons name explaination flaticon https www flaticon com free icons programing language 3700 programing languages icon and logos bootstrap icons https icons getbootstrap com free high quality open source icon library with over 1 600 icons include them anyway you like svgs svg sprite or web fonts use them with or without bootstrap in any project material design icons https materialdesignicons com material design icons growing icon collection allows designers and developers targeting various platforms to download icons in the format color and size they need for any project css icons https cssicon space icon set made with pure css code no dependencies grab and go icons free icons https icon icons com best icons for personal and commercial use svg png logos and svg bg logos and svg backgrounds logos name explaination svgporn https svgporn com 1000 high quality svg logos logosearch https logosear ch search html search engine with over 200 000 svg logos flaticon https www flaticon com free icons programing language 3700 programing languages icons and logos svg backgrounds name explaination getwaves https getwaves io customizable backgrounds images illustrationkit https illustrationkit com free vector illustrations for personal commercial projects designe freepik https www freepik com vectors illustrations find and download the best high quality photos designs and mockups drawkit https drawkit com hand drawn vector illustration and icon resources perfect for your next project blobmaker https www blobmaker app blobmaker is a free generative design tool create random unique and organic looking svg shapes alerts alerts name explaination sweetalert https sweetalert js org sweetalert makes popup messages easy and pretty sweetalert 2 https sweetalert2 github io a beautiful responsive customizable accessible wai aria replacement for javascript s popup boxes replacement animations animations css name explaination animista https animista net css animation smooth text box modal hover animation loaders https cssloaders github io this is a library having a collection of different types of css loaders spinners cssloaders https cssloaders github io awesome collection of beautiful loading spinners animate https animate style animate css is a library of ready to use cross browser animations for use in your web projects great for emphasis home pages sliders and attention guiding hints hover css http ianlunn github io hover a collection of css3 powered hover effects to be applied to links buttons logos svg featured images and so on tholman https tholman com obnoxious animations for the strong of heart and weak of mind motion ui https zurb com playground motion ui a sass library for creating flexible css transitions and animations tholman https tholman com obnoxious animations for the strong of heart and weak of mind javascript name explaination animation on scroll https michalsnik github io aos animate on scroll library to reveal animations when you scroll wow js https wowjs uk reveal animations when you scroll greensock https greensock com reveal animations when you scroll particlesjs https vincentgarreau com particles js a lightweight javascript library for creating particles simpleparallax https simpleparallax com examples the easiest way to get a parallax effect with javascript scroll effect image sliders image sliders name explaination swiperjs https swiperjs com the most modern website and mobile touch slider splidejs https splidejs com splide is a lightweight flexible and accessible slider carousel written in typescript no dependencies no lighthouse errors sequencejs https www sequencejs com the responsive css animation framework for creating sliders presentations banners and other step based applications videos and sounds videos and sounds name explaination pexels https www pexels com videos the best free stock videos shared by the pexels community pixabay https pixabay com videos stunning free stock video footage clips videvo https www videvo net stock video footage free and premium awesome stock videos free stock music https www free stock music com royalty free stock music for attractive videos youtube videos podcasts etc bensound https www bensound com royalty free music for creators react ui tools ackages react ui tools packages name explaination smooth scroll https www npmjs com package react use smooth scroll react provider component to add a smooth scroll effect react gsap https www npmjs com package react gsap react gsap lets you use the greensock animation platform gsap in react in a fully declarative way react spinners https www npmjs com package react gsap some awosome loading effects react preloaders https www npmjs com package react preloaders package for adding preloaders to your react app react type animation https www npmjs com package react type animation a customizable react typing animation component react animated text https www npmjs com package react animated text content aa component to animate your text in react use predefined animation type or compose your own one custom cursor react https www npmjs com package custom cursor react animated customizable and interactive cursor for react atroposjs https atroposjs com javascript library to create stunning touch friendly 3d parallax hover effects mouse image move https www npmjs com package react mouse image move react js package for smooth moving image with mouse move design templates design templates name explaination ohio clbthemes https ohio clbthemes com a website with best in class features and design templates envato elements https elements envato com unlimited downloads of 60 million creative assets and templates templatemonster https www templatemonster com aff tm gclid cj0kcqjw8o vbhcparisacmvvlphkae4phr28vargdivsjqgacu fewongh nffdoftb bohwxazlvsaahrjealw wcb the collection of items includes a wide choice of website templates suitable for all kinds of niche specific projects siteinspire https www siteinspire com siteinspire is a showcase of the finest web and interactive design webdesign inspiration https www webdesign inspiration com this is a web design inspiration gallery and the best website design ideas awwwards https www awwwards com the awards of design creativity and innovation on the internet screenlane https screenlane com the latest mobile ui design inspiration in your inbox every week for free dribbble https dribbble com explore the world s leading design portfolios free design resources https freedesignresources net crafted with love from amazing artists and professional designers around the world ranging from fonts mockups graphics templates more tools tools name explaination tinypng https tinypng com smart webp png and jpeg compression i love pdf https www ilovepdf com every tool you need to work with pdfs in one place box shadows https getcssscan com css box shadow examples beautiful css box shadow examples colors from image https html color codes info colors from image get color codes by uploading file from your computer or insert link figma https www figma com graphic design tool all the elements you need to create amazing logos social media graphics presentations and more for free canva https www canva com online design tools removebg https www remove bg remove and change the background of image on online unscreen https www unscreen com remove video background 100 automatically and free glass ui https ui glass generator glass ui is a free and open source css library based on the glassmorphism design specifications mobile friendly test https search google com test mobile friendly test how easily a visitor can use your page on a mobile device
design frontend-resources resources
front_end
blockchain-voting_2019
blockchain voting
blockchain
deep-vision
deep learning in computer vision deep vision pytorch https github com pytorch pytorch tensorflow https github com tensorflow tensorflow implementations of classic deep neural networks and training scripts for computer vision tasks this is used to ease the learning curve for new dl practitioners by two principles 1 keep the coding style consistent accross all networks 2 focus on the code readability and avoid obscure tricks if you think my work is helpful please star this repo if you have any questions regarding the code feel free create an issue the directory is categorized by model architecture then further by framework some pretrained models jupyter notebook visuliazation script and training logs are also provided for your reference image classification alexnet pytorch alexnet v1 alexnet v2 tensorflow alexnet v2 vgg pytorch vgg 16 19 inception googlenet pytorch inception v1 inception v3 resnet pytorch resnet 34 50 152 v1 tensorflow resnet 50 152 v1 resnet 50 v2 mobilenet pytorch mobilenet v1 1 0 lenet pytorch lenet 5 tensorflow lenet 5 object detection yolo tensorflow yolo v3 generative adversarial network dcgan tensorflow cyclegan tensorflow pose estimation stacked hourglass tensorflow hourglass 104 disclaimer this repo is mainly for study purpose hence i write the code in a readable and understandable way but may not be scalable and reusable i ve also added comments and referrence for those catches i ran into during replication i m not a researcher so don t have that much of time to tune the training and achieve the best benchmark if you are looking for pre trained models for transfer learning there re some good ones from pytorch torchvision https pytorch org docs stable torchvision models html or tensorflow slim https github com tensorflow models tree master research slim acknowledgement without the following resources i wouldn t be able to finish this project deep learning specialization https www deeplearning ai deep learning specialization by deeplearning ai and coursera computer vision nanodegree https www udacity com course computer vision nanodegree nd891 by udacity hands on machine learning with scikit learn and tensorflow https www amazon com hands machine learning scikit learn tensorflow dp 1491962291 keywords hands on machine learning qid 1547709501 s books sr 1 3 ref sr 1 3 by aur lien g ron
ai
chatbot-api
chatbot api see examples ipynb examples ipynb for request examples now support llama https huggingface co decapoda research llama 7b hf cfgs llama 7b json llama https huggingface co decapoda research llama 7b hf with lora https huggingface co tloen alpaca lora 7b cfgs llama 7b lora json chatglm https huggingface co thudm chatglm 6b cfgs chatglm 6b json instructglm https github com yanqiangmiffy instructglm cfgs chatglm 6b alpaca lora json blip2chatglm https huggingface co xipotzzz blip2zh chatglm 6b cfgs blip2zh chatglm 6b json setup conda create n llmapi python 3 8 conda activate llmapi conda install pytorch torchvision torchaudio pytorch cuda 11 7 c pytorch c nvidia pip install r requirements txt run uvicorn src app reload chatbot api supports model scheduling 1 idle model instances will be closed 2 new model instances will be created if too many concurrent requests you can modify sched config json sched config json to change the scheduling strategy and model instances a typical config is json idle check period 120 check idle models and close them every 120 seconds models blip2zh chatglm 6b modelname should be the same as the config filename under cfgs max instances 1 at most 1 instance will be created idle time 3600 if no request for 1 hours the instance will be closed create threshold if 5 requests request blip2zh chatglm 6b in 5 seconds n requests 5 1 more instance will be created not exceeding max instances delay 5 format request format json model chatglm 6b messages role user content hello stream true max tokens 1024 response format a typical response json choices index 0 message role assistant content hello how can i help you today you may refer to examples ipynb examples ipynb for more examples
ai
Front-End-Web-UI-Frameworks-and-Tools-Bootstrap-4
front end web ui frameworks and tools bootstrap 4 front end web ui frameworks and tools bootstrap 4
front_end
Python-Machine-Learning-Blueprints
python machine learning blueprints this is the code repository for python machine learning blueprints https www packtpub com big data and business intelligence python machine learning blueprints utm source github utm medium repository utm campaign 9781784394752 published by packt it contains all the supporting project files necessary to work through the book from start to finish instructions and navigation all of the code is organized into folders each folder starts with a number followed by the application name for example chapter02 to install the software for code testing kindly refer to the software and hardware requirement section to run the code given in the code bundle follow the steps instructions given in the book software and hardware requirement software 1 all example anaconda python 3 5 free https www continuum io downloads 2 chapter 3 phantomjs free http phantomjs org download html hardware 1 350mb of disk space 2 16 23mb of disk space os mac os x windows or linux related python books and videos designing machine learning systems with python https www packtpub com big data and business intelligence designing machine learning systems python utm source github utm medium repository utm campaign 9781785882951 functional python programming https www packtpub com application development functional python programming utm source github utm medium repository utm campaign 9781784396992 beginning python video https www packtpub com application development beginning python video utm source github utm medium repository utm campaign 9781786468994
ai
PROJECT-PSIT
h2 project problem solving in information technology h2 a href img src img pic1 jpg width 1000px height 300 a br h2 bitcoin price over the years 2017 2018 h2 h2 explain h2 nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp may 2017 may 2018 python csv ranking bitcoin name bitcoin symbol bitcoin market capital price transaction volume h2 bitcoin h2 nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp bitcoin bitcoin bitcoin bitcoin bitcoin h2 h2 cryptocurrencies br br br br bitcoin analsysis website http www it kmitl ac th it61070245 project index html youtube https www youtube com watch v 69rariqnd1s t fbclid iwar1umvwvvkb13asgkazsz4v8gu4xbr2zs66qxlltdhfoxeqjerdfdkrv5i4 built with python 3 7 0 pygal 2 0 0 member nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp a href img src img profile3 png width 100px height 100 a nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp a href img src img profile4 png width 100px height 100 a nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp a href img src img profile2 png width 100px height 100 a nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp a href img src img profile1 png width 100px height 100 a nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp 61070060 nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp 61070203 nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp 61070238 nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp 61070245 61070060 nbsp nbsp nbsp nbsp strong github strong nbsp nbsp denpoom https github com denpoom 61070203 nbsp nbsp nbsp nbsp strong github strong nbsp nbsp warrawat203 https github com warrawat203 61070238 nbsp nbsp nbsp nbsp strong github strong nbsp nbsp sahussawud https github com sahussawud 61070245 nbsp nbsp nbsp nbsp strong github strong nbsp nbsp toeiisk https github com toeiisk
server
DisableAlarmProject
disablealarmproject project for embedded system design course hardware design is inside kicad final version folder software description is included in the readme file inside module radio module encyption folder contains the encrypted files of the project
os
PHP-RFID-Attendance-logging-system
php nodemcu rfid attendance logging system the system is based on a website created with php html css bootstrap javascript jquery a nodemcu and an arduino uno board for the hardware device and a mysql database the hardware device will read rfid cards store their information and send it to the website which will process the data and then modify it store it in the sql database then send a response to the device so it can print a message on an lcd screen and also write the result on a micro sd card the website allows the administrator to register delete and even edit users and their registered rfid cards the system displays the current registered attendance logged on the final page in a table obtained by inner joining the users and the logs table using mysql queries this final table can also be exported as an excel file h2 project diagram h2 img src preview bloc jpg h2 website diagram h2 img src preview site jpg h2 user data h2 img src preview studenti jpg h2 user data edit page h2 img src preview edit jpg h2 user data delete page h2 img src preview delete jpg h2 registration h2 img src preview inregistrare jpg h2 read tag h2 img src preview date cartela jpg img src preview date jpg h2 attendance logs h2 img src preview prezenta jpg h2 attendance logs exported as an excel file h2 img src preview excel jpg h2 the logic behind the hardware device h2 img src preview hardware flowchart jpg h2 the hardware device h2 img src preview device outside jpg img src preview device inside jpg
server
pororo
pororo platform of neural models for natural language processing p align center a href https github com kakaobrain pororo releases img alt github release src https img shields io github release kakaobrain pororo svg a a href https github com kakaobrain pororo blob master license img alt apache 2 0 src https img shields io badge license apache 202 0 blue svg a a href https kakaobrain github io pororo img alt docs src https img shields io badge docs passing success svg a a href https github com kakaobrain pororo issues img alt issues src https img shields io github issues kakaobrain pororo a p br assets usage gif pororo performs natural language processing and speech related tasks it is easy to solve various subtasks in the natural language and speech processing field by simply passing the task name br installation pororo is based on torch 1 6 cuda 10 1 and python 3 6 you can install a package through the command below console pip install pororo or you can install it locally console git clone https github com kakaobrain pororo git cd pororo pip install e for library installation for specific tasks other than the common modules please refer to install md install md for the utilization of automatic speech recognition wav2letter https github com facebookresearch wav2letter should be installed separately for the installation please run the asr install sh asr install sh console bash asr install sh for the utilization of speech synthesis please run the tts install sh tts install sh console bash tts install sh speech synthesis samples can be found here https pororo tts github io br usage pororo can be used as follows first in order to import pororo you must execute the following snippet python from pororo import pororo after the import you can check the tasks currently supported by the pororo through the following commands python from pororo import pororo pororo available tasks available tasks are mrc rc qa question answering machine reading comprehension reading comprehension sentiment sentiment analysis nli natural language inference inference fill fill in blank fib para pi cse contextual subword embedding similarity sts semantic textual similarity sentence similarity sentvec sentence embedding sentence vector se inflection morphological inflection g2p grapheme to phoneme grapheme to phoneme conversion w2v wordvec word2vec word vector word embedding tokenize tokenise tokenization tokenisation tok segmentation seg mt machine translation translation pos tag pos tagging tagging const constituency constituency parsing cp pg collocation collocate col word translation wt summarization summarisation text summarization text summarisation summary gec review review scoring lemmatization lemmatisation lemma ner named entity recognition entity recognition zero topic dp dep parse caption captioning asr speech recognition st speech translation ocr srl semantic role labeling p2g aes essay qg question generation age suitability to check which models are supported by each task you can go through the following process python from pororo import pororo pororo available models collocation available models for collocation are lang ko model kollocate lang en model collocate en lang ja model collocate ja lang zh model collocate zh if you want to perform a specific task you can put the task name in the task argument and the language name in the lang argument python from pororo import pororo ner pororo task ner lang en after object construction it can be used in a way that passes the input value as follows python ner michael jeffrey jordan born february 17 1963 is an american businessman and former professional basketball player michael jeffrey jordan person o born o february 17 1963 date is o an o american norp businessman o and o former o professional o basketball o player o o if task supports multiple languages you can change the lang argument to take advantage of models trained in different languages python ner pororo task ner lang ko ner michael jeffrey jordan 1963 2 17 person o civilization o o michael jeffrey jordan person o o 1963 2 17 date o o o location o o o o civilization o ner pororo task ner lang ja ner person o o o o o o ner pororo task ner lang zh ner nba person o gpe o o nba org o o o o o o o o o o o o o o o o o org o o o o o o o o if the task supports multiple models you can change the model argument to use another model python from pororo import pororo mt pororo task mt lang multi model transformer large multi mtpg fast mt pororo task mt lang multi model transformer large multi fast mtpg br documentation for more detailed information see full documentation https kakaobrain github io pororo if you have any questions or requests please report the issue https github com kakaobrain pororo issues br citation if you apply this library to any project and research please cite our code misc pororo author heo hoon and ko hyunwoong and kim soohwan and han gunsoo and park jiwoo and park kyubyong title pororo platform of neural models for natural language processing howpublished url https github com kakaobrain pororo year 2021 br contributors hoon heo https github com huffon hyunwoong ko https github com hyunwoongko soohwan kim https github com sooftware gunsoo han https github com robinsongh381 jiwoo park https github com bernardscumm and kyubyong park https github com kyubyong br license pororo project is licensed under the terms of the apache license 2 0 copyright 2021 kakao brain corp https www kakaobrain com all rights reserved
deep-learning natural-language-processing automatic-speech-recognition speech-synthesis neural-models
ai
Upgrade-your-brain
upgrade your brain the definitive list of newsletters to keep up to date on various web development technologies language agnostic the daily nerd http dailynerd nl web design weekly http web design weekly com sidebar http sidebar io devops weekly http devopsweekly com web tools weekly http webtoolsweekly com code project http www codeproject com script mailouts archive aspx ewebdesign http ewebdesign com newsletter the smashing newsletter http www smashingmagazine com the smashing newsletter webpronews http www webpronews com signup hacker newsletter http www hackernewsletter com alertbox http www nngroup com articles subscribe webdesigner depot http www webdesignerdepot com newsletter sitepoint http www sitepoint com newsletter developer shed http www developershed com newsletter php the changelog weekly http thechangelog com weekly startup edition http startupedition com the modern web observer http appendto com modern web observer monocle http monocle io web dev weekly http www webdevweekly com udgwebdev weekly http weekly udgwebdev com mobile web weekly http mobilewebweekly co db weekly http dbweekly com versioning http tinyletter com versioning maquin rio http maquinario co dicas de front end http dicasdefrontend com br web dev break http www webdevbreak com weekly screencast for web developers web development reading list http wdrl info web operations weekly http webopsweekly com web dev resources from tutorialzine http tutorialzine com webdev newsletter accessibility weekly http a11yweekly com frontend weekly http frontendweekly co code with hugo https buttondown email hugo tldr https tldr tech weekly webtips https www webtips dev tech productivity https techproductivity co webdev with stefan https www stefanjudis com newsletter the tech caffeine https tanmaydeshpande substack com javascript general javascript weekly http javascriptweekly com a drip of javascript http designpepper com a drip of javascript braziljs http braziljs org impjs https github com impjs impjs semanal javascript jabber http javascriptjabber com web components weekly http webcomponentsweekly me js tips http www jstips co pony foo weekly https ponyfoo com weekly ecmascript daily https ecmascript daily github io hashbang weekly http hashbangweekly okgrow com es next news http esnextnews com dev weekly https mailchi mp f59beeac6b9b devupdates javascript kicks https javascriptkicks com awesome javascript https js libhunt com newsletter bytes https bytes dev browsertech digest https digest browsertech com nodejs node weekly http nodeweekly com adventures in nodeland by matteo collina https www getrevue co profile matteocollina angularjs adventures in angular http devchat tv adventures in angular podcast angularjs daily http www angularjsdaily com ng newsletter http www ng newsletter com angularjs tutorial http www thinkster io angulartutorial a tiny piece of angularjs http nardi us1 list manage com subscribe u ea0b2abd8b92ac44c600a908c id cd39a9459b angular typescript newsletter http facebook us15 list manage1 com subscribe u 9e63a5f6d03e3654b1d10a323 id 8c7f0ee4e1 ember ember weekly http emberweekly com backbone backbone js weekly http backboneweekly com meteor meteor weekly http meteorhacks com meteor weekly react react js newsletter http reactjsnewsletter com rally coding http rallycoding com react status http react statuscode com this week in react https thisweekinreact com react digest https reactdigest net enterprise react newsletter https www digitalprimates net enterprise react the enterprise react newsletter react native awesome react native https mailchi mp 10df853d5918 awesome react native weekly let s react http newsletter letsreact io react native newsletter http reactnative cc react native news https reactnativenews curated co task runner gruntjs weekly https twitter com gruntweekly vue vue js news https news vuejs org screencasts advanced javascript screencasts http javascriptcasts io html5 general html5 weekly http html5weekly com games gamedev js http weekly gamedevjs com css css weekly http css weekly com sassnews http sassnews us7 list manage com subscribe u b4a4054cce715a3b0ae5e7d35 id f7c505323d tailwind weekly https tailwindweekly com svg svg weekly https svgweekly com responsive design responsive design weekly http responsivedesignweekly com responsive web design http responsivewebdesign com newsletter ux ux weekly http uxwkly com mailchimpux http www theuxnewsletter com php general php weekly news http www phpweekly com php weekly http phpweekly info web php magazine http webandphp com a semana php http www asemanaphp com br wordpress wordpress answers http wordpress stackexchange com wpmail me http wpmail me wplift http wplift com wpmunews http premium wpmudev org laravel laravel weekly http laravelweekly com go go newsletter http www golangweekly com a semana go https www getrevue co profile a semana go gonot cias https gonoticias substack com golang weekly https golangweekly com perl perl weekly http perlweekly com python python weekly http www pythonweekly com pycoder s weekly http pycoders com django weekly http djangoweek ly django news https django news com import python weekly http importpython com newsletter ruby ruby weekly http rubyweekly com ruby daily http paper rubydaily org green ruby http greenruby org dart dart weekly http dartweekly com postgres postgres weekly http postgresweekly com android android weekly http androidweekly net vim a tiny piece of vim http vimtips nardi me vim weekly http www vimweekly com sql nosql document databases graph databases nosql weekly http www nosqlweekly com db weekly http dbweekly com mongodb weekly https mongodb email elixir elixirweekly https elixirweekly net elixir digest https elixirdigest net data elixir https dataelixir com
front_end
awesome-blockchain
awesome blockchain curated list of the bitcoin blockchain services list of content proof of existence proof of existence timestamp software timestamp software document signing document signing ssl certificates ssl certificates assets assets private blockchain private blockchain storage storage authentication authentication social communication social communication dns dns marketplace marketplace energy market energy internet of things iot lightning network ln proof of existence free eternitywall http eternitywall it messages lasting forever virtual notary http virtual notary org a free and secure electronic attestation service acronis notary http www acronis com en us business blockchain notary only on private ethereum blockchain blockchainproof me https blockchainproof me free proof of existence of any file oringstamp http www originstamp org registration wall bitproof io https bitproof io tierion https tierion com tierion is an engine for collecting data and recording it in the blockchain stampery https stampery co pay wall everlastly https everlastly com factom http www factom org proof of existence http proofofexistence com stampd http stampd io woleet https woleet io veriphant https veriphant com high level application services in this section do not allow to notarize a single file but provide high level application ascribe http ascribe io blocknotary http www blocknotary com colu https www colu co guardtime https guardtime com a blockchain approach to cybersecurity that ensures the integrity of systems networks and data at industrial scale signatura https signatura co the world s most secure and resilient digital signature platform assets coinspark http coinspark org upgrade your bitcoin with messaging and assets coinprism https www coinprism com use the bitcoin blockchain with any kind of asset colu co http colu co creating storing and managing digital assets on top of the bitcoin blockchain chronicled http www chronicled com index html collect and trade 100 authentic sneakers everledger http www everledger io an online reputation system for diamonds blockverify io http blockverify io blockchain based anti counterfeit solution blockai com openpublish https github com blockai openpublish a publishing protocol for registering media as a digital asset on the bitcoin blockchain blockcypher assets api http dev blockcypher com asset api assets api using openassets protocol online identity passcard https passcard info onename https onename com shocard http www shocard com blockchainme http blockchainme com timestamp software opentimestamps https github com opentimestamps opentimestamps server scalable trustless distributed timestamping with bitcoin 21 co bitcoin notary public https 21 co learn bitcoin notary public next steps document signing blocksign https blocksign com ssl certificates revoke ssl https github com christophera revocable self signed tls certificates hack analytics numisigh http numisight com numisight gives you the tools you need to view the forest the trees and all the levels of detail in between blockseer https www blockseer com bitcoin blockchain analytics other tradle http tradle io extending the bitcoin blockchain to non financial applications private blockchain multichain http www multichain com open platform for building blockchains openchain http openchain org blockchain technology for the enterprise eris industries https erisindustries com eris is free software that allows anyone to build their own secure low cost run anywhere data infrastructure using blockchain and smart contract technology linux foundation blockchain https blockchain linuxfoundation org backed by ibm applied blockchain http appliedblockchain com consulting company openblockchain https github com openblockchain blockchain fabric code storage sia http sia tech enterprise grade collaborative cloud for data storage storj io http storj io decentralized cloud storage storj is based on blockchain technology and peer to peer protocols to provide the most secure private and encrypted cloud storage maidsafe http maidsafe net safe secure access for everyone network a new secure way to access a world of existing apps where the security of your data is put above all else ipfs https ipfs io the interplanetary file system ipfs is a new hypermedia distribution protocol addressed by content and identities ipfs enables the creation of completely distributed applications it aims to make the web faster safer and more open lbry http lbry io lbry is a decentralized censorship resistant open source peer to peer information marketplace and discovery protocol filecoin http filecoin io filecoin is a data storage network and electronic currency based on bitcoin authentication bitid https github com bitid bitid the connect with bitcoin open protocol bitauth https github com bitpay bitauth authenticate with web services utilizing the same strategy as bitcoin social communication getgems http getgems org getgems is a free social messaging app that rewards you for your activity twister http twister net co fully decentralized p2p microblogging platform dns blockstack https blockstack org decentralized dns for blockchain applications marketplace rein http reinproject org decentralized professional services market ribbit me http ribbit me loyalty solution based on blockchain energy gridsingularity http gridsingularity com decentralized energy data exchange platform solarcoin http solarcoin org a global rewards program for solar electricity generation a name iot a internet of things 21 https 21 co micropayments for http api over bitcoin payment channels and library for iot applications iota http www iotatoken com decentralized internet of things token filament http filament com software and hardware for decentralized intranet of things systems slock it https slock it ethereum based platform for building shared things machinomy http machinomy com distributed platform for iot micropayments a name ln a lightning network specifications whitepaper https lightning network lightning network paper pdf descriptions of ln principles bolt https github com lightningnetwork lightning rfc blob master 00 introduction md protocol description implementations eclair https github com acinq eclair scala acinq company lightning https github com elementsproject lightning c implementation done by blockstream bitcoin core developers mainly thunder https github com blockchain thunder java blockchain info implementation lit https github com mit dci lit golang the mit digital currency initiative lnd https github com lightningnetwork lnd golang
blockchain
Industrial-Connectivity-Kit
industrial connectivity kit repo concerned with the industrial connectivity kit that is a comprehensive collection of ethernet and can modules designed specifically for embedded systems and industrial applications
os
django-tutorial
django tutorial this repository contains the code for the web development with python and django tutorial session run by mike pirnat mpirnat and david stanek dstanek at codemash 2013 and 2014 this is not the codemash 2015 version v2 in this tutorial we ll build a full featured website step by step using the django web framework getting started if you re attending the tutorial in person please make sure you install these prerequisites before the class begins so that we can make the most of our time together in the session you and your fellow attendees will thank you for your preparedness you ll need to install 1 python python django is written in the python programming language you ll need to install python in order to make anything work you should install either python 2 7 or python 3 3 2 git git you will need the git version control system in order to work with the exercises in this repository if you re new to git don t panic we won t be doing anything too weird and we ll walk through all of it in the session 3 pip pip pip is a tool for installing python packages you will need it to install the python dependencies for this tutorial 4 virtualenv virtualenv virutalenv is a tool for creating isolated python environments on your system this allows you to work on multiple projects that might have conflicts in the versions of libraries they depend on it also keeps your base system installation of python nice and clean setting your path windows if you re on windows we recommend following these instructions python windows to get python pip and virtualenv going be sure to update your path this varies a bit between different versions of windows windows path so use the method that s right for your os if you installed python 2 7 add c python27 c python27 scripts if you installed python 3 3 add c python33 c python33 scripts setting up the project once you have installed these basics let s get the working environment set up for the project time to open up a command line terminal in mac os x good ol cmd in windows 1 create a new virtual environment virtualenv and activate it on linux or mac os x virtualenv django precompiler cd django precompiler source bin activate on windows virtualenv django precompiler cd django precompiler scripts activate bat 2 clone this repository in the django precompiler directory from the previous step git clone https github com mpirnat django tutorial git src 3 install django and any other python dependencies in the django precompiler directory from the previous step cd src pip install r requirements txt 4 check to make sure everything s in good shape in the src directory from the previous step python prerequisites py on windows that looks like python exe prerequisites py 5 rewind the repository to the start of our exercises in the src directory from the previous step git reset hard ex00 you should now be ready for the tutorial slides the current slides that accompany this tutorial are available for viewing and downloading slides help if you need help getting set up please contact mike pirnat mpirnat gmail com and david stanek dstanek dstanek com please make sure to copy both of us so that we can make sure you get the best answer as soon as possible other versions looking for a different version version 2 v2 as presented at codemash 2015 credits this tutorial was created by mike crute mcrute mike pirnat mpirnat david stanek dstanek with gratitude to the python and django communities for their accomplishments dstanek http traceback org git http git scm com mcrute http mike crute org mpirnat http mike pirnat com pip http www pip installer org en latest installing html python windows http docs python guide org en latest starting install win python http python org download slides https speakerdeck com mpirnat web development with python and django 2014 v2 https github com mpirnat django tutorial v2 virtualenv http www virtualenv org en latest virtualenv html windows path http www java com en download help path xml
front_end
data_engineering_on_gcp_book
up and running data engineering on the google cloud platform the completely free e book for setting up and running a data engineering stack on google cloud platform note this book is currently incomplete if you find errors or would like to fill in the gaps read the contributions section https github com nunie123 data engineering on gcp book user content contributions table of contents preface br chapter 1 setting up a gcp account https github com nunie123 data engineering on gcp book blob master ch 01 gcp account md br chapter 2 setting up batch processing orchestration with composer and airflow https github com nunie123 data engineering on gcp book blob master ch 02 orchestration md br chapter 3 building a data lake with google cloud storage gcs https github com nunie123 data engineering on gcp book blob master ch 03 data lake md br chapter 4 building a data warehouse with bigquery https github com nunie123 data engineering on gcp book blob master ch 04 data warehouse md br chapter 5 setting up dags in composer and airflow https github com nunie123 data engineering on gcp book blob master ch 05 dags md br chapter 6 setting up event triggered pipelines with cloud functions https github com nunie123 data engineering on gcp book blob master ch 06 event triggers md br chapter 7 parallel processing with dataproc and spark https github com nunie123 data engineering on gcp book blob master ch 07 parallel processing md br chapter 8 streaming data with pub sub https github com nunie123 data engineering on gcp book blob master ch 08 streaming md br chapter 9 managing credentials with google secret manager https github com nunie123 data engineering on gcp book blob master ch 09 secrets md br chapter 10 infrastructure as code with terraform https github com nunie123 data engineering on gcp book blob master ch 10 infrastructure as code md br chapter 11 deployment pipelines with cloud build https github com nunie123 data engineering on gcp book blob master ch 11 deployment pipelines md br chapter 12 monitoring and alerting https github com nunie123 data engineering on gcp book blob master ch 12 monitoring md br chapter 13 up and running building a complete data engineering infrastructure https github com nunie123 data engineering on gcp book blob master ch 13 up and running md br appendix a example code repository https github com nunie123 data engineering on gcp book blob master appendix a example code readme md preface this is a book designed to teach you how to set up and maintain a production ready data engineering stack on google cloud platform https cloud google com gcp in each chapter i will discuss an important component of data engineering infrastructure i will give some background on what the component is for and why it s important followed by how to implement that component on gcp i ll conclude each chapter by referencing similar services offered by other cloud providers by the end of this book you will know how to set up a complete tech stack for a data engineering team using gcp be warned that this book is opinionated i ve chosen a stack that has worked well for me and that i believe will work well for many data engineering teams but it s entirely possible the infrastructure i describe in this book will not be a great fit for your team if you think there s a better way than what i ve laid out here i d love to hear about it please refer to the contributions section below who this book is for this book is for people with coding familiarity that are interested in setting up professional data pipelines and data warehouses using google cloud platform i expect the readers to include data engineers looking to learn more abut gcp junior data engineers looking to learn best practices for building and working with data engineering infrastructure software engineers devops engineers data scientists data analysts or anyone else that is tasked with performing data engineering functions to help them with their other work this book assumes your familiarity with sql and python if you re not familiar with python you should be able to muddle through with general programming experience if you do not have experience with these languages particularly sql it is recommended you learn these languages and then return to this book this book covers a lot of ground many of the subjects we ll cover in just part of a chapter will have entire books written about them i will provide references for further research just know that while this book is comprehensive in the sense that it provides all the information you need to get a stack up and running there is still plenty of information a data engineer needs to know that i ve omitted from this book what is covered in this book is not the data engineering stack it is a data engineering stack even within gcp there are lots of ways to accomplish similar tasks for example i could have used dataflow based on apache beam for the streaming solution or i could have gone with a completely different paradigm for how i store and query data such as storing data as parquet files in gcs and querying with spark i mention this here to make sure you understand that this book is not the complete guide to being a data engineer rather it is an introduction to the types of problems a data engineer solves and a sampling common tools in gcp used to solve those problems finally there are a vast array of data engineering tools that are in use i cover many popular tools for data engineering but many more have been left out of this book due to brevity and my lack of experience with them if you feel i left off something important please read the contributions section below how to read this book this book is divided into chapters discussing major data engineering concepts and functions most chapters is then divided into three parts an overview of the topic implementation examples and references to other articles and tools if you re looking to use this book as a guid to set up your data engineering infrastructure from scratch i recommend you read this book front to back each chapter describes a necessary component of the tech stack and they are ordered such that the infrastructure described in one chapter builds of the previously described infrastructure likely many people will find their way to this book trying to solve a specific problem e g how to set up alerting on gcp s composer airflow service for these people i ve tried to make each chapter as self contained as possible when i use infrastructure created in a previous chapter i ll always provide a link to the previous chapter where it s explained the best way to learn is by doing which is why each chapter provides code samples i encourage you to build this infrastructure with me as you read through the chapters included with this book in appendix a https github com nunie123 data engineering on gcp book blob master appendix a example code readme md i ve provided an example of what your infrastructure as code will look like a note about the code in the book the code within the chapters is for demonstration purposes and is not necessarily in the format you should be running in production for the sake of clarity and brevity the code usually omits good practices such as type hinting validation error handling and docstrings if you want a better sense of what production ready code looks like review the code in appendix a https github com nunie123 data engineering on gcp book blob master appendix a example code readme md contributions you may have noticed this book is hosted on github this results in three great things 1 the book is hosted online and freely available 2 you can make pull requests 3 you can create issues if you think the book is wrong missing information or otherwise needs to be edited there are two options 1 make a pull request the preferred option if you think something needs to be changed fork this repo make the change yourself then send me a pull request i ll review it discuss it with you if needed then add it in easy peasy if you re not very familiar with github instructions for doing this are here https gist github com chaser324 ce0505fbed06b947d962 if your pr is merged it will be considered a donation of your work to this project you agree to grant a attribution 4 0 international cc by 4 0 https creativecommons org licenses by 4 0 license for your work you will be added the the contributors section on this page once your pr is merged 2 make an issue go to the issues tab https github com nunie123 data engineering on gcp book issues for this repo on github click to create a new issue then tell me what you think is wrong preferably including references to specific files and line numbers i look forward to working with you all contributors ed nunes ed lives in chicago and works as a data engineer for zoro https www zoro com feel free to reach out to him on linkedin https www linkedin com in ed nunes b0409b14 license you are free to use this book under the attribution noncommercial noderivatives 4 0 international cc by nc nd 4 0 https creativecommons org licenses by nc nd 4 0 license next chapter chapter 1 setting up a gcp account https github com nunie123 data engineering on gcp book blob master ch 1 gcp account md
cloud
cvzone
cvzone downloads https pepy tech badge cvzone https pepy tech project cvzone downloads https pepy tech badge cvzone month https pepy tech project cvzone downloads https pepy tech badge cvzone week https pepy tech project cvzone this is a computer vision package that makes its easy to run image processing and ai functions at the core it uses opencv https github com opencv opencv and mediapipe https github com google mediapipe libraries installation you can simply use pip to install the latest version of cvzone pip install cvzone hr examples for sample usage and examples please refer to the examples folder in this repository this folder contains various examples to help you understand how to make the most out of cvzone s features video documentation youtube video https img youtube com vi iexqttqgyo0 0 jpg https youtu be iexqttqgyo0 table of contents 1 installations installations 2 corner rectangle corner rectangle 3 puttextrect puttextrect 4 download image from url download image from url 5 overlay png overlay png 6 rotate image rotate image 7 stack images stack images 8 fps fps 9 finding contours finding contours 10 color module color module 11 classification module classification module 12 face detection face detection 13 face mesh module face mesh module 14 selfie segmentation module selfie segmentation module 15 hand tracking module hand tracking module 16 pose module pose module 17 serial module serial module 18 plot module plot module installations to install the cvzone package run the following command bash pip install cvzone corner rectangle div align center img src results cornerrect2 jpg alt corner rectangle cvzone div python import cv2 import cvzone importing the cvzone library initialize the webcam cap cv2 videocapture 2 capture video from the third webcam 0 based index main loop to continuously capture frames while true capture a single frame from the webcam success img cap read success is a boolean that indicates if the frame was captured successfully and img contains the frame itself add a rectangle with styled corners to the image img cvzone cornerrect img the image to draw on 200 200 300 200 the position and dimensions of the rectangle x y width height l 30 length of the corner edges t 5 thickness of the corner edges rt 1 thickness of the rectangle colorr 255 0 255 color of the rectangle colorc 0 255 0 color of the corner edges show the modified image cv2 imshow image img display the image in a window named image wait for 1 millisecond between frames cv2 waitkey 1 waits 1 ms for a key event not being used here puttextrect div align center img src results puttextrect jpg alt puttextrect cvzone div python import cv2 import cvzone importing the cvzone library initialize the webcam cap cv2 videocapture 2 capture video from the third webcam 0 based index main loop to continuously capture frames while true capture a single frame from the webcam success img cap read success is a boolean that indicates if the frame was captured successfully and img contains the frame itself add a rectangle and put text inside it on the image img bbox cvzone puttextrect img cvzone 50 50 image and starting position of the rectangle scale 3 thickness 3 font scale and thickness colort 255 255 255 colorr 255 0 255 text color and rectangle color font cv2 font hershey plain font type offset 10 offset of text inside the rectangle border 5 colorb 0 255 0 border thickness and color show the modified image cv2 imshow image img display the image in a window named image wait for 1 millisecond between frames cv2 waitkey 1 waits 1 ms for a key event not being used here download image from url python import cv2 import cvzone imgnormal cvzone downloadimagefromurl url https github com cvzone cvzone blob master results shapes png raw true imgpng cvzone downloadimagefromurl url https github com cvzone cvzone blob master results cvzonelogo png raw true keeptransparency true imgpng cv2 resize imgpng 0 0 none 3 3 cv2 imshow image normal imgnormal cv2 imshow transparent image imgpng cv2 waitkey 0 overlay png div align center img src results overlaypng jpg alt overlaypng cvzone div python import cv2 import cvzone initialize camera capture cap cv2 videocapture 2 imgpng cvzone downloadimagefromurl url https github com cvzone cvzone blob master results cvzonelogo png raw true keeptransparency true imgpng cv2 imread cvzonelogo png cv2 imread unchanged while true read image frame from camera success img cap read imgoverlay cvzone overlaypng img imgpng pos 30 50 imgoverlay cvzone overlaypng img imgpng pos 200 200 imgoverlay cvzone overlaypng img imgpng pos 500 400 cv2 imshow imgoverlay imgoverlay cv2 waitkey 1 rotate image div align center img src results rotateimage jpg alt rotateimage cvzone div python import cv2 from cvzone utils import rotateimage import rotateimage function from cvzone utils initialize the video capture cap cv2 videocapture 2 capture video from the third webcam index starts at 0 start the loop to continuously get frames from the webcam while true read a frame from the webcam success img cap read success will be true if the frame is read successfully img will contain the frame rotate the image by 60 degrees without keeping the size imgrotated60 rotateimage img 60 scale 1 keepsize false rotate image 60 degrees scale it by 1 and don t keep original size rotate the image by 60 degrees while keeping the size imgrotated60keepsize rotateimage img 60 scale 1 keepsize true rotate image 60 degrees scale it by 1 and keep the original size display the rotated images cv2 imshow imgrotated60 imgrotated60 show the 60 degree rotated image without keeping the size cv2 imshow imgrotated60keepsize imgrotated60keepsize show the 60 degree rotated image while keeping the size wait for 1 millisecond between frames cv2 waitkey 1 wait for 1 ms during which any key press can be detected not being used here stack images div align center img src results stackimages jpg alt stackimages cvzone div python import cv2 import cvzone initialize camera capture cap cv2 videocapture 2 start an infinite loop to continually capture frames while true read image frame from camera success img cap read convert the image to grayscale imggray cv2 cvtcolor img cv2 color bgr2gray resize the image to be smaller 0 1x of original size imgsmall cv2 resize img 0 0 none 0 1 0 1 resize the image to be larger 3x of original size imgbig cv2 resize img 0 0 none 3 3 apply canny edge detection on the grayscale image imgcanny cv2 canny imggray 50 150 convert the image to hsv color space imghsv cv2 cvtcolor img cv2 color bgr2hsv create a list of all processed images imglist img imggray imgcanny imgsmall imgbig imghsv stack the images together using cvzone s stackimages function stackedimg cvzone stackimages imglist 3 0 7 display the stacked images cv2 imshow stackedimg stackedimg wait for 1 millisecond this also allows for keyboard inputs cv2 waitkey 1 fps python import cvzone import cv2 initialize the fps class with an average count of 30 frames for smoothing fpsreader cvzone fps avgcount 30 initialize the webcam and set it to capture cap cv2 videocapture 0 cap set cv2 cap prop fps 30 set the frames per second to 30 main loop to capture frames and display fps while true read a frame from the webcam success img cap read update the fps counter and draw the fps on the image fpsreader update returns the current fps and the updated image fps img fpsreader update img pos 20 50 bgcolor 255 0 255 textcolor 255 255 255 scale 3 thickness 3 display the image with the fps counter cv2 imshow image img wait for 1 ms to show this frame then continue to the next frame cv2 waitkey 1 finding contours python import cv2 importing the opencv library for computer vision tasks import cvzone importing the cvzone library for additional functionalities import numpy as np importing numpy library for numerical operations download an image containing shapes from a given url imgshapes cvzone downloadimagefromurl url https github com cvzone cvzone blob master results shapes png raw true perform edge detection using the canny algorithm imgcanny cv2 canny imgshapes 50 150 dilate the edges to strengthen the detected contours imgdilated cv2 dilate imgcanny np ones 5 5 np uint8 iterations 1 find contours in the image without any corner filtering imgcontours confound cvzone findcontours imgshapes imgdilated minarea 1000 sort true filter none drawcon true c 255 0 0 ct 255 0 255 retrtype cv2 retr external approxtype cv2 chain approx none find contours in the image and filter them based on corner points either 3 or 4 corners imgcontoursfiltered confoundfiltered cvzone findcontours imgshapes imgdilated minarea 1000 sort true filter 3 4 drawcon true c 255 0 0 ct 255 0 255 retrtype cv2 retr external approxtype cv2 chain approx none display the image with all found contours cv2 imshow imgcontours imgcontours display the image with filtered contours either 3 or 4 corners cv2 imshow imgcontoursfiltered imgcontoursfiltered wait until a key is pressed to close the windows cv2 waitkey 0 color module python import cvzone import cv2 create an instance of the colorfinder class with trackbar set to true mycolorfinder cvzone colorfinder trackbar true initialize the video capture using opencv using the third camera index 2 adjust index if you have multiple cameras cap cv2 videocapture 2 set the dimensions of the camera feed to 640x480 cap set 3 640 cap set 4 480 custom color values for detecting orange hmin smin vmin are the minimum values for hue saturation and value hmax smax vmax are the maximum values for hue saturation and value hsvvals hmin 10 smin 55 vmin 215 hmax 42 smax 255 vmax 255 main loop to continuously get frames from the camera while true read the current frame from the camera success img cap read use the update method from the colorfinder class to detect the color it returns the masked color image and a binary mask imgorange mask mycolorfinder update img hsvvals stack the original image the masked color image and the binary mask imgstack cvzone stackimages img imgorange mask 3 1 show the stacked images cv2 imshow image stack imgstack break the loop if the q key is pressed if cv2 waitkey 1 0xff ord q break classification module python from cvzone classificationmodule import classifier import cv2 cap cv2 videocapture 2 initialize video capture path c users user documents maskmodel maskclassifier classifier f path keras model h5 f path labels txt while true img cap read capture frame by frame prediction maskclassifier getprediction img print prediction print prediction result cv2 imshow image img cv2 waitkey 1 wait for a key press face detection python import cvzone from cvzone facedetectionmodule import facedetector import cv2 initialize the webcam 2 means the third camera connected to the computer usually 0 refers to the built in webcam cap cv2 videocapture 2 initialize the facedetector object mindetectioncon minimum detection confidence threshold modelselection 0 for short range detection 2 meters 1 for long range detection 5 meters detector facedetector mindetectioncon 0 5 modelselection 0 run the loop to continually get frames from the webcam while true read the current frame from the webcam success boolean whether the frame was successfully grabbed img the captured frame success img cap read detect faces in the image img updated image bboxs list of bounding boxes around detected faces img bboxs detector findfaces img draw false check if any face is detected if bboxs loop through each bounding box for bbox in bboxs bbox contains id bbox score center get data center bbox center x y w h bbox bbox score int bbox score 0 100 draw data cv2 circle img center 5 255 0 255 cv2 filled cvzone puttextrect img f score x y 10 cvzone cornerrect img x y w h display the image in a window named image cv2 imshow image img wait for 1 millisecond and keep the window open cv2 waitkey 1 face mesh module python from cvzone facemeshmodule import facemeshdetector import cv2 initialize the webcam 2 indicates the third camera connected to the computer 0 would usually refer to the built in webcam cap cv2 videocapture 2 initialize facemeshdetector object staticmode if true the detection happens only once else every frame maxfaces maximum number of faces to detect mindetectioncon minimum detection confidence threshold mintrackcon minimum tracking confidence threshold detector facemeshdetector staticmode false maxfaces 2 mindetectioncon 0 5 mintrackcon 0 5 start the loop to continually get frames from the webcam while true read the current frame from the webcam success boolean whether the frame was successfully grabbed img the current frame success img cap read find face mesh in the image img updated image with the face mesh if draw true faces detected face information img faces detector findfacemesh img draw true check if any faces are detected if faces loop through each detected face for face in faces get specific points for the eye lefteyeuppoint point above the left eye lefteyedownpoint point below the left eye lefteyeuppoint face 159 lefteyedownpoint face 23 calculate the vertical distance between the eye points lefteyeverticaldistance distance between points above and below the left eye info additional information like coordinates lefteyeverticaldistance info detector finddistance lefteyeuppoint lefteyedownpoint print the vertical distance for debugging or information print lefteyeverticaldistance display the image in a window named image cv2 imshow image img wait for 1 millisecond to check for any user input keeping the window open cv2 waitkey 1 selfie segmentation module python import cvzone from cvzone selfisegmentationmodule import selfisegmentation import cv2 initialize the webcam 2 indicates the third camera connected to the computer 0 usually refers to the built in camera cap cv2 videocapture 2 set the frame width to 640 pixels cap set 3 640 set the frame height to 480 pixels cap set 4 480 initialize the selfisegmentation class it will be used for background removal model is 0 or 1 0 is general 1 is landscape faster segmentor selfisegmentation model 0 infinite loop to keep capturing frames from the webcam while true capture a single frame success img cap read use the selfisegmentation class to remove the background replace it with a magenta background 255 0 255 imgbg can be a color or an image as well must be same size as the original if image cutthreshold is the sensitivity of the segmentation imgout segmentor removebg img imgbg 255 0 255 cutthreshold 0 1 stack the original image and the image with background removed side by side imgstacked cvzone stackimages img imgout cols 2 scale 1 display the stacked images cv2 imshow image imgstacked check for q key press to break the loop and close the window if cv2 waitkey 1 0xff ord q break hand tracking module python from cvzone handtrackingmodule import handdetector import cv2 initialize the webcam to capture video the 2 indicates the third camera connected to your computer 0 would usually refer to the built in camera cap cv2 videocapture 2 initialize the handdetector class with the given parameters detector handdetector staticmode false maxhands 2 modelcomplexity 1 detectioncon 0 5 mintrackcon 0 5 continuously get frames from the webcam while true capture each frame from the webcam success will be true if the frame is successfully captured img will contain the frame success img cap read find hands in the current frame the draw parameter draws landmarks and hand outlines on the image if set to true the fliptype parameter flips the image making it easier for some detections hands img detector findhands img draw true fliptype true check if any hands are detected if hands information for the first hand detected hand1 hands 0 get the first hand detected lmlist1 hand1 lmlist list of 21 landmarks for the first hand bbox1 hand1 bbox bounding box around the first hand x y w h coordinates center1 hand1 center center coordinates of the first hand handtype1 hand1 type type of the first hand left or right count the number of fingers up for the first hand fingers1 detector fingersup hand1 print f h1 fingers1 count 1 end print the count of fingers that are up calculate distance between specific landmarks on the first hand and draw it on the image length info img detector finddistance lmlist1 8 0 2 lmlist1 12 0 2 img color 255 0 255 scale 10 check if a second hand is detected if len hands 2 information for the second hand hand2 hands 1 lmlist2 hand2 lmlist bbox2 hand2 bbox center2 hand2 center handtype2 hand2 type count the number of fingers up for the second hand fingers2 detector fingersup hand2 print f h2 fingers2 count 1 end calculate distance between the index fingers of both hands and draw it on the image length info img detector finddistance lmlist1 8 0 2 lmlist2 8 0 2 img color 255 0 0 scale 10 print new line for better readability of the printed output display the image in a window cv2 imshow image img keep the window open and update it for each frame wait for 1 millisecond between frames cv2 waitkey 1 pose module python from cvzone posemodule import posedetector import cv2 initialize the webcam and set it to the third camera index 2 cap cv2 videocapture 2 initialize the posedetector class with the given parameters detector posedetector staticmode false modelcomplexity 1 smoothlandmarks true enablesegmentation false smoothsegmentation true detectioncon 0 5 trackcon 0 5 loop to continuously get frames from the webcam while true capture each frame from the webcam success img cap read find the human pose in the frame img detector findpose img find the landmarks bounding box and center of the body in the frame set draw true to draw the landmarks and bounding box on the image lmlist bboxinfo detector findposition img draw true bboxwithhands false check if any body landmarks are detected if lmlist get the center of the bounding box around the body center bboxinfo center draw a circle at the center of the bounding box cv2 circle img center 5 255 0 255 cv2 filled calculate the distance between landmarks 11 and 15 and draw it on the image length img info detector finddistance lmlist 11 0 2 lmlist 15 0 2 img img color 255 0 0 scale 10 calculate the angle between landmarks 11 13 and 15 and draw it on the image angle img detector findangle lmlist 11 0 2 lmlist 13 0 2 lmlist 15 0 2 img img color 0 0 255 scale 10 check if the angle is close to 50 degrees with an offset of 10 iscloseangle50 detector anglecheck myangle angle targetangle 50 offset 10 print the result of the angle check print iscloseangle50 display the frame in a window cv2 imshow image img wait for 1 millisecond between each frame cv2 waitkey 1 serial module python from cvzone serialmodule import serialobject initialize the arduino serialobject with optional parameters baudrate 9600 digits 1 max retries 5 arduino serialobject portno none baudrate 9600 digits 1 max retries 5 initialize a counter to keep track of iterations count 0 start an infinite loop while true increment the counter on each iteration count 1 print data received from the arduino getdata method returns the list of data received from the arduino print arduino getdata if the count is less than 100 if count 100 send a list containing 1 to the arduino arduino senddata 1 else if the count is 100 or greater send a list containing 0 to the arduino arduino senddata 0 reset the count back to 0 once it reaches 200 this will make the cycle repeat if count 200 count 0 cpp include cvzone h serialdata serialdata 1 1 numofvalsrec digitspervalrec 0 or 1 1 digit 0 to 99 2 digits 0 to 999 3 digits serialdata serialdata if not receving only sending int sendvals 2 min val of 2 even when sending 1 int valsrec 1 int x 0 void setup serialdata begin 9600 pinmode 13 output void loop to send x 1 if x 100 x 0 sendvals 0 x serialdata send sendvals to recieve serialdata get valsrec digitalwrite 13 valsrec 0 plot module sine example python from cvzone plotmodule import liveplot import cv2 import math sinplot liveplot w 1200 ylimit 100 100 interval 0 01 xsin 0 while true xsin 1 if xsin 360 xsin 0 imgplotsin sinplot update int math sin math radians xsin 100 cv2 imshow image sin plot imgplotsin if cv2 waitkey 1 0xff ord q break face detection x value example python from cvzone plotmodule import liveplot from cvzone facedetectionmodule import facedetector import cv2 import cvzone cap cv2 videocapture 1 detector facedetector mindetectioncon 0 85 modelselection 0 xplot liveplot w 1200 ylimit 0 500 interval 0 01 while true success img cap read img bboxs detector findfaces img draw false val 0 if bboxs loop through each bounding box for bbox in bboxs bbox contains id bbox score center get data center bbox center x y w h bbox bbox score int bbox score 0 100 val center 0 draw data cv2 circle img center 5 255 0 255 cv2 filled cvzone puttextrect img f score x y 10 cvzone cornerrect img x y w h imgplot xplot update val cv2 imshow image plot imgplot cv2 imshow image img if cv2 waitkey 1 0xff ord q break
computervision opencv-python opencv mediapipe
ai
angular5-iot-dashboard
build status https travis ci org smart dashboard angular dashboard svg branch master https travis ci org smart dashboard angular dashboard depricated visit https pixelplux com product torabito iot suite this project is no longer maintained you need to use https pixelplux com product torabito iot suite for monitoring your arduino esp8266 project angular 5 iot dashboard angular 5 dashboard is a management dashboard for many purposes focused on iot smart home and autonomy this project is a fully functional app and is hosted on https pixelplux com product torabito iot suite as an enterprise product we are sharing many components and our workflow here inside this repository this project can be used for internet of things reporting dashboard user management live monitoring and other other dashboard based projects for angular we will continously update the workflow of the application you can use components from this project and get inspired how an intellgence dashboard can be working either you can fork this project or build your own app and import components from this repository please try to keep components as untouched as possible if you want to get the weekly updates and improvements in case your business logic is different from the way we are implementing ours try to consult with us to build your own version and still get benefit of minor and major features from our developers and contributors keywords angular 6 dashboard angular 5 dashboard angular 4 dashboard angular 2 dashboard internet of things angular 2 realtime angular app socket angular app iot dashboard using angular interactive dashboard realtime console realtime console angular stable enterprise features we list our stable features that are working as enterprise level inside the application user signup user signin user password reset recieve incoming restful requests from devices arduino raspberry pi create devices and managing them interactive documentation for the api create places and locations based on name and level display realtime value in dashboard mobile version support using cordova collect user contact information for technical reasons manage user profiles experimental features add conditions for changing devices read devices geographical location for mobile version supported languages we are trying to cover as many as languages as possible at the moment we do cover english united states polish poland please feel free to contribute to this repository in case you want to add your language right to left support we do support rtl layout but some components are not integrated that well our focus was on layout direction and etc we do support persian language at the moment as experimental and it s not provided in commercial version marketting team would choose either if they want new languages or regions and it s out of the scope currently supporting experimental polish poland persian iran english world wide target to support languages arabic united arab emirates egypt turkish turkey german germany austria spanish spain please feel free to contribute to the locales for your own country angular iot dashboard angular 5 dashboard realtime dashboard angular iot dashboard gif angular iot dashboard angular 5 dashboard realtime dashboard angular 6 support we are looking forward for angular 6 release to be of first people who are providing dashboard for angular 6 at the moment all components are based on angular 5 x technical stack project is based on angular 5 and angular cli for developing please use npm start which also provides hmr and for production level we use npm run build which calls ng tasks directly please review the package json for different building environments this application can be run without any api or microservices all endpoints having interactive mocks since each customer might need a different way of building the application we just put building examples in case you are distributing this app again for your own purposes please make your own environments and add them to angular cli json file and update the package respectively we are using highcharts library for our charts for any incoming pull requests that containing other chart libraries please open an issue first and describe why it s not possible to do it using highcharts lodash and ngrx store are used heavily for data flow async await concepts are everywhere since project is a realtime dashboard we are not supporting unit tests we only use integration e2e tests using cypress and it will be run for each pull request in case of heavy calculation or sensitive data implementation that requires unit testing move it to other package publish it to npm and then install it inside this repository project demo is stored on github pages https owsolutions github io angular5 iot dashboard hence we are commiting the dist directory for each build and given that our dist folder is not necessarily the production nevertheless you need to build this application for yourself since our configuration is different mobile version experimental this application also will be bundled into a cordova app for android applications we put the apk files into github releases which are not signed please feel free to sign them by your own keystore read about signing a apk file here https stackoverflow com questions 10930331 how to sign an already compiled apk then you can publish it or install it for test purposes please notice that we build our apk with mock data so that app is not connected to any remote server and is only for testing and demonstration purposes please fork the app and update or ci cd to build with your endpoint address or extra configuration live preview you can see the latest deploy here https owsolutions github io angular5 iot dashboard https owsolutions github io angular5 iot dashboard we are hosting demo version on github for enterprise version please contact us contribution guide we are so much excited to receive pull requests from you there are some simple rules that we follow in our project please no comments on functions unless it s really necessary please refer to this article for reason https bradt ca blog useless code comments open an issue for your pull request and start your branch name with this format issue number this is my branch so we can track the issue until we close it make sure your code passes tests linting and e2e tests for new functionality please add abundant tests copywrite this project is free for educational usage code review and non commercial usage for enterprise commercial usage you need to obtain a license and please contact us at connexion founder outlook com
angular4 socketio javascript typescript angular-iot-dashboard iot dashboard angular angular5 iot-platform monitoring realtime angular2 iot-components remote remote-control remote-sensing
server
site-inspector
site inspector a ruby gem to sniff information about a domain s technology and capabilities gem version https badge fury io rb site inspector svg http badge fury io rb site inspector build status https travis ci org benbalter site inspector svg https travis ci org benbalter site inspector demo site inspector herokuapp com https site inspector herokuapp com source https github com benbalter site inspector demo concepts site inspector involves three primary concepts domain a domain has a host defined by it s tld sld a domain might be example com domain s have certain domain wide properties like whether it supports non www requests or if it enforces https endpoint each domain has four endpoints based on whether you make your request with https or not and whether you prefix the host with www or not so the domain example com may have endpoints at https example com https www example com http example com and https www example com there may theoretically be a different server responding to each endpoint so endpoints have certain endpoint specific properties like whether it responds or not or whether it redirects each domain has one canonical primary endpoint checks a check is a set of tests performed on an endpoint a check might look at what headers are returned what cms is used or whether there is a valid https certificate there are some built in checks listed below or you can define your own while they re endpoint specific checks often filter up and inform some of the domain wide logic such as if the domain supports https usage ruby ruby domain siteinspector inspect whitehouse gov domain https true domain www true domain canonical endpoint to s https www whitehouse gov domain canonical endpoint sniffer cms drupal command line usage site inspector inspect inspects a domain usage site inspector inspect domain options options j json json encode the output a all return results for all endpoints defaults to only the canonical endpoint sniffer return results for the sniffer check defaults to all checks unless one or more checks are specified https return results for the https check defaults to all checks unless one or more checks are specified hsts return results for the hsts check defaults to all checks unless one or more checks are specified headers return results for the headers check defaults to all checks unless one or more checks are specified dns return results for the dns check defaults to all checks unless one or more checks are specified content return results for the content check defaults to all checks unless one or more checks are specified h help show this message v version print the name and version t trace show the full backtrace when an error occurs what s checked domain canonical endpoint the domain s primary endpoint government whether the domain is a government domain up whether any endpoint responds www whether either www endpoint responds root whether you can access the domain with www https whether https is supported enforces https whether non htttps endpoints are either down or redirects to https downgrades https whether the canonical endpoint redirects to an http endpoint canonically www whether non www requests are redirected to www or all non www endpoints are down canonically https whether non https request are redirected to https or all http endpoints are down redirect whether the domain redirects to an external domain hsts does the canonical endpoint have hsts enabled hsts subdomains are subdomains included in the hsts list hsts preload ready can this domain be added to the hsts preload list endpoint up whether the endpoint responds or not timed out whether the endpoint times out redirect whether the endpoint redirects external redirect whether the endpoint redirects to another domain checks each endpoint also returns the following checks accessibility uses the pa11y cli to run automated accessibility tests requires node to install pally sudo npm install g pa11y section508 tests against the section508 standard wcag2a tests against the wcag2a standard wcag2aa tests against the wcag2aa standard wcag2aaa tests against the wcag2aaa standard content doctype the html doctype returned sitemap xml whether the endpoint has a sitemap robots txt whether the endpoint has a robots txt file dns dnssec is dnssec supported ipv6 is ipv6 supported cdn the endpoint s cdn if any cloud provider the endpoint s cloud provider if any google apps whether the domain is using google apps hostname the server hostname ip the server ip headers cookies does the domain use cookies strict transport security whether sts is enabled content security policy the endpoint s csp click jacking protection whether an x frame options header is sent xss protection whether an x xss protection header is sent server the server header secure cookies whether the cookies are secure or not hsts valid whether the hsts header is valid max age the hsts max age include subdomains whether subdomains are included preload whether its preloaded enabled whether hsts is enabled preload ready whether hsts could be preloaded https valid if the https response is valid return code the https error if any sniffer cms the cms used if any analytics the analytics providers used if any javascript the javascript libraries used if any advertising the advertising providers used if any adding your own check checks https github com benbalter site inspector tree master lib site inspector checks are special classes that are children of siteinspector endpoint check https github com benbalter site inspector blob master lib site inspector checks check rb you can implement your own check like this ruby class siteinspector class endpoint class mention check def mentions ben endpoint content body ben i end end end end this check can then be used as follows domain canonical endpoint mention mentions ben checks can call the endpoint object which contains the request response and other checks custom checks are automatically exposed as endpoint methods contributing bootstrapping locally 1 clone down the repo 2 script bootstrap running tests script cibuild development console script console how to contribute 1 fork the project 2 create a new descriptively named feature branch 3 make your changes 4 submit a pull request
server
Pynq-CV-OV5640
introduction in this pynq overlay picture is captured from an ov5640 camera which is connected to pl side and several accelerated image processing algorithms are contained in this overlay you can choose which algorithm is enabled without downloading a new bitstream images architecture png in the example application notebook you can configure the ov5640 camera and get the captured picture and display it on the notebook the processed video streaming is displayed on the hdmi monitor which connected to the pl side boards pynq z2 ov5640 notebooks images systemdiagram png this pynq overlay contains the below accelerated image processing algorithms rgb2hsv subsample equalizehist gaussianblur sobel canny dilation erosion peripherals ov5640 camera board waveshare ov5640 or equaivalent pmod camera adapter hdmi monitor quick start open a terminal on your pynq board and run sudo pip3 install upgrade git https github com xupsh pynq cv ov5640 git or offline install sudo pip3 install upgrade currently this repository is compatible with pynq image v2 4 http www pynq io board design rebuilt all ips and hls source codes are contains in this repository you can recovery the hls and vivado project license pynq license bsd 3 clause license https github com xilinx pynq blob master license
ai
Hospital-Management-System
hospital management system health care hospital management system is designed for manage details about hospital patient employee and rooms 10 designed by using html css js jquery php procedural php mysql languages and framworks html css js php procedural php mysql jquery bootstrap import database file from database and usermanual folder user manual included 1 create account for system to create need to enter top level adminstration login details to system top level adminstration login details username admin password password example super admin login username superadmin password 123 example basic admin login username basicadmin password 123
html css3 php bootstrap javascript jquery hospital-management css mysql mysql-database
os
Lab-Project-FreeRTOS-MCUBoot
labs freertos plus mcuboot reference example for freertos with mcuboot https github com mcu tools mcuboot support mcuboot is a configurable secure bootloader maintained by several industry leaders it can operate as the first or second stage bootloader with support for cryptographic verification of software images via one of the following ecdsa p256 rsa 2048 rsa 3072 by default it supports image reversion whereby uploaded image upgrades are tentatively booted once upon and image s initial boot if the upgrade image marks itself as confirmed it is retained as the primary image if the upgrade image is not confirmed the subsequent boot will rollback to the prior confirmed image if no valid image is available in any slot the device bricks itself as a safety precaution the developers of mcuboot provide more detailed documentation here https github com mcu tools mcuboot tree main docs mcuboot also provides subset support for mcumgr https github com apache mynewt mcumgr cli when a device enters serial boot recovery mode if enabled serial mode can be triggered during bootup via user input such as a button hold mcumgr interface enables users to retrieve image diagnostics from the board query resets upload modify images and more demo description the demo consists of mcuboot booting an application which first disables a bootloader watchdog timer prints its version number then confirms itself so it won t be reverted if it s an update the app proceeds to periodically print hello world the demo also details the application signing and upgrade process and provides a porting guide for implementing on other socs finally use of mcumgr https github com apache mynewt mcumgr cli is demonstrated for retrieving image diagnostics modifying uploading images and triggering other board functions from your host pc supported socs espressif esp32 patch description the mcuboot patch provides bug fixes implants hooks for replacing function calls that were specific to other rtos provides boot freertos content bug fixes and enhancements to mcuboot s espressif port the esp idf patch provides idf capability to build and format applications for espressif s mcuboot application loader license this library is licensed under the mit 0 license see the license file
os
ECE5725
ece5725 design with embedded operating systems
os
platformscape
platformscape a blog about platform engineering and cloud native infrastructure visit https platformscape com
cloud
awesome-persian-nlp-ir
awesome persian nlp ir tools and resources awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome this repository is going to be a curation of every research and efforts on persian nlp we segmented this repo into five main sections as listed below contents tools sections tools md datasets sections datasets md models sections models md repositories sections repos md papers and books sections papers and books md contribute contributions welcome read the contribution guidelines contributing md first license cc0 https i creativecommons org p zero 1 0 88x31 png https creativecommons org publicdomain zero 1 0
persian-language natural-language-processing information-retrieval language-detection persian-nlp corpus part-of-speech-tagger normalizer named-entity-recognition embeddings morphological-analysis stemmer dependency-parser spell-check persian-stemmer shallow-parser
ai
vanilla-todo
vanilla todo a teuxdeux https teuxdeux com clone in plain html css and javascript no build steps it s fully animated and runs smoothly at 60 fps with a total transfer size of 50kb unminified try it online https raw githack com morris vanilla todo main public index html more importantly it s a case study showing that vanilla web development is viable in terms of maintainability 521 the good and worthwhile in terms of user experience 51 user experience 50 less time to load and 90 less bandwidth in this case there s no custom framework invented here instead the case study was designed 22 rules to discover minimum viable patterns 321 mount functions that are truly vanilla the result is maintainable albeit verbose 522 the verbose and with considerable duplication if anything the case study validates the value of build steps and frameworks but also demonstrates that standard web technologies can be used effectively and there are only a few critical areas 523 the bad where a vanilla approach is clearly inferior intermediate understanding of the web platform is required to follow through table of contents 1 motivation 1 motivation 2 method 2 method 2 1 subject 21 subject 2 2 rules 22 rules 2 3 goals 23 goals 2 3 1 user experience 231 user experience 2 3 2 code quality 232 code quality 2 3 3 generality of patterns 233 generality of patterns 3 implementation 3 implementation 3 1 basic structure 31 basic structure 3 2 javascript architecture 32 javascript architecture 3 2 1 mount functions 321 mount functions 3 2 2 data flow 322 data flow 3 2 3 rendering 323 rendering 3 2 4 reconciliation 324 reconciliation 3 3 drag drop 33 drag drop 3 4 animations 34 animations 4 testing 4 testing 5 assessment 5 assessment 5 1 user experience 51 user experience 5 2 code quality 52 code quality 5 2 1 the good 521 the good 5 2 2 the verbose 522 the verbose 5 2 3 the bad 523 the bad 5 3 generality of patterns 53 generality of patterns 6 conclusion 6 conclusion 7 what s next 7 whats next 8 appendix 8 appendix 8 1 links 81 links 8 2 response 82 response 9 changelog 9 changelog 1 motivation i believe too little has been invested in researching practical scalable methods for building web applications without third party dependencies it s not enough to describe how to create dom nodes or how to toggle a class without a framework it s also rather harmful to write an article saying you don t need library x and then proceed in describing how to roll your own untested inferior version of x what s missing are thorough examples of complex web applications built only with standard web technologies covering as many aspects of the development process as possible this case study is an attempt to fill this gap at least a little bit and inspire further research in the area 2 method the method for this case study is as follows pick an interesting subject implement it using only standard web technologies document techniques and patterns found during the process assess the results by common quality standards this section describes the method in more detail 2 1 subject i ve chosen to build a functionally equivalent clone of teuxdeux https teuxdeux com for this study the user interface has interesting challenges in particular performant drag drop when combined with animations the original teuxdeux app deserves praise here in my opinion it has the best over all concept and ux of all the to do apps out there thank you https fictivekin com the user interface is arguably small which is good for a case study but large enough to require thought on its architecture however it is lacking in some key areas routing asynchronous resource requests server side rendering 2 2 rules to produce valid vanilla solutions and because constraints spark creativity i came up with a set of rules to follow throughout the process only use standard web technologies only use widely supported js features unless they can be polyfilled 1 no runtime js dependencies except polyfills no build steps no general purpose utility functions related to the dom ui 2 1 this is a moving target the current version is using es2020 2 these usually end up becoming a custom micro framework thereby questioning why you didn t use one of the established and tested libraries frameworks in the first place 2 3 goals the results are going to be assessed by three major concerns 2 3 1 user experience the resulting product should be comparable to or better than the original regarding functionality performance and design this includes testing major browsers and devices 2 3 2 code quality the resulting implementation should adhere to established code quality standards in the industry this will be difficult to assess objectively as we will see later 2 3 3 generality of patterns the discovered techniques and patterns should be applicable in a wide range of scenarios 3 implementation this section walks through the resulting implementation highlighting techniques and problems found during the process you re encouraged to inspect the source code public alongside this section 3 1 basic structure since build steps are ruled out the codebase is organized around plain html css and js files the html and css mostly follows rscss https ricostacruz com rscss devised by rico sta cruz https ricostacruz com which yields an intuitive component oriented structure the stylesheets are slightly verbose i missed scss https sass lang com here and i think one of these is a must have for bigger projects additionally the global css namespace problem is unaddressed see e g css modules https github com css modules css modules all javascript files are es modules import export basic code quality code style linting is guided by prettier https prettier io stylelint https stylelint io and eslint https eslint org i ve set the eslint parser to es2020 to ensure only es2020 code is allowed note that i ve opted out of web components completely i can t clearly articulate what i dislike about them but i never missed them throughout this study the basic structure comes with some boilerplate e g referencing all the individual stylesheets and scripts from the html probably enough to justify a simple build step it is otherwise straight forward and trivial to understand literally just a bunch of html css and js files 3 2 javascript architecture naturally the javascript architecture is the most interesting part of this study i found that using a combination of functions query selectors and dom events is sufficient to build a scalable maintainable codebase albeit with some trade offs as we will see later conceptually the proposed architecture loosely maps css selectors to js functions which are mounted i e called once per matching element this yields a simple mental model and synergizes with the dom and styles todo list todolist scripts todolist js styles todo list css app collapsible appcollapsible scripts appcollapsible js styles app collapsible css this proved to be a useful repeatable pattern throughout all of the implementation process 3 2 1 mount functions mount functions take a dom element as their only argument their responsibility is to set up initial state event listeners and provide behavior and rendering for the target element here s a hello world example of mount functions js define mount function loosely mapped to hello world export function helloworld el define initial state const state title hello world description an example vanilla component counter 0 set rigid base html el innerhtml h1 class title h1 p class description p div class my counter div mount sub components el queryselectorall my counter foreach mycounter attach event listeners el addeventlistener modifycounter e update counter state counter e detail initial update update define idempotent update function function update next update state optionally optimize e g bail out if state hasn t changed object assign state next update own html el queryselector title innertext state title el queryselector description innertext state description pass data to sub scomponents el queryselector my counter dispatchevent new customevent updatemycounter detail value state counter define another component loosely mapped to my counter export function mycounter el define initial state const state value 0 set rigid base html el innerhtml p span class value span button class increment increment button button class decrement decrement button p attach event listeners el queryselector increment addeventlistener click dispatch an action use detail to transport data el dispatchevent new customevent modifycounter detail 1 bubbles true el queryselector decrement addeventlistener click dispatch an action use detail to transport data el dispatchevent new customevent modifycounter detail 1 bubbles true el addeventlistener updatemycounter e update e detail define idempotent update function function update next object assign state next el queryselector value innertext state value mount helloworld component s any div class hello world div in the document will be mounted document queryselectorall hello world foreach helloworld this comes with quite some boilerplate but has useful properties as we will see in the following sections note that any part of a mount function is entirely optional for example a mount function does not have to set any base html and may instead only set event listeners to enable some behavior also note that an element can be mounted with multiple mount functions for example to do items are mounted with todoitem and appdraggable compared to react components mount functions provide interesting flexibility as components and behaviors can be implemented using the same idiom and combined arbitrarily reference appicon js public scripts appicon js todoitem js public scripts todoitem js todoiteminput js public scripts todoiteminput js 3 2 2 data flow i found it effective to implement one way data flow similar to react s approach however exclusively using custom dom events data flows downwards from parent components to child components through custom dom events actions flow upwards through custom dom events bubbling up usually resulting in some parent component state change which is in turn propagated downwards through data events the data store is factored into a separate behavior todostore it only receives and dispatches events and encapsulates all of the data logic listening to and dispatching events is slightly verbose with standard apis and certainly justifies introducing helpers i didn t need event delegation la jquery for this study but i believe it s a useful concept that is difficult to do concisely with standard apis reference tododay js public scripts tododay js todostore js public scripts todostore js 3 2 3 rendering naively re rendering a whole component using innerhtml should be avoided as this may hurt performance and will likely break important functionality which browsers have already been optimizing for decades a button input etc may lose focus form inputs may lose data text selection may be reset css transitions may not work correctly event listeners may need to be reattached as seen in 3 2 1 321 mount functions rendering is therefore split into some rigid base html and an idempotent complete update function which only makes necessary changes idempotency is key here i e update functions may be called at any time and should always render the component correctly completeness is equally important i e update functions should render the whole component regardless of what triggered an update in effect this means almost all dom manipulation is done in update functions which greatly contributes to robustness and readability of the codebase as seen above this approach is quite verbose and ugly compared to jsx for example however it s very performant and can be further optimized by checking for data changes caching selectors etc it is also simple to understand reference todoitem js public scripts todoitem js todocustomlist js public scripts todocustomlist js 3 2 4 reconciliation expectedly the hardest part of the study was rendering a variable amount of dynamic components efficiently here s a commented example from the implementation outlining the reconciliation algorithm js export function todolist el const state items el innerhtml div class items div el addeventlistener updatetodolist e update e detail function update next object assign state next const container el queryselector items mark current children for removal const obsolete new set container children map current children by data key const childrenbykey new map obsolete foreach child childrenbykey set child getattribute data key child build new list of child elements from data const children state items map item find existing child by data key let child childrenbykey get item id if child if child exists keep it obsolete delete child else otherwise create new child child document createelement div child classlist add todo item set data key child setattribute data key item id mount component todoitem child update child child dispatchevent new customevent updatetodoitem detail item item return child remove obsolete children obsolete foreach child container removechild child re insert new list of children children foreach child index if child container children index container insertbefore child container children index it s very verbose and has lots of opportunity to introduce bugs compared to a simple loop in jsx this seems insane it is quite performant as it does minimal work but is otherwise messy definitely a candidate for a utility function or library 3 3 drag drop implementing drag drop from scratch was challenging especially regarding browser device consistency using a library would have been a lot more cost effective initially however having a customized implementation paid off once i started introducing animations as both had to be coordinated closely i can imagine this would have been a difficult problem when using third party code for either the drag drop implementation is again based on dom events and integrates well with the remaining architecture it s clearly the most complex part of the study but i was able to implement it without changing existing code besides mounting behaviors and adding event handlers i suspect the drag drop implementation to have some subtle problems on touch devices as i haven t extensively tested them using a library for identifying the gestures could be more sensible and would reduce costs in testing browsers and devices reference appdraggable js public scripts appdraggable js appsortable js public scripts appsortable js todolist js public scripts todolist js 3 4 animations for the final product i wanted smooth animations for most user interactions this is a cross cutting concern which was implemented using the flip https aerotwist com blog flip your animations technique as devised by paul lewis https twitter com aerotwist implementing flip animations without a large refactoring was the biggest challenge of this case study especially in combination with drag drop after days of work i was able to implement the algorithm in isolation and coordinate it with other concerns at the application s root level the usecapture mode of addeventlistener proved to be very useful in this case reference appflip js public scripts appflip js todoapp js public scripts todoapp js 4 testing i ve implemented one end to end test and one unit test using playwright https playwright dev this was straightforward besides small details like the mjs extension and the fact that you cannot use named imports when importing from public scripts there s a lot more to explore here but it s not much different from testing other frontend stacks it s actually simpler as there was zero configuration and just one dependency however it s currently lacking code coverage playwright provides some code coverage facilities https playwright dev docs api class coverage but it s not straight forward to produce a standard lcov report from that and it would probably be difficult to unify end to end and unit test coverage reference additem test mjs test e2e additem test mjs util test mjs test unit util test mjs 5 assessment 5 1 user experience most important features from the original teuxdeux application are implemented and usable daily to do lists add edit delete to do items custom to do lists add edit delete custom to do lists drag drop to do items across lists reorder custom to do lists via drag drop local storage persistence additionally most interactions are smoothly animated at 60 frames per second in particular dragging and dropping gives proper visual feedback when elements are reordered the latter was an improvement over the original application when i started working on the case study in 2019 in the meantime the teuxdeux team released an update with a much better drag drop experience great job one notable missing feature is markdown support it would be insensible to implement markdown from scratch this is a valid candidate for using an external library as it is entirely orthogonal to the remaining codebase the application has been tested on latest chrome firefox safari and safari on ios todo test more browsers and devices a fresh load of the original teuxdeux application transfers around 500 kb and finishes loading at over 1000 ms sometimes up to 2000ms measured in 05 2022 reloads finish at around 500ms with a transferred size of around 50 kb the vanilla application consistently loads in 300 500 ms mdash not minified and with each script stylesheet and icon served as an individual file reloads finish at 100 200ms again not optimized at all with e g asset hashing indefinite caching to be fair my implementation misses quite a few features from the original i suspect a fully equivalent clone to be well below 100 kb transfer though todo run more formal performance tests and add figures for the results 5 2 code quality unfortunately it is quite hard to find undisputed objective measurements for code quality besides trivialities like code style linting etc the only generally accepted assessment seems to be peer reviewal to have at least some degree of assessment of the code s quality the following sections summarize relevant facts about the codebase and some opinionated statements based on my experience in the industry 5 2 1 the good no build steps no external dependencies at runtime besides polyfills no dependency maintenance no breaking changes to monitor used only standard technologies plain html css and javascript standard dom apis very few concepts introduced mount functions loosely mapped by css class names state separated from the dom idempotent updates data flow using custom events compare the proposed architecture to the api conceptual surface of angular or react progressive developer experience markup style and behavior are orthogonal and can be developed separately adding behavior has little impact on the markup besides adding classes debugging is straight forward using modern browser developer tools the app can be naturally enhanced from the outside by handling dispatching events just like you can naturally animate some existing html little indirection low coupling the result is literally just a bunch of html css and js files straight forward zero config testing with playwright all source files html css and js combine to under 2400 lines of code including comments and empty lines for comparison prettifying the original teuxdeux s minified js application bundle yields 52678 loc 05 2022 to be fair my implementation misses quite a few features from the original i suspect a fully equivalent clone to be well below 10000 loc though 5 2 2 the verbose stylesheets are a bit verbose scss would help here simple components require quite some boilerplate code el queryselectorall scope is somewhat default expected and would justify a helper listening to and dispatching events is slightly verbose although not used in this study event delegation is not trivial to implement without code duplication eliminating verbosities through build steps and a minimal set of helpers would reduce the comparably low code size see above even further 5 2 3 the bad class names share a global namespace event names share a global namespace especially problematic for events that bubble up no code completion in html strings the separation between base html and dynamic rendering is not ideal when compared to jsx for example jsx virtual dom techniques provide much better development ergonomics reconciliation is verbose brittle and repetitive i wouldn t recommend the proposed technique without a well tested helper function at least you have to remember mounting behaviors correctly when creating new elements it would be helpful to automate this somehow e g watch elements of selector x at all times and ensure the desired behaviors are mounted once on them no type safety i ve always been a proponent of dynamic languages but since typescript s type system provides the best of both worlds i cannot recommend using it enough we re effectively locked out of using npm dependencies that don t provide browser ready builds es modules or umd most frameworks handle a lot of browser inconsistencies for free and continuously monitor regressions with extensive test suites the cost of browser testing is surely a lot higher when using a vanilla approach no code coverage from tests besides the issues described above i believe the codebase is well organized and there are clear paths for bugfixes and feature development since there s no third party code bugs are easy to find and fix and there are no dependency limitations to work around a certain degree of dom api knowledge is required but i believe this should be a goal for any web developer 5 3 generality of patterns assessing the generality of the discovered techniques objectively is not really possible without production usage from my experience however i can t imagine any scenario where mount functions event based data flow etc are not applicable the underlying principles power the established frameworks after all state is separated from the dom react angular vue rendering is idempotent and complete react s pure render function one way data flow react an open question is if these patterns hold for library authors although not considered during the study some observations can be made the javascript itself would be fine to share as es modules however event naming needs great care as dispatching bubbling events from imported behaviors can trigger parent listeners in consumer code can be mitigated by providing options to prefix or map event names css names share a global namespace and need to be managed as well could be mitigated by prefixing as well however making the javascript a bit more complex 6 conclusion the result of this study is a working todo application with decent ui ux and most of the functionality of the original teuxdeux app built using only standard web technologies it comes with better overall performance at a fraction of the code size and bandwidth the codebase seems manageable through a handful of simple concepts although it is quite verbose and even messy in some areas this could be mitigated by a small number of helper functions and simple build steps e g scss and typescript the study s method helped discovering patterns and techniques that are at least on par with a framework based approach for the given subject without diverging into building a custom framework a notable exception to the latter is rendering variable numbers of elements in a concise way i was unable to eliminate the verbosity involved in basic but efficient reconciliation further research is needed in this area but for now this appears to be a valid candidate for a possibly external general purpose utility when looking at the downsides remember that all of the individual parts are self contained highly decoupled portable and congruent to the web platform the resulting implementation cannot rust by definition as no dependencies can become out of date another thought to be taken with a grain of salt i believe frameworks make simple tasks even simpler but hard tasks e g implementing cross cutting concerns or performance optimizations often more difficult setting some constraints up front forced me to challenge my assumptions and preconceptions about vanilla web development it was quite liberating to avoid general purpose utilities and get things done with what s readily available as detailed in the assessment the study would likely be more convincing if build steps were allowed modern javascript and scss could reduce most of the unnecessarily verbose parts to a minimum finally this case study does not question using dependencies or frameworks in general mdash they do provide lots of value in many areas it was a constrained experiment designed to discover novel methods for vanilla web development and hopefully inspire innovation and further research in the area 7 what s next i d love to hear feedback and ideas on any aspect of the case study it s still lacking in some important areas e g testing techniques pull requests questions and bug reports are more than welcome here are a few ideas i d like to see explored in the future run another case study with typescript scss and build steps seems promising research validation rules for utility functions and external dependencies experiment with architectures based on virtual dom rendering and standard dom events compile discovered rules patterns and techniques into a comprehensive guide case studies constrained by a set of formal rules are an effective way to find new patterns and techniques in a wide range of domains i d love to see similar experiments in the future 8 appendix 8 1 links general resources i ve used extensively mdn web docs https developer mozilla org as a reference for dom apis can i use https caniuse com as a reference for browser support react https reactjs org as inspiration for the architecture useful articles regarding flip animations flip your animations aerotwist com https aerotwist com blog flip your animations animating layouts with the flip technique css tricks com https css tricks com animating layouts with the flip technique animating the unanimatable medium com https medium com developers writing animating the unanimatable 1346a5aab3cd projects i ve inspected for drag drop architecture react dnd https github com react dnd react dnd react beautiful dnd https github com atlassian react beautiful dnd dragula https github com bevacqua dragula 8 2 response 10 2020 trending on hacker news https news ycombinator com item id 24893247 lobsters https lobste rs s 5gcrxh case study on vanilla web development desandro twitter https twitter com desandro status 1321095247091433473 developer for the original teuxdeux reddit https www reddit com r javascript comments jj10k9 vanillatodo a case study on viable techniques for thanks 9 changelog 05 2023 add basic testing fix stylelint errors update dependencies 08 2022 small improvements fix date seeking bug on safari 05 2022 refactored for es2020 refactored for event driven communication exclusively moved original es5 based version of the study to es5 es5 added assessment regarding library development added date picker 01 2021 added response section 82 response 10 2020 refactored for dataset 2 https github com morris vanilla todo issues 2 mdash opethrocks https github com opethrocks fixed 3 https github com morris vanilla todo issues 3 navigation bug mdash anchepiece https github com anchepiece jcoussard https github com jcoussard fixed 4 https github com morris vanilla todo issues 4 double item creation mdash n0nick https github com n0nick fixed 1 https github com morris vanilla todo issues 4 bad links mdash roryokane https github com roryokane initial version
vanilla frontend javascript
front_end
performance-matters-16-17
performance matters project setup this project serves an adapted version of the bootstrap documentation website http getbootstrap com it is based on the github pages branche of bootstrap https github com twbs bootstrap tree gh pages differences from actual bootstrap documentation added custom webfont removed third party scripts the src directory is served with express https expressjs com templating is done with nunjucks https mozilla github io nunjucks getting started install dependencies npm install serve npm start expose localhost npm run expose
front_end
SQL_EmployeeDatabase
sql employeedatabase data modeling engineering and analysis into a sql database data modeling inspected the csvs and sketched out an erd of the tables using http www quickdatabasediagrams com data engineering used the information to create a table schema for each of the six csv files import each csv file into the corresponding sql table data analysis listed the following details of each employee employee number last name first name gender and salary listed employees who were hired in 1986 listed the manager of each department with the following information department number department name the manager s employee number last name first name and start and end employment dates listed the department of each employee with the following information employee number last name first name and department name listed all employees whose first name is hercules and last names begin with b listed all employees in the sales department including their employee number last name first name and department name listed all employees in the sales and development departments including their employee number last name first name and department name listed in descending order the frequency count of employee last names and how many employees share each last name
server
OptML_course
epfl course optimization for machine learning cs 439 official coursebook information http edu epfl ch coursebook en optimization for machine learning cs 439 lectures fri 13 15 15 00 in co2 https plan epfl ch room co 202 exercises fri 15 15 17 00 in bc01 https plan epfl ch room bc 2001 this course teaches an overview of modern mathematical optimization methods for applications in machine learning and data science in particular scalability of algorithms to large datasets will be discussed in theory and in implementation team instructors martin jaggi martin jaggi epfl ch mailto martin jaggi epfl ch nicolas flammarion nicolas flammarion epfl ch mailto nicolas flammarion epfl ch assistants aditya varre aditya varre epfl ch mailto aditya varre epfl ch amirkeivan mohtashami amirkeivan mohtashami epfl ch mailto amirkeivan mohtashami epfl ch y ksel oguz kaan oguz yuksel epfl ch mailto oguz yuksel epfl ch chayti el mahdi el mahdi chayti epfl ch mailto el mahdi chayti epfl ch contents convexity gradient methods proximal algorithms subgradient methods stochastic and online variants of mentioned methods coordinate descent frank wolfe accelerated methods primal dual context and certificates lagrange and fenchel duality second order methods including quasi newton methods derivative free optimization advanced contents parallel and distributed optimization algorithms computational trade offs time vs data vs accuracy lower bounds non convex optimization convergence to critical points alternating minimization neural network training program nr date topic materials exercises 1 24 2 introduction convexity notes raw master lecture notes lecture notes pdf slides raw master slides lecture01 pdf lab01 tree master labs ex01 2 3 3 gradient descent notes raw master lecture notes lecture notes pdf slides raw master slides lecture02 pdf lab02 tree master labs ex02 3 10 3 projected gradient descent notes raw master lecture notes lecture notes pdf slides raw master slides lecture03 pdf lab03 tree master labs ex03 4 17 3 proximal and subgradient descent notes raw master lecture notes lecture notes pdf slides raw master slides lecture04 pdf lab04 tree master labs ex04 5 24 3 stochastic gradient descent non convex optimization notes raw master lecture notes lecture notes pdf slides raw master slides lecture05 pdf lab05 tree master labs ex05 6 31 3 non convex optimization notes raw master lecture notes lecture notes pdf slides raw master slides lecture06 pdf lab06 tree master labs ex06 7 4 easter vacation 14 4 easter vacation 7 21 4 newton s method quasi newton notes raw master lecture notes lecture notes pdf slides raw master slides lecture07 pdf lab07 tree master labs ex07 8 28 4 coordinate descent notes raw master lecture notes lecture notes pdf slides raw master slides lecture08 pdf lab08 tree master labs ex08 9 5 5 frank wolfe notes raw master lecture notes lecture notes pdf slides raw master slides lecture09 pdf lab09 tree master labs ex09 10 12 5 accelerated gradient gradient free adaptive methods notes slides raw master slides lecture10 pdf lab10 tree master labs ex10 11 19 5 mini project week 12 26 5 opt for ml in practice notes slides raw master slides lecture11 pdf q a 13 2 6 opt for ml in practice notes slides raw master slides lecture12 pdf q a projects videos public playlist of 2021 videos youtube https www youtube com playlist list pl4o4bxki faeyrsbqtuyn2xmjjaqlfqzx playlist of 2022 videos epfl internal https tube switch ch switchcast epfl ch series 4fab28ac 1c8f 4632 8d01 e128746b7a1d playlist of 2023 videos epfl internal https mediaspace epfl ch channel cs 439 optimization for machine learning 31980 exercises the weekly exercises tree master labs consist of a mix of theoretical and practical python exercises for the corresponding topic each week starting week 2 solutions to exercises are available in the lab folder project a mini project will focus on the practical implementation here we encourage students to investigate the real world performance of one of the studied optimization algorithms or variants helping to provide solid empirical evidence for some behaviour aspects on a real machine learning task the project is mandatory and done in groups of 3 students it will count 30 to the final grade project reports 3 page pdf are due june 16th here is a detailed project description raw master labs mini project miniproject description pdf assessment final written exam on monday 03 07 2023 from 15h15 to 18h15 co2 co3 format closed book theoretical questions similar to exercises you are allowed to bring one cheat sheet a4 size paper both sides can be used for practice exams 2022 raw master exams exam2022 pdf 2021 raw master exams exam2021 pdf 2020 raw master exams exam2020 pdf 2019 raw master exams exam2019 pdf 2018 raw master exams exam2018 pdf solutions 2022 raw master exams exam2022solutions pdf 2021 raw master exams exam2021solutions pdf 2020 raw master exams exam2020solutions pdf 2019 raw master exams exam2019solutions pdf 2018 raw master exams exam2018solutions pdf links to related courses and materials cmu 10 725 https www stat cmu edu ryantibs convexopt f18 berkeley ee 227c https ee227c github io recommended books convex optimization algorithms and complexity https arxiv org pdf 1405 4980 pdf by s bastien bubeck free online convex optimization http stanford edu boyd cvxbook stephen boyd and lieven vandenberghe free online introductory lectures on convex optimization http citeseerx ist psu edu viewdoc download doi 10 1 1 693 855 rep rep1 type pdf yurii nesterov free online
ai
protocol
mozilla protocol protocol is a design system for mozilla and firefox websites it establishes a common design language provides reusable coded components and outlines high level guidelines for content and accessibility https protocol mozilla org protocol is still an evolving project currently it s used primarily by the mozilla marketing websites team as the front end for www mozilla org https www mozilla org the long term goal is to provide a robust unified design system that anyone at mozilla can use to build an on brand website if you re interested in using protocol on your project let us know and we can help you you can find us in protocol design system on mozilla s slack for mozillians or in protocol design system on matrix https chat mozilla org open to the public also feel free to file an issue on github https github com mozilla protocol issues current npm package version https img shields io npm v mozilla protocol core total downloads on npm https img shields io npm dt mozilla protocol core pull requests welcome https img shields io badge prs welcome brightgreen getting started protocol is built on the node js https nodejs org platform and published to npm https www npmjs com so be sure to have both installed before proceeding installation to use protocol in your website you can install the core package directly from npm npm install mozilla protocol core save alternatively you can also download the latest release https github com mozilla protocol releases latest from github usage once installed the relevant css javascript and asset files will be available in your project under node modules mozilla protocol core the core css file is bundled as protocol css which contains styling for things such as basic elements and typography as well as some global components like navigation and a footer other component and layout css is bundled as protocol components css for convenience however these pre compiled css files include the entire pattern library which you may not need we recommend compiling your own styles from the source sass files also included in the published package that allows you to configure protocol to include just the styles and components you need for each page of your website make it run to build protocol from source and run the documentation site locally you can clone the repo from github git clone https github com mozilla protocol git cd protocol npm install running npm install will install dependencies then npm run webpack this will compile the sass and copy assets into a local folder in preparation to run the server it also starts a watch process that will watch those files and automatically recompile when they change in another command line console and still within the protocol folder run npm start this will build the site locally and start the development server at http localhost 3000 building the website to build the protocol documentation site for deployment run npm run build docs building the npm package we use a webpack https webpack js org configuration for building the contents of the npm package ready for publishing to build the package run npm run build package this will install dependencies lint css js files and then build the package content in the package directory running tests to perform the package build process above and then run front end js tests against the processed files npm test publishing to npm protocol is published to npm under the mozilla protocol core namespace package name to publish a release to npm use the following steps 1 before you start make sure the project s changelog md https github com mozilla protocol blob main changelog md is up to date 2 update the package version number in assets package package json https github com mozilla protocol blob main assets package package json use semantic versioning https semver org to determine what the new version number should be 3 update the package readme assets package readme md https github com mozilla protocol blob main assets package readme md 4 update the package version number in the root package json https github com mozilla protocol blob main package json file and then run npm install to update the package lock json file 5 submit a pull request with your changes or commit directly to main if you have permission once the changes have been merged to main 6 tag a new release you can do this either using git tag https git scm com book en v2 git basics tagging or directly on the github website https github com mozilla protocol releases latest 7 run npm run build package npm test to run the build script and front end tests the package contents will be located in package 8 if the build is successful and all tests pass publish to npm using npm publish package deployment note the following instructions assume the mozilla repository is the remote called origin pushing to production each time an updated package is published to npm https protocol mozilla org should also be updated so the documentation site matches the npm package features 1 verify all is good on the staging site https main mozilla protocol netlify app 2 make sure your local main branch is up to date 3 push the main branch to the prod branch git push origin main prod a notice will be posted in www notify on slack when the push has completed pushing to demo for previewing new components before they are merged to main two demo instances are available 1 push your branch to the demo1 or demo2 branches e g git push f origin my branch name demo1 2 your branch will be published https demo1 mozilla protocol netlify com https demo2 mozilla protocol netlify com a notice will be posted in www notify on slack when the push has completed
os
web-dev-hactoberfest-2023
hacktoberfest 2023 this the repository i m hosting for hacktoberfest 2023 to help my college mates to embark their journey in open source hf10 vert fcl cmyk https github com ankitmrmishra hacktoberfest2023 assets 68045075 2af8577c eadc 42ae 8845 a9e98323f322 alert do not spam in the repository i ll make sure you must get disqualified for the hacktoberfest if any such behaviour is found please read contributing md contributingguide md and code of conduct md code of conduct md for details on our code of conduct and the process for submitting pull requests to us what is in the repo and what to contribute in ths repo you will find different folders with different tasks do checkout each task and each folder to contribute no palagrism is entertained the pull request must be clean and in a proper way if you don t know how to do this learn it from youtube chatgpt and other different sources do not spam here 1 person can contribute only one time in a particular folder the challenges your challenge is to build out this results summary component and get it looking as close to the design as possible you can use any tools you like to help you complete the challenge so if you ve got something you d like to practice feel free to give it a go we provide the data for the results in a local data json file so you can use that to add the results and total score dynamically if you choose your users should be able to view the optimal layout for the interface depending on their device s screen size see hover and focus states for all interactive elements on the page bonus use the local json data to dynamically populate the content want some support on the challenge join our slack community https www frontendmentor io slack and ask questions in the help channel where to find everything your task is to build out the project to the designs inside the design folder you will find both a mobile and a desktop version of the design the designs are in jpg static format using jpgs will mean that you ll need to use your best judgment for styles such as font size padding and margin all the required assets for this project are in the assets folder the images are already exported for the correct screen size and optimized there is also a style guide md file containing the information you ll need such as color palette and fonts building your project feel free to use any workflow that you feel comfortable with below is a suggested process but do not feel like you need to follow these steps 1 initialize your project as a public repository on github https github com creating a repo will make it easier to share your code with the community if you need help if you re not sure how to do this have a read through of this try git resource https try github io 2 configure your repository to publish your code to a web address this will also be useful if you need some help during a challenge as you can share the url for your project with your repo url there are a number of ways to do this and we provide some recommendations below 3 look through the designs to start planning out how you ll tackle the project this step is crucial to help you think ahead for css classes to create reusable styles 4 before adding any styles structure your content with html writing your html first can help focus your attention on creating well structured content 5 write out the base styles for your project including general content styles such as font family and font size 6 start adding styles to the top of the page and work down only move on to the next section once you re happy you ve completed the area you re working on deploying your project as mentioned above there are many ways to host your project for free our recommend hosts are github pages https pages github com vercel https vercel com netlify https www netlify com create a custom readme md we strongly recommend overwriting this readme md with a custom one we ve provided a template inside the readme template md readme template md file in this starter code the template provides a guide for what to add a custom readme will help you explain your project and reflect on your learnings please feel free to edit our template as much as you like once you ve added your information to the template delete this file and rename the readme template md file to readme md that will make it show up as your repository s readme file thanks to frontend mentor for the challenge and helping us in the hacktoberfest have fun building
hacktoberfest hacktoberfest2023
front_end
TTOW0211-221-game-development
game development module ttow0211 ttow0221 lecturers paavo nelimarkka jouni huotari visiting lecturers contact firstname lastname jamk fi class d330 in the afternoons special lectures in d407 on mondays and tuesdays at 8 30 11 30 course grading https github com jamk it ttow0211 221 game development wiki course grading slack https jamk it slack com and the channel is ttow0211 221 https jamk it slack com messages ttow0211 221 assignment return folders in optima https optima jamk fi check if you have enrolled to both the courses through asio system if not and willing please tell us if you are only willing to do the basics part please tell us assignments final return instructions basics of game development pre course assignments https github com jamk it ttow0211 221 game development wiki pre course assignments if any questions ask paavonelimarkka on slack 1 game design assignments https github com jamk it ttow0211 221 game development wiki game design assignments 2 exercises and assignments for game programming with unity 3 game graphics assignments game project https github com jamk it ttow0211 221 game development blob master info project md for both courses game developers journal instructions for writing learning report http homes jamk fi huojo opetus iio50z learningreport pdf course timetable lectures in d407 08 30 11 30 lessons in d330 starting from 12 15 day subjects lecturer misc 8 2 pre course assignments https github com jamk it ttow0211 221 game development wiki pre course assignments questions paavonelimarkka on slack 6 3 lecture game project experiences dmitri z course introduction https docs google com presentation d 15lzkxx7tdeppj7uol1ygysg7cmljwem0od90irdrzem edit usp sharing introducing yourselves pre course assignemnts https github com jamk it ttow0211 221 game development wiki pre course assignments game evaluation assignment https github com jamk it ttow0211 221 game development wiki game evaluation assignment game idea development https github com jamk it ttow0211 221 game development wiki game idea development paavo nelimarkka 7 3 lecture manuscripting synopsis ilari miikkulainen agile methodologies tools https github com jamk it ttow0211 221 game development wiki agile methodologies team formation team communication idea pitching https github com jamk it ttow0211 221 game development wiki team formation synopsis https github com jamk it ttow0211 221 game development wiki assignments synopsis paavo nelimarkka 13 3 lecture game design post mortem analysis klaus k ri inen post mortem analysis https github com jamk it ttow0211 221 game development wiki post mortem analysis blue arrow awards in d407 13 30 https www bluearrowawards com paavo nelimarkka remember your synopsis 14 3 lecture game design theory klaus k ri inen game idea pitching game prototype https github com jamk it ttow0211 221 game development wiki game pitch paper prototype start gdd https github com jamk it ttow0211 221 game development wiki game design document 2 game design document gpp https github com jamk it ttow0211 221 game development wiki game project plan paavo nelimarkka 20 3 lecture game graphics mikko tyni tuomas roininen expa http www expa fi game idea pitching and prototype reviews paavo nelimarkka 21 3 lecture game graphics mikko tyni git https github com jamk it ttow0211 221 game development wiki material git unity3d survival shooter https unity3d com learn tutorials projects survival shooter tutorial project work expa gathering https www facebook com events 1624706311166333 paavo nelimarkka expa gathering https www facebook com events 1624706311166333 27 3 lecture game graphics mikko tyni unity3d survival shooter https unity3d com learn tutorials projects survival shooter tutorial project work paavo nelimarkka 28 3 lecture game graphics mikko tyni project work paavo nelimarkka ilaris lecture cancelled and rescheduled for next week 3 4 lecture game project execution mikko tyni game sound music production d330 project work ilari miikkulainen paavo nelimarkka 4 4 lecture entrepreneurship in game industry mikko tyni project work paavo nelimarkka 10 4 lecture marketing in game industry klaus k ri inen project work paavo nelimarkka 11 4 lecture game business insudtry mika karhulahti project work paavo nelimarkka 17 4 easter break 18 4 project work paavo nelimarkka 24 4 quest lecturers overmare studios various lecturers group discussions 12 30 14 00 finishing the project remember the demo video presentation paavo nelimarkka course feeback in asio 25 4 game presentations in auditorium course feeback in asio group discussions if still needed everyone who s interested expa gathering please note there will be changes in this timetable learning material resources here is a link https github com jamk it ttow0211 221 game development wiki material for our material resources wiki page if you find something useful that could be linked there please inform paavonelimarkka on slack assignments pre course assignments https github com jamk it ttow0211 221 game development wiki pre course assignments misc links https www reddit com r unity3d https education github com pack
timetable
server
ITAM
p align center a href https laravel com target blank img src https raw githubusercontent com laravel art master logo lockup 5 20svg 2 20cmyk 1 20full 20color laravel logolockup cmyk red svg width 400 alt laravel logo a p p align center a href https github com laravel framework actions img src https github com laravel framework workflows tests badge svg alt build status a a href https packagist org packages laravel framework img src https img shields io packagist dt laravel framework alt total downloads a a href https packagist org packages laravel framework img src https img shields io packagist v laravel framework alt latest stable version a a href https packagist org packages laravel framework img src https img shields io packagist l laravel framework alt license a p about laravel laravel is a web application framework with expressive elegant syntax we believe development must be an enjoyable and creative experience to be truly fulfilling laravel takes the pain out of development by easing common tasks used in many web projects such as simple fast routing engine https laravel com docs routing powerful dependency injection container https laravel com docs container multiple back ends for session https laravel com docs session and cache https laravel com docs cache storage expressive intuitive database orm https laravel com docs eloquent database agnostic schema migrations https laravel com docs migrations robust background job processing https laravel com docs queues real time event broadcasting https laravel com docs broadcasting laravel is accessible powerful and provides tools required for large robust applications learning laravel laravel has the most extensive and thorough documentation https laravel com docs and video tutorial library of all modern web application frameworks making it a breeze to get started with the framework you may also try the laravel bootcamp https bootcamp laravel com where you will be guided through building a modern laravel application from scratch if you don t feel like reading laracasts https laracasts com can help laracasts contains over 2000 video tutorials on a range of topics including laravel modern php unit testing and javascript boost your skills by digging into our comprehensive video library laravel sponsors we would like to extend our thanks to the following sponsors for funding laravel development if you are interested in becoming a sponsor please visit the laravel patreon page https patreon com taylorotwell premium partners vehikl https vehikl com tighten co https tighten co kirschbaum development group https kirschbaumdevelopment com 64 robots https 64robots com cubet techno labs https cubettech com cyber duck https cyber duck co uk many https www many co uk webdock fast vps hosting https www webdock io en devsquad https devsquad com curotec https www curotec com services technologies laravel op gg https op gg webreinvent https webreinvent com utm source laravel utm medium github utm campaign patreon sponsors lendio https lendio com contributing thank you for considering contributing to the laravel framework the contribution guide can be found in the laravel documentation https laravel com docs contributions code of conduct in order to ensure that the laravel community is welcoming to all please review and abide by the code of conduct https laravel com docs contributions code of conduct security vulnerabilities if you discover a security vulnerability within laravel please send an e mail to taylor otwell via taylor laravel com mailto taylor laravel com all security vulnerabilities will be promptly addressed license the laravel framework is open sourced software licensed under the mit license https opensource org licenses mit
server
oneai-node
logo oneai logo light cropped svg https oneai com utm source open source utm medium node sdk readme natural language processing api api key https img shields io badge 20 get 20your 20api 20key 20for 20free 231d1c29 logo data image svg 2bxml base64 phn2zyb3awr0ad0imteyiibozwlnahq9ijg4iib2awv3qm94psiwidagmteyidg4iibmawxspsjub25liib4bwxucz0iahr0cdovl3d3dy53my5vcmcvmjawmc9zdmcipgo8cgf0acbmawxslxj1bgu9imv2zw5vzgqiignsaxatcnvszt0izxzlbm9kzcigzd0itty5lji5mtmgmzcuntu4nkwzni4xmtqzidg3ljc1mtzimtayljq2oew2os4yotezidm3lju1odzaiibmawxspsijmdbgrkzgii8 cjxwyxroigzpbgwtcnvszt0izxzlbm9kzcigy2xpcc1ydwxlpsjldmvub2rkiibkpsjnodkumzawoca1mc4yotixsdexms4xnjrwmeg4os4zmda4vjuwlji5mjfaiibmawxspsijrjizreu5ii8 cjxwyxroigzpbgwtcnvszt0izxzlbm9kzcigy2xpcc1ydwxlpsjldmvub2rkiibkpsjnmjuuotewocaxms42ndc5qzexljywmdcgmteunjq3osawidizlji0odygmcazny41ntg3qzagnteuody4ocaxms42mda3idyzljq3mdegmjuuotewoca2my40nzaxqzqwljiymdkgnjmundcwmsa1ms44mje2iduxljg2odggnteuodixniazny41ntg3qzuxljgymtygmjmumjq4nia0mc4ymja5idexljy0nzkgmjuuotewocaxms42ndc5iibmawxspsijneq0rezgii8 cjwvc3znpgo https studio oneai com settings api keys build https github com oneai nlp oneai node actions workflows main yml badge svg coverage status https coveralls io repos github oneai nlp oneai node badge svg branch main https coveralls io github oneai nlp oneai node branch main version https img shields io npm v oneai svg https www npmjs org package oneai downloads https img shields io npm dm oneai svg https www npmjs com package oneai license https img shields io npm l oneai svg https www npmjs com package oneai try on runkit https badge runkitcdn com oneai svg https runkit com npm oneai discord https img shields io discord 941458663493746698 logo discord https discord gg arpmha9n8h one ai provides natural language processing for node js some use cases include summarize conversations and articles detect sentiments and emotions detect topics and classify content alt text https s4 gifyu com images screen recording 10 6 2022 at 6 55 pm gif summarize example of summarizing a zoom meeting node const conversation speaker mario utterance we need to increase our coin collection rate speaker luigi utterance agreed mario let s invest in more power ups boring meeting const pipeline new oneai pipeline oneai skills summarize pipeline run conversation output luigi and mario agreed to fix the plumbing at 9am on friday see the full documentation here https studio oneai com docs utm source open source utm medium node sdk readme getting started installation npm install oneai authentication you will need a valid api key for all requests register and create a key for your project in one ai studio https studio oneai com utm source open source utm medium node sdk readme as a security measure we only show the key once so make sure to keep it somewhere safe example node import oneai from oneai oneai new oneai your api key const pipeline new oneai pipeline oneai skills names oneai skills summarize min length 20 oneai skills highlights const output await pipeline run analyze this text console log output pipeline api the pipeline api enables analyzing and transforming text using various skills a skill is a package of trained nlp models available via api which accept text from various language sources as input and respond with processed texts and extracted metadata chaining skills together creates a pipeline oneai studio the best way to create a pipeline is to use our studio https studio oneai com utm source open source utm medium node sdk readme where you can craft a pipeline using an easy graphical interface and then paste the generated code back into your repository basic example let s say you re interested in extracting keywords from the text node const pipeline new oneai pipeline oneai skills keywords const output await pipeline run analyze this text console log output multi skills request let s say you re interested in extracting keywords and emotions from the text node const pipeline new oneai pipeline oneai skills keywords oneai skills emotions const output await pipeline run analyze this text console log output analyzer skills vs generator skills skills can do either text analysis and then their output are labels and spans labels location in the analyzed text or they can be generator skills in which case they transform the input text into an output text here s an example for a pipeline that combines both type of skills it will extract keywords and emotions from the text and then summarize it node const pipeline new oneai pipeline oneai skills keywords oneai skills emotions oneai skills summarize const output await pipeline run analyze this text console log output order is important when the pipeline is invoked it is invoked with an original text you submit if a generator skill is ran then all following skills will use its generated text rather then the original text in this example for instance we change the order of the pipeline from the previous example and the results will be different instead of extracting keywords and emotions from the original text keywords and emotions will be extracted from the generated summary node const pipeline new oneai pipeline oneai skills summarize oneai skills keywords oneai skills emotions const output await pipeline run analyze this text console log output configuring skills many skills are configurable as you can find out in the docs https studio oneai com docs utm source open source utm medium node sdk readme let s use the exact same example this time however we ll limit the summary length to 50 words node const pipeline new oneai pipeline oneai skills summarize max length 50 oneai skills keywords oneai skills emotions const output await pipeline run analyze this text console log output output the structure of the output is dynamic and corresponds to the skills used whether they are generators or analyzers and their order in the pipeline each output object contains the input text which can be the original input or text produced by generator skills and a list of labels detected by analyzer skills that contain the extracted data let s say we run this code node const text the hitchhiker s guide to the galaxy is a science fiction comedy radio series written by douglas adams const pipeline new oneai pipeline oneai skills names oneai skills summarize min length 20 oneai skills names const output await pipeline run text console log output in plain english we extract names from the text then summarize it and then extract names from the summary here s what the reponse would look like the important thing to notice whenever a generator skill runs summarize in this case all following skills responses will be embedded within the generator result as it changes the text the skill processes json text the hitchhiker s guide to the galaxy is a science fiction comedy radio series written by douglas adams names this array will contain the names detected in the original text type name label type name work of art label class value the hitchhiker s guide to the galaxy label value output spans label spans where the name was detected in the text section 0 start 0 end 36 summary this actual summary text the hitchhiker s guide to the galaxy is a science fiction comedy the names detected in the summary names type name name work of art value the hitchhiker s guide to the galaxy output spans section 0 start 0 end 36 file uploads our api supports the following file extensions txt text content json conversations in the one ai conversation format srt analyze captions as conversations wav mp3 audio files to be transcribed analyzed jpg detect text in pictures via ocr upload a file via the pipeline runfile method i e node const filepath example txt const pipeline new oneai pipeline const output await pipeline runfile filepath support feel free to submit issues in this repo contact us at devrel oneai com or chat with us on discord https discord gg arpmha9n8h
language language-ai nlp nodejs oneai ai api artificial-intelligence natural-language-processing summarization text text-classification text-analysis entity-detection sentiment-analysis topic-detection transcription
ai
Learn-VS-Code
learn vs code learn vs code cover image images cover png head over to learn vs code https www learnvscode com to learn everything you need to know about the most popular editor in web development in just a few years visual studio code has become the most popular editor for web development in short it is open source cross platform full of functionality and has an amazing community behind it many of the biggest names in web development have already made the switch and so should you if you re looking to learn the ins and outs of vs code while increasing your efficiency and proficiency as developer this course is perfect for you whether you re brand new to web development or a seasoned veteran there s something for everyone shortcut references will use ctrlcmd to represent the control key on windows and the command key on mac if the shortcuts on windows vs mac are different they will be separated by the character with mac coming first sample projects to use during demos build a quiz app with html css and javascript https github com jamesqquick build a quiz app with html css and javascript design and build a chat application with socket io https github com jamesqquick design and build a chat application with socket io color lighten darken tools https github com jamesqquick lighten darken color tool 1 getting started sections gettingstarted md in this section we will discuss available resources the what and why behind vs code and how to download and install learn more sections layoutsandshortcuts md 2 layouts and shortcuts sections layoutsandshortcuts md in this section we will open vs code for the first and explore the layout we will explore various shortcuts for customizing your layout working with files manipulating text and more learn more sections layoutsandshortcuts md 3 customization sections customization md in this section we will explore how to customize your editor through keymaps shortcuts settings extensions and themes learn more sections customization md 4 writing and formatting code sections writingandformattingcode md in this section we learn how to create snippets work with markdown documents organize code format code and lint code learn more sections writingandformattingcode md 6 integrated terminal sections integratedterminal md in this section we will learn how to use and customize the integrated terminal learn more sections integratedterminal md 7 working with git sections workingwithgit md in this section we will learn how to take advantage of the built in tooling for git as well as useful extensions for working with git as well learn more sections workingwithgit md 8 debugging sections debugging md in this section we will learn how to debug front end javascript and back end javascript as well as angular react and vue applications learn more sections debugging md 9 extras sections extras md in this section we will learn about extra misc features of vs code learn more sections extras md
front_end
tutorials
machine learning with torch7 all the documentation for these tutorials is available on this page https github com clementfarabet ipam tutorials tree master th tutorials
ai
rgb
rgb react gin blog this a simple web blog created by using react for frontend and gin golang framework for backend this repository is created as a support for the guide https letscode blog category gin golang and react web app guide about implementing gin backend from the scratch
front_end
blockchain
blockchain learning on block chain step by step
blockchain
Mastering-Blockchain-Second-Edition
mastering blockchain second edition this is the code repository for mastering blockchain second edition https www packtpub com big data and business intelligence mastering blockchain second edition utm source github utm medium repository utm campaign 9781788839044 published by packt https www packtpub com it contains all the supporting project files necessary to work through the book from start to finish about the book a blockchain is a distributed ledger that is replicated across multiple nodes and enables immutable transparent and cryptographically secure record keeping of transactions the blockchain technology is the backbone of cryptocurrencies and it has applications in finance government media and almost all other industries mastering blockchain second edition has been thoroughly updated and revised to provide a detailed description of this leading technology and its implementation in the real world this book begins with the technical foundations of blockchain technology teaching you the fundamentals of distributed systems cryptography and how it keeps data secure you will learn about the mechanisms behind cryptocurrencies and how to develop applications using ethereum a decentralized virtual machine you will also explore different other blockchain solutions and get an introduction to business blockchain frameworks under hyperledger a collaborative effort for the advancement of blockchain technologies hosted by the linux foundation you will also be shown how to implement blockchain solutions beyond currencies internet of things with blockchain blockchain scalability and the future scope of this fascinating and powerful technology instructions and navigations all of the code is organized into folders each folder starts with a number followed by the application name for example chapter02 the code will look like the following pragma solidity 0 4 0 contract teststruct struct trade uint tradeid uint quantity uint price string trader this struct can be initialized and used as below trade tstruct trade tradeid 123 quantity 1 price 1 trader equinox all examples in this book have been developed on ubuntu 16 04 1 lts xenial and macos version 10 13 2 as such it is recommended to use ubuntu or any other unix like system however any appropriate operating system either windows or linux can be used but examples especially those related to installation may need to be changed accordingly examples related to cryptography have been developed using the openssl 1 0 2g 1 mar 2016 command line tool ethereum solidity examples have been developed using remix ide available online at https remix ethereum org ethereum byzantine release is used to develop ethereum related examples at the time of writing this is the latest version available and can be downloaded from https www ethereum org examples related to iot have been developed using a raspberry pi kit by vilros but any aapropriate latest model or kit can be used specifically raspberry pi 3 model b v 1 2 has been used to build the hardware example of iot node js v8 9 3 and npm v5 5 1 have been used to download related packages and run node js server for iot examples the truffle framework has been used in some examples of smart contract deployment and is available at http truffleframework com any latest version available via npm should be appropriate related products mastering blockchain https www packtpub com big data and business intelligence mastering blockchain utm source github utm medium repository utm campaign 9781787125445 building blockchain projects https www packtpub com big data and business intelligence building blockchain projects utm source github utm medium repository utm campaign 9781787122147 blockchain and cryptocurrency bitcoin ethereum essentials video https www packtpub com application development blockchain and cryptocurrency bitcoin ethereum essentials video utm source github utm medium repository utm campaign 9781788990837
blockchain bitcoin ethereum cryptocurrency
blockchain
EmployeeDatabase_SQL
employee database assessment conducting data modeling data engineering and data analysis using sql and pandas overview the objective of this project is to perform a research analysis of employees hired during the 1980s and 1990s at pewlett hackard a fictional company all that remains of the employee database from that period are six csv files those six csv files will be used to design tables which will be then imported into a sql database so that the research assessment can be conducted the research the research is broken down into 3 sections data modeling data engineering and data analysis with a bonus assessment done using sqlalchemy following is a breakdown of each of those sections data modeling based on the structure of the csv files in the data folder an erd sketch was created using quickdbd img src https github com wayburke employeedatabase sql blob main employeesql erd quickdbd export png data engineering from the sketched erd a schema was generated by quickdbd that was altered to my preference in schema creation the csv files were then imported into the tables created the modified sql schema can be view at this link https github com wayburke employeedatabase sql blob main employeesql employeesql schema quickdbd sql data analysis queries responses were provided for the 8 questions below the sql queries can be viewed here https github com wayburke employeedatabase sql blob main employeesql employeesql 20 20selectstatements sql the questions are as follows 1 list the following details of each employee employee number last name first name sex and salary 2 list the following details of employees who were hired in 1986 first name last name and hire date 3 list the manager of each department along with their department number department name the manager s employee number last name and first name 4 list the department number of each employee along with that employee s employee number last name first name and department name 5 list all employees whose first name is hercules and whose last names begins with the letter b 6 list all employees in the sales department including their employee number last name and first name 7 list all employees in the sales and development departments including their employee number last name first name and department name 8 list the frequency count in descending order of all the employee by last names i e how many employees share each last name bonus analysis using jupyter notebook this jupyter notebook https github com wayburke employeedatabase sql blob main employeesql employeesql ipynb used the database created in data engineering phase to create a connection using sqlalchemy create a histogram of the employee salary ranges img src https github com wayburke employeedatabase sql blob main employeesql images frequency 20of 20salaries png create a bar chart of the average salaries per job title img src https github com wayburke employeedatabase sql blob main employeesql images average 20salary 20by 20title png folders files included data folder erd folder employeesql ipynb employeesql selectstatements sql employeesql schema quickdbd sql images folder
csv-import database pandas-python sql sqlalchemy
server
PracticalMalwareAnalysisProblems
practicalmalwareanalysislabproblems a collection of ida pro reverse engineering databases for the lab problems in practical malware analysis by michael sikorski and andrew honig some units labs are missing due to ida pro not saving some databases properly and because of data corruption caused by a paranoid anti virus program binaries the binaries that were reversed for these problems can be found here https github com mikesiko practicalmalwareanalysis labs
server
Spotway_etl_pipeline
spotway data engineering project end to end etl pipeline built using python aws cloud technologies lambda s3 glue athena cloudwatch and spotify web api what is spotway spotway is an end to end etl extract transform load pipeline built using python spotify s web api and aws cloud technologies to see what songs are currently popular and trendy so that you are always up to date on the latest tunes tech stack skils used 1 python 2 spoity web api 3 aws lambda 4 aws s3 5 aws glue 6 aws cloudwatch 7 aws athena pipeline architecture the architecture diagram created below highlights and breaks down the etl pipeline into different stages we access the spotify api and use python to extract data from the api to build our pipeline we use aws cloudwatch to have a daily trigger which it is then extracted on aws lambda the extracted data is stored on a bucket in aws s3 the raw extracted data on s3 is placed on aws lambda once again and is transformed by the lambda function and stored on s3 the transformed data is crawled by aws glue and can be accessed and used for data analytics on aws athena screenshot 2023 05 29 at 4 00 19 pm https github com anujgarlapati spotway etl pipeline assets 59670482 97934928 cbc9 4dfa b14a f263bed2d349 building the etl pipeline before starting to build the etl pipeline we must require access to the dataset the dataset used for this data engineering project is spotify s web api an account must be created to access this api and this will allow us to get the credentials of both the client id and client secret these tokens can be accessed as seen in the image below screenshot 2023 05 31 at 2 58 51 pm https github com anujgarlapati spotway etl pipeline assets 59670482 38653e7d 6583 46ea a72c 60f13ed7a7ac spotify extract data api py this python file code is used to extract data from spotify s web api the code is run on aws lambda where it is then stored in a lambda function and is used for the initial stages of our etl pipeline the code can be broken down as follows the packages necessary for the lambda spotify extraction data api function import json import os import spotipy from spotipy oauth2 import spotifyclientcredentials import boto3 from datetime import datetime inside the lambda function both the client id and client secret are stored in aws lambda environment variables to keep confidentiality an object is created in order to extract data def lambda handler event context client id os environ get client id client secret os environ get client secret spotify credentials spotifyclientcredentials client id client id client secret client secret spotify ob spotipy spotify client credentials manager spotify credentials playlists spotify ob user playlists spotify more specifically we are extracting data from the global top 50 playlists where the url is being extracted so that we access the playlist data top50 playlist link https open spotify com playlist 37i9dqzevxbmdohdwvn2tf top50 playlist url top50 playlist link split 1 top50 data spotify ob playlist tracks top50 playlist url boto3 provides a python api for aws cloud infrastructure services in the following code we store the raw data in a bucket in s3 client boto3 client s3 filename final raw data str datetime now json client put object bucket spotway data pipeline project anuj key spotify data raw process required filename body json dumps top50 data the code is seen in aws lambda as follows screenshot 2023 05 31 at 3 43 12 pm https github com anujgarlapati spotway etl pipeline assets 59670482 596ea0b2 6d2f 494c 9667 128a914741cb in amazon cloudwatch we set a daily trigger as we are in need of extracting data once every day as the playlist data is ever changing this can be done by adding a trigger in the function overview screenshot 2023 06 08 at 9 03 36 pm https github com anujgarlapati spotway etl pipeline assets 59670482 d0f8a1df 1b9a 4f73 bf31 4db400da9668 spotify data transformation py once raw data is extracted daily from spoitfy s web api and stored on a bucket in s3 we must have a transformation function in aws lambda as the second part of our etl pipeline the first part of transforming the extracted data is that we are simply first looking into different characteristics of the data we have multiple functions in this python file where the first lambda function focuses on the album data import json import boto3 from datetime import datetime from io import stringio import pandas as pd def spotifyalbum top50 data top50 list album for line in top50 data items album id top50 line track album id album name top50 line track album name album releasedate top50 line track album release date album totaltracks top50 line track album total tracks album url top50 line track album external urls spotify converting into dictonary album dict id album id top50 name album name top50 release date album releasedate top50 total tracks album totaltracks top50 url album url top50 top50 list album append album dict return top50 list album lambda function for artist data def spotifyartist top50 data top50 list artist for line in top50 data items for key value in line items if key track for artist in value artists artist dict artist id artist id artist name artist name url artist href top50 list artist append artist dict return top50 list artist lambda function for songs data def spotifysongs top50 data song list top50 for line in top50 data items song id top50 line track id song name top50 line track name song duration top50 line track duration ms song url top50 line track external urls spotify song popularity top50 line track popularity song added line added at converting into dictonary song dict id song id top50 name song name top50 duration song duration top50 url song url top50 popularity song popularity top50 song added song added song list top50 append song dict return song list top50 once we have each function for different parts of the playlist data we can then call each of the functions in the lambda handler where the transformed data will be placed in a bucket in s3 the code is below def lambda handler event context s3 boto3 client s3 bucket spotway data pipeline project anuj key spotify data raw process required spotify data spotify keys for file in s3 list objects bucket bucket prefix key contents file key file key if file key split 1 json response s3 get object bucket bucket key file key content response body jsonobject json loads content read spotify data append jsonobject spotify keys append file key for top50 data in spotify data top50 list album spotifyalbum top50 data top50 list artist spotifyartist top50 data song list top50 spotifysongs top50 data top 50 album df pd dataframe from dict top50 list album top 50 album df top 50 album df drop duplicates subset id top 50 artist df pd dataframe from dict top50 list artist top 50 artist df top 50 artist df drop duplicates subset artist id top50 song df pd dataframe from dict song list top50 top 50 album df release date pd to datetime top 50 album df release date top50 song df song added pd to datetime top50 song df song added top50 songkey spotify transformed data spotify songs data songs transformed data str datetime now csv song buffer stringio top50 song df to csv song buffer index false song content song buffer getvalue s3 put object bucket bucket key top50 songkey body song content top50 albumkey spotify transformed data spotify albums data album transformed data str datetime now csv album buffer stringio top 50 album df to csv album buffer index false album content album buffer getvalue s3 put object bucket bucket key top50 albumkey body album content top50 artistkey spotify transformed data spotify artist data artist transformed data str datetime now csv artist buffer stringio top 50 artist df to csv artist buffer index false artist content artist buffer getvalue s3 put object bucket bucket key top50 artistkey body artist content s3 resource boto3 resource s3 for key in spotify keys copy source bucket bucket key key s3 resource meta client copy copy source bucket spotify data raw processed key split 1 s3 resource object bucket key delete the trigger for spotify data transformation function is as follows screenshot 2023 06 08 at 9 03 46 pm https github com anujgarlapati spotway etl pipeline assets 59670482 dd4bda16 5cf3 48bf 9faf 30f96dcbf3fd loading the data in the etl pipeline once both the extraction and transformation phases are done for the etl pipeline the last and final phase of the etl pipeline is to load the data to load the data we must use the crawler in aws glue to connect a datastore to create metadata tables there are three different crawlers that we have created to load the data in a catalog the artist song and album data this can be seen in the image down below screenshot 2023 05 31 at 4 30 00 pm https github com anujgarlapati spotway etl pipeline assets 59670482 a1ec777e 0e57 48e7 a4c5 549d051e01b8 once the data is loaded into a data catalog by aws glue s crawler we can then access the data using aws athena for data analytics we are able to perform a multitude of tasks which include even running sql queries to sort and organize the data this can be seen down below screenshot 2023 05 31 at 4 39 07 pm https github com anujgarlapati spotway etl pipeline assets 59670482 3373915e 7ebc 40f0 83ae dd0f0697ed0d conclusion key takeways this project had deeply taught me how to create an etl pipeline to extract transform and load data using aws cloud technologies ultimately i have learned thoroughly on how to use aws lambda glue s3 athena and cloudwatch note i have also cleaned the data using a jupyter notebook and this can be seen in the following spotify etl ipynb file
cloud
interpret
interpretml a href https githubtocolab com interpretml interpret blob develop examples python interpretable classification methods ipynb img src https colab research google com assets colab badge svg alt open in colab a binder https mybinder org badge logo svg https mybinder org v2 gh interpretml interpret develop labpath examples 2fpython 2finterpretable classification methods ipynb license https img shields io github license interpretml interpret svg style flat square python version https img shields io pypi pyversions interpret svg style flat square package version https img shields io pypi v interpret svg style flat square conda https img shields io conda v conda forge interpret build status https img shields io azure devops build ms interpret 293 develop svg style flat square coverage https img shields io azure devops coverage ms interpret 293 develop svg style flat square maintenance https img shields io maintenance yes 2023 style flat square br in the beginning machines learned in darkness and data scientists struggled in the void to explain them let there be light interpretml is an open source package that incorporates state of the art machine learning interpretability techniques under one roof with this package you can train interpretable glassbox models and explain blackbox systems interpretml helps you understand your model s global behavior or understand the reasons behind individual predictions interpretability is essential for model debugging why did my model make this mistake feature engineering how can i improve my model detecting fairness issues does my model discriminate human ai cooperation how can i understand and trust the model s decisions regulatory compliance does my model satisfy legal requirements high risk applications healthcare finance judicial https github com interpretml interpretml github io blob master interpret highlight gif installation python 3 7 linux mac windows sh pip install interpret or conda install c conda forge interpret introducing the explainable boosting machine ebm ebm is an interpretable model developed at microsoft research sup citations sup it uses modern machine learning techniques like bagging gradient boosting and automatic interaction detection to breathe new life into traditional gams generalized additive models this makes ebms as accurate as state of the art techniques like random forests and gradient boosted trees however unlike these blackbox models ebms produce exact explanations and are editable by domain experts dataset auroc domain logistic regression random forest xgboost explainable boosting machine adult income finance 907 003 903 002 927 001 928 002 heart disease medical 895 030 890 008 851 018 898 013 breast cancer medical 995 005 992 009 992 010 995 006 telecom churn business 849 005 824 004 828 010 852 006 credit fraud security 979 002 950 007 981 003 981 003 notebook for reproducing table https nbviewer jupyter org github interpretml interpret blob main benchmarks ebm classification comparison ipynb supported techniques interpretability technique type explainable boosting https interpret ml docs ebm html glassbox model decision tree https interpret ml docs dt html glassbox model decision rule list https interpret ml docs dr html glassbox model linear logistic regression https interpret ml docs lr html glassbox model shap kernel explainer https interpret ml docs shap html blackbox explainer lime https interpret ml docs lime html blackbox explainer morris sensitivity analysis https interpret ml docs msa html blackbox explainer partial dependence https interpret ml docs pdp html blackbox explainer train a glassbox model let s fit an explainable boosting machine python from interpret glassbox import explainableboostingclassifier ebm explainableboostingclassifier ebm fit x train y train or substitute with logisticregression decisiontreeclassifier rulelistclassifier ebm supports pandas dataframes numpy arrays and handles string data natively understand the model python from interpret import show ebm global ebm explain global show ebm global global explanation image examples python assets readme ebm global specific png raw true br understand individual predictions python ebm local ebm explain local x test y test show ebm local local explanation image examples python assets readme ebm local specific png raw true br and if you have multiple model explanations compare them python show logistic regression global decision tree global dashboard image examples python assets readme dashboard png raw true br if you need to keep your data private use differentially private ebms see dp ebms https proceedings mlr press v139 nori21a nori21a pdf python from interpret privacy import dpexplainableboostingclassifier dpexplainableboostingregressor dp ebm dpexplainableboostingclassifier epsilon 1 delta 1e 5 specify privacy parameters dp ebm fit x train y train show dp ebm explain global identical function calls to standard ebms br br for more information see the documentation https interpret ml docs getting started html br br acknowledgements interpretml was originally created by equal contributions samuel jenkins harsha nori paul koch and rich caruana ebms are fast derivative of ga2m invented by yin lou rich caruana johannes gehrke and giles hooker many people have supported us along the way check out acknowledgements md acknowledgements md we also build on top of many great packages please check them out plotly https github com plotly plotly py dash https github com plotly dash scikit learn https github com scikit learn scikit learn lime https github com marcotcr lime shap https github com slundberg shap salib https github com salib salib skope rules https github com scikit learn contrib skope rules treeinterpreter https github com andosa treeinterpreter gevent https github com gevent gevent joblib https github com joblib joblib pytest https github com pytest dev pytest jupyter https github com jupyter notebook a name citations citations a details open summary strong interpretml strong summary hr details open summary em interpretml a unified framework for machine learning interpretability h nori s jenkins p koch and r caruana 2019 em summary br pre article nori2019interpretml title interpretml a unified framework for machine learning interpretability author nori harsha and jenkins samuel and koch paul and caruana rich journal arxiv preprint arxiv 1909 09223 year 2019 pre a href https arxiv org pdf 1909 09223 pdf paper link a details hr details details summary strong explainable boosting strong summary hr details summary em intelligible models for healthcare predicting pneumonia risk and hospital 30 day readmission r caruana y lou j gehrke p koch m sturm and n elhadad 2015 em summary br pre inproceedings caruana2015intelligible title intelligible models for healthcare predicting pneumonia risk and hospital 30 day readmission author caruana rich and lou yin and gehrke johannes and koch paul and sturm marc and elhadad noemie booktitle proceedings of the 21th acm sigkdd international conference on knowledge discovery and data mining pages 1721 1730 year 2015 organization acm pre a href https www microsoft com en us research wp content uploads 2017 06 kdd2015finaldraftintelligiblemodels4healthcare igt143e caruanaa pdf paper link a details details summary em accurate intelligible models with pairwise interactions y lou r caruana j gehrke and g hooker 2013 em summary br pre inproceedings lou2013accurate title accurate intelligible models with pairwise interactions author lou yin and caruana rich and gehrke johannes and hooker giles booktitle proceedings of the 19th acm sigkdd international conference on knowledge discovery and data mining pages 623 631 year 2013 organization acm pre a href https www cs cornell edu yinlou papers lou kdd13 pdf paper link a details details summary em intelligible models for classification and regression y lou r caruana and j gehrke 2012 em summary br pre inproceedings lou2012intelligible title intelligible models for classification and regression author lou yin and caruana rich and gehrke johannes booktitle proceedings of the 18th acm sigkdd international conference on knowledge discovery and data mining pages 150 158 year 2012 organization acm pre a href https www cs cornell edu yinlou papers lou kdd12 pdf paper link a details details summary em interpretability then what editing machine learning models to reflect human knowledge and values zijie j wang alex kale harsha nori peter stella mark e nunnally duen horng chau mihaela vorvoreanu jennifer wortman vaughan rich caruana 2022 em summary br pre article wang2022interpretability title interpretability then what editing machine learning models to reflect human knowledge and values author wang zijie j and kale alex and nori harsha and stella peter and nunnally mark e and chau duen horng and vorvoreanu mihaela and vaughan jennifer wortman and caruana rich journal arxiv preprint arxiv 2206 15465 year 2022 pre a href https arxiv org pdf 2206 15465 pdf paper link a details details summary em axiomatic interpretability for multiclass additive models x zhang s tan p koch y lou u chajewska and r caruana 2019 em summary br pre inproceedings zhang2019axiomatic title axiomatic interpretability for multiclass additive models author zhang xuezhou and tan sarah and koch paul and lou yin and chajewska urszula and caruana rich booktitle proceedings of the 25th acm sigkdd international conference on knowledge discovery data mining pages 226 234 year 2019 organization acm pre a href https arxiv org pdf 1810 09092 pdf paper link a details details summary em distill and compare auditing black box models using transparent model distillation s tan r caruana g hooker and y lou 2018 em summary br pre inproceedings tan2018distill title distill and compare auditing black box models using transparent model distillation author tan sarah and caruana rich and hooker giles and lou yin booktitle proceedings of the 2018 aaai acm conference on ai ethics and society pages 303 310 year 2018 organization acm pre a href https arxiv org pdf 1710 06169 paper link a details details summary em purifying interaction effects with the functional anova an efficient algorithm for recovering identifiable additive models b lengerich s tan c chang g hooker r caruana 2019 em summary br pre article lengerich2019purifying title purifying interaction effects with the functional anova an efficient algorithm for recovering identifiable additive models author lengerich benjamin and tan sarah and chang chun hao and hooker giles and caruana rich journal arxiv preprint arxiv 1911 04974 year 2019 pre a href https arxiv org pdf 1911 04974 pdf paper link a details details summary em interpreting interpretability understanding data scientists use of interpretability tools for machine learning h kaur h nori s jenkins r caruana h wallach j wortman vaughan 2020 em summary br pre inproceedings kaur2020interpreting title interpreting interpretability understanding data scientists use of interpretability tools for machine learning author kaur harmanpreet and nori harsha and jenkins samuel and caruana rich and wallach hanna and wortman vaughan jennifer booktitle proceedings of the 2020 chi conference on human factors in computing systems pages 1 14 year 2020 pre a href https www microsoft com en us research publication interpreting interpretability understanding data scientists use of interpretability tools for machine learning paper link a details details summary em how interpretable and trustworthy are gams c chang s tan b lengerich a goldenberg r caruana 2020 em summary br pre article chang2020interpretable title how interpretable and trustworthy are gams author chang chun hao and tan sarah and lengerich ben and goldenberg anna and caruana rich journal arxiv preprint arxiv 2006 06466 year 2020 pre a href https arxiv org pdf 2006 06466 pdf paper link a details hr details details summary strong differential privacy strong summary hr details summary em accuracy interpretability and differential privacy via explainable boosting h nori r caruana z bu j shen j kulkarni 2021 em summary br pre inproceedings pmlr v139 nori21a title accuracy interpretability and differential privacy via explainable boosting author nori harsha and caruana rich and bu zhiqi and shen judy hanwen and kulkarni janardhan booktitle proceedings of the 38th international conference on machine learning pages 8227 8237 year 2021 volume 139 series proceedings of machine learning research publisher pmlr pre a href https proceedings mlr press v139 nori21a nori21a pdf paper link a details hr details details summary strong lime strong summary hr details summary em why should i trust you explaining the predictions of any classifier m t ribeiro s singh and c guestrin 2016 em summary br pre inproceedings ribeiro2016should title why should i trust you explaining the predictions of any classifier author ribeiro marco tulio and singh sameer and guestrin carlos booktitle proceedings of the 22nd acm sigkdd international conference on knowledge discovery and data mining pages 1135 1144 year 2016 organization acm pre a href https arxiv org pdf 1602 04938 pdf paper link a details hr details details summary strong shap strong summary hr details summary em a unified approach to interpreting model predictions s m lundberg and s i lee 2017 em summary br pre incollection nips2017 7062 title a unified approach to interpreting model predictions author lundberg scott m and lee su in booktitle advances in neural information processing systems 30 editor i guyon and u v luxburg and s bengio and h wallach and r fergus and s vishwanathan and r garnett pages 4765 4774 year 2017 publisher curran associates inc url https papers nips cc paper 7062 a unified approach to interpreting model predictions pdf pre a href https papers nips cc paper 7062 a unified approach to interpreting model predictions pdf paper link a details details summary em consistent individualized feature attribution for tree ensembles lundberg scott m and erion gabriel g and lee su in 2018 em summary br pre article lundberg2018consistent title consistent individualized feature attribution for tree ensembles author lundberg scott m and erion gabriel g and lee su in journal arxiv preprint arxiv 1802 03888 year 2018 pre a href https arxiv org pdf 1802 03888 paper link a details details summary em explainable machine learning predictions for the prevention of hypoxaemia during surgery s m lundberg et al 2018 em summary br pre article lundberg2018explainable title explainable machine learning predictions for the prevention of hypoxaemia during surgery author lundberg scott m and nair bala and vavilala monica s and horibe mayumi and eisses michael j and adams trevor and liston david e and low daniel king wai and newman shu fang and kim jerry and others journal nature biomedical engineering volume 2 number 10 pages 749 year 2018 publisher nature publishing group pre a href https www ncbi nlm nih gov pmc articles pmc6467492 pdf nihms 1505578 pdf paper link a details hr details details summary strong sensitivity analysis strong summary hr details summary em salib an open source python library for sensitivity analysis j d herman and w usher 2017 em summary br pre article herman2017salib title salib an open source python library for sensitivity analysis author herman jonathan d and usher will journal j open source software volume 2 number 9 pages 97 year 2017 pre a href https www researchgate net profile will usher publication 312204236 salib an open source python library for sensitivity analysis links 5ac732d64585151e80a39547 salib an open source python library for sensitivity analysis pdf origin publication detail paper link a details details summary em factorial sampling plans for preliminary computational experiments m d morris 1991 em summary br pre article morris1991factorial title author morris max d journal technometrics volume 33 number 2 pages 161 174 year 1991 publisher taylor francis group pre a href https abe ufl edu faculty jjones abe 5646 2010 morris 1991 20sa 20paper pdf paper link a details hr details details summary strong partial dependence strong summary hr details summary em greedy function approximation a gradient boosting machine j h friedman 2001 em summary br pre article friedman2001greedy title greedy function approximation a gradient boosting machine author friedman jerome h journal annals of statistics pages 1189 1232 year 2001 publisher jstor pre a href https projecteuclid org download pdf 1 euclid aos 1013203451 paper link a details hr details details summary strong open source software strong summary hr details summary em scikit learn machine learning in python f pedregosa et al 2011 em summary br pre article pedregosa2011scikit title scikit learn machine learning in python author pedregosa fabian and varoquaux ga e l and gramfort alexandre and michel vincent and thirion bertrand and grisel olivier and blondel mathieu and prettenhofer peter and weiss ron and dubourg vincent and others journal journal of machine learning research volume 12 number oct pages 2825 2830 year 2011 pre a href https www jmlr org papers volume12 pedregosa11a pedregosa11a pdf paper link a details details summary em collaborative data science plotly technologies inc 2015 em summary br pre online plotly author plotly technologies inc title collaborative data science publisher plotly technologies inc address montreal qc year 2015 url https plot ly pre a href https plot ly link a details details summary em joblib running python function as pipeline jobs g varoquaux and o grisel 2009 em summary br pre article varoquaux2009joblib title joblib running python function as pipeline jobs author varoquaux ga e l and grisel o journal packages python org joblib year 2009 pre a href https joblib readthedocs io en latest link a details hr details videos the science behind interpretml explainable boosting machine https www youtube com watch v mreihghgl0k how to explain models with interpretml deep dive https www youtube com watch v wwbekmq0 i8 black box and glass box explanation in machine learning https youtu be 7uznky8pehq explainable ai explained by design interpretable models with microsofts interpretml https www youtube com watch v qpn9m30ojfc interpreting machine learning models with interpretml https www youtube com watch v ernuffsknhk external links interpretable or accurate why not both https towardsdatascience com interpretable or accurate why not both 4d9c73512192 the explainable boosting machine as accurate as gradient boosting as interpretable as linear regression https towardsdatascience com the explainable boosting machine f24152509ebb exploring explainable boosting machines https leinadj github io 2023 04 09 exploring explainable boosting machines html performance and explainability with ebm https blog oakbits com ebm algorithm html interpretml another way to explain your model https towardsdatascience com interpretml another way to explain your model b7faf0a384f8 a gentle introduction to ga2ms a white box model https www fiddler ai blog a gentle introduction to ga2ms a white box model model interpretation with microsoft s interpret ml https medium com sand mayur model interpretation with microsofts interpret ml 85aa0ad697ae explaining model pipelines with interpretml https medium com mariusvadeika explaining model pipelines with interpretml a9214f75400b explain your model with microsoft s interpretml https medium com dataman ai explain your model with microsofts interpretml 5daab1d693b4 on model explainability from lime shap to explainable boosting https everdark github io k9 notebooks ml model explain model explain nb html dealing with imbalanced data mortgage loans defaults https mikewlange github io imbalanceddata index html the right way to compute your shapley values https towardsdatascience com the right way to compute your shapley values cfea30509254 the art of sprezzatura for machine learning https towardsdatascience com the art of sprezzatura for machine learning e2494c0db727 mixing art into the science of model explainability https towardsdatascience com mixing art into the science of model explainability 312b8216fa95 papers that use or compare ebms llms understand glass box models discover surprises and suggest repairs https arxiv org pdf 2308 01157 pdf model interpretability in credit insurance http hdl handle net 10400 5 27507 federated boosted decision trees with differential privacy https arxiv org pdf 2210 02910 pdf gam e changer or not an evaluation of interpretable machine learning models https arxiv org pdf 2204 09123 pdf gam coach towards interactive and user centered algorithmic recourse https arxiv org pdf 2302 14165 pdf missing values and imputation in healthcare data can interpretable machine learning help https arxiv org pdf 2304 11749v1 pdf practice and challenges in building a universal search quality metric https www researchgate net profile nuo chen 38 publication 370126720 practice and challenges in building a universal search quality metric links 6440a0f239aa471a524cb77d practice and challenges in building a universal search quality metric pdf origin publication detail explaining phishing attacks an xai approach to enhance user awareness and trust https www researchgate net profile giuseppe desolda publication 370003878 explaining phishing attacks an xai approach to enhance user awareness and trust links 643922a8e881690c4bd50ced explaining phishing attacks an xai approach to enhance user awareness and trust pdf revealing the galaxy halo connection through machine learning https arxiv org pdf 2204 10332 pdf explainable artificial intelligence for covid 19 diagnosis through blood test variables https link springer com content pdf 10 1007 s40313 021 00858 y pdf using explainable boosting machines ebms to detect common flaws in data https link springer com chapter 10 1007 978 3 030 93736 2 40 differentially private gradient boosting on linear learners for tabular data analysis https assets amazon science fa 3a a62ba73f4bbda1d880b678c39193 differentially private gradient boosting on linear learners for tabular data analysis pdf concrete compressive strength prediction using an explainable boosting machine model https www sciencedirect com science article pii s2214509523000244 pdfft md5 171c275b6bcae8897cef03d931e908e2 pid 1 s2 0 s2214509523000244 main pdf estimate deformation capacity of non ductile rc shear walls using explainable boosting machine https arxiv org pdf 2301 04652 pdf introducing the rank biased overlap as similarity measure for feature importance in explainable machine learning a case study on parkinson s disease https link springer com chapter 10 1007 978 3 031 15037 1 11 targeting resources efficiently and justifiably by combining causal machine learning and theory https www ncbi nlm nih gov pmc articles pmc9768181 pdf frai 05 1015604 pdf extractive text summarization using generalized additive models with interactions for sentence selection https arxiv org pdf 2212 10707 pdf death by round numbers glass box machine learning uncovers biases in medical practice https www medrxiv org content medrxiv early 2022 11 28 2022 04 30 22274520 full pdf post hoc interpretation of transformer hyperparameters with explainable boosting machines https www cs jhu edu xzhan138 papers black2022 pdf interpretable machine learning for predicting pathologic complete response in patients treated with chemoradiation therapy for rectal adenocarcinoma https www ncbi nlm nih gov pmc articles pmc9771385 pdf frai 05 1059033 pdf exploring the balance between interpretability and performance with carefully designed constrainable neural additive models https deliverypdf ssrn com delivery php id 998105006000069122073098120102102121021040051018055094125029122011041003059093125102072122106122077081069015087124028097016003127095087091028087010007035098086102086081014043013113004081117108011028041097095064071100112069081100069120077067116088100069070097093080074087115080072064086111126 ext pdf index true estimating discontinuous time varying risk factors and treatment benefits for covid 19 with interpretable ml https arxiv org pdf 2211 08991 pdf stratomod predicting sequencing and variant calling errors with interpretable machine learning https www biorxiv org content 10 1101 2023 01 20 524401v1 full pdf interpretable machine learning algorithms to predict leaf senescence date of deciduous trees https pdf sciencedirectassets com 271723 1 s2 0 s0168192323x00112 1 s2 0 s0168192323003143 main pdf x amz security token iqojb3jpz2lux2vjeop 2f 2f 2f 2f 2f 2f 2f 2f 2f 2fweacxvzlwvhc3qtmsjgmeqciarprcug2 2bpva 2f87dfmydbinsntwddgnhecon72yfad3aibhzr9bvmkrvzrjqz1doy1ymkd6vsqw45zqo5ykkclnhsq8bqil 2f 2f 2f 2f 2f 2f 2f 2f 2f 2f8beauadda1otawmzu0njg2nsimdv4ihgm83azwhkyjkpaffmabpkhgjjh1i3y26wef5ln6zuxfgcdlklmnpezdoentreay08vleu7 2f3clensqygaq5txcivztjdv2tbcxdut0pp4fanrhuwiqdfdksvds3ee7veupaqhvjmni0 2f 2bylrw2ozjmppz7h5sd3i4 2f 2fk2 2fjlpawhlr4rfj9bxmmpbldeqhjijil5zzaeleeijxkrttvj6iywtic 2fhj23m7fdnkh94hkkftowegljzgt7fsc5wnc7dgexrl7eblvu9yvusmuf9rfyiu 2bkavyxia7wdun48cwjwdgljyv9xpy 2fp2lrkjeiinmybdknqzjfszh0hwxx0aq6zlxdkjubvsgqfodc2npaugxjnupslnizcmfwr8luvufibm1zigetfdzrb4zejfqvxxv 2bsztpcs1tmo 2f8lag3mni 2bi 2fp7lt3bj 2f 2bzg6s7d6rogs96xms3am3wffiwnixuetgrwmrwxs75eqexcjmrq4elu 2by3voxxivqftt68w6 2bnbryub5kge 2b6gljxufd5y7hzflm0tffw9xezf5pjdbz 2fx 2bi0dxeiwvn2mznpsawiiy6zbt31gsrrmtte9sm4u 2b8dwsr0fymxmme5fklgzkysq0xpufhzn6lylcoxtbob 2brylalndp8e31enpu 2b1xl5isg 2fxhinrm29syzk0u1plpk78ng 2bqt4mulld7jlziebka1vz 2bu8 2f1zyveofc8i6q691pqjyl 2fzk5lfqo1eeremvov4i2neywmgtjtcak1wfchnamfledwyjiern5pki4yvsgf 2fwxg8ahuybg41cfgftl 2fwlj77dpoq8qhgp5bzfheyeywemijnbz4tere7kvpdvbkok5lbxtijili0ftu 2f4f0k825m 2ft4w 2fqzipgy6sgfspzj6vfwqmikbmprtcy6nbr4uazu 2fpuwrawxu3hcydmztvojlrab 2bv5nsdcqwkhvk7yn89jte9um3p8gyev9bfpxt6lykctjnoulkuqnywvl8ngkdbujnjlayzb4d0p4dfrfse2sutuwnvs 2bvwa 2bydn4 2bwpkmn5pu0kr78myj7lyyjgodnloxcbsv 2fxa396tmexagw3ihm2u7h 2fvxm1izmoz 2fflt5y6cey 2fegchxevpb6 x amz algorithm aws4 hmac sha256 x amz date 20230808t111525z x amz signedheaders host x amz expires 300 x amz credential asiaq3phcvty64ptfofs 2f20230808 2fus east 1 2fs3 2faws4 request x amz signature e35040e1985923b74081dbdac33f7250949695d95e631d68a8fe20684b3746bc hash 59ce65176ba4b931ecc905ef2a0bb80561947d73205e8ad2561d63a95552a4fb host 68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61 pii s0168192323003143 tid spdf 41137a89 2992 4585 8512 4303f8dedb0c sid b0b6f2a791aeb640d1897e968c8092375869gxrqa type client tsoh d3d3lnnjawvuy2vkaxjly3quy29t ua 10145807525053555255 rr 7f375764f8ea2338 cc us comparing explainable machine learning approaches with traditional statistical methods for evaluating stroke risk models retrospective cohort study https cardio jmir org 2023 1 e47736 pdf cross feature selection to eliminate spurious interactions and single feature dominance explainable boosting machines https arxiv org ftp arxiv papers 2307 2307 08485 pdf multi objective optimization of performance and interpretability of tabular supervised machine learning models https arxiv org pdf 2307 08175v1 pdf an explainable model to support the decision about the therapy protocol for aml https arxiv org pdf 2307 02631 pdf assessing wind field characteristics along the airport runway glide slope an explainable boosting machine assisted wind tunnel study https www nature com articles s41598 023 36495 5 trustworthy academic risk prediction with explainable boosting machines https link springer com chapter 10 1007 978 3 031 36272 9 38 binary ecg classification using explainable boosting machines for iot edge devices https ieeexplore ieee org document 9970834 explainable artificial intelligence toward usable and trustworthy computer aided diagnosis of multiple sclerosis from optical coherence tomography https www ncbi nlm nih gov pmc articles pmc10406231 an interpretable machine learning model with deep learning based imaging biomarkers for diagnosis of alzheimer s disease https arxiv org pdf 2308 07778 pdf comparing explainable machine learning approaches with traditional statistical methods for evaluating stroke risk models retrospective cohort study https pureadmin qub ac uk ws portalfiles portal 495863198 jmir cardio pdf explainable artificial intelligence for cotton yield prediction with multisource data https ieeexplore ieee org document 10214067 monotone tree based gami models by adapting xgboost https arxiv org ftp arxiv papers 2309 2309 02426 pdf neural graphical models https arxiv org pdf 2210 00453 pdf enhancing predictive battery maintenance through the use of explainable boosting machine https link springer com chapter 10 1007 978 3 031 44146 2 6 improved differentially private regression via gradient boosting https arxiv org pdf 2303 03451 pdf explainable artificial intelligence in job recommendation systems http essay utwente nl 96974 1 tran ma eemcs pdf diagnosis uncertain models for medical risk prediction https arxiv org pdf 2306 17337 pdf extending explainable boosting machines to scientific image data https arxiv org pdf 2305 16526 pdf pest presence prediction using interpretable machine learning https arxiv org pdf 2205 07723 pdf key thresholds and relative contributions of knee geometry anteroposterior laxity and body weight as risk factors for noncontact acl injury https www ncbi nlm nih gov pmc articles pmc10184233 pdf 10 1177 23259671231163627 pdf epitope1d accurate taxonomy aware b cell linear epitope prediction https www biorxiv org content 10 1101 2022 10 17 512613v1 full pdf explainable boosting machines for slope failure spatial predictive modeling https www mdpi com 2072 4292 13 24 4991 htm micromodels for efficient explainable and reusable systems a case study on mental health https arxiv org pdf 2109 13770 pdf identifying main and interaction effects of risk factors to predict intensive care admission in patients hospitalized with covid 19 https www medrxiv org content 10 1101 2020 06 30 20143651v1 full pdf comparing the interpretability of machine learning classifiers for brain tumour survival prediction https deliverypdf ssrn com delivery php id 760122118067103094108090123091079011028032009009023085005014014002123105085114025022024005047078031019089073120012025117073002064031071072113006066035001068125027021087087083085026100009018045107092063001023068071002124070107120120007014102094103069089119026110104107005031095001092090 ext pdf index true using interpretable machine learning to predict maternal and fetal outcomes https arxiv org pdf 2207 05322 pdf calibrate interactive analysis of probabilistic model output https arxiv org pdf 2207 13770 pdf neural additive models interpretable machine learning with neural nets https arxiv org pdf 2004 13912 pdf laplace approximated neural additive models https arxiv org pdf 2305 16905 pdf node gam neural generalized additive model for interpretable deep learning https arxiv org pdf 2106 01613 pdf scalable interpretability via polynomials https arxiv org pdf 2205 14108v1 pdf neural basis models for interpretability https arxiv org pdf 2205 14120 pdf ilmart interpretable ranking with constrained lambdamart https arxiv org pdf 2206 00473 pdf integrating co clustering and interpretable machine learning for the prediction of intravenous immunoglobulin resistance in kawasaki disease https ieeexplore ieee org stamp stamp jsp tp arnumber 9097874 gami net an explainable neural network based on generalized additive models with structured interactions https arxiv org pdf 2003 07132v1 pdf interpretable generalized additive neural networks https pdf sciencedirectassets com 271700 aip 1 s2 0 s0377221723005027 main pdf x amz security token iqojb3jpz2lux2vjeox 2f 2f 2f 2f 2f 2f 2f 2f 2f 2fweacxvzlwvhc3qtmsjhmeuciqdya80lsoqy 2fmgtgsi8cq2bzhofu7410ljuyqwqt9ht0gigbg4nsen4e5jkuouf04uzcpimh8nhh22jry3opog 2fa1wqsguixhafggwwntkwmdm1ndy4njuidfuczlumsd25uy5h3cqpbw0kczyo1i0j19n0o26whocoxeimg0i7m02rupqug4eidyfvkx 2frqfc4el2y0z7io 2b95niq9urod3zwwzzpgpkcghpu1ga4jwhsknjdi8g2q 2fgm18 2fl8b9jn4lq3klufu3hcjjh 2b4o1aztjb3pmqdxkn 2bfqiftfs13xncyqgngblw3yasp3zxov55tksx6b 2fp5zuxworwic2jlanxa0exr 2fkbee75gfildu8bh2tj1wozob0ytzwdal1 2bc4exghvdhzrpvr9w6q 2btg4tx6qhglawv1uuqn8zt1z8gefhmtrtsv5pnjiplqqmxp62ueufpmesyyoo5rfkjrs96pxys1s 2fc5zfz0v63kkftmsvn4izvq 2b9tlq 2fewq3bvts8b0ch 2fom6w8wn4ngk3hywjiuwvgexxahmqdw9o2pq7cwosofckjjkoyxbxazp0ox5leccgof11bbhncdsiiqlwqhqsk0738appuu99yh12xmmwyu6yxav1jvgrpalimrkliau9by418e6 2fbba 2b 2bfcdqc3vkev3npsqklitmait2y624jhm09ntjdc4iconnrve3q5shih6dzsbhrpj9okqpu9npkndbrkoafdnq 2bklq 2b8jaxychwd3ybuxqstlytpuexesonffj36hgj 2fbkkfc5ac9w 2balq 2fkbyivtpfnbwigusc 2bugsh0kc 2bjqoyyunhjfyz3fxcdwi 2baugnt3utxtt 2brncklh3f68zayodkflihrqevc2 2frbxj5gaqcgzfduvm 2bvjgb 2bine458prmxlurwfhjaroqzhodvc68ar2q3ydpyqmyuzxaivrqn2xljgdh0lctvwnourzqly2l4v87nym7ncxjo 2fiibqartsrtowmkw6jigpqy6sqehgddw23gwj9rspflchfuzj 2b 2bfdgex3lzpupitkl6 2bywjkw0wxpr4c0rj0isch1eaxjcxslxhsshginlzr41ereeabyudeyntxd0iaqcr3l4rlqtvui6v3iicxnltdg5rdajwuqsgqhmrzoh0ujqqxvljwxgfkokckdjdnrft 2fmoh 2bnjlcr4ktvwtjic2yamnhqco2tlkbc27i118dsokwryuvb 2bkctwd3tmkxif 2brw5s 3d x amz algorithm aws4 hmac sha256 x amz date 20230707t133754z x amz signedheaders host x amz expires 300 x amz credential asiaq3phcvtytykbxre7 2f20230707 2fus east 1 2fs3 2faws4 request x amz signature 389ecd144af85f4eae42ab9684f9d56696191a9d8d33c44386ee6af520187724 hash e798cbd4d80d01d56a2a1ea75a3947b027daecaea5f6e6674a1dc2dbea97dab3 host 68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61 pii s0377221723005027 tid spdf 79e45837 627e 4a9f 88f5 7359ecb4ca63 sid 7e54333b754ff04056483e557e54be0269ddgxrqa type client tsoh d3d3lnnjawvuy2vkaxjly3quy29t ua 0b1a5101565e5607565a rr 7e307c188e87e7cb cc mx a concept and argumentation based interpretable model in high risk domains https arxiv org pdf 2208 08149 pdf analyzing the differences between professional and amateur esports through win probability https dl acm org doi pdf 10 1145 3485447 3512277 explainable machine learning with pairwise interactions for the classifcation of parkinson s disease and swedd from clinical and imaging features https www ncbi nlm nih gov pmc articles pmc9132761 pdf 11682 2022 article 688 pdf interpretable prediction of goals in soccer https statsbomb com wp content uploads 2019 10 decroos interpretability statsbomb pdf extending the tsetlin machine with integer weighted clauses for increased interpretability https arxiv org pdf 2005 05131 pdf in pursuit of interpretable fair and accurate machine learning for criminal recidivism prediction https arxiv org pdf 2005 04176 pdf from shapley values to generalized additive models and back https arxiv org pdf 2209 04012 pdf developing a visual interactive interface for electronic health record labeling https arxiv org pdf 2209 12778 pdf development and validation of an interpretable 3 day intensive care unit readmission prediction model using explainable boosting machines https www medrxiv org content 10 1101 2021 11 01 21265700v1 full pdf death by round numbers and sharp thresholds how to avoid dangerous ai ehr recommendations https www medrxiv org content 10 1101 2022 04 30 22274520v1 full pdf building a predictive model to identify clinical indicators for covid 19 using machine learning method https www ncbi nlm nih gov pmc articles pmc9037972 pdf 11517 2022 article 2568 pdf using innovative machine learning methods to screen and identify predictors of congenital heart diseases https www ncbi nlm nih gov pmc articles pmc8777022 pdf fcvm 08 797002 pdf explainable boosting machine for predicting alzheimer s disease from mri hippocampal subfields https link springer com chapter 10 1007 978 3 030 86993 9 31 impact of accuracy on model interpretations https arxiv org pdf 2011 09903 pdf machine learning algorithms for identifying dependencies in ot protocols https www mdpi com 1996 1073 16 10 4056 books that cover ebms machine learning for high risk applications https www oreilly com library view machine learning for 9781098102425 interpretable machine learning with python https www amazon com interpretable machine learning python hands dp 180020390x explainable artificial intelligence an introduction to interpretable machine learning https www amazon com explainable artificial intelligence an introduction to interpretable xai dp 3030833550 applied machine learning explainability techniques https www amazon com applied machine learning explainability techniques dp 1803246154 the explainable a i with python examples https www amazon com explainable i python examples ebook dp b0b4f98mn6 platform and model design for responsible ai design and build resilient private fair and transparent machine learning models https www amazon com platform model design responsible transparent dp 1803237074 explainable ai recipes https www amazon com explainable recipes implement explainability interpretability ebook dp b0bsf5nby7 ensemble methods for machine learning https www amazon com ensemble methods machine learning kunapuli dp 1617297135 platform and model design for responsible ai design and build resilient private fair and transparent machine learning models https www amazon com platform model design responsible transparent dp 1803237074 external tools ebm to onnx converter by softathome https github com interpretml ebm2onnx gam changer https github com interpretml gam changer ml 2 sql experimental https github com kaspersgit ml 2 sql contact us there are multiple ways to get in touch email us at interpret microsoft com or feel free to raise a github issue br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br br if a tree fell in your random forest would anyone notice
machine-learning interpretability gradient-boosting blackbox scikit-learn xai interpretml interpretable-machine-learning interpretable-ai transparency iml interpretable-ml explainable-ml explainability bias ai artificial-intelligence explainable-ai differential-privacy
ai
generator-jhipster-ant-design
generator jhipster ant design npm version npm image npm url build status travis image travis url dependency status daviddm image daviddm url jhipster blueprint ant design system blueprint for jhipster client introduction this is a jhipster http jhipster github io blueprint that is meant to be used in a jhipster application prerequisites as this is a jhipster http jhipster github io blueprint we expect you have jhipster and its related tools already installed installing jhipster https jhipster github io installation html installation with yarn to install this blueprint bash yarn global add generator jhipster ant design to update this blueprint bash yarn global upgrade generator jhipster ant design with npm to install this blueprint bash npm install g generator jhipster ant design to update this blueprint bash npm update g generator jhipster ant design usage to use this blueprint run the below command bash jhipster blueprint ant design running local blueprint version for development during development of blueprint please note the below steps they are very important 1 link your blueprint globally note if you do not want to link the blueprint step 3 to each project being created use npm instead of yarn as yeoman doesn t seem to fetch globally linked yarn modules on the other hand this means you have to use npm in all the below steps as well bash cd ant design npm link 2 link a development version of jhipster to your blueprint optional required only if you want to use a non released jhipster version like the master branch or your own custom fork you could also use yarn for this if you prefer bash cd generator jhipster npm link cd ant design npm link generator jhipster 3 create a new folder for the app to be generated and link jhipster and your blueprint there bash mkdir my app cd my app npm link generator jhipster ant design npm link generator jhipster optional needed only if you are using a non released jhipster version jhipster d blueprint ant design license mit chiho sin https github com chihosin npm image https img shields io npm v generator jhipster ant design svg npm url https npmjs org package generator jhipster ant design travis image https travis ci org chihosin generator jhipster ant design svg branch master travis url https travis ci org chihosin generator jhipster ant design daviddm image https david dm org chihosin generator jhipster ant design svg theme shields io daviddm url https david dm org chihosin generator jhipster ant design
os
Vanadium
vanadium a traditional proxy site frontend for use in combating web filters currently using ultraviolet img src vanadium png features fully functional omnibox search url entry and search suggestions all in one place a simplistic no frills design that is easy for anyone to pick up and use a mobile responsive design makes this frontend usable on both mobile devices and traditional computers a work in progress arcade featuring classic adobe flash games all working thanks to ruffle rs setup automatic deployment compatibility not guaranteed remix on glitch https cdn glitch com 2703baf2 b643 4da7 ab91 7ee2a2d00b5b 2fremix button svg https glitch com edit import github titaniumnetwork dev vanadium deploy https raw githubusercontent com titaniumnetwork dev vanadium main replit svg https repl it github titaniumnetwork dev vanadium manual deployment for use on your own linux server sh git clone https github com titaniumnetwork dev vanadium cd vanadium npm install npm start copyright takedown requests simply create an issue on this repository and we will assist you as soon as possible if that doesn t suit you then send an e mail to nullnvoid mailfence com and i can work with you directly
proxy bypass circumvention chromebook nodejs node javascript
front_end
berial
p align center img src https avatars0 githubusercontent com u 68577605 s 200 v 4 alt berial logo width 150 p h1 align center berial h1 p align center imp simple micro front end framework p p align center a href https github com berialjs berial actions img src https img shields io github workflow status berialjs berial ci svg alt build status a a href https npmjs com package berial img src https img shields io npm v berial svg alt npm v a a href https npmjs com package berial img src https img shields io npm dt berial svg alt npm d a p why berial berial is a new approach to a popular idea build a javascript framework for front end microservices there are any wonderful features of it such as asynchronous rendering pipeline web components shadow dom scoped css javascript sandbox proxy note diffence form fre berial will pay attention to business value use html one app one app two app two app script type module import register from berial register name one app url 1 html allowlist fre name two app scripts 2 js styles 2 css script license mit yisar h a n a
microservice micro-frontend
front_end
RTOS-Kernel-for-ARM
real time operating systems implemented the internals of a real time operating system real time operating systems have a number of subsystems but their most important functions are task scheduling resource allocation and memory protection a form of fault isolation we implemented the basic infrastructure context switching task management concurrency control through mutexes and interface layers with admission control and the priority inheritance locking discipline scheduler the scheduler is a part of the kernel that is in charge of deciding the order in which tasks run and the length of each run the scheduling policy that a scheduler follows is the theoretical frame work that the scheduler employs in making its allocation decisions note that there is no mention of context switching or architecture specific implementation details in this lab you will be implement ing sections of a scheduler and in particular the predicates needed to implement the rate monotonic scheduling policy dispatcher the dispatcher deals with enforcing the scheduler s policy the context switch and task switch routines are under the purview of the dispatcher run queue the run queue or run list is a list of tasks that are currently runnable that satisfy some criteria on simple round robin scheduling systems there is one system run list where runnable tasks are served in a first come first served fcfs manner hence the name run queue systems that have multiple task priorities do not have a universal run queue gravelv2 has a run list for every priority level on the system but since there are as many priorities as tasks on gravelv2 the scheduler enforces that no more than one task can be in a run queue at once hence we end up not using the fcfs nature of the queue working instead on a purely priority based scheduling policy task state during the life time of a task it can be in a number of states all tasks start out as runnable the scheduler can only schedule runnable tasks to run when a runnable task is scheduled to run it is now in the running state a running task can block on a lock or an event when it does this the task moves to the blocked state a task that is blocked cannot be scheduled to run it can be made runnable upon the signaling of an appropriate event in traditional operating systems a task that is exiting will move from the running state to the undead or zombie state at which point it will be reaped and have its resources returned to the system in an appropriate manner in gravelv2 no tasks exit and all tasks are assumed to be periodic forever please remember that the scheduler will not and is not supposed to schedule any task that is not runnable
os
semantic-llama
new name repo https github com monarch initiative ontogpt semantic llama is now ontogpt this repo is archived use the repo above semantic llama semantic large language model annotation a knowledge extraction tool that uses a large language model to extract semantic information from text this exploits the ability of ultra llms such as gpt 3 to return user defined data structures as a response usage given a short text abstract txt with content such as the cgas sting mediated dna sensing signaling pathway is crucial for interferon ifn production and host antiviral responses the underlying mechanism was the interaction of us3 with catenin and its hyperphosphorylation of catenin at thr556 to block its nuclear translocation see full input tests input cases gocam betacat txt we can extract this into the go pathway datamodel src semantic llama templates gocam yaml bash semllama extract t gocam gocamannotations abstract txt giving schema compliant yaml such as yaml genes hgnc 2514 hgnc 21367 hgnc 27962 us3 fplx interferon isg gene gene interactions gene1 us3 gene2 hgnc 2514 gene localizations gene hgnc 2514 location nuclear gene functions gene hgnc 2514 molecular activity transcription gene hgnc 21367 molecular activity production see full output tests output gocam betacat yaml note in the above the grounding is very preliminary and can be improved ungrounded namedentities appear as test how it works 1 you provide an arbitrary data model describing the structure you want to extract text into this can be nested but see limitations below 2 provide your preferred annotations for grounding namedentity fields 3 semantic llama will generate a prompt feed the prompt to a language model currently openai parse the results into a dictionary structure ground the results using a preferred annotator pre requisites python 3 9 an openai account a bioportal account optional you will need to set both api keys using oak poetry run runoak set apikey openai your openai api key poetry run runoak set apikey bioportal your bioportal api key how to define your own extraction data model step 1 define a schema see src semantic llama templates src semantic llama templates for examples define a schema using a subset of linkml that describes the structure you want to extract from your text yaml classes mendeliandisease attributes name description the name of the disease examples value peroxisome biogenesis disorder identifier true needed for inlining description description a description of the disease examples value peroxisome biogenesis disorders zellweger syndrome spectrum pbd zss is a group of autosomal recessive disorders affecting the formation of functional peroxisomes characterized by sensorineural hearing loss pigmentary retinal degeneration multiple organ dysfunction and psychomotor impairment synonyms multivalued true examples value zellweger syndrome spectrum value pbd zss subclass of multivalued true range mendeliandisease examples value lysosomal disease value autosomal recessive disorder symptoms range symptom multivalued true examples value sensorineural hearing loss value pigmentary retinal degeneration inheritance range inheritance examples value autosomal recessive genes range gene multivalued true examples value pex1 value pex2 value pex3 gene is a namedthing id prefixes hgnc annotations annotators gilda bioportal hgnc nr symptom is a namedthing id prefixes hp annotations annotators sqlite obo hp inheritance is a namedthing annotations annotators sqlite obo hp the schema is defined in linkml prompt hints can be specified using the prompt annotation otherwise description is used multivalued fields are supported the default range is string these are not grounded e g disease name synonyms define a class for each namedentity for any namedentity you can specify a preferred annotator using the annotators annotation we recommend following an established schema like biolink but you can define your own step 2 compile the schema run the make command at the top level this will compile the schema to pedantic step 3 run the command line e g emllama extract t mendelian disease mendeliandisease marfan wikipedia txt web application there is a bare bones web application poetry run webllama note that the agent running uvicorn must have the api key set so for obvious reasons don t host this publicaly without authentication unless you want your credits drained features multiple levels of nesting currently only two levels of nesting are supported if a field has a range which is itself a class and not a primitive it will attempt to nest e g the gocam schema has an attribute yaml attributes gene functions description semicolon separated list of gene to molecular activity relationships multivalued true range genemolecularactivityrelationship because genemolecularactivityrelationship is inlined it will nest the generated prompt is gene functions semicolon separated list of gene to molecular activities relationships the output of this is then passed through further llama iterations limitations non deterministic this relies on an existing llm and llms can be fickle in their responses coupled to openai you will need an openai account in theory any llm can be used but in practice the parser is tuned for openai acknowledgements this cookiecutter https cookiecutter readthedocs io en stable readme html project was developed from the sphintoxetry cookiecutter https github com hrshdhgd sphintoxetry cookiecutter template and will be kept up to date using cruft https cruft github io cruft
ai knowledge-extraction language-models linkml oaklib obofoundry
ai
CPSC411
cpsc411 mobile development
front_end
Transformers-for-NLP-2nd-Edition
transformers for nlp 2nd edition img src https github com denis2054 transformers for nlp 2nd edition blob main transformers rothman png raw tru alt drawing width 400 copyright 2022 2023 denis rothman packt publishing br last updated september 30 2023 dolphin additional bonus programs for openai chatgpt gpt 3 5 legacy chatgpt plus gpt 3 5 default gpt 3 5 default and gpt 4 br api examples for gpt 3 5 turbo gpt 4 dall e 2 google cloud ai language and google cloud ai vision br discover hugginggpt google smart compose google bard and microsoft s new bing br advanced prompt engineering with the chatgpt api and the gpt 4 api br just look for the dolphin and enjoy your ride into the future of ai contact me on linkedin https www linkedin com in denis rothman 0b034043 br get the book on amazon https www amazon com transformers natural language processing architectures dp 1803247339 dp 1803247339 ref mt other encoding utf8 me qid transformer models from bert to gpt 4 environments from hugging face to openai fine tuning training and prompt engineering examples a bonus section with chatgpt gpt 3 5 turbo gpt 4 and dall e including jump starting gpt 4 speech to text text to speech text to image generation with dall e and more getting started you can run these notebooks on cloud platforms like google colab https colab research google com or your local machine note that some chapters require a gpu to run in a reasonable amount of time so we recommend one of the cloud platforms as they come pre installed with cuda running on a cloud platform or in your environement to run these notebooks on a cloud platform just click on one of the badges in the table below or run them on your environment chapter colab kaggle gradient studiolab chapter 2 getting started with the architecture of the transformer model ul li multi head attention sub layer ipynb li li positional encoding ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter02 multi head attention sub layer ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter02 positional encoding ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter02 multi head attention sub layer ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter02 positional encoding ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter02 multi head attention sub layer ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter02 positional encoding ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter02 multi head attention sub layer ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter02 positional encoding ipynb chapter 3 fine tuning bert models ul li bert fine tuning sentence classification gpu ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter03 bert fine tuning sentence classification gpu ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter03 bert fine tuning sentence classification gpu ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter03 bert fine tuning sentence classification gpu ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter03 bert fine tuning sentence classification gpu ipynb chapter 4 pretraining a roberta model from scratch pretraining a roberta model from scratch ul li kantaibert ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter04 kantaibert ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter04 kantaibert ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter04 kantaibert ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter04 kantaibert ipynb chapter 5 downstream nlp tasks with transformers ul li transformer tasks ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter05 transformer tasks ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter05 transformer tasks ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter05 transformer tasks ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter05 transformer tasks ipynb chapter 6 machine translation with the transformer ul li trax translation ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter06 trax translation ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter06 trax translation ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter06 trax translation ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter06 trax translation ipynb chapter 7 the rise of suprahuman transformers with gpt 3 engines ul li getting started gpt 3 ipynb li li fine tuning gpt 3 ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter07 getting started gpt 3 ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter07 fine tuning gpt 3 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter07 getting started gpt 3 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter07 getting started gpt 3 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter07 fgetting started gpt 3 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter07 fine tuning gpt 3 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter07 getting started gpt 3 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter07 fine tuning gpt 3 ipynb chapter 8 applying transformers to legal and financial documents for ai text summarization ul li summarizing text with t5 ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter08 summarizing text with t5 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter08 summarizing text with t5 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter08 summarizing text with t5 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter08 summarizing text with t5 ipynb chapter 9 matching tokenizers and datasets ul li tokenizers ipynb li li training openai gpt 2 ch09 ipynb li li summarizing with chatgpt ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter09 tokenizer ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter09 training openai gpt 2 ch09 ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter09 summarizing with chatgpt ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter09 tokenizer ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter09 training openai gpt 2 ch09 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter09 summarizing with chatgpt ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter09 tokenizer ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter09 training openai gpt 2 ch09 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter09 summarizing with chatgpt ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter09 tokenizer ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter09 training openai gpt 2 ch09 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter09 summarizing with chatgpt ipynb chapter 10 semantic role labeling ul li srl ipynb li li semantic role labeling with chatgpt ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter10 srl ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter10 semantic role labeling with chatgpt ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter10 srl ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter10 semantic role labeling with chatgpt ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter10 srl ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter10 semantic role labeling with chatgpt ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter10 srl ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter10 semantic role labeling with chatgpt ipynb chapter 11 let your data do the talking story questions and answers ul li qa ipynb li li haystack qa pipeline ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter11 qa ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter11 haystack qa pipeline ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter11 qa ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter11 haystack qa pipeline ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter11 qa ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter11 haystack qa pipeline ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter11 qa ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter11 haystack qa pipeline ipynb chapter 12 detecting customer emotions to make predictions ul li sentimentanalysis ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter12 sentimentanalysis ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter12 sentimentanalysis ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter12 sentimentanalysis ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter12 sentimentanalysis ipynb chapter 13 analyzing fake news with transformers ul li fake news ipynb li li fake news analysis with chatgpt ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter13 fake news ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter13 fake news analysis with chatgpt ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter13 fake news ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter13 fake news analysis with chatgpt ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter13 fake news ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter13 fake news analysis with chatgpt ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter13 fake news ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter13 fake news analysis with chatgpt ipynb chapter 14 interpreting black box transformer models ul li bertviz ipynb li li understanding gpt 2 models with ecco ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter14 bertviz ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter14 understanding gpt 2 models with ecco ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter14 bertviz ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter14 understanding gpt 2 models with ecco ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter14 bertviz ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter14 understanding gpt 2 models with ecco ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter14 xai by chatgpt for chatgpt ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter14 bertviz ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter14 understanding gpt 2 models with ecco ipynb chapter 15 from nlp to task agnostic transformer models ul li vision transformers ipynb li li dall e ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter15 vision transformers ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter15 dall e ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter15 vision transformers ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter15 dall e ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter15 vision transformers ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter15 dall e ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter15 vision transformers ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter15 dall e ipynb chapter 16 the emergence of transformer driven copilots ul li domain specific gpt 3 functionality ipynb li li kantaibert recommender ipynb li li vision transformer mlp mixer ipynb li li compact convolutional transformers ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter16 domain specific gpt 3 functionality ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter16 kantaibert recommender ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter16 vision transformer mlp mixer ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter16 compact convolutional transformers ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter16 domain specific gpt 3 functionality ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter16 kantaibert recommender ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter16 vision transformer mlp mixer ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter16 compact convolutional transformers ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter16 domain specific gpt 3 functionality ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter16 kantaibert recommender ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter16 vision transformer mlp mixer ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter16 compact convolutional transformers ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter16 domain specific gpt 3 functionality ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter16 kantaibert recommender ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter16 vision transformer mlp mixer ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter16 compact convolutional transformers ipynb chapter 17 consolidation of suprahuman transformers with openai chatgpt and gpt 4 ul li jump starting chatgpt with the openai api ipynb li li chatgpt plus writes and explains classification ipynb li li getting started openai gpt 4 ipynb li li prompt engineering as an alternative to fine tuning ipynb li li getting started with the dall e 2 api ipynb li li speaking with chatgpt ipynb li li all in one ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 jump starting chatgpt with the openai api ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 chatgpt plus writes and explains classification ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 getting started openai gpt 4 ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 prompt engineering as an alternative to fine tuning ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with the dall e 2 api ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 speaking with chatgpt ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main chapter17 all in one ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 jump starting chatgpt with the openai api ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 chatgpt plus writes and explains classification ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with gpt 4 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 prompt engineering as an alternative to fine tuning ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with the dall e 2 api ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 speaking with chatgpt ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main chapter17 all in one ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 jump starting chatgpt with the openai api ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 chatgpt plus writes and explains classification ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with gpt 4 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 prompt engineering as an alternative to fine tuning ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with the dall e 2 api ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 speaking with chatgpt ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main chapter17 all in one ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 jump starting chatgpt with the openai api ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 chatgpt plus writes and explains classification ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with gpt 4 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 prompt engineering as an alternative to fine tuning ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 getting started with the dall e 2 api ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 speaking with chatgpt ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main chapter17 all in one ipynb appendix iii generic text completion with gpt 2 ul li openai gpt 2 ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main appendixiii openai gpt 2 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main appendixiii openai gpt 2 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main appendixiii openai gpt 2 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main appendixiii openai gpt 2 ipynb appendix iv custom text completion with gpt 2 ul li training openai gpt 2 ipynb li ul open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main appendixiv training openai gpt 2 ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main appendixiv training openai gpt 2 ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main appendixiv training openai gpt 2 ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main appendixiv training openai gpt 2 ipynb additional openai bonus notebooks bonus colab kaggle gradient sagemaker studio lab explore and compare chatgpt gpt 4 and gpt 3 models exploring gpt 4 api https github com denis2054 transformers for nlp 2nd edition blob main bonus exploring gpt 4 api ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main bonus exploring gpt 4 api ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main bonus exploring gpt 4 api ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main bonus exploring gpt 4 api ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main bonus exploring gpt 4 api ipynb create a chatgpt xai function that explains chatgpt and an xai shap function xai by chatgpt for chatgpt https github com denis2054 transformers for nlp 2nd edition blob main bonus xai by chatgpt for chatgpt ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main bonus xai by chatgpt for chatgpt ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main bonus xai by chatgpt for chatgpt ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main bonus xai by chatgpt for chatgpt ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main bonus xai by chatgpt for chatgpt ipynb go back to the origins with gpt 2 and chatgpt gpt 2 and chatgpt the origins https github com denis2054 transformers for nlp 2nd edition blob main bonus gpt 2 and chatgpt the origins ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main bonus gpt 2 and chatgpt the origins ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main bonus gpt 2 and chatgpt the origins ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main bonus gpt 2 and chatgpt the origins ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main bonus gpt 2 and chatgpt the origins ipynb chatgpt or davinin instruct what is best for your project chatgpt as a cobot chatgpt versus davinci instruct ipynb https github com denis2054 transformers for nlp 2nd edition blob main bonus chatgpt as a cobot chatgpt versus davinci instruct ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main bonus chatgpt as a cobot chatgpt versus davinci instruct ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main bonus chatgpt as a cobot chatgpt versus davinci instruct ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main bonus chatgpt as a cobot chatgpt versus davinci instruct ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main bonus chatgpt as a cobot chatgpt versus davinci instruct ipynb ai language model comparison br explore various ai language models and their capabilities through this comprehensive notebook br dive into different apis and functionalities such as sentiment analysis entity recognition syntax analysis content classification and ai vision br discover and compare the offerings of google cloud ai language google cloud ai vision openai gpt 4 google bard microsoft new bing chatgpt plus gpt 4 hugging face hugginggpt and google smart compose exploring and comparing advanced ai technologies ipynb https github com denis2054 transformers for nlp 2nd edition blob main bonus exploring and comparing advanced ai technologies ipynb open in colab https colab research google com assets colab badge svg https colab research google com github denis2054 transformers for nlp 2nd edition blob main bonus exploring and comparing advanced ai technologies ipynb kaggle https kaggle com static images open in kaggle svg https kaggle com kernels welcome src https github com denis2054 transformers for nlp 2nd edition blob main bonus exploring and comparing advanced ai technologies ipynb gradient https assets paperspace io img gradient badge svg https console paperspace com github denis2054 transformers for nlp 2nd edition blob main bonus exploring and comparing advanced ai technologies ipynb open in sagemaker studio lab https studiolab sagemaker aws studiolab svg https studiolab sagemaker aws import github denis2054 transformers for nlp 2nd edition blob main bonus exploring and comparing advanced ai technologies ipynb key features implement models such as bert reformer and t5 that outperform classical language models br compare nlp applications using gpt 3 gpt 2 and other transformers br analyze advanced use cases including polysemy cross lingual learning and computer vision a github bonus directory with soa chatgpt gpt 3 5 turbo gpt 4 and dall e notebooks br book description transformers are a game changer for natural language understanding nlu and have become one of the pillars of artificial intelligence br transformers for natural language processing 2nd edition investigates deep learning for machine translations language modeling question answering and many more nlp domains with transformers br an industry 4 0 ai specialist needs to be adaptable knowing just one nlp platform is not enough anymore different platforms have different benefits depending on the application whether it s cost flexibility ease of implementation results or performance in this book we analyze numerous use cases with hugging face google trax openai and allennlp br this book takes transformers capabilities further by combining multiple nlp techniques such as sentiment analysis named entity recognition and semantic role labeling to analyze complex use cases such as dissecting fake news on twitter also see how transformers can create code using just a brief description br by the end of this nlp book you will understand transformers from a cognitive science perspective and be proficient in applying pretrained transformer models to various datasets br what you will learn discover new ways of performing nlp techniques with the latest pretrained transformers br grasp the workings of the original transformer gpt 3 bert t5 deberta and reformer br create language understanding python programs using concepts that outperform classical deep learning models br apply python tensorflow and pytorch programs to sentiment analysis text summarization speech recognition machine translations and more br measure the productivity of key transformers to define their scope potential and limits in production br who this book is for if you want to learn about and apply transformers to your natural language and image data this book is for you br a good understanding of nlp python and deep learning is required to benefit most from this book many platforms covered in this book provide interactive user interfaces which allow readers with a general interest in nlp and ai to follow several chapters of this book br table of contents br 1 what are transformers br 2 getting started with the architecture of the transformer model br 3 fine tuning bert models br 4 pretraining a roberta model from scratch br 5 downstream nlp tasks with transformers br 6 machine translation with the transformer br 7 the rise of suprahuman transformers with gpt 3 engines br 8 applying transformers to legal and financial documents for ai text summarization br 9 matching tokenizers and datasets br 10 semantic role labeling with bert based transformers br 11 let your data do the talking story questions and answers br 12 detecting customer emotions to make predictions br 13 analyzing fake news with transformers br 14 interpreting black box transformer models br 15 from nlp to task agnostic transformer models br 16 the emergence of transformer driven copilots br 17 the consolidation of suprahuman transformers with openai s chatgpt and gpt 4 br appendix i terminology of transformer models br appendix ii hardware constraints for transformer models br and more
bert chatgpt chatgpt-api nlp transformers natural-language-processing gpt-4 gpt-3-5-turbo dall-e dall-e-api gpt-4-api huggingface-transformers openai roberta-model trax deep-learning machine-learning python pytorch
cloud
Advanced-house-price-prediction
advanced house price prediction in this repository i included the pipeline of data science 1 e d a 2 feature engineering 3 feature selection 4 model creationand 5 deployment to cloud
cloud
iot-push-demo
sending push notifications from web app to android devices via gcm and ios devices via apns a showcase demo and a quick tutorial of phonegap pg and pubnub pubnub data stream network javascript api using a simulated smart room heater la nest web user interface to give you some use case ideas photo https raw githubusercontent com pubnub iot push demo gh pages push demo photo jpg how to try the demo this demo is only available for android devices due to the lack of openness of ios app development there are two parts of demo an android app and a web app first you access to the android app to obtain your registration id then you pair the device with the desktop app desktop to be able to receive push notifications android instruction download this apk apk and install it on your android device once your device is registered to this demo you should get your 8 digit unique id desktop instruction go to pubnub github io iot push demo desktop enter your 8 digit id in the input box hover cursor over the temperature controller ui and use mousewheel or trackpad to change the value when you set the room temperature above 80f it sends you a push notification to your android device source code this repo is for the web app only however i included the code to be used for cordova phonegap app under cordova folder the index js file should be in your cordova app root www js to build as an android native app tutorials i wrote a series of step by step instructions on how to send push notifications using cordova plugin and pubnub apis sending android push notifications via gcm in javascript cordova gcm sending ios push notifications via apns in javascript cordova apns also i have phonegap cordova 101 tuts converting your javascript app to an android app w phonegap cordova blog 1 converting your javascript app to an ios app w phonegap cordova blog 2 pg http phonegap com pubnub http www pubnub com docs javascript javascript sdk html desktop https pubnub github io iot push demo apk https github com pubnub iot push demo releases tag 0 1 cordova apns http www pubnub com blog sending ios push notifications via apns javascript using apns phonegap cordova gcm http www pubnub com blog sending android push notifications via gcm javascript using phonegap cordova blog 1 http www pubnub com blog how to convert your javascript app into an android app with phonegap cordova blog 2 http www pubnub com blog converting your javascript app to an ios app w phonegap
server
Shuriken-Strike
sharktank csci 526 spring 2022 game development project
front_end
wormholes
wormholeschain the wormholeschain solves the blockchain trilemma which entails a necessary tradeoff between scalability security and decentralization by building the technology to achieve the ideal balance between these three metrics creating a highly scalable and secure blockchain system that doesn t sacrifice decentralization gitter https badges gitter im wormholes org internal test miner svg https gitter im wormholes org internal test miner utm source badge utm medium badge utm campaign pr badge the approach the significant step before spinning up your node is choosing your approach based on requirements and many potential possibilities you must select the client implementation of both execution and consensus clients the environment hardware system and the parameters for client settings to decide whether to run the software on your hardware or in the cloud depending on your demands you can use the startup script to start your node after preparing the environment when the node is running and syncing you are ready to use it but make sure to keep an eye on its maintenance environment and hardware wormholes clients are able to run on consumer grade computers and do not require any special hardware such as mining machines therefore you have more options for deploying the node based on your demands let us think about running a node on both a local physical machine and a cloud server hardware wormholes clients can run on your computer laptop server or even a single board computer although running clients on different devices are possible it had better use a dedicated machine to enhance its performance and underpin the security which can minimize the impact on your computer hardware requirements differ by the client but generally are not that high since the node just needs to stay synced do not confuse it with mining which requires much more computing power however sync time and performance do improve with more powerful hardware minimum requirements cpu main frequency 2 9ghz 4 cores or above cpu memory capacity 8gb or more hard disk capacity 500gb or more network bandwidth 6m uplink and downlink peer to peer rate or higher before installing the client please ensure your computer has enough resources to run it you can find the minimum and recommended requirements below spin up your own wormholes node participate in the wormholes blockchain public testnet jointly support and maintain the wormholes network ecosystem and you can obtain corresponding benefits this tutorial will guide you to deploy wormholes nodes and participate in verifying the security and reliability of the wormholes network choose the software tools and deployment methods you are familiar with to maintain your own nodes docker clients setup preparation install wget please go to the wget website https www gnu org software wget to download and install it if you are using linux system you can also install it using the apt get install wget command if you are using macos system you can also install it using the brew install wget command install docker for the installation and use of docker please refer to the docker official documentation https docs docker com engine install run the node when using the script to start the node you must enter the private key of the account used for pledge prepared earlier for details see the documentation deploy wormholes nodes using official scripts https www wormholes com docs install run docker docker 3 index html manual clients setup the actual client setup can be done by using the automatic launcher or manually for ordinary users we recommend you use a startup script which guides you through the installation and automates the client setup process however if you have experience with the terminal the manual setup steps should be easy to follow startup parameters start wormholes in fast sync mode default can be changed withthe syncmode flag causing it to download more data in exchange for avoiding processing the entire history of the wormholes chain network which is very cpu intensive start up wormholes s built in interactive javascript via the trailing console subcommand through which you can interact using web3 methods https web3js readthedocs io en v1 2 9 note the web3 version bundled within wormholes is very old and not up to date with official docs as well as wormholes s own management apis https www wormholes com docs management this tool is optional and if you leave it out you can always attach to an already running wormholes instance with wormholes attach full nodes functions stores the full blockchain history on disk and can answer the data request from the network receives and validates the new blocks and transactions verifies the states of every account start ordinary node 1 download the binary config and genesis files from release https github com wormholes org wormholes or compile the binary by make wormholes 2 start your full node ordinary nodes need to be started in full mode wormholes devnet syncmode full start validator node 1 download the binary config and genesis files from release https github com wormholes org wormholes or compile the binary by make wormholes 2 prepare a script to start the node and name it run node if the system is windows you need to add the file suffix bat or something else note that the script should be in the same directory as the main wormholes program if you are using the windows system the reference is as follows echo off set rootpath dp0 set nodepath rootpath wormholes if exist nodepath rd s q nodepath if 1 echo please pass in the private key of the account to be pledged exit 1 else md nodepath geth echo 1 nodepath geth nodekey wormholes exe devnet datadir nodepath mine syncmode full if you are using a linux system the reference is as follows bin bash write private key to file if d wormholes then rm rf wormholes fi if gt 0 then mkdir p wormholes wormholes echo 1 wormholes wormholes nodekey else echo please pass in the private key of the account to be pledged return 1 fi wormholes devnet datadir wormholes syncmode full 3 start node you can start a node based on the startup parameters or you can start your own node using a startup script if you use scripts you need to select scripts for different environments according to different system environments when running the startup script you must pass in the private key of the account to be pledged which is the private key saved in step 1 the reference is as follows linux system runtime parameter private run node 94b796b1b11893561c34cf000f23ecf3b39067bb198b9ec9f7b1a79646114680 windows system go to the directory where the startup script is located in the cmd terminal run node bat 94b796b1b11893561c34cf000f23ecf3b39067bb198b9ec9f7b1a79646114680
blockchain