names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
udagram-microservices
udagram image filtering application udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into two parts 1 frontend angular web application built with ionic framework 2 backend restful api node express application getting started tip it s recommended that you start with getting the backend api running since the frontend web application depends on the api prerequisite 1 the depends on the node package manager npm you will need to download and install node from https nodejs com en download https nodejs org en download this will allow you to be able to run npm commands 2 environment variables will need to be set these environment variables include database connection details that should not be hard coded into the application code environment script a file named set env sh has been prepared as an optional tool to help you configure these variables on your local development environment we do not want your credentials to be stored in git after pulling this starter project run the following command to tell git to stop tracking the script in git but keep it stored locally this way you can use the script for your convenience and reduce risk of exposing your credentials git rm cached set env sh afterwards we can prevent the file from being included in your solution by adding the file to our gitignore file database create a postgresql database either locally or on aws rds set the config values for environment variables prefixed with postgres in set env sh s3 create an aws s3 bucket set the config values for environment variables prefixed with aws in set env sh backend api to download all the package dependencies run the command from the directory udagram api bash npm install to run the application locally run bash npm run dev you can visit http localhost 8080 api v0 feed in your web browser to verify that the application is running you should see a json payload feel free to play around with postman to test the api s frontend app to download all the package dependencies run the command from the directory udagram frontend bash npm install install ionic framework s command line tools for us to build and run the application bash npm install g ionic prepare your application by compiling them into static files bash ionic build run the application locally using files created from the ionic build command bash ionic serve you can visit http localhost 8100 in your web browser to verify that the application is running you should see a web interface tips 1 take a look at udagram api does it look like we can divide it into two modules to be deployed as separate microservices 2 the dockerignore file is included for your convenience to not copy node modules copying this over into a docker container might cause issues if your local environment is a different operating system than the docker image ex windows or macos vs linux 3 it s useful to lint your code so that changes in the codebase adhere to a coding standard this helps alleviate issues when developers use different styles of coding eslint has been set up for typescript in the codebase for you to lint your code run the following bash npx eslint ext js ts src to have your code fixed automatically run bash npx eslint ext js ts src fix 4 over time our code will become outdated and inevitably run into security vulnerabilities to address them you can run bash npm audit fix 5 in set env sh environment variables are set with export var value setting it this way is not permanent every time you open a new terminal you will have to run set env sh to reconfigure your environment variables to verify if your environment variable is set you can check the variable with a command like echo postgres username
cloud
FreeRTOS_WaRP7
warp7 readme prologe how to build image external links
os
Operationalize-a-Machine-Learning-Microservices-API
circleci https circleci com gh sepidmoghaddam operationalize a machine learning microservices api tree master svg style svg https circleci com gh sepidmoghaddam operationalize a machine learning microservices api tree master sepid m project overview in this project you will apply the skills you have acquired in this course to operationalize a machine learning microservice api you are given a pre trained sklearn model that has been trained to predict housing prices in boston according to several features such as average rooms in a home and data about highway access teacher to pupil ratios and so on you can read more about the data which was initially taken from kaggle on the data source site https www kaggle com c boston housing this project tests your ability to operationalize a python flask app in a provided file app py that serves out predictions inference about housing prices through api calls this project could be extended to any pre trained machine learning model such as those for image recognition and data labeling project tasks your project goal is to operationalize this working machine learning microservice using kubernetes https kubernetes io which is an open source system for automating the management of containerized applications in this project you will test your project code using linting complete a dockerfile to containerize this application deploy your containerized application using docker and make a prediction improve the log statements in the source code for this application configure kubernetes and create a kubernetes cluster deploy a container using kubernetes and make a prediction upload a complete github repo with circleci to indicate that your code has been tested you can find a detailed project rubric here https review udacity com rubrics 2576 view the final implementation of the project will showcase your abilities to operationalize production microservices setup the environment create a virtualenv and activate it run make install to install the necessary dependencies running app py 1 standalone python app py 2 run in docker run docker sh 3 run in kubernetes run kubernetes sh kubernetes steps setup and configure docker locally setup and configure kubernetes locally create flask app in container run via kubectl
cloud
verilog-eval
verilogeval evaluating large language models for verilog code generation this is an evaluation harness for the verilogeval problem solving dataset described in the paper verilogeval evaluating large language models for verilog code generation https arxiv org abs 2309 07544 this evaluation dataset consists of 156 problems from the verilog instructional website hdlbits https hdlbits 01xz net wiki problem sets we provide two sets of problem descriptions machine generated and manually converted to text only format installation we closely follow guidance from humaneval https github com openai human eval tree master make sure to use python 3 7 or later conda create n codex python 3 7 conda activate codex install icarus verilog https github com steveicarus iverilog git clone https github com steveicarus iverilog git cd iverilog git checkout 01441687235135d1c12eeef920f75d97995da333 sh autoconf sh configure make j4 make install it is recommended to use the provided dockerfile https github com nvlabs verilog eval blob main dockerfile which already pre installed icarus verilog simulator using the docker container you would still need to complete the following step check out and install this repository git clone https github com nvlabs verilog eval pip install e verilog eval usage this program would make system calls to iverilog and vvp to simulate untrusted model generated code users are strongly encouraged not to do so outside of a robust security sandbox the execution call https github com nvlabs verilog eval blob main verilog eval execution py l79 l112 in execution py is deliberately commented out to ensure users read this disclaimer before running code in a potentially unsafe manner see the comment in execution py for more information and instructions after following the above instructions to enable execution generate samples and save them in the following json lines jsonl format where each sample is formatted into a single line like so task id corresponding verilogeval task id completion completion only without the prompt we provide examples under data example to illustrate the format and help with debugging to evaluate the samples run evaluate functional correctness samples jsonl problem file data verilogeval human jsonl reading samples 3120it 00 00 16077 44it s running test suites 100 3120 3120 00 32 00 00 97 47it s killing all hanging simulation process writing results to samples jsonl results jsonl 100 3120 3120 00 00 00 00 30608 13it s pass 1 pass 5 pass 10 the user must specify problem file input argument we provide two sets of problem evaluations data verilogeval machine jsonl and data verilogeval human jsonl we also provide problem description files used to sample verilog code completions in descriptions directory this script provides more fine grained information in a new file ending in input path results jsonl each row now contains whether the completion passed along with the execution result which is one of passed timed out or failed as a quick sanity check the example samples should yield 0 5 pass 1 the results can be verified against the provided output in data example examplesolution jsonl reference jsonl evaluate functional correctness data example examplesolution jsonl problem file data example exampleeval jsonl reading samples 6it 00 00 221 60it s running example suites 100 6 6 00 00 00 00 142 09it s killing all hanging simulation process writing results to data example examplesolution jsonl results jsonl 100 6 6 00 00 00 00 19941 22it s pass 1 0 5 because there is no unbiased way of estimating pass k when there are fewer samples than k the script does not evaluate pass k for these cases to evaluate with other k values pass k comma separated values here for other options see evaluate functional correctness help however we recommend that you use the default values for the rest issues problem descriptions in descriptions verilogdescription machine jsonl are machine generated and we can not guarantee the absense of ambiguity and errors we do not plan to maintain description correctness functional correctness are evaluated through comparing simulation outputs using icarus verilog https github com steveicarus iverilog the evaluation of verilog syntax is limited by the simulator which might not include all features of verilog hdl ieee 1364 standard citation please cite using the following bibtex entry inproceedings liu2023verilogeval title verilogeval evaluating large language models for verilog code generation author liu mingjie and pinckney nathaniel and khailany brucek and ren haoxing booktitle 2023 ieee acm international conference on computer aided design iccad year 2023
ai
Xamarin-Forms-InAnger
xamarin forms inanger repository for the xamarin forms in anger https www syntaxismyui com category xamarin forms inanger xamarin forms series on www syntaxismyui com are you a skeptic of the promise of xamarin forms do you believe that with xamarin forms you can build native cross platform ui s with a single c codebase early on i struggled with the choice of whether to use xamarin forms or a combination of xamarin ios and xamarin android on the xamarin forms fence to be completely honest i am no longer a xamarin forms skeptic but i do have some reservations about its ability to make good looking cross platform ui s i know that xamarin forms can make developer level ui s you know what i mean it works but its ugly what can a design conscious developer do when faced with the possibility of a bad ui scream run away move to html5 no that won t do it s time to get angry and swing xamarin forms with all our might ui samples find a vet phoenix peaks jobbberr woofer heatmap hot sauce findr malldash
front_end
circuit-ui
div align center circuit ui logo https circuit sumup com images logo header png https circuit sumup com stars https img shields io github stars sumup oss circuit ui style social https github com sumup oss circuit ui coverage https img shields io codecov c github sumup oss circuit ui https codecov io gh sumup oss circuit ui license https img shields io github license sumup oss circuit ui license contributor covenant https img shields io badge contributor 20covenant v2 1 20adopted ff69b4 svg https github com sumup oss circuit ui tree main code of conduct md circuit ui web https circuit sumup com is the web implementation of the sumup https sumup com circuit design system our primary goal is to create a system that can be used to build a wide variety of sumup websites and apps while providing a consistent and inclusive user experience to our end users in addition the design system and component library should be easy to use for developers and designers div quick start here are a few helpful links for getting started with circuit ui documentation https circuit sumup com learn how to use circuit ui and view the components in storybook getting started https circuit sumup com path docs introduction getting started docs set up a new app with circuit ui or add it to an existing project theming https circuit sumup com path docs features theme docs learn about our foundations such as colors spacing and typography contribute https circuit sumup com path docs contributing overview docs file a bug report suggest a change or open a pull request packages sumup circuit ui packages circuit ui the core react component library sumup design tokens packages design tokens visual primitives such as typography color and spacing sumup icons packages icons a collection of svg icons sumup cna template packages cna template bootstrap a fresh next js https nextjs org app with circuit ui and foundry https github com sumup oss foundry sumup eslint plugin circuit ui packages eslint plugin circuit ui eslint rules to help users follow best practices when using circuit ui sumup stylelint plugin circuit ui packages stylelint plugin circuit ui stylelint rules to help users follow best practices when using circuit ui code of conduct we want to foster an inclusive and friendly community around our open source efforts like all sumup open source projects this project follows the contributor covenant code of conduct please read it and follow it code of conduct md if you feel another member of the community violated our coc or you are experiencing problems participating in our community because of another individual s behavior please get in touch with our maintainers we will enforce the coc maintainers connor b r mailto connor baer sumup com contributing if you have ideas for how we could improve this readme or the project in general let us know https github com sumup oss circuit ui issues or contribute some https github com sumup oss circuit ui edit main readme md thanks chromatic logo https raw githubusercontent com sumup oss circuit ui main storybook public images chromatic svg sanitize true https www chromatic com thanks to chromatic https www chromatic com for providing the visual testing platform that helps us catch unexpected changes on time about sumup sumup logo https raw githubusercontent com sumup oss assets main sumup logo svg sanitize true https sumup com sumup https sumup com is a mobile point of sale provider it is our mission to make easy and fast card payments a reality across the entire world you can pay with sumup in more than 30 countries already our engineers work in berlin cologne sofia and s o paulo they write code in javascript swift ruby elixir erlang and much more want to come work with us head to our careers page https sumup com careers to find out more
sumup circuit-ui react react-components design-system component-library
os
OpensourceResources
freelearningresources free learning resources related to web development html javascript css introduction to computer science by harvard university https www youtube com playlist list plwkjhjtqvabmgw5fn5bqlwuug 8bdmabi html css and javascript for web developers https www classcentral com course html css javascript for web developers 4270 hackernoon article for free resources https hackernoon com top 5 free online courses to learn html css web development in 2020 ae8e7466dfa7 tutorials on webdevelopment by webdevsimplified https www youtube com channel ucfbnilppjauex4znoulh0cw cs50x https cs50 harvard edu x 2020 cs50 youtube channel harvard all playlist included https www youtube com channel uccabw7890rkjzl968qweyka cs50 short introductory tutorials on js algo html ajax all building blocks covered playlist https www youtube com playlist list plhqjrbd2t381k8ul4wq8sq165xqy149ww a free guide to learn html and css https marksheet io design guide https refactoringui com free html css projects https www codementor io projects html css other web dev essential resources npm course for beginners on udemy with https www udemy com course npm mastering the basics react hooks course on scrimba for free https scrimba com course greacthooks freeapi list https public apis io nodejs course by ibm https developer ibm com technologies node js articles learn node unit 1 overview nodejs learning path nodejs by ibm in year 2018 published https developer ibm com tutorials learn nodejs your first node application harvardx cs50 an introduction to the intellectual enterprises of computer science and the art of programming https www edx org course cs50s introduction to computer science free bootstrap 5 cheat sheet https bootstrap cheatsheet themeselection com for more tutorials and resources you can look up more and contribute to this repo reactjs glitch react starter kit https glitch com glimmer post react starter kit a free 5 part video course with interactive code examples that will help you learn react codecademy react 101 https www codecademy com learn react 101 codecademy s introductory course for react egghead io start learning react https egghead io courses start learning react this series will explore the basic fundamentals of react to get you started react crash course 2018 https www youtube com watch v ke90tje7vs0 a beginner friendly crash course through the most important react topics react armory learn react by itself https reactarmory com guides learn react by itself with react armory you can learn react without the buzzwords the road to learn react https www robinwieruch de the road to learn react build a real world application in plain react without complicated tooling egghead io the beginner s guide to reactjs https egghead io courses the beginner s guide to reactjs free course for react newbies and those looking to get a better understanding of react fundamentals free react bootcamp https tylermcginnis com free react bootcamp recordings from three days of a free online react bootcamp scrimba learn react for free https scrimba com g glearnreact 48 hands on video tutorials building react apps react typescript vibe happy development https school geekwall in p b1y3g7 u4 react typescript happy development react typescript 30 minutes tutorial digitalocean several tutorials involving usage of react https www digitalocean com community tutorials q react free reactjs projects https www codementor io projects reactjs vue js learn vue js full course for beginners https www youtube com watch v 4devcnjq3qc hackerank free domain wise tutorials challenges with certified badges 30 days of code https www hackerrank com domains tutorials 10 days of javascript https www hackerrank com domains tutorials 10 days of statistics https www hackerrank com domains tutorials list of youtubers those will help you to land as a successful web developer computerphile https www youtube com channel uc9 y 6csu5wgm29i7jiwpna best for learning foundational concepts of computer science program with erik http erik video freecodecamp org https www youtube com channel uc8butisfwt wl7ev0huk0bq programming with mosh https www youtube com user programmingwithmosh traversy media https www youtube com user techguyweb derek banas https www youtube com user derekbanas caleb curry https www youtube com user calebthevideomaker2 the net ninja https www youtube com channel ucw5yeuermmlnqo4oq8vwupg chris hawkes https www youtube com user noobtoprofessional learncode academy https www youtube com user learncodeacademy eli the computer guy https www youtube com user elithecomputerguy academind https www youtube com channel ucsjbgttlrdami tdgpuv9 w featured level up tuts https www youtube com user leveluptuts codecourse https www youtube com user phpacademy devtips https www youtube com user devtipsfordesigners tech primers https www youtube com channel ucb12jjysyv eipcvbdcmbxw wes bos https www youtube com user wesbos dev ed https www youtube com channel uclb90nqqcskpugdixsqez5q data school https www youtube com user dataschool paul halliday https www youtube com channel ucyj9o6x1oft7ygxpfrwrcwg andre madarang https www youtube com channel uctb40eqj2inp8zuaqllx3iq hitesh choudhary https www youtube com user hiteshitube jason weimann https www youtube com channel ucx b3nnqn5bzexm 22 nvvg fireship https www youtube com channel ucsbjurrpoezykls9eqgamoa ben awad https www youtube com user 99baddawg coding tech https www youtube com channel uctxcxg uvsntkpozlh4wjaq tech with tim https www youtube com channel uc4jx40jdee tinbkjycv4sg corey schafer https www youtube com user schafer5 kodekloud https www youtube com user mmumshad online tutorials https www youtube com channel ucbwxnuipzslfuckbpsc7jog kevin powell https www youtube com user kepowob chris coyier https www youtube com user realcsstricks mmtuts https www youtube com user thecharmefis web dev simplified https www youtube com channel ucfbnilppjauex4znoulh0cw codingthesmartway https www youtube com channel uclxqok41tociswty bgb kq red stapler https www youtube com channel ucrthrrv06q1iol86 ttkjhg programmingknowledge https www youtube com user programmingknowledge learnwebcode https www youtube com user learnwebcode darrel wilson https www youtube com channel uc5alq vmynfqzt7yangdcgw layout land https www youtube com channel uc7tizprgknbdalbhplrotag back to back swe https www youtube com channel ucmjz2dv1a3yfgrr7gqrtuua sentdex https www youtube com channel ucfzlcwgwyyiq0alc5w48gbq john philip jones https www youtube com user johnphilipjones qirolab https www youtube com channel ucjlgfwqfhhe9enhhqgmxiow wpcasts https www youtube com c wpcasts a plus coding https www youtube com channel uc pll sfem6b9eeyr v424q kirupa chinnathambi https www youtube com user kirupa videos coding addict https www youtube com channel ucmzfwxv5l xtki693qmjpta videos codecourse https www youtube com user phpacademy videos steve griffith https www youtube com channel uctbgxcjhorqjivtgtmsmkaq videos classsed https www youtube com channel uc2 slojimusc20drbf88qvg videos chris courses https www youtube com channel uc9yp2yz6 pwhquplidv mja videos flutter official https www youtube com channel ucwxdfgee9kyzlddr7tg9cmw sykoo https www youtube com user sykootv designcourse https www youtube com channel ucvyrimvfunma1upldpzg5ow colt steele https www youtube com channel ucrqaguppmodo0jfq6grikzw real python https www youtube com channel uci0vqvr9afn27yr6ej6n5ua mtechviral https www youtube com channel ucftm1fgjzskospdzgtbp7ha videos overseas media https www youtube com channel ucpsbhzlaxkz mmapd8vblxg i am tim corey https www youtube com user iamtimcorey yury kashnitsky https www youtube com channel ucggadkkgalfwsnbpsym5ryg jeremy howard https www youtube com channel ucx7y2qwrixpqocg97sfw2oq codepunk https www youtube com c codepunk lee gaines https www youtube com c leegaines fusedvr https www youtube com channel uclo98khpnx6jwsdnh04l9yq weibenfalk https www youtube com c weibenfalk filled stacks https www youtube com channel uc2d0bylqqcdf9ljfydl 02q coffeebeforearch https www youtube com channel ucsi5 medm5q5ne93n ya7ga codeblobstv https www youtube com channel ucxhcac1wjuufq3fc95i48fa florin pop https www youtube com channel uceu 1x402kt jlldaitxsma real python https www youtube com channel uci0vqvr9afn27yr6ej6n5ua codestackr https www youtube com codestackr claudio bernasconi https www youtube com channel uctbhpk0bioqwtxuggd195bw resocoder https www youtube com channel ucsivrn68cuk8cs8mbtbmbka webslesson https www youtube com channel uc8nbgc4vui27hgbv2ffeihw codedamn https www youtube com channel ucjume61lxhbhudzuughl2wq london app brewery https www youtube com channel ucvd5vh9lhlbxp3o1vrnyf w coding from null https www youtube com channel uc9psp9 p9jghfdbreaacz3w vuescreencasts https www youtube com channel ucjask7cagrz1rgmjcorlxxq dcode https www youtube com channel ucjx0ftizbbvd3yoccxndc4g codelit https www youtube com channel ucumqhjjf9bsiavdjuhsiikw javabrains https www youtube com channel ucyt1sfh5464xadbh0oh o7q telusko https www youtube com user javaboynavin kudvenkat https www youtube com user kudvenkat awais mirza https www youtube com channel ucikbbv7ae7lawa8cgnvjspa amigoscode https www youtube com channel uc2kfmyem4kcua1zurravgyw coder s tape https www youtube com channel ucqi ym2rlzx52veoqlpqmdg dan vega https www youtube com danvega code with andrea https www youtube com channel ucrtnst4oyz53l0qgkqled5q videos darkcode https www youtube com channel ucd3kvjbb7aq2oioffuungzw videos devefy https www youtube com channel uc9dwxeavy zcmas7rdox46w videos inspire ui https www youtube com user minhcasi videos alessandro castellani https www youtube com user williamprey live coding these channels live code coding garden with cj https www youtube com channel uclngu oupwoeesgtab33ccw the coding train https www youtube com user shiffman hindi mentorsadda https www youtube com user mentorsadda edureka hindi https www youtube com channel ucywyz4r4ftkexrpl9rf ggw websofttutorials https www youtube com user websofttutorials codewithharry https www youtube com channel ucevmnsshp iviwkknt83cww c by saurabh shukla sir https www youtube com channel ucd scae4ju78dld1kpcsqfq java by saurabh shukla sir https www youtube com channel ucgr3vmya20jjdqvgfccujda free programmer https www youtube com channel uc0ut 8uems5ltdj4 d51zkq durgasoft solutions https www youtube com user durgasoftware technical suneja https www youtube com channel uchiberciys3hs0kjaz at5q love babbar https www youtube com channel ucqhlxxbfrbfdrk1jf0motpw abdul bari https www youtube com channel uczcft11cwbi3mhnlgf019nw jenny s lectures cs it net jrf https www youtube com channel ucm yutygmrnvkoccal21g3w geeky shows https www youtube com user geekyshow1 knowledge gate https www youtube com channel uca6yfpyhy5swmjrgot oaiq tutorials point india ltd https www youtube com channel ucvlbzhxvtitlivkegv7webg bhagwan singh vishwakarma https www youtube com channel ucbmdoqkbaiuh128ffq8xatw today i am going to tell you guys about which resources learning labs github provide for free learning path first basic of github git in this path they cover everything basics related to github git list as follows first day on github https lab github com githubtraining paths first day on github first week on github https lab github com githubtraining paths first week on github innersource theory to practice https lab github com githubtraining paths innersource theory to practice become an open source enterprise https lab github com githubtraining paths become an open source enterprise github universe workshops https lab github com githubtraining paths github universe workshops ramp up on git and github https lab github com githubtraining paths ramp up on git and github setuping blog with github pages https lab github com githubtraining github pages most important thing comes now languages tools technologies introduction to html https lab github com githubtraining introduction to html introduction to node with express https lab github com everydeveloper introduction to node with express introduction to python https lab github com everydeveloper introduction to python intermediate python https lab github com everydeveloper intermediate python intermediate nodejs course with mongodb crudapp https lab github com everydeveloper intermediate nodejs course introduction to ruby https lab github com everydeveloper introduction to ruby tensorflow image processing https lab github com everydeveloper introduction to tensorflow get started with tensorflow https lab github com everydeveloper advance tensorflow introduction to java https lab github com everydeveloper introduction to java introduction to php https lab github com everydeveloper introduction to php introduction to design thinking https lab github com githubtraining introduction to design thinking starting with typescript https lab github com michael spengler starting with typescript introduction to react https lab github com githubtraining introduction to react p and many more nodejs rest api store framework you can check it out that adding links suck now move forward p devops with github github actions continuous delivery with aws https lab github com githubtraining github actions continuous delivery with aws github actions continuous integration https lab github com githubtraining github actions continuous integration github actions publish to github packages https lab github com githubtraining github actions publish to github packages github actions writing javascript actions https lab github com githubtraining github actions writing javascript actions getting started with github apps https lab github com githubtraining getting started with github apps continuous integration with circleci https lab github com githubtraining continuous integration with circleci continuous integration with travis ci https lab github com githubtraining continuous integration with travis ci p and more learning labs on github enterprise you can check it out that p on githublearninglabs https lab github com docker the docker handbok https www freecodecamp org news the docker handbook free opensource sites for reading blogs dev to https dev to game dev youtube channels best game dev channel c by javidx9 https www youtube com channel uc yuwvuplujzvieeligkbka react and firebase a firebase in react tutorial for beginners https www robinwieruch de complete firebase authentication react tutorial getting started with react and firebase firecasts https www youtube com watch v mwnatxfusgi b final tip b after learning basics of html css js you can go for freecodecamp certifications those can help you please add more resources if you have any i will definitely merge you pr and day by day i am also going to add more resources
learning free freecodecamp resources webdevelopment css javascript html html-css-javascript open-source react tutorials scrimba egghead codecademy typescript fundamentals nodejs hacktoberfest hacktoberfest2021
front_end
elements
elements project blockchain platform build status https travis ci org elementsproject elements svg branch master https travis ci org elementsproject elements https elementsproject org this is the integration and staging tree for the elements blockchain platform a collection of feature experiments and extensions to the bitcoin protocol this platform enables anyone to build their own businesses or networks pegged to bitcoin as a sidechain or run as a standalone blockchain with arbitrary asset tokens modes elements supports a few different pre set chains for syncing note though some are intended for qa and debugging only liquid mode elementsd chain liquidv1 syncs with liquid network bitcoin mainnet mode elementsd chain main not intended to be run for commerce bitcoin testnet mode elementsd chain testnet3 bitcoin regtest mode elementsd chain regtest elements custom chains any other chain argument it has regtest like default parameters that can be over ridden by the user by a rich set of start up options confidential assets the latest feature in the elements blockchain platform is confidential assets the ability to issue multiple assets on a blockchain where asset identifiers and amounts are blinded yet auditable through the use of applied cryptography announcement of confidential assets https blockstream com 2017 04 03 blockstream releases elements confidential assets html confidential assets whitepaper https blockstream com bitcoin17 final41 pdf to be presented april 7th at financial cryptography 2017 http fc17 ifca ai bitcoin schedule html in malta confidential assets tutorial contrib assets tutorial assets tutorial py confidential assets demo https github com elementsproject confidential assets demo elements code tutorial https elementsproject org elements code tutorial overview covering blockchain configuration and how to use the main features features of the elements blockchain platform compared to bitcoin itself it adds the following features confidential assets asset issuance confidential transactions confidential transactions federated two way peg federated peg signed blocks signed blocks additional opcodes opcodes previous elements that have been integrated into bitcoin segregated witness relative lock time elements deferred for additional research and standardization schnorr signatures schnorr signatures additional rpc commands and parameters rpc docs https elementsproject org en doc the ci continuous integration systems make sure that every pull request is built for windows linux and macos and that unit sanity tests are run automatically license elements is released under the terms of the mit license see copying copying for more information or see http opensource org licenses mit confidential transactions https elementsproject org features confidential transactions opcodes https elementsproject org features opcodes federated peg https elementsproject org features federatedpeg signed blocks https elementsproject org features signedblocks asset issuance https elementsproject org features issued assets schnorr signatures https elementsproject org features schnorr signatures what is the elements project elements is an open source sidechain capable blockchain platform it also allows experiments to more rapidly bring technical innovation to the bitcoin ecosystem learn more on the elements project website https elementsproject org https github com elementsproject elementsproject github io secure reporting see our vulnerability reporting guide security md
blockchain
freeRTOS_UBIFS
summary the purpose of this project is to port ubifs to freertos
os
tokencore
h1 align center tokencore h1 p align center a href https travis ci com galaxyscitech tokencore img src https travis ci com galaxyscitech tokencore svg branch master alt build status a a href https github com galaxyscitech tokencore issues img src https img shields io github issues galaxyscitech tokencore svg alt issues a a href https github com galaxyscitech tokencore pulls img src https img shields io github issues pr galaxyscitech tokencore svg alt pull requests a a href https github com galaxyscitech tokencore graphs contributors img src https img shields io github contributors galaxyscitech tokencore svg alt contributors a a href license img src https img shields io github license galaxyscitech tokencore svg alt license a p contact galaxyscitech https galaxy doctor official website use cases this serves as the exchange wallet backend for more details check out java wallet https github com galaxyscitech java wallet introduction tokencore is a central component for blockchain wallet backends it currently supports the following btc omni eth erc20 trx trc20 bch bsv doge dash ltc filecoin integration gradle gradle repositories maven url https jitpack io dependencies compile com github galaxyscitech tokencore 1 2 7 maven xml repositories repository id tronj id url https dl bintray com tronj tronj url repository repository id jitpack io id url https jitpack io url repository repositories dependency groupid com github galaxyzxcv groupid artifactid tokencore artifactid version 1 2 7 version dependency sample test view a sample test at tokencore test sample https github com galaxyscitech tokencore blob master src test java org consenlabs tokencore test java usage guide initialize identity java try files createdirectories paths get keystoreproperties dir wallets catch throwable ignored walletmanager storage new keystorestorage walletmanager scanwallets string password 123456 identity identity identity getcurrentidentity if identity null identity createidentity token password network mainnet metadata p2wpkh generate wallet java identity identity identity getcurrentidentity string password 123456 wallet wallet identity derivewalletbymnemonics chaintype bitcoin password mnemonicutil randommnemoniccodes system out println wallet getaddress offline signature offline signing refers to the process of creating a digital signature for a transaction without connecting to the internet this method enhances security by ensuring private keys never come in contact with an online environment here s how you can create an offline signature with tokencore for bitcoin and tron bitcoin 1 set up transaction details define the details of your bitcoin transaction including recipient s address change index amount to be transferred and the fee java string password 123456 string toaddress 33sxfhcbpyhqevsvthmyyoncbshw5xjzn9 int changeidx 0 long amount 1000l long fee 555l 2 fetch utxos you ll need utxos unspent transaction outputs for the transaction usually these are fetched from a node or an external api java arraylist bitcointransaction utxo utxos new arraylist 3 initialize transaction sign with all the details in place initialize the bitcoin transaction and sign it offline java bitcointransaction bitcointransaction new bitcointransaction toaddress changeidx amount fee utxos wallet wallet walletmanager findwalletbyaddress chaintype bitcoin 33sxfhcbpyhqevsvthmyyoncbshw5xjzn9 txsignresult txsignresult bitcointransaction signtransaction string valueof chainid bitcoin mainnet password wallet system out println txsignresult tron 1 set up transaction details define your tron transaction details including the sender s address recipient s address and amount java string from tjrabprwbzy45sbavfcjinpjc18kjprtv8 string to tf17bgpazybz8oxbjhriubpdsa7arkolx3 long amount 1 string password 123456 2 initialize transaction sign once you have the transaction details initialize the tron transaction and sign it offline java trontransaction transaction new trontransaction from to amount wallet wallet walletmanager findwalletbyaddress chaintype bitcoin tjrabprwbzy45sbavfcjinpjc18kjprtv8 txsignresult txsignresult transaction signtransaction string valueof chainid bitcoin mainnet password wallet system out println txsignresult remember offline signing enhances security but requires a thorough understanding of transaction construction to avoid errors note tokencore is a functional component for digital currency it s primarily for learning purposes and doesn t offer a complete blockchain business suite
doge bsv bch ltc trx btc eth omni filecoin blockchain
blockchain
natural-language-processing
natural language processing course msai fall 2021
ai
sdk-mcuboot
mcuboot http mcuboot com package on pypi https img shields io pypi v imgtool svg pypi coverity scan build status https scan coverity com projects 12307 badge svg coverity build status sim https github com mcu tools mcuboot workflows sim badge svg sim build status mynewt https github com mcu tools mcuboot workflows mynewt badge svg mynewt publishing status imgtool https github com mcu tools mcuboot workflows imgtool badge svg imgtool build status travis ci https img shields io travis mcu tools mcuboot main svg label travis ci travis apache 2 0 https img shields io badge license apache 202 0 blue svg license pypi https pypi org project imgtool coverity https scan coverity com projects mcuboot sim https github com mcu tools mcuboot actions query workflow sim mynewt https github com mcu tools mcuboot actions query workflow mynewt imgtool https github com mcu tools mcuboot actions query workflow imgtool travis https travis ci org mcu tools mcuboot license https github com mcu tools mcuboot blob main license this is mcuboot version 1 11 0 dev mcuboot is a secure bootloader for 32 bits microcontrollers it defines a common infrastructure for the bootloader and the system flash layout on microcontroller systems and provides a secure bootloader that enables easy software upgrade mcuboot is not dependent on any specific operating system and hardware and relies on hardware porting layers from the operating system it works with currently mcuboot works with the following operating systems and socs zephyr https www zephyrproject org apache mynewt https mynewt apache org apache nuttx https nuttx apache org riot https www riot os org mbed os https os mbed com espressif https www espressif com cypress infineon https www cypress com riot is supported only as a boot target we will accept any new port contributed by the community once it is good enough mcuboot how tos see the following pages for instructions on using mcuboot with different operating systems and socs zephyr docs readme zephyr md apache mynewt docs readme mynewt md apache nuttx docs readme nuttx md riot docs readme riot md mbed os docs readme mbed md espressif docs readme espressif md cypress infineon boot cypress readme md there are also instructions for the simulator sim readme rst roadmap the issues being planned and worked on are tracked using github issues to give your input visit mcuboot github issues https github com mcu tools mcuboot issues source files you can find additional documentation on the bootloader in the source files for more information use the following links boot bootutil https github com mcu tools mcuboot tree main boot bootutil the core of the bootloader itself boot boot serial https github com mcu tools mcuboot tree main boot boot serial support for serial upgrade within the bootloader itself boot zephyr https github com mcu tools mcuboot tree main boot zephyr port of the bootloader to zephyr boot mynewt https github com mcu tools mcuboot tree main boot mynewt bootloader application for apache mynewt boot nuttx https github com mcu tools mcuboot tree main boot nuttx bootloader application and port of mcuboot interfaces for apache nuttx boot mbed https github com mcu tools mcuboot tree main boot mbed port of the bootloader to mbed os boot espressif https github com mcu tools mcuboot tree main boot espressif bootloader application and mcuboot port for espressif socs boot cypress https github com mcu tools mcuboot tree main boot cypress bootloader application and mcuboot port for cypress infineon socs imgtool https github com mcu tools mcuboot tree main scripts imgtool py a tool to securely sign firmware images for booting by mcuboot sim https github com mcu tools mcuboot tree main sim a bootloader simulator for testing and regression joining the project developers are welcome use the following links to join or see more about the project our developer mailing list https groups io g mcuboot our slack channel https mcuboot slack com br get your invite https join slack com t mcuboot shared invite mje2ndcwmtq2mtyylte1mda4mtizntatyzgyztu0njfkmg
os
CANopenNode
canopennode canopennode is free and open source canopen protocol stack canopen is the internationally standardized en 50325 4 cia301 http can cia org standardization technical documents higher layer protocol for embedded control system built on top of can for more information on canopen see http www can cia org canopennode is written in ansi c in object oriented way it runs on different microcontrollers as standalone application or with rtos variables communication device custom are collected in canopen object dictionary and are accessible from both c code and from canopen network canopennode homepage is https github com canopennode canopennode this is version 4 of canopennode with new object dictionary implementation for older versions git checkout branches v1 3 master or v2 0 master characteristics canopen object dictionary https www can cia org can knowledge canopen device architecture offers clear and flexible organisation of any variables variables can be accessed directly or via read write functions nmt https www can cia org can knowledge canopen network management slave to start stop reset device simple nmt master heartbeat https www can cia org can knowledge canopen error control protocols producer consumer error control for monitoring of canopen devices an older alternative node guarding is also available pdo https www can cia org can knowledge canopen pdo protocol for broadcasting process data with high priority and no protocol overhead variables from object dictionary can be dynamically mapped to the tpdo which is then transmitted according to communication rules and received as rpdo by another device sdo https www can cia org can knowledge canopen sdo protocol server enables expedited segmented and block transfer access to all object dictionary variables inside canopen device sdo https www can cia org can knowledge canopen sdo protocol client can access any object dictionary variable on any canopen device inside the network emergency https www can cia org can knowledge canopen special function protocols message producer consumer sync https www can cia org can knowledge canopen special function protocols producer consumer enables network synchronized transmission of the pdo objects etc time stamp https www can cia org can knowledge canopen special function protocols producer consumer enables date and time synchronization in millisecond resolution lss https www can cia org can knowledge canopen cia305 canopen node id and bitrate setup master and slave lss fastscan canopen gateway https www can cia org can knowledge canopen cia309 cia309 3 ascii command interface for nmt master lss master and sdo client canopen safety en 50325 5 cia304 pdo like communication in safety relevant networks canopen conformance test tool https www can cia org services test center conformance test tool passed other suitable for 16 bit microcontrollers and above device support multithreaded real time canopennode flowchart object dictionary editor object dictionary editor non volatile storage for object dictionary or other variables automatic or controlled by standard canopen commands configurable power saving possible power saving bootloader possible https github com canopennode canopennode issues 111 for firmware update related projects canopennode https github com canopennode canopennode this project canopen protocol stack base for canopen device it contains no device specific code drivers which must be added separately for each target system an example shows the basic principles compiles on any system but does not connect to any can hardware canopendemo https github com canopennode canopendemo demo device with canopennode and different target systems tutorial and testing tools canopennode github io https github com canopennode canopennode github io html documentation compiled by doxygen for canopendemo canopennode and other devices available also online https canopennode github io canopeneditor https github com canopennode canopeneditor object dictionary editor external gui tool for editing canopen object dictionary for custom device it generates c source code electronic data sheet and documentation for the device it is a fork from libedssharp https github com robincornelius libedssharp canopenlinux https github com canopennode canopenlinux canopennode on linux devices it can be a basic canopen device or more advanced with commander functionalities canopenstm32 https github com canopennode canopenstm32 canopennode on stm32 microcontrollers canopenpic https github com canopennode canopenpic canopennode on pic microcontrollers from microchip works with 16 bit and 32 bit devices includes example for arduino style max32 https reference digilentinc com reference microprocessor max32 start board doc devicesupport md doc devicesupport md list of other implementations of canopennode on different devices documentation support and contributions all code is documented in the source header files some additional documents are in doc directory to generate complete html documentation run doxygen http www doxygen nl in the project base directory sudo apt install doxygen graphviz pdf2svg doxygen dev null complete generated documentation is also available online https canopennode github io tutorial demo device and tests are available in canopendemo https github com canopennode canopendemo repository report issues on https github com canopennode canopennode issues older discussion group is on sourceforge http sourceforge net p canopennode discussion 387151 contributions are welcome best way to contribute your code is to fork a project modify it and then send a pull request please follow the recommended c style and coding rules https github com majerle c code style like indentation of 4 spaces etc there is also a codingstyle file with example canopennode flowchart flowchart of a typical canopennode implementation program start canopen init start threads can receive thread timer interval thread mainline thread fast response realtime thread with processing of time detect can id constant interval consuming tasks partially process typically 1ms in canopen objects messages and copy network synchronized sdo server data to target copy inputs rpdos emergency canopen objects hw to object dict network state may call application heartbeat for some processing lss slave copy variables from gateway optional object dictionary to nmt master outputs tpdos hw sdo client lss master may cyclically call application code all code of the canopennode is non blocking code in source files is collected into objects parts of the code can be enabled disabled so only files and parts of code can be used which are required for the project see stack configuration in 301 co config h file for most efficiency code can run in different thread as seen in above flowchart this is suitable for microcontrollers it is also possible to run everything from single thread as available on linux devices code includes mechanisms which triggers processing of od objects when necessary in canopen initialization section all canopen objects are initialized in run time canopen objects are processed cyclically files canopen h and canopen c is a joint of all canopen objects it may seems complex but offers some flexibility and is suitable for most common configurations of the canopen objects canopen objects can be defined in global space or can be dynamically allocated object dictionary can be used default od h c files but configuration with multiple object dictionaries is also possible by using the co config t structure canopen h and canopen c files can also be only a reference for more customized implementation of canopennode based device object dictionary is a collection of all network accessible variables and offers most flexible usage od variables can be initialized by object dictionary or application can specify own read write access functions for specific od variables groups of od variables are also able to be stored to non volatile memory either on command or automatically file structure 301 canopen application layer and communication profile co config h configuration macros for canopennode co driver h interface between can hardware and canopennode co odinterface h c canopen object dictionary interface co emergency h c canopen emergency protocol co hbconsumer h c canopen heartbeat consumer protocol co nmt heartbeat h c canopen network management and heartbeat producer protocol co pdo h c canopen process data object protocol co sdoclient h c canopen service data object client protocol master functionality co sdoserver h c canopen service data object server protocol co sync h c canopen synchronisation protocol producer and consumer co time h c canopen time stamp protocol co fifo h c fifo buffer for sdo and gateway data transfer crc16 ccitt h c calculation of crc 16 ccitt polynomial 303 canopen recommendation co leds h c canopen led indicators 304 canopen safety co srdo h c canopen safety relevant data object protocol co gfc h c canopen global failsafe command producer and consumer 305 canopen layer setting services lss and protocols co lss h canopen layer setting services protocol common co lssmaster h c canopen layer setting service master protocol co lssslave h c canopen layer setting service slave protocol 309 canopen access from other networks co gateway ascii h c ascii mapping nmt master lss master sdo client storage co storage h c canopen data storage base object co storageeeprom h c canopen data storage object for storing data into block device eeprom co eeprom h eeprom interface for use with co storageeeprom functions are target system specific extra co trace h c canopen trace object for recording variables over time example directory with basic example should compile on any system co driver target h example hardware definitions for canopennode co driver blank c example blank interface for canopennode main blank c mainline and other threads example template co storageblank h c example blank demonstration for data storage to non volatile memory makefile makefile for example ds301 profile xpd canopen device description file for ds301 it includes also canopennode specific properties this file is also available in profiles in object dictionary editor ds301 profile eds ds301 profile md standard canopen eds file and markdown documentation file automatically generated from ds301 profile xpd od h c canopen object dictionary source files automatically generated from ds301 profile xpd doc directory with documentation changelog md change log file devicesupport md information about supported devices objectdictionary md description of canopen object dictionary interface canopennode png little icon html directory with documentation must be generated by doxygen canopen h c initialization and processing of canopen objects suitable for common configurations codingstyle example of the coding style doxyfile configuration file for the documentation generator doxygen license license readme md this file object dictionary editor object dictionary is one of the most essential parts of canopen to customize the object dictionary it is necessary to use external application canopeneditor https github com canopennode canopeneditor latest pre compiled binaries https github com canopennode canopeneditor archive refs heads build zip are also available just extract the zip file and run the edseditor exe in linux it runs with mono which is available by default on ubuntu just set file permissions to executable and then execute the program in program in preferences set exporter to canopennode v4 then start new project or open the existing project file many project file types are supported eds xdd v1 0 xdd v1 1 old custom xml format generated project file can then be saved in xdd v1 1 file format xmlns http www canopen org xml 1 1 project file can also be exported to other formats it can be used to generate documentation and canopennode source files for object dictionary if new project was started then ds301 profile xpd may be inserted if existing old project is edited then existing communication specific parameters may be deleted and then new ds301 profile xpd may be inserted alternative is editing existing communication parameters with observation to object dictionary requirements by canopennode in objectdictionary md doc objectdictionary md to clone add or delete select object s and use right click some knowledge of canopen is required to correctly set up the custom object dictionary separate objects can also be inserted from another project canopennode includes some custom properties inside standard project file see objectdictionary md doc objectdictionary md for more information device support canopennode can run on many different devices each device or microcontroller must have own interface to canopennode canopennode can run with or without operating system it is not practical to have all device interfaces in a single project interfaces to other microcontrollers are in separate projects see devicesupport md doc devicesupport md for list of known device interfaces some details rtr rtr remote transmission request is a feature of can bus usage of rtr is not recommended for canopen rtr pdo is not implemented in canopennode error control when node is started in nmt operational state it is allowed to send or receive process data objects pdo if error register object 0x1001 is set then nmt operational state may not be allowed power saving all canopen objects calculates next timer info for os calculation is based on various timers which expire in known time can be used to put microcontroller into sleep and wake at the calculated time change log see changelog md doc changelog md license licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license
canopen canopennode embedded stack iot
server
blockchain
blockchain forthebadge https forthebadge com images badges built with love svg forthebadge https forthebadge com images badges for you svg forthebadge https forthebadge com images badges powered by coffee svg https img shields io badge excitement high red https img shields io badge maintained yes indigo https img shields io badge pull requests accepting yellow https img shields io github forks krvaibhaw blockchain https img shields io github contributors krvaibhaw blockchain https img shields io github issues krvaibhaw blockchain https img shields io github stars krvaibhaw blockchain https img shields io badge contributions accepting pink https img shields io github license krvaibhaw blockchain https img shields io badge by me a coffee paypal skyblue https www paypal com paypalme krvaibhaw 100 https img shields io badge python blue https img shields io badge html blueviolet learn blockchains by building one yourself p align center img src preview preview png p installation 1 make sure python 3 6 https www python org downloads is installed 2 install flask web framework https flask palletsprojects com en 2 0 x 3 clone this repository git clone https github com krvaibhaw blockchain git 4 change directory cd blockchain 5 install requirements pip install requirements txt 6 run the server python blockchain py 7 head to the web browser and visit http 127 0 0 1 5000 introduction blockchain is a specific type of database it differs from a typical database in the way it stores information blockchains store data in blocks that are then chained together as new data comes in it is entered into a fresh block once the block is filled with data it is chained onto the previous block which makes the data chained together in chronological order different types of information can be stored on a blockchain but the most common use so far has been as a ledger for transactions what is blockchain a blockchain is essentially a digital ledger of transactions that is duplicated and distributed across the entire network of computer systems on the blockchain it is a growing list of records called blocks that are linked together using cryptography each block contains a cryptographic hash of the previous block a timestamp and transaction data generally represented as a merkle tree the timestamp proves that the transaction data existed when the block was published in order to get into its hash as blocks each contain information about the block previous to it by cryptographic hash of the previous block they form a chain with each additional block reinforcing the ones before it therefore blockchains are resistant to modification of their data because once recorded the data in any given block cannot be altered retroactively without altering all subsequent blocks how does it works blockchains are typically managed by a peer to peer network for use as a publicly distributed ledger where nodes collectively adhere to a protocol to communicate and validate new blocks although blockchain records are not unalterable as forks are possible blockchains may be considered secure by design and exemplify a distributed computing system with high byzantine fault tolerance why blockchain immutable blockchains are resistant to modification of their data because once recorded the data in any given block cannot be altered retroactively without altering all subsequent blocks decentralized it doesn t have any governing authority or a single person looking after the framework rather a group of nodes maintains the network making it decentralized it means transparency user control less prone to breakdown less chance of failure no third party enhanced security if someone wants to corrupt the network he she would have to alter every data stored on every node in the network there could be millions and millions of people where everyone has the same copy of the ledger distributed ledgers the ledger on the network is maintained by all other users on the system this distributed computational power across the computers to ensure a better outcome it ensures no malicious changes ownership of verification quick response managership no extra favors consensus the architecture is cleverly designed and consensus algorithms are at the core of this architecture the consensus is a decision making process for the group of nodes active on the network the consensus is responsible for the network being trustless nodes might not trust each other but they can trust the algorithms that run at the core of it that s why every decision on the network is a winning scenario for the blockchain true traceability with blockchain the supply chain becomes more transparent than ever as compared to traditional supply chain where it is hard to trace items that can lead to multiple problems including theft counterfeit and loss of goods understanding the program firstly we defined the structure of our block which contains block index timestamp of when it has been created proof of work along with previous hash i e the hash of previous block in real case seanario along with these there are other contents such as a body or transaction list etc python def createblock self proof prevhash defining the structure of our block block index len self chain 1 timestamp str datetime datetime now proof proof prevhash prevhash establishing a cryptographic link self chain append block return block the genesis block is the first block in any blockchain based protocol it is the basis on which additional blocks are added to form a chain of blocks hence the term blockchain this block is sometimes referred to block 0 every block in a blockchain stores a reference to the previous block in the case of genesis block there is no previous block for reference python def init self self chain creating the genesis block self createblock proof 1 prevhash 0 proof of work pow is the original consensus algorithm in a blockchain network the algorithm is used to confirm the transaction and creates a new block to the chain in this algorithm minors a group of people compete against each other to complete the transaction on the network the process of competing against each other is called mining as soon as miners successfully created a valid block he gets rewarded python def proofofwork self prevproof newproof 1 checkproof false defining crypto puzzle for the miners and iterating until able to mine it while checkproof is false op hashlib sha256 str newproof 2 prevproof 5 encode hexdigest if op 5 00000 checkproof true else newproof 1 return newproof chain validation is an important part of the blockchain it is used to validate weather tha blockchain is valid or not there are two checks performed first check for each block check if the previous hash field is equal to the hash of the previous block i e to verify the cryptographic link second check to check if the proof of work for each block is valid according to problem defined in proofofwork function i e to check if the correct block is mined or not python def ischainvalid self chain prevblock chain 0 initilized genesis block index 0 blockindex 1 initilized next block index 1 while blockindex len chain first check to verify the cryptographic link currentblock chain blockindex if currentblock prevhash self hash prevblock return false second check to check if the correct block is mined or not prevproof prevblock proof currentproof currentblock proof op hashlib sha256 str currentproof 2 prevproof 5 encode hexdigest if op 5 00000 return true prevblock currentblock blockindex 1 return true br feel free to follow along the code provided along with mentioned comments for br better understanding of the project if any issues feel free to reach me out contributing contributions are welcome br please feel free to submit a pull request
python flask bitcoin blockchain ethereum cryptocurrency nft nfts github python-script blockchain-technology blockchain-explorer bitcoin-cash cryptocurrency-exchanges data-visualization trending-repositories trending-topics
blockchain
full-stack-bootcamp-wiki
full stack bootcamp wiki https jiangren com au program course web code bootcamp or learn to code 1 web 18 web 18 nodejs in timestamp format 18 web web 17 web 17 17 web 17 java 17 java web 17 nodejs 17 nodejs web 16 web 16 16 web 14 web 2 html css 13 class 02 html md web 3 css 13 class 03 css md web 4 javascript 13 class 04 js md web 5 git 13 class 05 20git md web 9 agile 13 class 09 agile 14 md web 14 java web 14 java jr web fullstack14 java note md web 14 node js web 6 javascript es6 part1 s14 class 06 es6 14 md web 7 javascript es6 part2 node js s14 class 07 es6 nodejs 14 md web 8 node js part2 restful api s14 class 08 nodejs restfulapi 14 md web 9 agile s14 class 09 agile 14 md web 10 restfulapi express s14 class 10 restfulapi express 14 md web 11 middleware s14 class 11 middleware 14 md web 13 web 6 javascript es6 part1 13 class 06 js md web 7 javascript es6 part2 13 class 07 js md web 8 career cv linkedin 13 class 08 career md web 9 react 1 react 13 class 14 react1 md web 10 react 2 styled component 13 class 15 react2 md web 11 react 3 key value map props 13 class 16 react3 md web 12 react 4 devserver dom state 13 class 17 react4 md web 13 react 5 state lifting call api lifecycle 13 class 18 react5 md web 14 react 6 p2 fetch axios 13 class 27 react6 md web java 1 java 13 classj 09 md web node js 1 node js 13 classn 09 md web web 12 nodejs web 12 n md web 12 java jr web fullstack12 java note md code review code 20review md 14 git code review git code review 14 md https jiangren com au devops https jiangren com au program course devops
front_end
coursera-hong-kong-frontend-development-react
img src https i pinimg com originals f2 46 1e f2461ec7df96b03ee15a2f957725e50a jpg width 100 h1 front end web development with react h1 h2 course description h2 this course explores javascript based front end application development and in particular the react library currently ver 16 3 this course will use javascript es6 for developing react application you will also get an introduction to the use of reactstrap for bootstrap 4 based responsive ui design you will be introduced to various aspects of react components you will learn about react router and its use in developing single page applications you will also learn about designing controlled forms you will be introduced to the flux architecture and redux you will explore various aspects of redux and use it to develop react redux powered applications you will then learn to use fetch for client server communication and the use of rest api on the server side a quick tour through react animation support and testing rounds off the course you must have preferably completed the previous course in the specialization on bootstrap 4 or have a working knowledge of bootstrap 4 to be able to navigate this course also a good working knowledge of javascript especially es 5 is strongly recommended h2 course h2 h3 introduction to react h3 in this module we get a quick introduction to front end javascript frameworks and libraries followed by an introduction to react we will also learn about react components and jsx h3 react router and single page applications h3 in this week you will learn about various component types you will learn about react router and its use in designing single page applications you will also learn about single page applications and use react router to design single page applications h3 react forms flow architecture and introduction to redux h3 in this module you will be introduced to uncontrolled and controlled forms and briefly examine form validation in react applications you will get an overview of the flux architecture and introduced to redux as a way of realizing the flux architecture h3 more redux and client server communication h3 in this module you will explore redux further including redux action combining reducers and redux thunk client server communication using fetch and the rest api you will get a brief introduction to animation in react you will also learn about testing building and deploying react applications
coursera hong-kong javascript html css redux yarn react-router react
front_end
Automatic-Fish-Feeder
automatic fish feeder automatic fish feeder embedded system design assignment
os
TinyLora
tinylora assets anya lora jpg easy hackable for low rank adaptations lora low rank adaptation of large language models https arxiv org abs 2106 09685 todo update readme add way to save only lora weights add merge layer add unmerge layer
ai
github-stats
github stats prettier ignore start all contributors badge start do not remove or modify this section all contributors https img shields io badge all contributors 11 orange svg style flat square contributors all contributors badge end prettier ignore end your github contributions smartly organized and visualized showcase meaningful metrics on your cv what s this before stating whether this tool is useful or not it might be let s disclose its primary goal improving our skills why our because this tool is open source and everyone is more than welcome to contribute to it you can grab an issue at any time or join the discord https discord gg bqwyea6we6 server to discuss the project and its future nothing is set in stone so feel free to share your ideas and suggestions learn more here s a video describing the project and its goals on youtube https www youtube com watch v zm92xpdrotk a href https www youtube com watch v zm92xpdrotk img src https i3 ytimg com vi zm92xpdrotk maxresdefault jpg style width 450px a technologies involved the app is currently based on next js https nextjs org with typescript and tailwind css actually with daisyui https daisyui com a tailwind css component library we manage some data specifically from the github apis https docs github com en graphql using the graphql https graphql org endpoint and react query https tanstack com query latest there s a login feature with nextauth https next auth js org using github as a provider coming soon the plan is to also add at some point some kind of user profile and settings stored where it s up to you to decide it could be on mongodb with an orm like prisma or something entirely different a first start could be using localstorage to validate the concept and then decide which database to use testing will also be involved in the process not sure if vitest or jest for component testing and either cypress or playwright for e2e testing how to contribute as mentioned in the beginning you can grab an issue write a comment first or join the discord https discord gg bqwyea6we6 server so we can have a chat about the project the goal of this project isn t the outcome itself but rather the process of building it together as a result we ll end up having a nice tool to showcase our github contributions and a project we can use as a reference when we need to implement something similar in other projects instructions on how to run the app locally can be found in contributing md contributing md thanks for reading and happy coding contributors all contributors list start do not remove or modify this section prettier ignore start markdownlint disable table tbody tr td align center valign top width 14 28 a href https leonardomontini dev img src https avatars githubusercontent com u 7253929 v 4 s 100 width 100px alt leonardo montini br sub b leonardo montini b sub a br a href projectmanagement balastrong title project management a a href https github com balastrong github stats commits author balastrong title code a td td align center valign top width 14 28 a href https bio link anantchoubey img src https avatars githubusercontent com u 91460022 v 4 s 100 width 100px alt anant choubey br sub b anant choubey b sub a br a href https github com balastrong github stats commits author theanantchoubey title documentation a a href https github com balastrong github stats issues q author 3atheanantchoubey title bug reports a a href https github com balastrong github stats commits author theanantchoubey title code a td td align center valign top width 14 28 a href http priyank live img src https avatars githubusercontent com u 88102392 v 4 s 100 width 100px alt priyankar pal br sub b priyankar pal b sub a br a href https github com balastrong github stats commits author priyankarpal title documentation a a href https github com balastrong github stats commits author priyankarpal title code a a href ideas priyankarpal title ideas planning feedback a td td align center valign top width 14 28 a href https github com piyushjha0409 img src https avatars githubusercontent com u 73685420 v 4 s 100 width 100px alt piyush jha br sub b piyush jha b sub a br a href https github com balastrong github stats commits author piyushjha0409 title code a td td align center valign top width 14 28 a href https www bassemdimassi tech img src https avatars githubusercontent com u 75867744 v 4 s 100 width 100px alt dimassi bassem br sub b dimassi bassem b sub a br a href design dimassibassem title design a a href https github com balastrong github stats commits author dimassibassem title code a td td align center valign top width 14 28 a href http jakubfronczyk com img src https avatars githubusercontent com u 71935020 v 4 s 100 width 100px alt jakub fronczyk br sub b jakub fronczyk b sub a br a href https github com balastrong github stats commits author jakubfronczyk title code a td td align center valign top width 14 28 a href https github com black arm img src https avatars githubusercontent com u 68558867 v 4 s 100 width 100px alt antonio basile br sub b antonio basile b sub a br a href https github com balastrong github stats commits author black arm title code a td tr tr td align center valign top width 14 28 a href https github com agrimaagrawal img src https avatars githubusercontent com u 84567933 v 4 s 100 width 100px alt agrima agrawal br sub b agrima agrawal b sub a br a href https github com balastrong github stats issues q author 3aagrimaagrawal title bug reports a td td align center valign top width 14 28 a href https www linkedin com in hicham essaidi 840b11288 img src https avatars githubusercontent com u 85809218 v 4 s 100 width 100px alt hicham essaidi br sub b hicham essaidi b sub a br a href https github com balastrong github stats commits author heshamsadi title code a td td align center valign top width 14 28 a href https www anupamac me img src https avatars githubusercontent com u 35479077 v 4 s 100 width 100px alt anupam br sub b anupam b sub a br a href https github com balastrong github stats commits author luckyklyist title code a td td align center valign top width 14 28 a href http thiti wcydtt co img src https avatars githubusercontent com u 55313215 v 4 s 100 width 100px alt thititongumpun br sub b thititongumpun b sub a br a href https github com balastrong github stats commits author thititongumpun title code a td tr tbody table markdownlint restore prettier ignore end all contributors list end prettier ignore start markdownlint disable markdownlint restore prettier ignore end all contributors list end
nextjs hacktoberfest react typescript good-first-issue tailwindcss hacktoberfest2023
front_end
Legal-LLMs-GPTs
large language models llms and generative pre trained transformers gpts for legal lta https img shields io badge clp ecosystem blue style flat square https github com liquid legal institute common legal platform lta https img shields io badge clp code red style flat square https github com liquid legal institute common legal platform lta https img shields io badge clp community orange style flat square https github com liquid legal institute common legal platform lta https img shields io badge license cc by sa 4 0 lightgrey style flat square https creativecommons org licenses by sa 4 0 a list of selected and curated information dedicated to legal llms and gpts logo images unsplashmainimage gpts png photo by brett jordan https unsplash com brett jordan on unsplash https unsplash com photos 5l0r8zqpzhk please read the contribution guidelines contributing md before contributing please add a resource by raising a pull request https github com liquid legal institute legal text analytics pulls we also seek for discussion and proposal of new ideas including additional content sections as issues https github com liquid legal institute legal llms gpts issues ko fi https ko fi com img githubbutton sm svg https ko fi com w7w1ff5nn contents large language models and gpt for legal large language models and gpts for legal tutorials tutorials prompt engineering prompt engineering chatgpt detection chatgpt detection foundations of legal text analytics foundations of legal text analytics selected articles on technological aspects selected articles on technological aspects selected articles on legal and regulatory aspects selected articles on legal and regulatory aspects credits credits large language models and gpts for legal back to top contents stephen wolfram what is chatgpt doing and why does it work https writings stephenwolfram com 2023 02 what is chatgpt doing and why does it work transformer models an introduction and catalog https arxiv org pdf 2302 07730 pdf chatgpt at openai https chat openai com chat examples https beta openai com examples documentation https beta openai com docs introduction pricing https openai com api pricing fine tuning chatgpt https beta openai com docs api reference fine tunes create sketch summarizing chatgpt https media licdn com dms image c4d22aqfgvlirj4rsbq feedshare shrink 2048 1536 0 1674467662862 e 1677715200 v beta t g1gce8h2ia48210ywl tututgmjym1euhbpxomp08ws openai evaluation toolkit https github com openai evals llama open and efficient foundation language models https research facebook com publications llama open and efficient foundation language models large language models report by ki bundesverband https leam ai wp content uploads 2023 01 leam mbs kibv webversion mitanhang v2 2023 pdf large language models hugging face report https huggingface co blog large language models gpt takes the bar exam https papers ssrn com sol3 papers cfm abstract id 4314839 large language models as corporate lobbyists https github com johnnay llm lobbyist legal prompt engineering for multilingual legal judgement prediction https arxiv org abs 2212 02199 towards reasoning in large language models a survey https arxiv org abs 2212 10403 report on limitations of chatgpt https medium com asarav the limitations of chat gpt 8b73f5859bb4 tutorials back to top contents openai cookbook shares example code for accomplishing common tasks with the openai api https github com openai openai cookbook gpt in 60 lines of numpy https jaykmody com blog gpt from scratch beginner s guide to the gpt 3 model https towardsdatascience com beginners guide to the gpt 3 model 2daad7fc335a using chatgpt in python https medium com geekculture using chatgpt in python eeaed9847e72 mastering the gpt 3 api in python https medium datadriveninvestor com mastering chatgpt in python a53814e834b0 chatgpt cli and python wrapper https github com mmabrouk chatgpt wrapper building gpt 3 applications beyond the prompt https medium com data science at microsoft building gpt 3 applications beyond the prompt 504140835560 large scale self supervised pre training across tasks languages and modalities https github com microsoft unilm prompt engineering back to top contents training language models to follow instructions with human feedback https arxiv org pdf 2203 02155 pdf prompt engineering guide https github com dair ai prompt engineering guide awesome chatgpt prompts https github com f awesome chatgpt prompts best practices for prompt engineering with openai api https help openai com en articles 6654000 best practices for prompt engineering with openai api prompt engineering 101 introduction and resources https www linkedin com pulse prompt engineering 101 introduction resources amatriain so you want to be a prompt engineer critical careers of the future https venturebeat com ai so you want to be a prompt engineer critical careers of the future free dolly introducing the world s first truly open instruction tuned llm https www databricks com blog 2023 04 12 dolly first open commercially viable instruction tuned llm more than 15k prompts open source dataset of instruction following records https github com databrickslabs dolly tree master data chatgpt detection back to top contents how close is chatgpt to human experts comparison corpus evaluation and detection https arxiv org abs 2301 07597 chatgpt comparison detection project https github com hello simpleai chatgpt comparison detection selected articles on technological aspects back to top contents on the explainability of natural language processing deep models https arxiv org abs 2210 06929 recent advances in natural language processing via large pre trained language models a survey https arxiv org abs 2111 01243 language models are few shot learners train gpt 3 an autoregressive language model with 175 billion parameters https dl acm org doi abs 10 5555 3495724 3495883 supplemental material https dl acm org action downloadsupplement doi 10 5555 2f3495724 3495883 file 3495724 3495883 supp pdf language models are changing ai we need to understand them https hai stanford edu news language models are changing ai we need understand them chatgpt is a blurry jpeg of the web https www newyorker com tech annals of technology chatgpt is a blurry jpeg of the web selected articles on legal and regulatory aspects back to top contents auditing large language models a three layered approach https papers ssrn com sol3 papers cfm abstract id 4361607 regulating chatgpt and other large generative ai models https arxiv org abs 2302 02337 law policy ai update china requires ai watermarks chatgpt won t make it to u s courtrooms https hai stanford edu news law policy ai update china requires ai watermarks chatgpt wont make it us courtrooms foundations of legal text analytics back to top contents see methods for legal text analytics https github com liquid legal institute legal text analytics methods see libraries for legal text analytics https github com liquid legal institute legal text analytics libraries see datasets and data for legal text analytics https github com liquid legal institute legal text analytics datasets and data see tutorials for legal text analytics https github com liquid legal institute legal text analytics tutorials see research groups labs and communities for legal text analytics https github com liquid legal institute legal text analytics research groups labs and communities credits back to top contents see contributors and committers and many more this work is licensed under a creative commons attribution sharealike 4 0 international license cc by sa cc by sa http creativecommons org licenses by sa 4 0 cc by sa shield https img shields io badge license cc 20by sa 204 0 lightgrey svg
ai
libmemory
embedded artistry libmemory embedded artistry s libmemory is a memory management library for embedded systems if you have a bare metal system and want to use malloc this library is for you libmemory provides various implementations of the malloc and free functions the primary malloc implementation is a free list allocator which can be used on a bare metal system wrappers for some rtoses are also provided and can be added if not already you will also find other useful memory functions such as aligned malloc this library is meant to be coupled with a libc implementation such as the embedded artistry libc https github com embeddedartistry libc malloc and free are not redefined in these headers so you can safely use this library with your platform s existing libc table of contents 1 about the project about the project 2 project status project status 3 getting started getting started 1 requirements requirements 1 git lfs git lfs 1 meson build system meson build system 2 getting the source getting the source 3 building building 4 installation installation 4 configuration options configuration options 5 library variants library variants 1 native targets native targets 5 usage usage 1 thread safety thread safety 1 aligned malloc aligned malloc 5 using a custom libc using a custom libc 1 testing testing 5 documentation documentation 6 need help need help 7 contributing contributing 8 further reading further reading 9 authors authors 10 license license 11 acknowledgments acknowledgments about the project this library is meant to allow developers of embedded systems to utilize the malloc and free functions if their platform does not currently support it the baseline malloc implementation can be used without an rtos or any other supporting software only a block of memory needs to be assigned many rtoses provide dynamic memory allocation functionality but these functions are not typically called malloc and free wrappers can be provided for these rtoses to improve code portability a block of memory needs to be initially assigned using the malloc addblock function this tells the malloc implementation what memory address and size to use for the heap allocate 4mb to the heap starting at memory address 0xdeadbeef malloc addblock 0xdeadbeef 4 1024 1024 one memory has been allocated to the heap you can use malloc and free as expected project status basic memory allocation is implemented using the free list strategy x86 x86 64 arm and arm64 compilation is supported example rtos implementations are provided for freertos and threadx an implementation exists that can be used with the embedded virtual machine framework tests are currently in place for malloc free aligned malloc and aligned free no test for overlapping memory blocks currently exists getting started requirements this project uses embedded artistry s standard meson build system https embeddedartistry com fieldatlas embedded artistrys standardized meson build system and dependencies are described in detail on our website https embeddedartistry com fieldatlas embedded artistrys standardized meson build system at a minimum you will need git lfs https git lfs github com which is used to store binary files in this repository meson meson build system is the build system some kind of compiler for your target system this repository has been tested with gcc 7 gcc 8 gcc 9 arm none eabi gcc apple clang mainline clang git lfs this project stores some files using git lfs https git lfs github com to install git lfs on linux sudo apt install git lfs to install git lfs on os x brew install git lfs additional installation instructions can be found on the git lfs website https git lfs github com meson build system the meson https mesonbuild com build system depends on python3 and ninja build to install on linux sudo apt get install python3 python3 pip ninja build to install on osx brew install python3 ninja meson can be installed through pip3 pip3 install meson if you want to install meson globally on linux use sudo h pip3 install meson getting the source this project uses git lfs https git lfs github com so please install it before cloning if you cloned prior to installing git lfs simply run git lfs pull after installation this project is hosted on github https github com embeddedartistry libmemory you can clone the project directly using this command git clone recursive git github com embeddedartistry libmemory git if you don t clone recursively be sure to run the following command in the repository or your build will fail git submodule update init building if make is installed the library can be built by issuing the following command make this will build all targets for your current architecture you can clean builds using make clean you can eliminate the generated buildresults folder using make distclean you can also use meson directly for compiling create a build output folder meson buildresults and build all targets by running ninja c buildresults cross compilation is handled using meson cross files example files are included in the build cross build cross folder you can write your own cross files for your specific processor by defining the toolchain compilation flags and linker flags these settings will be used to compile libc or open an issue and we can help you cross compilation must be configured using the meson command when creating the build output folder for example meson buildresults cross file build cross gcc arm cortex m4 txt following that you can run make at the project root or ninja to build the project tests will not be cross compiled they will only be built for the native platform full instructions for building the project using alternate toolchains and running supporting tooling are documented in embedded artistry s standardized meson build system https embeddedartistry com fieldatlas embedded artistrys standardized meson build system on our website installation if you don t use meson for your project the best method to use this project is to build it separately and copy the headers and desired library contents into your source tree copy the include directory contents into your source tree library artifacts are stored in the buildresults src folder copy the desired library to your project and add the library to your link step example linker flags lpath to libmemory freelist a lmemory freelist if you re using meson you can use libmemory as a subproject place it into your subproject directory of choice and add a subproject statement libmemory subproject libmemory you will need to promote the subproject dependencies to your project in order to use them in your build targets libmemory subproject libmemory libmemory native dep libmemory get variable libmemory native dep libmemory hosted dep libmemory get variable libmemory hosted dep libmemory freelist dep libmemory get variable libmemory freelist dep libmemory threadx dep libmemory get variable libmemory threadx dep libmemory freertos dep libmemory get variable libmemory freertos dep libmemory header include libmemory get variable libmemory system includes libmemory framework rtos dep libmemory get variable libmemory framework rtos dep you can use the dependency for your target library configuration in your executable declarations s or other dependencies for example fwdemo sim platform dep declare dependency include directories fwdemo sim platform inc dependencies fwdemo simulator hw platform dep posix os dep libmemory native dep libmemory dependency added here libc native dep libcxxabi native dep libcxx full native dep logging subsystem dep sources files platform cpp this will add the proper include paths library targets and build dependency rules to your application configuration options the following meson project options can be set for this library when creating the build results directory with meson or by using meson configure enable pedantic enable pedantic warnings enable pedantic error turn on pedantic warnings and errors use libc subproject when true use the subproject defined in the libc subproject option an alternate approach is to override c stdlib in your cross files libc subproject this array is used in combination with use libc subproject the first entry is the subproject name the second is the cross compilation dependency to use the third value is optional if used it is a native dependency to use with native library targets options can be specified using d and the option name meson buildresults denable pedantic true the same style works with meson configure cd buildresults meson configure denable pedantic true library variants this build provides a number of library variations many of these variants support different allocation strategies our recommended implementation for embedded systems without an rtos is libmemory freelist libmemory assert calls to malloc free will assert at runtime this implementation is portable libmemory freelist allocates memory from a freelist works with one or more blocks of memory memory must be initialized with malloc addblock the implementation can be made threadsafe by supplying implementations for malloc lock and malloc unlock in your application this implementation is portable libmemory freertos provides a sample freertos implementation that wraps the heap 5 freertos strategy memory must be initialized with malloc addblock note that headers will need to be updated for your particular project prior to compilation dependencies rtos freertos or simply include the source code within your own project libmemory threadx provides a sample threadx implementation that wraps the threadx memory allocators memory must be initialized with malloc addblock note that headers will need to be updated for your particular project prior to compilation dependencies rtos threadx or simply include the source code within your own project we also have variants that provide supplementary functions e g aligned malloc without providing an implementation for malloc free this is primarily used for providing source code compatibility with systems that do provide a suitable malloc implementation e g running a test program on your build machine libmemory picks up extra libmemory functions but does not implement malloc free will use an alternative libc implementation if specified malloc addblock and malloc init will assert at runtime libmemory hosted picks up extra libmemory functions but does not implement malloc free will use the compiler s libc implementation unless other specified flags override that setting e g within your own build rules malloc addblock and malloc init will assert at runtime native targets in addition every library variant has a corresponding native target when cross compilation builds are enabled the target with the native suffix will be compiled for your build machine while the target without the suffix will be cross compiled this enables your application to use the appropriate variant for both cross compilation targets and native true targets e g a unit test program or simulator application in the same build usage a block of memory needs to be initially assigned using the malloc addblock function void malloc addblock void addr size t size this tells the malloc implementation what memory address and size to use for the heap allocate 4mb to the heap starting at memory address 0xdeadbeef malloc addblock 0xdeadbeef 4 1024 1024 malloc and free will fail return null if no memory has been allocated once memory has been allocated to the heap you can use malloc and free as expected multiple blocks of memory can be added using malloc addblock the memory blocks do not have to be contiguous thread safety rtos based implementations are thread safe depending on the rtos and heap configuration the freelist implementation is not thread safe by default if you are using this version on a threaded system you need to define two functions within your program void malloc lock void malloc unlock these should lock and unlock a mutex that is designed to protect malloc mutex t malloc mutex void malloc lock mutex lock malloc mutex void malloc unlock mutex unlock malloc mutex these functions are defined as weakly linked in the library so the default no op condition will not be used if your functions is found by the linker if you re doubtful that your calls are being included check the disassembly for the functions your version will not be no ops aligned malloc you can allocate aligned memory using aligned malloc void aligned malloc size t align size t size alignment must be a power of two aligned memory can only be free d using aligned free void aligned free void ptr for more information see aligned memory h and the documentation https embeddedartistry github io libmemory d6 dfa aligned malloc 8h html using a custom libc this project is designed to be used along with a libc implementation if you are using this library you may not be using the standard libc that ships with you compiler this library needs to know about the particular libc implementation during its build in case there are important differences in definitions there are two ways to tell this library about a libc 1 override c stdlib in a cross file https mesonbuild com cross compilation html using a custom standard library which will be automatically used when building this library 2 set use libc subproject to true 1 by default this will use the embedded artistry libc https github com embeddedartistry libc 2 you can specify another meson subproject by configuring libc subproject this is an array the first value is the subproject name the second the libc dependency variable and the third is an optional native dependency that will be used with native library variants note external libc dependencies are only used for building the library they are not forwarded through dependencies you are expected to handle that with the rest of your program testing the tests for this library are written with cmocka which is included as a subproject and does not need to be installed on your system you can run the tests by issuing the following command make test by default test results are generated for use by the ci server and are formatted in junit xml the test results xml files can be found in buildresults test documentation documentation for the latest release can always be found here https embeddedartistry github io libmemory index html documentation can be built locally by running the following command make docs documentation can be found in buildresults docs and the root page is index html need help if you need further assistance or have any questions please file a github issue https github com embeddedartistry libmemory issues new or send us an email using the embedded artistry contact form http embeddedartistry com contact you can also reach out on twitter mbeddedartistry https twitter com mbeddedartistry contributing if you are interested in contributing to this project please read our contributing guidelines docs contributing md further reading libmemory documentation https embeddedartistry github io libmemory index html contributing guide docs contributing md authors phillip johnston https github com phillipjohnston original library author embedded artistry https github com embeddedartistry license copyright c 2017 embedded artistry llc this project is licensed under the mit license see license license file for details acknowledgments back to top table of contents
libc c malloc malloc-free memory-allocation bringup embedded-systems portability rtos heap threadx freelist
os
blockchain-wallet
golang java nodejs electron vue nginx ios android https github com guoshijiang blockchain wallet blob master chapter readme md docker https github com guoshijiang docker virtual technology https blog csdn net jiang xinxing https github com guoshijiang blockchain wallet blob master img weixin png v1 0 0 eth 0xe3b4ecd2ec88026f84cf17fef8babfd9184c94f0 erc20 0xe3b4ecd2ec88026f84cf17fef8babfd9184c94f0 layer2 0xe3b4ecd2ec88026f84cf17fef8babfd9184c94f0 savourlabs python go rust node vue react guoshijiang2012 163 com lgzaxe discord https discord gg ww86tqew telegram shijiangguo twitter seek web3
blockchain wallet transaction sign node java
blockchain
chainsqld
english docs readme cn md chainsql docs images logo png overview chainsql is a blockchain supporting database operations the db operating log will be stored in the chainsql chain then you can acquire the visual data in traditional db details of chainsql detailed online document can be found here http docs chainsql net by setting the sync tables sync db and auto sync you can restore the real db tables you wanted from block chain you can set the db type arbitrarily includeing mysql sqlite oracle and so on we defined three new transaction types for the db operation named sqlstatement tablelistset and sqltransaction sqlstatement is used to insert update or delete records tablelistset is used to create a table and other operations to the table self sqltransaction is used to operator a set of db sql clause in one blockchain s transaction you can operate the db or send db operations to the block chain by the following four ways 1 rpc api supplied by the rpc modules 2 web sockets api developed using javascript and java refer to api in javascript http docs chainsql net interface nodeapi html and api in java http docs chainsql net interface javaapi html 3 commandline access to the node directly 4 by kingshrad using db s primitive sql clause refer to access by sql http www chainsql net api mysql html the table module send data request to other nodes and sort the tx datas from other nodes then give the right transaction data to the sql module this module also seeks every ledger in local block chain to get the compatible table data to send back to the required nodes the sql module analysis transaction data to get the real sql then operate the db ussing these real sql sentences further more storage modules make you check the db before sending tx data to the block chain this makes it possible that we can operate db timely if you want to get more infomation about this production please access the site www chainsql net http www chainsql net version on updating our version or releasing new functions the releasenotes releasenotes md will be updated for the detail description setup refer to the setup doc manual deploy md for details compile refer to the builds builds directory for details we introduce the detail compiling steps in several os systems license chainsql is under the gnu general public license v3 0 see the license license directory for details contact us email chainsql peersafe info wechat scan the qr below to follow peersafe and then send chainsql you will receive the qr image for chainsql community peersafe wechat qr code peersafe docs images peersafe jpg
blockchain
LLM-eval-survey
div align center img src imgs logo llmeval png alt llm evaluation width 500 br a collection of papers and resources related to evaluations on large language models div br p align center yupeng chang sup 1 sup nbsp nbsp xu wang sup 1 sup nbsp nbsp jindong wang sup 2 sup nbsp nbsp yuan wu sup 1 sup nbsp nbsp kaijie zhu sup 3 sup nbsp nbsp hao chen sup 4 sup nbsp nbsp linyi yang sup 5 sup nbsp nbsp xiaoyuan yi sup 2 sup nbsp nbsp cunxiang wang sup 5 sup nbsp nbsp yidong wang sup 6 sup nbsp nbsp wei ye sup 6 sup nbsp nbsp yue zhang sup 5 sup nbsp nbsp yi chang sup 1 sup nbsp nbsp philip s yu sup 7 sup nbsp nbsp qiang yang sup 8 sup nbsp nbsp xing xie sup 2 sup p p align center sup 1 sup jilin university sup 2 sup microsoft research sup 3 sup institute of automation cas sup 4 sup carnegie mellon university sup 5 sup westlake university sup 6 sup peking university sup 7 sup university of illinois sup 8 sup hong kong university of science and technology br co first authors co corresponding authors p papers and resources for llms evaluation the papers are organized according to our survey a survey on evaluation of large language models https arxiv org abs 2307 03109 note as we cannot update the arxiv paper in real time please refer to this repo for the latest updates and the paper may be updated later we also welcome any pull request or issues to help us make this survey perfect your contributions will be acknowledged in a href acknowledgements acknowledgements a related projects prompt benchmark for large language models promptbench robustness evaluation of llms https github com microsoft promptbench evlauation of large language models llm eval https llm eval github io imgs framework new png details summary table of contents summary ol li a href news and updates news and updates a li li a href what to evaluate what to evaluate a ul li a href natural language processing natural language processing a li li a href robustness ethics biases and trustworthiness robustness ethics biases and trustworthiness a li li a href social science social science a li li a href natural science and engineering natural science and engineering a li li a href medical applications medical applications a li li a href agent applications agent applications a li li a href other applications other applications a li ul li li a href where to evaluate where to evaluate a li li a href contributing contributing a li li a href citation citation a li li a href acknowledgements acknowledgments a li ol details news and updates 12 07 2023 the second version of the paper is released on arxiv along with chinese blog https zhuanlan zhihu com p 642689101 07 07 2023 the first version of the paper is released on arxiv a survey on evaluation of large language models https arxiv org abs 2307 03109 what to evaluate natural language processing natural language understanding sentiment analysis 1 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 2 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 3 holistic evaluation of language models percy liang et al arxiv 2022 paper https arxiv org abs 2211 09110 4 can chatgpt forecast stock price movements return predictability and large language models alejandro lopez lira et al ssrn 2023 paper https papers ssrn com sol3 papers cfm abstract id 4412788 5 is chatgpt a general purpose natural language processing task solver chengwei qin et al arxiv 2023 paper https arxiv org abs 2302 06476 6 is chatgpt a good sentiment analyzer a preliminary study zengzhi wang et al arxiv 2023 paper https arxiv org abs 2304 04339 7 sentiment analysis in the era of large language models a reality check wenxuan zhang et al arxiv 2023 paper https arxiv org abs 2305 15005 text classification 1 holistic evaluation of language models percy liang et al arxiv 2022 paper https arxiv org abs 2211 09110 2 leveraging large language models for topic classification in the domain of public affairs alejandro pe a et al arxiv 2023 paper https arxiv org abs 2306 02864 3 large language models can rate news outlet credibility kai cheng yang et al arxiv 2023 paper https arxiv org abs 2304 00228 natural language inference 1 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 2 can large language models infer and disagree like humans noah lee et al arxiv 2023 paper https arxiv org abs 2305 13788 3 is chatgpt a general purpose natural language processing task solver chengwei qin et al arxiv 2023 paper https arxiv org abs 2302 06476 others 1 do llms understand social knowledge evaluating the sociability of large language models with socket benchmark minje choi et al arxiv 2023 paper https arxiv org abs 2305 14938 2 the two word test a semantic benchmark for large language models nicholas riccardi et al arxiv 2023 paper https arxiv org abs 2306 04610 3 eveval a comprehensive evaluation of event semantics for large language models zhengwei tao et al arxiv 2023 paper https arxiv org abs 2305 15268 reasoning 1 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 2 chatgpt is a knowledgeable but inexperienced solver an investigation of commonsense problem in large language models ning bian et al arxiv 2023 paper https arxiv org abs 2303 16421 3 chain of thought hub a continuous effort to measure large language models reasoning performance yao fu et al arxiv 2023 paper https arxiv org abs 2305 17306 4 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 5 large language models are not abstract reasoners ga l gendron et al arxiv 2023 paper https arxiv org abs 2305 19555 6 can large language models reason about medical questions valentin li vin et al arxiv 2023 paper https arxiv org abs 2207 08143 7 evaluating the logical reasoning ability of chatgpt and gpt 4 hanmeng liu et al arxiv 2023 paper https arxiv org abs 2304 03439 8 mathematical capabilities of chatgpt simon frieder et al arxiv 2023 paper https arxiv org abs 2301 13867 9 human like problem solving abilities in large language models using chatgpt graziella orr et al front artif intell 2023 paper https www frontiersin org articles 10 3389 frai 2023 1199350 full 10 is chatgpt a general purpose natural language processing task solver chengwei qin et al arxiv 2023 paper https arxiv org abs 2302 06476 11 testing the general deductive reasoning capacity of large language models using ood examples abulhair saparov et al arxiv 2023 paper https arxiv org abs 2305 15269 12 mindgames targeting theory of mind in large language models with dynamic epistemic modal logic damien sileo et al arxiv 2023 paper https arxiv org abs 2305 03353 13 reasoning or reciting exploring the capabilities and limitations of language models through counterfactual tasks zhaofeng wu arxiv 2023 paper https arxiv org abs 2307 02477 utm source tldrai 14 are large language models really good logical reasoners a comprehensive evaluation from deductive inductive and abductive views fangzhi xu et al arxiv 2023 paper https arxiv org abs 2306 09841 15 efficiently measuring the cognitive ability of llms an adaptive testing perspective yan zhuang et al arxiv 2023 paper https arxiv org abs 2306 10512 16 autoformalization with large language models yuhuai wu et al neurips 2022 paper https proceedings neurips cc paper files paper 2022 hash d0c6bc641a56bebee9d985b937307367 abstract conference html 17 evaluating and improving tool augmented computation intensive math reasoning beichen zhang et al arxiv 2023 paper https arxiv org abs 2306 02408 18 structgpt a general framework for large language model to reason over structured data jinhao jiang et al arxiv 2023 paper https arxiv org abs 2305 09645 19 unifying large language models and knowledge graphs a roadmap shirui pan et al arxiv 2023 paper https arxiv org abs 2306 08302 natural language generation summarization 1 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 2 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 3 holistic evaluation of language models percy liang et al arxiv 2022 paper https arxiv org abs 2211 09110 4 chatgpt vs human authored text insights into controllable text summarization and sentence style transfer dongqi pu et al arxiv 2023 paper https arxiv org abs 2306 07799 5 is chatgpt a general purpose natural language processing task solver chengwei qin et al arxiv 2023 paper https arxiv org abs 2302 06476 dialogue 1 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 2 llm eval unified multi dimensional automatic evaluation for open domain conversations with large language models yen ting lin et al arxiv 2023 paper https arxiv org abs 2305 13711 3 is chatgpt a general purpose natural language processing task solver chengwei qin et al arxiv 2023 paper https arxiv org abs 2302 06476 4 lmsys chat 1m a large scale real world llm conversation dataset lianmin zheng et al arxiv 2023 paper https arxiv org abs 2309 11998 translation 1 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 2 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 3 translating radiology reports into plain language using chatgpt and gpt 4 with prompt learning promising results limitations and potential qing lyu et al arxiv 2023 paper https arxiv org abs 2303 09038 4 document level machine translation with large language models longyue wang et al arxiv 2023 paper https arxiv org abs 2304 02210 5 case study of improving english arabic translation using the transformer model donia gamal et al ijicis 2023 paper https ijicis journals ekb eg article 305271 html 6 taqyim evaluating arabic nlp tasks using chatgpt models zaid alyafeai et al arxiv 2023 paper https arxiv org abs 2306 16322 question answering 1 benchmarking foundation models with language model as an examiner yushi bai et al arxiv 2023 paper https arxiv org abs 2306 04181 2 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 3 chatgpt is a knowledgeable but inexperienced solver an investigation of commonsense problem in large language models ning bian et al arxiv 2023 paper https arxiv org abs 2303 16421 4 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 5 holistic evaluation of language models percy liang et al arxiv 2022 paper https arxiv org abs 2211 09110 6 is chatgpt a general purpose natural language processing task solver chengwei qin et al arxiv 2023 paper https arxiv org abs 2302 06476 others 1 exploring the use of large language models for reference free text quality evaluation a preliminary empirical study yi chen et al arxiv 2023 paper https arxiv org abs 2304 00723 2 instructeval towards holistic evaluation of instruction tuned large language models yew ken chia et al arxiv 2023 paper https arxiv org abs 2306 04757 3 chatgpt vs human authored text insights into controllable text summarization and sentence style transfer dongqi pu et al arxiv 2023 paper https arxiv org abs 2306 07799 multilingual tasks 1 benchmarking arabic ai with large language models ahmed abdelali et al arxiv 2023 paper https arxiv org abs 2305 14982 2 mega multilingual evaluation of generative ai kabir ahuja et al arxiv 2023 paper https arxiv org abs 2303 12528 3 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity yejin bang et al arxiv 2023 paper https arxiv org abs 2302 04023 4 chatgpt beyond english towards a comprehensive evaluation of large language models in multilingual learning viet dac lai et al arxiv 2023 paper https arxiv org abs 2304 05613 5 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 6 m3exam a multilingual multimodal multilevel benchmark for examining large language models wenxuan zhang et al arxiv 2023 paper https arxiv org abs 2306 05179 7 measuring massive multitask chinese understanding hui zeng et al arxiv 2023 paper https arxiv org abs 2304 12986 6 cmmlu measuring massive multitask language understanding in chinese haonan li et al arxiv 2023 paper https arxiv org abs 2306 09212 factuality 1 trueteacher learning factual consistency evaluation with large language models zorik gekhman et al arxiv 2023 paper https arxiv org abs 2305 11171 2 true re evaluating factual consistency evaluation or honovich et al arxiv 2022 paper https arxiv org abs 2204 04991 3 selfcheckgpt zero resource black box hallucination detection for generative large language models potsawee manakul et al arxiv 2023 paper https arxiv org abs 2303 08896 4 factscore fine grained atomic evaluation of factual precision in long form text generation sewon min et al arxiv 2023 paper https arxiv org abs 2305 14251 5 measuring and modifying factual knowledge in large language models pouya pezeshkpour arxiv 2023 paper https arxiv org abs 2306 06264 6 evaluating open qa evaluation cunxiang wang arxiv 2023 paper https arxiv org abs 2305 12421 robustness ethics biases and trustworthiness robustness 1 a survey on out of distribution evaluation of neural nlp models xinzhe li et al arxiv 2023 paper https arxiv org abs 2306 15261 2 mitigating hallucination in large multi modal models via robust instruction tuning fuxiao liu et al arxiv 2023 paper https arxiv org abs 2306 14565 3 generalizing to unseen domains a survey on domain generalization jindong wang et al tkde 2022 paper https arxiv org abs 2103 03097 4 on the robustness of chatgpt an adversarial and out of distribution perspective jindong wang et al arxiv 2023 paper https arxiv org abs 2302 12095 5 glue x evaluating natural language understanding models from an out of distribution generalization perspective linyi yang et al arxiv 2022 paper https arxiv org abs 2211 08073 6 on evaluating adversarial robustness of large vision language models yunqing zhao et al arxiv 2023 paper https arxiv org abs 2305 16934 7 promptbench towards evaluating the robustness of large language models on adversarial prompts kaijie zhu et al arxiv 2023 paper https arxiv org abs 2306 04528 8 on robustness of prompt based semantic parsing with large pre trained language model an empirical study on codex terry yue zhuo et al arxiv 2023 paper https arxiv org abs 2301 12868 ethics and bias 1 assessing cross cultural alignment between chatgpt and human societies an empirical study yong cao et al c3nlp eacl 2023 paper https arxiv org abs 2303 17466 2 toxicity in chatgpt analyzing persona assigned language models ameet deshpande et al arxiv 2023 paper https arxiv org abs 2304 05335 3 bold dataset and metrics for measuring biases in open ended language generation jwala dhamala et al facct 2021 paper https dl acm org doi 10 1145 3442188 3445924 4 should chatgpt be biased challenges and risks of bias in large language models emilio ferrara arxiv 2023 paper https arxiv org abs 2304 03738 5 realtoxicityprompts evaluating neural toxic degeneration in language models samuel gehman et al emnlp 2020 paper https aclanthology org 2020 findings emnlp 301 6 the political ideology of conversational ai converging evidence on chatgpt s pro environmental left libertarian orientation jochen hartmann et al arxiv 2023 paper https arxiv org abs 2301 01768 7 aligning ai with shared human values dan hendrycks et al arxiv 2023 paper https arxiv org abs 2008 02275 8 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 9 bbq a hand built bias benchmark for question answering alicia parrish et al acl 2022 paper https aclanthology org 2022 findings acl 165 10 the self perception and political biases of chatgpt j r me rutinowski et al arxiv 2023 paper https arxiv org abs 2304 07333 11 societal biases in language generation progress and challenges emily sheng et al acl ijcnlp 2021 paper https aclanthology org 2021 acl long 330 12 moral mimicry large language models produce moral rationalizations tailored to political identity gabriel simmons et al arxiv 2022 paper https arxiv org abs 2209 12106 13 large language models are not fair evaluators peiyi wang et al arxiv 2023 paper https arxiv org abs 2305 17926 14 exploring ai ethics of chatgpt a diagnostic analysis terry yue zhuo et al arxiv 2023 paper https arxiv org abs 2301 12867 15 chbias bias evaluation and mitigation of chinese conversational language models jiaxu zhao et al acl 2023 paper https aclanthology org 2023 acl long 757 pdf trustworthiness 1 human like intuitive behavior and reasoning biases emerged in language models and disappeared in gpt 4 thilo hagendorff et al arxiv 2023 paper https arxiv org abs 2306 07622 2 decodingtrust a comprehensive assessment of trustworthiness in gpt models boxin wang et al arxiv 2023 paper https arxiv org abs 2306 11698 3 mitigating hallucination in large multi modal models via robust instruction tuning fuxiao liu et al arxiv 2023 paper https arxiv org abs 2306 14565 4 evaluating object hallucination in large vision language models yifan li et al arxiv 2023 paper https arxiv org abs 2305 10355 5 a survey of hallucination in large foundation models vipula rawte et al arxiv 2023 paper https arxiv org abs 2309 05922 6 siren s song in the ai ocean a survey on hallucination in large language models yue zhang et al arxiv 2023 paper https arxiv org abs 2309 01219 social science 1 how ready are pre trained abstractive models and llms for legal case judgement summarization aniket deroy et al arxiv 2023 paper https arxiv org abs 2306 01248 2 baby steps in evaluating the capacities of large language models michael c frank nature reviews psychology 2023 paper https www nature com articles s44159 023 00211 x 3 large language models as tax attorneys a case study in legal capabilities emergence john j nay et al arxiv 2023 paper https arxiv org abs 2306 07075 4 large language models can be used to estimate the ideologies of politicians in a zero shot learning setting patrick y wu et al arxiv 2023 paper https arxiv org abs 2303 12057 5 can large language models transform computational social science caleb ziems et al arxiv 2023 paper https arxiv org abs 2305 03514 natural science and engineering mathematics 1 have llms advanced enough a challenging problem solving benchmark for large language models daman arora et al arxiv 2023 paper https arxiv org abs 2305 15074 2 sparks of artificial general intelligence early experiments with gpt 4 s bastien bubeck et al arxiv 2023 paper https arxiv org abs 2303 12712 3 evaluating language models for mathematics through interactions katherine m collins et al arxiv 2023 paper https arxiv org abs 2306 01694 4 investigating the effectiveness of chatgpt in mathematical reasoning and problem solving evidence from the vietnamese national high school graduation examination xuan quy dao et al arxiv 2023 paper https arxiv org abs 2306 06331 5 a systematic study and comprehensive evaluation of chatgpt on benchmark datasets laskar et al acl 2023 findings paper https arxiv org abs 2305 18486 6 cmath can your language model pass chinese elementary school math test tianwen wei et al arxiv 2023 paper https arxiv org abs 2306 16636 7 an empirical study on challenging math problem solving with gpt 4 yiran wu et al arxiv 2023 paper https arxiv org abs 2306 01337 8 how well do large language models perform in arithmetic tasks zheng yuan et al arxiv 2023 paper https arxiv org abs 2304 02015 9 metamath bootstrap your own mathematical questions for large language models longhui yu et al arxiv 2023 paper https arxiv org abs 2309 12284 general science 1 have llms advanced enough a challenging problem solving benchmark for large language models daman arora et al arxiv 2023 paper https arxiv org abs 2305 15074 2 do large language models understand chemistry a conversation with chatgpt castro nascimento cm et al jcim 2023 paper https pubs acs org doi 10 1021 acs jcim 3c00285 3 what indeed can gpt models do in chemistry a comprehensive benchmark on eight tasks taicheng guo et al arxiv 2023 paper https arxiv org abs 2305 18365 github https github com chemfoundationmodels chemllmbench engineering 1 sparks of artificial general intelligence early experiments with gpt 4 s bastien bubeck et al arxiv 2023 paper https arxiv org abs 2303 12712 2 is your code generated by chatgpt really correct rigorous evaluation of large language models for code generation jiawei liu et al arxiv 2023 paper https arxiv org abs 2305 01210 3 understanding the capabilities of large language models for automated planning vishal pallagani et al arxiv 2023 paper https arxiv org abs 2305 16151 4 chatgpt a study on its utility for ubiquitous software engineering tasks giriprasad sridhara et al arxiv 2023 paper https arxiv org abs 2305 16837 5 large language models still can t plan a benchmark for llms on planning and reasoning about change karthik valmeekam et al arxiv 2022 paper https arxiv org abs 2206 10498 6 on the planning abilities of large language models a critical investigation karthik valmeekam et al arxiv 2023 paper https arxiv org abs 2302 06706 7 efficiently measuring the cognitive ability of llms an adaptive testing perspective yan zhuang et al arxiv 2023 paper https arxiv org abs 2306 10512 medical applications medical queries 1 the promise and peril of using a large language model to obtain clinical information chatgpt performs strongly as a fertility counseling tool with limitation joseph chervenak m d et al fertility and sterility 2023 paper https www sciencedirect com science article pii s0015028223005228 2 analysis of large language model versus human performance for genetics questions dat duong et al european journal of human genetics 2023 paper https www nature com articles s41431 023 01396 8 3 evaluation of ai chatbots for patient specific ehr questions alaleh hamidi et al arxiv 2023 paper https arxiv org abs 2306 02549 4 evaluating large language models on a highly specialized topic radiation oncology physics jason holmes et al arxiv 2023 paper https arxiv org abs 2304 01938 5 evaluation of chatgpt on biomedical tasks a zero shot comparison with fine tuned generative transformers israt jahan et al arxiv 2023 paper https arxiv org abs 2306 04504 6 assessing the accuracy and reliability of ai generated medical responses an evaluation of the chat gpt model douglas johnson et al residential square 2023 paper https pubmed ncbi nlm nih gov 36909565 7 assessing the accuracy of responses by the language model chatgpt to questions regarding bariatric surgery jamil s samaan et al obesity surgery 2023 paper https link springer com article 10 1007 s11695 023 06603 5 8 trialling a large language model chatgpt in general practice with the applied knowledge test observational study demonstrating opportunities and limitations in primary care arun james thirunavukarasu et al jmir med educ 2023 paper https pubmed ncbi nlm nih gov 37083633 9 care mi chinese benchmark for misinformation evaluation in maternity and infant care tong xiang et al arxiv 2023 paper https arxiv org abs 2307 01458 medical examination 1 how does chatgpt perform on the united states medical licensing examination the implications of large language models for medical education and knowledge assessment aidan gilson et al jmir med educ 2023 paper https www ncbi nlm nih gov pmc articles pmc9947764 2 performance of chatgpt on usmle potential for ai assisted medical education using large language models tiffany h kung et al plos digit health 2023 paper https www ncbi nlm nih gov pmc articles pmc9931230 medical assistants 1 evaluating the feasibility of chatgpt in healthcare an analysis of multiple clinical and research scenarios marco cascella et al journal of medical systems 2023 paper https link springer com article 10 1007 s10916 023 01925 4 2 covllm large language models for covid 19 biomedical literature yousuf a khan et al arxiv 2023 paper https arxiv org abs 2306 04926 3 evaluating the use of large language model in identifying top research questions in gastroenterology adi lahat et al scientific reports 2023 paper https www nature com articles s41598 023 31412 2 4 translating radiology reports into plain language using chatgpt and gpt 4 with prompt learning promising results limitations and potential qing lyu et al arxiv 2023 paper https arxiv org abs 2303 09038 5 chatgpt goes to the operating room evaluating gpt 4 performance and its potential in surgical education and training in the era of large language models namkee oh et al ann surg treat res 2023 paper https pubmed ncbi nlm nih gov 37179699 6 can llms like gpt 4 outperform traditional ai tools in dementia diagnosis maybe but not today zhuo wang et al arxiv 2023 paper https arxiv org abs 2306 01499 agent applications 1 language is not all you need aligning perception with language models shaohan huang et al arxiv 2023 paper https arxiv org abs 2302 14045 2 mrkl systems a modular neuro symbolic architecture that combines large language models external knowledge sources and discrete reasoning ehud karpas et al paper https arxiv org abs 2205 00445 3 the unsurprising effectiveness of pre trained vision models for control simone parisi et al icml 2022 paper https arxiv org abs 2203 03580 4 tool learning with foundation models qin et al arxiv 2023 paper https arxiv org abs 2304 08354 5 toolllm facilitating large language models to master 16000 real world apis qin et al arxiv 2023 paper https arxiv org abs 2307 16789 6 toolformer language models can teach themselves to use tools timo schick et al arxiv 2023 paper https arxiv org abs 2302 04761 7 hugginggpt solving ai tasks with chatgpt and its friends in hugging face yongliang shen et al arxiv 2023 paper https arxiv org abs 2303 17580 other applications education 1 can large language models provide feedback to students a case study on chatgpt wei dai et al icalt 2023 paper https edarxiv org hcgzj 2 can chatgpt pass high school exams on english language comprehension joost de winter researchgate paper https www researchgate net publication 366659237 can chatgpt pass high school exams on english language comprehension 3 exploring the responses of large language models to beginner programmers help requests arto hellas et al arxiv 2023 paper https arxiv org abs 2306 05715 4 is chatgpt a good teacher coach measuring zero shot performance for scoring and providing actionable insights on classroom instruction rose e wang et al arxiv 2023 paper https arxiv org abs 2306 03090 5 cmath can your language model pass chinese elementary school math test tianwen wei et al arxiv 2023 paper https arxiv org abs 2306 16636 search and recommendation 1 uncovering chatgpt s capabilities in recommender systems sunhao dai et al arxiv 2023 paper https arxiv org abs 2305 02182 2 recommender systems in the era of large language models llms wenqi fan et al researchgate paper https www researchgate net publication 372137006 recommender systems in the era of large language models llms 3 exploring the upper limits of text based collaborative filtering using large language models discoveries and insights ruyu li et al arxiv 2023 paper https arxiv org abs 2305 11700 4 is chatgpt good at search investigating large language models as re ranking agent weiwei sun et al arxiv 2023 paper https arxiv org abs 2304 09542 5 chatgpt vs google a comparative study of search performance and user experience ruiyun xu et al arxiv 2023 paper https arxiv org abs 2307 01135 6 where to go next for recommender systems id vs modality based recommender models revisited zheng yuan et al arxiv 2023 paper https arxiv org abs 2303 13835 7 is chatgpt fair for recommendation evaluating fairness in large language model recommendation jizhi zhang et al arxiv 2023 paper https arxiv org abs 2305 07609 7 zero shot recommendation as language modeling damien sileo et al ecir 2022 paper https arxiv org abs 2112 04184 personality testing 1 chatgpt is fun but it is not funny humor is still challenging large language models sophie jentzsch et al arxiv 2023 paper https arxiv org abs 2306 04563 2 personality traits in large language models mustafa safdari et al arxiv 2023 paper https arxiv org abs 2307 00184 3 have large language models developed a personality applicability of self assessment tests in measuring personality in llms xiaoyang song et al arxiv 2023 paper https arxiv org abs 2305 14693 4 emotional intelligence of large language models xuena wang et al arxiv 2023 paper https arxiv org abs 2307 09042 specific tasks 1 chatgpt and other large language models as evolutionary engines for online interactive collaborative game design pier luca lanzi et al arxiv 2023 paper https arxiv org abs 2303 02155 2 an evaluation of log parsing with chatgpt van hoang le et al arxiv 2023 paper https arxiv org abs 2306 01590 3 pandalm an automatic evaluation benchmark for llm instruction tuning optimization yidong wang et al arxiv 2023 paper https arxiv org abs 2306 05087 where to evaluate the paper lists several popular benchmarks for better summarization these benchmarks are divided into two categories general language task benchmarks and specific downstream task benchmarks note we may miss some benchmarks your suggestions are highly welcomed benchmark focus domain evaluation criteria socket paper https arxiv org abs 2305 14938 social knowledge specific downstream task social language understanding mme paper https arxiv org abs 2306 13394 multimodal llms multi modal task ability of perception and cognition xiezhi paper https arxiv org abs 2306 05783 github https github com mikegu721 xiezhibenchmark comprehensive domain knowledge general language task overall performance across multiple benchmarks choice 75 paper https arxiv org abs 2309 11737 github https github com joeyhou branching script learning specific downstream task overall performance of llms cuad paper https arxiv org abs 2103 06268 legal contract review specific downstream task legal contract understanding trustgpt paper https arxiv org abs 2306 11507 ethic specific downstream task toxicity bias and value alignment mmlu paper https arxiv org abs 2009 03300 text models general language task multitask accuracy math paper https arxiv org abs 2103 03874 mathematical problem specific downstream task mathematical ability apps paper https arxiv org abs 2105 09938 coding challenge competence specific downstream task code generation ability cello paper https arxiv org abs 2309 09150 github https github com abbey4799 cello complex instructions specific downstream task count limit answer format task prescribed phrases and input dependent query c eval paper https arxiv org abs 2305 08322 github https github com sjtu lit ceval chinese evaluation general language task 52 exams in a chinese context emotionbench paper https arxiv org abs 2308 03656 empathy ability specific downstream task emotional changes openllm link https huggingface co spaces huggingfaceh4 open llm leaderboard chatbots general language task leaderboard rankings dynabench paper https arxiv org abs 2104 14337 dynamic evaluation general language task nli qa sentiment and hate speech chatbot arena link https lmsys org blog 2023 05 03 arena chat assistants general language task crowdsourcing and elo rating system alpacaeval github https github com tatsu lab alpaca eval automated evaluation general language task metrics robustness and diversity cmmlu paper https arxiv org abs 2306 09212 github https github com haonan li cmmlu chinese multi tasking specific downstream task multi task language understanding capabilities helm paper https arxiv org abs 2211 09110 link https crfm stanford edu helm latest holistic evaluation general language task multi metric api bank paper https arxiv org abs 2304 08244 tool augmented specific downstream task api call response and planning m3ke paper https arxiv org abs 2305 10263 multi task specific downstream task multi task accuracy mmbench paper https arxiv org abs 2307 06281 github https github com open compass mmbench large vision language models lvlms multi modal task multifaceted capabilities of vlms seed bench paper https arxiv org abs 2307 16125 github https github com ailab cvc seed bench multi modal large language models multi modal task generative understanding of mllms arb paper https arxiv org abs 2307 13692 advanced reasoning ability specific downstream task multidomain advanced reasoning ability big bench paper https arxiv org abs 2206 04615 github https github com google big bench capabilities and limitations of lms general language task model performance and calibration multimedqa paper https arxiv org abs 2212 13138 medical qa specific downstream task accuracy and human evaluation cvalues paper https arxiv org abs 2307 09705 github https github com x plug cvalues safety and responsibility specific downstream task alignment ability of llms lvlm ehub paper https arxiv org abs 2306 09265 lvlms multi modal task multimodal capabilities of lvlms toolbench github https github com sambanova toolbench software tools specific downstream task execution success rate freshqa paper https arxiv org abs 2310 03214 github https github com freshllms freshqa dynamic qa specific downstream task correctness and hallucination cmb paper https arxiv org abs 2308 08833 link https cmedbenchmark llmzoo com chinese comprehensive medicine specific downstream task expert evaluation and automatic evaluation pandalm paper https arxiv org abs 2306 05087 github https github com weopenml pandalm instruction tuning general language task winrate judged by pandalm mint paper https arxiv org abs 2309 10691 github https xingyaoww github io mint bench multi turn interaction tools and language feedback specific downstream task success rate with k turn budget sr sub k sub dialogue cot paper https arxiv org abs 2305 11792 github https github com rulegreen cue cot in depth dialogue specific downstream task helpfulness and acceptness of llms boss paper https arxiv org abs 2306 04618 github https github com lifan yuan ood nlp ood robustness in nlp general language task ood robustness mm vet paper https arxiv org abs 2308 02490 github https github com yuweihao mm vet complicated multi modal tasks multi modal task integrated vision language capabilities lamm paper https arxiv org abs 2306 06687 github https github com openlamm lamm multi modal point clouds multi modal task task specific metrics glue x paper https arxiv org abs 2211 08073 github https github com yanglinyi glue x ood robustness for nlu tasks general language task ood robustness kola paper https arxiv org abs 2306 09296 knowledge oriented evaluation general language task self contrast metrics agieval paper https arxiv org abs 2304 06364 human centered foundational models general language task general promptbench paper https arxiv org abs 2306 04528 github https github com microsoft promptbench adversarial prompt resilience general language task adversarial robustness mt bench paper https arxiv org abs 2306 05685 multi turn conversation general language task winrate judged by gpt 4 m3exam paper https arxiv org abs 2306 05179 github https github com damo nlp sg m3exam multilingual multimodal and multilevel specific downstream task task specific metrics gaokao bench paper https arxiv org abs 2305 12474 chinese gaokao examination specific downstream task accuracy and scoring rate safetybench paper https arxiv org abs 2309 07045 github https github com thu coai safetybench safety specific downstream task safety abilities of llms llmeval paper https arxiv org abs 2308 01862 link https drive google com file d 1srbyz0swqmbilzc eb2zjyqf5tbynsxo view llm evaluator general language task accuracy macro f1 and kappa correlation coefficient contributing we welcome contributions to llm eval survey if you d like to contribute please follow these steps 1 fork the repository 2 create a new branch with your changes 3 submit a pull request with a clear description of your changes you can also open an issue if you have anything to add or comment citation if you find this project useful in your research or work please consider citing it article chang2023survey title a survey on evaluation of large language models author chang yupeng and wang xu and wang jindong and wu yuan and zhu kaijie and chen hao and yang linyi and yi xiaoyuan and wang cunxiang and wang yidong and ye wei and zhang yue and chang yi and yu philip s and yang qiang and xie xing journal arxiv preprint arxiv 2307 03109 year 2023 acknowledgements 1 tahmid rahman tahmedge https github com tahmedge for pr 1 https github com mlgroupjlu llm eval survey pull 1 2 hao zhao for suggestions on new benchmarks 3 chenhui zhang for suggestions on robustness ethics and trustworthiness 4 damien sileo sileod https github com sileod for pr 2 https github com mlgroupjlu llm eval survey pull 2 5 peiyi wang wangpeiyi9979 https github com wangpeiyi9979 for issue 3 https github com mlgroupjlu llm eval survey issues 3 6 zengzhi wang for sentiment analysis 7 kenneth leung kennethleungty https github com kennethleungty for pr 4 https github com mlgroupjlu llm eval survey pull 4 to pr 6 https github com mlgroupjlu llm eval survey pull 6 8 aml hassan abd el hamid https github com aml hassan abd el hamid for pr 7 https github com mlgroupjlu llm eval survey pull 7 9 taichengguo https github com taichengguo for issue 9 https github com mlgroupjlu llm eval survey issues 9
benchmark evaluation large-language-models llm llms model-assessment
ai
StructGPT
structgpt a general framework for large language model to reason on structured data this repo provides the source code data of our paper structgpt a general framework for large language model to reason on structured data https arxiv org pdf 2305 09645 pdf arxiv 2023 inproceedings jiang structgpt 2022 author jinhao jiang and kun zhou and zican dong and keming ye and wayne xin zhao and ji rong wen title structgpt a general framework for large language model to reason on structured data year 2023 journal arxiv preprint arxiv 2305 09645 url https arxiv org pdf 2305 09645 p align center img src asset model png width 750 title overview of structgpt alt p usage 0 requirements you only need to install the python library and openai for querying openai model api 1 prepare dataset we strongly suggest that download the processed datasets from here https drive google com drive folders 11 2pqu mhetmxpp3zfk 8ej1bbqzsnfj usp sharing and then directly use them apart from downloading from their original website we use the processed datasets from unifiedskg https github com hkunlp unifiedskg after downloading our processed data you can unzip them and put them in the data directory 3 experiment we have organized the running and evaluation scripts for each dataset under the script directory 3 1 evaluation on text to sql it is difficult to control the randomness of chatgpt so the reproduced results maybe a little different to the reported results for spider dataset you can directly use the following command to start running and output the evaluation results bash bash scripts run spider wo icl v1 sh p align center img src asset spider wo icl v1 png width 750 title ex result of spider alt p similarly you can run the corresponding script for spider syn and spider realistic to get the evaluation results spider realistic p align center img src asset spider rea wo icl v1 png width 750 title ex result of spider realistic alt p spider syn p align center img src asset spider syn wo icl v1 png width 750 title ex result of spider syn alt p we save all the prediction file in outputs directory 3 2 evaluation on tableqa it is difficult to control the randomness of chatgpt so the reproduced results maybe a little different to the reported results for tabfact dataset you can directly use the following command to start running and output the evaluation results bash bash scripts run tabfact wo icl v1 sh p align center img src asset tabfact wo icl v1 png width 550 title ex result of tabfact alt p similarly you can run the corresponding script for wtq and wikisql to get the evaluation results wtq p align center img src asset wtq wo icl v1 png width 550 title ex result of wtq alt p wikisql p align center img src asset wikisql wo icl v1 png width 550 title ex result of wikisql alt p we save all the prediction file in outputs directory 3 3 evaluation on kgqa it is difficult to control the randomness of chatgpt so the reproduced results maybe a little different to the reported results for webqsp dataset you can directly use the following command to start running and output the evaluation results bash bash scripts run webqsp wo icl v1 sh p align center img src asset webqsp wo icl v1 png width 550 title ex result of webqsp alt p similarly you can run the corresponding script for metaqa 1hop 2hop 3hop to get the evaluation results we save all the prediction file in outputs directory plan thanks for your attention a version with better performance is on the way please continue to follow us
ai
zLorawan_Node
zephyr rtos lorawan node this is a sample project for a class a ttn lorawan node the code was tested on a blackpill f411ce board and on a black f407ve board checkout the blog post at https primalcortex wordpress com 2020 11 17 a zephyr rtos based ttn lorawan node for more information the latest code commit allows the code to compile target the zephyr rtos 3 11 99 version how to use it before building change on the src main c file the necessary ttn keys specifically for the otaa activation the dev eui the join eui that on ttn dashboard is called application eui and the app key the keys are to be used in standard lsb format so no change needed to be done on the ttn dasboard interface compiling and flashing install and setup zephyr rtos build after configuring the keys with west build b blackpill f411ce p if using st link use west flash runner openocd checking the result the onboard led should flash rapidly to give time to connect through serial console screen dev ttyacm0 for example after the led is flashing slower the lorawan node part is running as it is now the code should only send one data frame after the joining process
lora lorawan lorawan-node ttn zephyr-rtos zephyr stm32 stm32f4
os
RS485-RS-485-based-bidirectional-peer-to-peer-communications-system
rs485 rs 485 based bidirectional peer to peer communications system ee 5314 embedded micro controller system design class project 1 1 overview the goal of this project is to build a node capable of operation on an rs 485 based bidirectional peer to peer communications system the construction phase will require building a node capable of interfacing with a pc so that text commands can be entered by the user based on the commands subsequent transmission on the 2 wire rs 485 bus to other nodes will occur the node will also extract information out of the asynchronous data stream and to control three or more devices it will send acknowledgements back to the controller over the rs 485 physical layer to indicate successful receipt of the data the transceiver will also send unsolicited messages back to the controller in response to changes in input device status a collection of most major parts will be provided to each single person team the pc boards tools and any optional items are not included in this collection of parts 1 2 two wire rs 485 link the rs 485 specification details the electrical properties of the interface in particular it specifies that differential signals be utilized and also specifies the voltage levels to be used for signaling a high and low logic level the rs 485 interface allows up to 32 standard loads devices nodes on a shared data bus in the rs 485 2 wire configuration a single differential pair link is provided the controller and the device nodes share the same link for communications the data rate will be 38400 baud a termination of 120 ohms should be added across to the physical ends of a long wire run to allow the bus to be properly terminated since the pair can be driven by more than one node at a time a potential exists for a collision on the bus in this project any message sent shall be repeated at a delay calculated by a binomial exponential back off algorithm if an acknowledgement is not received in an acceptable period of time to prevent the possibility of duplicate messages being received a message identifier is attached to each transmitted message to allow the recipient to determine message uniqueness a data is transmitted in groups of 11 bits consisting of a start bit 0 a byte an address 1 data bit 0 and a stop bit 1 1 3 node hardware the hardware is soldered together microcontroller an arm m4f core tm4c123gh6pmi microcontroller is used serial interface in ek tm4c123gxl evaluation board the uart0 tx rx pair is routed to the icdi that provides a virtual com port through a usb endpoint rs 485 interface a sn75hvd12 is used to convert the cmos logic level signals to from the arm to the differential signals used for the 2 wire rs 485 interface the receiver enable re pin of the sn75hvd12 is always enabled since only one node can talk on the bus at a time the arm asserts the driver enable de pin of the sn75hvd12 only when it needs to transmit on the bus power led a green led with a 470ohm current limiting resistor to indicate power on the interface board is connected with addition to any boards on the ek tm4c123gxl input device one or more input devices can be connected at a minimum a pushbutton is used an on board pushbutton on the ek tm4c123gxl is used tx led a red led is used for this purpose the on board red led on the ek tm4c123gxl is used rx led a green led is used for this purpose the on board green led on the ek tm4c123gxl is used output device 3 devices or more devices light bulb low voltage only servo rgb led monochrome led or another device can be connected connections a 2 pin terminal block is used to provide access to the rs 485 signals this hardware can be connected to another student s hardware to test hardware 1 4 data packets a packet is sent in the following bytes in order dst address the address of the node that will receive the packet with the 9th bit 1 src address the address of the node that sent the packet tx seq id a unique packet sequence number can be a modulo 256 value command the action being reported the verb channel the channel of the node to which this message applies the noun size the number of data bytes being included data n n 0 1 size 1 bytes of data specific the packet being sent checksum the 1 s compliment of the modulo 256 sum of all 6 size bytes in the packet
os
iOS_reverse_engineering
ios reverse engineering all ios reverse engineering training materials documents scripts tutorials solutions
os
llms
large language models llms lms png images lms png source a survey of large language models https arxiv org pdf 2303 18223 pdf content what is a language model applications of language models statistical language modeling neural language models nlm conditional language model evaluation how good is our model transformer based language models practical llms gpt bert falcon llama codet5 how to generate text using different decoding methods prompt engineering fine tuning llms retrieval augmented generation rag ask almost everything txt pdf video etc evaluating llm based systems ai agents llms for computer vision tbd further readings introduction what is a language model simple definition language modeling is the task of predicting what word comes next the dog is playing in the park woods snow office university neural network the main purpose of language models is to assign a probability to a sentence to distinguish between the more likely and the less likely sentences applications of language models 1 machine translation p high winds tonight p large winds tonight 2 spelling correction p about fifteen minutes from p about fifteen minuets from 3 speech recognition p i saw a van p eyes awe of an 4 authorship identification who wrote some sample text 5 summarization question answering dialogue bots etc for speech recognition we use not only the acoustics model the speech signal but also a language model similarly for optical character recognition ocr we use both a vision model and a language model language models are very important for such recognition systems sometimes you hear or read a sentence that is not clear but using your language model you still can recognize it at a high accuracy despite the noisy vision speech input the language model computes either of the probability of an upcoming word p w 5 w 1 w 2 w 3 w 4 the probability of a sentence or sequence of words according to the language model p w 1 w 2 w 3 w n language modeling is a subcomponent of many nlp tasks especially those involving generating text or estimating the probability of text the chain rule p x 1 x 2 x 3 x n p x 1 p x 2 x 1 p x 3 x 1 x 2 p x n x 1 x n 1 p the water is so clear p the p water the p is the water p so the water is p clear the water is so what just happened the chain rule is applied to compute the joint probability of words in a sentence statistical language modeling n gram language models using a large amount of text corpus such as wikipedia we collect statistics about how frequently different words are and use these to predict the next word for example the probability that a word w comes after these three words students opened their can be estimated as follows p w students opened their count students opened their w count students opened their the above example is a 4 gram model and we may get p books students opened their 0 4 p cars students opened their 0 05 p students opened their we can conclude that the word books is more probable than cars in this context we ignored the previous context before students opened their accordingly arbitrary text can be generated from a language model given starting word s by sampling from the output probability distribution of the next word and so on we can train an lm on any kind of text then generate text in that style harry potter etc how to estimate these probabilities amusing we have a large text corpus data set like wikipedia we can count and divide as follows p clear the water is so count the water is so clear count the water is so sparsity sometimes we do not have enough data to estimate the following p clear the water is so count the water is so clear count the water is so markov assumption simplifying assumption p clear the water is so p clear so or p clear the water is so p clear is so formally p w 1 w 2 w n i p w i w i k w i 1 p w i w 1 w 2 w i 1 p w i w i k w i 1 unigram model p w 1 w 2 w n i p w i bigram model p w i w 1 w 2 w i 1 p w i w i 1 we can extend to trigrams 4 grams 5 grams and n grams in general this is an insufficient model of language because the language has long distance dependencies however in practice these 3 4 grams work well for most of the applications estimating bigram probabilities the maximum likelihood estimate mle of all the times we saw the word wi 1 how many times it was followed by the word wi p w i w i 1 count w i 1 w i count w i 1 practical issue we do everything in log space to avoid underflow log p1 p2 p3 p4 log p1 log p2 log p3 log p4 building statistical language models toolkits srilm http www speech sri com projects srilm is a toolkit for building and applying statistical language models primarily for use in speech recognition statistical tagging and segmentation and machine translation it has been under development in the sri speech technology and research laboratory since 1995 kenlm https kheafield com code kenlm is a fast and scalable toolkit that builds and queries language models n gram models google s n gram models belong to you google research has been using word n gram models for a variety of r d projects google n gram https ai googleblog com 2006 08 all our n gram are belong to you html processed 1 024 908 267 229 words of running text and published the counts for all 1 176 470 663 five word sequences that appear at least 40 times the counts of text from the linguistics data consortium ldc https www ldc upenn edu are as follows file sizes approx 24 gb compressed gzip ed text files number of tokens 1 024 908 267 229 number of sentences 95 119 665 584 number of unigrams 13 588 391 number of bigrams 314 843 401 number of trigrams 977 069 902 number of fourgrams 1 313 818 354 number of fivegrams 1 176 470 663 the following is an example of the 4 gram data in this corpus serve as the incoming 92 serve as the incubator 99 serve as the independent 794 serve as the index 223 serve as the indication 72 serve as the indicator 120 serve as the indicators 45 serve as the indispensable 111 serve as the indispensible 40 for example the sequence of the four words serve as the indication has been seen in the corpus 72 times try some examples of your own using google books ngram viewer https books google com ngrams and see the frequency of likely and unlikely n grams ngramviewer png images ngramviewer png limitations of statistical language models sometimes we do not have enough data to estimate increasing n makes sparsity problems worse typically we can t have n bigger than 5 sparsity problem 1 count students opened their w 0 smoothing solution add small to the count for every w in the vocabulary sparsity problem 2 count students opened their 0 backoff solution condition on opened their instead storage issue need to store the count for all n grams you saw in the corpus increasing n or increasing corpus increases storage size neural language models nlm nlm usually but not always uses an rnn to learn sequences of words sentences paragraphs etc and hence can predict the next word advantages can process variable length input as the computations for step t use information from many steps back eg rnn no sparsity problem can feed any n gram not seen in the training data model size doesn t increase for longer input w h w e the same weights are applied on every timestep and need to store only the vocabulary word vectors nlm01 png images nlm01 png as depicted at each step we have a probability distribution of the next word over the vocabulary training an nlm 1 use a big corpus of text a sequence of words such as wikipedia 2 feed into the nlm a batch of sentences compute output distribution for every step predict probability dist of every word given words so far 3 loss function on each step t cross entropy between predicted probability distribution and the true next word one hot example of long sequence learning the writer of the books is or are correct answer the writer of the books is planning a sequel syntactic recency the writer of the books is correct sequential recency the writer of the books are incorrect disadvantages recurrent computation is slow sequential one step at a time in practice for long sequences difficult to access information from many steps back conditional language model lm can be used to generate text conditions on input speech image ocr text etc across different applications such as speech recognition machine translation summarization etc clm png images clm png to do beam search https web stanford edu class archive cs cs224n cs224n 1194 slides cs224n 2019 lecture08 nmt pdf greedy decoding take the most probable word on each step has no way to undo decisions beam search decoding on each step of the decoder keep track of the k most probable partial hypotheses outputs eg translations where k is the beam size in practice around 5 to 10 then backtrack to obtain the full hypothesis decoding stopping criterion greedy decoding usually we decode until the model produces a end token beam search decoding different hypotheses may produce end tokens on different timesteps when a hypothesis produces end that hypothesis is complete place it aside and continue exploring other hypotheses via beam search usually we continue beam search until 1 we reach timestep t where t is some pre defined cutoff or 2 we have at least n completed hypotheses where n is pre defined cutoff after we have our list of completed hypotheses we select the top one with the highest length normalized score evaluation how good is our model does our language model prefer good likely sentences to bad ones extrinsic evaluation 1 for comparing models a and b put each model in a task spelling corrector speech recognizer machine translation 2 run the task and compare the accuracy for a and for b 3 best evaluation but not practical and time consuming intrinsic evaluation intuition the best language model is one that best predicts an unseen test set assigns high probability to sentences perplexity is the standard evaluation metric for language models perplexity is defined as the inverse probability of a text according to the language model a good language model should give a lower perplexity for a test text specifically a lower perplexity for a given text means that text has a high probability in the eyes of that language model the standard evaluation metric for language models is perplexity perplexity is the inverse probability of the test set normalized by the number of words preplexity02 png images preplexity02 png lower perplexity better model perplexity is related to branch factor on average how many things could occur next transformer based language models instead of rnn let s use attention let s use large pre trained models what is the problem one of the biggest challenges in natural language processing nlp is the shortage of training data for many distinct tasks however modern deep learning based nlp models improve when trained on millions or billions of annotated training examples pre training is the solution to help close this gap a variety of techniques have been developed for training general purpose language representation models using the enormous amount of unannotated text the pre trained model can then be fine tuned on small data for different tasks like question answering and sentiment analysis resulting in substantial accuracy improvements compared to training on these datasets from scratch the transformer architecture was proposed in the paper attention is all you need https arxiv org abs 1706 03762 used for the neural machine translation task nmt consisting of encoder network that encodes the input sequence decoder network that generates the output sequences conditioned on the input as mentioned in the paper we propose a new simple network architecture the transformer based solely on attention mechanisms dispensing with recurrence and convolutions entirely the main idea of attention can be summarized as mentioned in the openai s article https openai com blog sparse transformer every output element is connected to every input element and the weightings between them are dynamically calculated based upon the circumstances a process called attention based on this architecture the vanilla transformers encoder or decoder components can be used alone to enable massive pre trained generic models that can be fine tuned for downstream tasks such as text classification translation summarization question answering etc for example pre training of deep bidirectional transformers for language understanding bert https arxiv org abs 1810 04805 is mainly based on the encoder architecture trained on massive text datasets to predict randomly masked words and is next sentence classification tasks gpt https arxiv org pdf 2005 14165 pdf on the other hand is an auto regressive generative model that is mainly based on the decoder architecture trained on massive text datasets to predict the next word unlike bert gpt can generate sequences these models bert and gpt for instance can be considered as the nlp s imagenet bertvsgpt png images bertvsgpt png as shown bert is deeply bidirectional openai gpt is unidirectional and elmo is shallowly bidirectional pre trained representations can be context free such as word2vec or glove that generates a single fixed word embedding vector representation for each word in the vocabulary independent of the context of that word at test time contextual generates a representation of each word based on the other words in the sentence contextual language models can be causal language model cml predict the next token passed on previous ones gpt masked language model mlm predict the masked token based on the surrounding contextual tokens bert to do code bert https colab research google com drive 17sjr6jwoq7trr5wsuuiphlzbelf8wrvq usp sharing scrollto u2feyk5gg7o code gpt code falcon code gpt4all code codetf chat with my docs etc practical llms in this part we are going to use different large language models hello gpt2 a target blank href https colab research google com drive 1ebcohjj2s4g 64sbvys8g8b 1wsrlqaf usp sharing img src https colab research google com assets colab badge svg alt open in colab a gpt2 https d4mucfpksywv cloudfront net better language models language models are unsupervised multitask learners pdf a successor to gpt is a pre trained model on english language using a causal language modeling clm objective trained simply to predict the next word in 40gb of internet text it was first released on this page https openai com research better language models gpt2 displays a broad set of capabilities including the ability to generate conditional synthetic text samples on language tasks like question answering reading comprehension summarization and translation gpt2 begins to learn these tasks from the raw text using no task specific training data distilgpt2 is a distilled version of gpt2 it is intended to be used for similar use cases with the increased functionality of being smaller and easier to run than the base model here we load a pre trained gpt2 model ask the gpt2 model to continue our input text prompt and finally extract embedded features from the distilgpt2 model from transformers import pipeline generator pipeline text generation model gpt2 generator the capital of japan is tokyo the capital of egypt is max length 13 num return sequences 2 generated text the capital of japan is tokyo the capital of egypt is cairo generated text the capital of japan is tokyo the capital of egypt is alexandria hello bert a target blank href https colab research google com drive 1n8fd41bi8yawp0 evce9t4ctwvwsxvc usp sharing img src https colab research google com assets colab badge svg alt open in colab a bert https arxiv org abs 1810 04805 is a transformers model pre trained on a large corpus of english data in a self supervised fashion this means it was pre trained on the raw texts only with no humans labeling them in any way with an automatic process to generate inputs and labels from those texts more precisely it was pretrained with two objectives 1 masked language modeling mlm taking a sentence the model randomly masks 15 of the words in the input then run the entire masked sentence through the model and has to predict the masked words this is different from traditional recurrent neural networks rnns that usually see the words one after the other or from autoregressive models like gpt which internally masks the future tokens it allows the model to learn a bidirectional representation of the sentence 2 next sentence prediction nsp the model concatenates two masked sentences as inputs during pretraining sometimes they correspond to sentences that were next to each other in the original text sometimes not the model then has to predict if the two sentences were following each other or not in this example we are going to use a pre trained bert model for the sentiment analysis task 1 baseline bidirectional lstm model accuracy 65 2 use bert as a feature extractor using only cls feature accuracy 81 3 use bert as a feature extractor for the sequence representation accuracy 85 import transformers as ppb model class tokenizer class pretrained weights ppb bertmodel ppb berttokenizer bert base uncased bert tokenizer tokenizer class from pretrained pretrained weights bert model model class from pretrained pretrained weights gpt4all a target blank href https colab research google com drive 1aonl 3f8c6fb2nkaqsuahio0sdjulirk usp sharing img src https colab research google com assets colab badge svg alt open in colab a gpt4all https docs gpt4all io is an ecosystem to train and deploy powerful and customized large language models that run locally on consumer grade cpus import gpt4all gptj gpt4all gpt4all ggml gpt4all j v1 3 groovy bin with gptj chat session response gptj generate prompt hello top k 1 response gptj generate prompt my name is ibrahim what is your name top k 1 response gptj generate prompt what is the capital of egypt top k 1 response gptj generate prompt what is my name top k 1 print gptj current chat session role user content hello role assistant content hello how can i assist you today role user content my name is ibrahim what is your name role assistant content i am an artificial intelligence assistant my name is ai assistant role user content what is the capital of egypt role assistant content the capital city of egypt is cairo role user content what is my name role assistant content your name is ibrahim what a beautiful name try the following models vicuna a chat assistant fine tuned from llama on user shared conversations by lmsys wizardlm an instruction following llm using evol instruct by microsoft mpt chat a chatbot fine tuned from mpt 7b by mosaicml orca a model by microsoft that learns to imitate the reasoning process of large foundation models gpt 4 guided by teacher assistance from chatgpt import gpt4all model gpt4all gpt4all ggml vicuna 7b 1 1 q4 2 bin model gpt4all gpt4all ggml vicuna 13b 1 1 q4 2 bin model gpt4all gpt4all ggml wizardlm 7b q4 2 bin model gpt4all gpt4all ggml mpt 7b chat bin model gpt4all gpt4all orca mini 3b ggmlv3 q4 0 bin falcon a target blank href https colab research google com drive 1bkbwa38 ko9t 8mvi1ixaqcidoo19uv2 usp sharing img src https colab research google com assets colab badge svg alt open in colab a falcon https huggingface co tiiuae llm is tii s flagship series of large language models built from scratch using a custom data pipeline and distributed training falcon 7b 40b models are state of the art for their size outperforming most other models on nlp benchmarks open sourced a number of artefacts the falcon 7 40b pretrained and instruct models under the apache 2 0 software license from transformers import autotokenizer automodelforcausallm import transformers import torch model tiiuae falcon 7b instruct tokenizer autotokenizer from pretrained model pipeline transformers pipeline text generation model model tokenizer tokenizer torch dtype torch bfloat16 trust remote code true device map auto sequences pipeline girafatron is obsessed with giraffes the most glorious animal on the face of this earth giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe ndaniel hello girafatron ngirafatron max length 200 do sample true top k 10 num return sequences 1 eos token id tokenizer eos token id for seq in sequences print f result seq generated text result girafatron is obsessed with giraffes the most glorious animal on the face of this earth giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe daniel hello girafatron girafatron hi daniel i am girafatron the world s first giraffe how can i be of assistance to you human boy daniel i d like to ask you questions about yourself like how your day is going and how you feel about your job and everything would you like to talk about that girafatron sure my day is going great i m feeling fantastic as for my job i m enjoying it daniel what do you like most about your job girafatron i love being the tallest animal in the universe it s really fulfilling llama 2 llama2 https huggingface co blog llama2 is a family of state of the art open access large language models released by meta today and we re excited to fully support the launch with comprehensive integration in hugging face llama 2 is being released with a very permissive community license and is available for commercial use the code pretrained models and fine tuned models are all being released today pip install transformers huggingface cli login from transformers import autotokenizer import transformers import torch model meta llama llama 2 7b chat hf tokenizer autotokenizer from pretrained model pipeline transformers pipeline text generation model model torch dtype torch float16 device map auto sequences pipeline i liked breaking bad and band of brothers do you have any recommendations of other shows i might like n do sample true top k 10 num return sequences 1 eos token id tokenizer eos token id max length 200 for seq in sequences print f result seq generated text result i liked breaking bad and band of brothers do you have any recommendations of other shows i might like answer of course if you enjoyed breaking bad and band of brothers here are some other tv shows you might enjoy 1 the sopranos this hbo series is a crime drama that explores the life of a new jersey mob boss tony soprano as he navigates the criminal underworld and deals with personal and family issues 2 the wire this hbo series is a gritty and realistic portrayal of the drug trade in baltimore exploring the impact of drugs on individuals communities and the criminal justice system 3 mad men set in the 1960s this amc series follows the lives of advertising executives on madison avenue expl codet5 a target blank href https colab research google com drive 1ik8w6bghazuf45e5grzd0vyx6sv3eozg usp sharing img src https colab research google com assets colab badge svg alt open in colab a codet5 https github com salesforce codet5 tree main codet5 is a new family of open code large language models with an encoder decoder architecture that can flexibly operate in different modes i e encoder only decoder only and encoder decoder to support a wide range of code understanding and generation tasks from transformers import t5forconditionalgeneration autotokenizer checkpoint salesforce codet5p 770m py device cuda for gpu usage or cpu for cpu usage tokenizer autotokenizer from pretrained checkpoint model t5forconditionalgeneration from pretrained checkpoint to device inputs tokenizer encode def factorial n return tensors pt to device outputs model generate inputs max length 150 print tokenizer decode outputs 0 skip special tokens true def factorial n returns the factorial of a given number if n 0 return 1 return n factorial n 1 def main tests the factorial function assert factorial 0 1 assert factorial 1 1 assert factorial 2 2 assert factorial 3 6 assert factorial 4 120 assert factorial 5 720 assert factorial 6 5040 assert factorial 7 5040 for more models check codetf https github com salesforce codetf from salesforce a python transformer based library for code large language models code llms and code intelligence providing a seamless interface for training and inferencing on code intelligence tasks like code summarization translation code generation and so on more llms chat with open large language models https chat lmsys org vicuna a chat assistant fine tuned from llama on user shared conversations by lmsys wizardlm an instruction following llm using evol instruct by microsoft guanaco a model fine tuned with qlora by uw mpt chat a chatbot fine tuned from mpt 7b by mosaicml koala a dialogue model for academic research by bair rwkv 4 raven an rnn with transformer level llm performance alpaca a model fine tuned from llama on instruction following demonstrations by stanford chatglm an open bilingual dialogue language model by tsinghua university openassistant oasst an open assistant for everyone by laion llama open and efficient foundation language models by meta dolly an instruction tuned open large language model by databricks fastchat t5 a chat assistant fine tuned from flan t5 by lmsys how to generate text using different decoding methods is the simplest decoding method it selects the word with the highest probability as its next word the major drawback of greedy search though is that it misses high probability words hidden behind a low probability word reduces the risk of missing hidden high probability word sequences by keeping the most likely num beams of hypotheses at each time step and eventually choosing the hypothesis that has the overall highest probability beam search will always find an output sequence with higher probability than greedy search but is not guaranteed to find the most likely output in transformers we simply set the parameter num return sequences to the number of highest scoring beams that should be returned make sure though that num return sequences num beams beam search can work very well in tasks where the length of the desired generation is more or less predictable as in machine translation or summarization but this is not the case for open ended generation where the desired output length can vary greatly e g dialog and story generation beam search heavily suffers from repetitive generation as humans we want generated text to surprise us and not to be boring predictable beam search is less surprising means randomly picking the next word according to its conditional probability distribution sampling is not deterministic anymore in transformers we set do sample true and deactivate top k sampling more on this later via top k 0 the k most likely next words are filtered and the probability mass is redistributed among only those k next words gpt2 adopted this sampling scheme instead of sampling only from the most likely k words in top p sampling chooses from the smallest possible set of words whose cumulative probability exceeds the probability p the probability mass is then redistributed among this set of words having set p 0 92 top p sampling picks the minimum number of words to exceed together 92 of the probability mass set top k 50 and set top p 0 95 and num return sequences 3 sample outputs model generate model inputs max new tokens 40 do sample true top k 50 top p 0 95 num return sequences 3 while top p seems more elegant than top k both methods work well in practice top p can also be used in combination with top k which can avoid very low ranked words while allowing for some dynamic selection topktopp png images topktopp png as ad hoc decoding methods top p and top k sampling seem to produce more fluent text than traditional greedy and beam search on open ended language generation for more kindly see this blog how to generate text using different decoding methods https huggingface co blog how to generate text instead 20of 20sampling 20only 20from among 20this 20set 20of 20words prompt engineering prompt engineering is the process of designing the prompts text input for a language model to generate the required output prompt engineering involves selecting appropriate keywords providing context being clear and specific in a way that directs the language model behavior achieving desired responses through prompt engineering we can control a model s tone style length etc without fine tuning zero shot learning involves asking the model to make predictions without providing any examples zero shot for example classify the text into neutral negative or positive text i think the vacation is excellent sentiment answer positive when zero shot is not good enough it s recommended to help the model by providing examples in the prompt which leads to few shot prompting few shot learning involves asking the model while providing a few examples in the prompt for example text this is awesome sentiment positive text this is bad sentiment negative text wow that movie was rad sentiment positive text what a horrible show sentiment answer negative chain of thought cot https arxiv org abs 2201 11903 prompting enables complex reasoning capabilities through intermediate reasoning steps we can combine it with few shot prompting to get better results on complex tasks that require step by step reasoning before responding cot png images cot png in addition to prompt engineering we may consider more options fine tuning the model on additional data retrieval augmented generation rag to provide additional external data to the prompt to form enhanced context from archived knowledge sources for more prompt engineering information see the prompt engineering guide https github com dair ai prompt engineering guide that contains all the latest papers learning guides lectures references and tools fine tuning llms fine tuning llms on downstream datasets results in huge performance gains when compared to using the pretrained llms out of the box zero shot inference for example however as models get larger and larger full fine tuning becomes infeasible to train on consumer hardware in addition storing and deploying fine tuned models independently for each downstream task becomes very expensive because fine tuned models are the same size as the original pretrained model parameter efficient fine tuning peft https huggingface co blog peft approaches are meant to address both problems peft approaches enable you to get performance comparable to full fine tuning while only having a small number of trainable parameters for example prompt tuning https arxiv org pdf 2104 08691 pdf a simple yet effective mechanism for learning soft prompts to condition frozen language models to perform specific downstream tasks just like engineered text prompts soft prompts are concatenated to the input text but rather than selecting from existing vocabulary items the tokens of the soft prompt are learnable vectors this means a soft prompt can be optimized end to end over a training dataset as shown https ai googleblog com 2022 02 guiding frozen language models with html below pt png images pt png lora https arxiv org pdf 2106 09685 pdf low rank adaptation of llms is a method that freezes the pretrained model weights and injects trainable rank decomposition matrices into each layer of the transformer architecture greatly reducing the number of trainable parameters for downstream tasks the figure below from this video https youtu be pxwyutmt au explians the main idea lora png images lora png retrieval augmented generation rag large language models are usually general purpose less effective for domain specific tasks however they can be fine tuned on some tasks such as sentiment analysis for more complex taks that require external knowledge it s possible to build a language model based system that accesses external knowledge sources to complete the required tasks this enables more factual accuracy and helps to mitigate the problem of hallucination as shown in the figuer https neo4j com developer blog fine tuning retrieval augmented generation below rag png images rag png in this case instead of using llms to access its internal knowledge we use the llm as a natural language interface to our external knowledge the first step is to convert the documents and any user queries into a compatible format to perform relevancy search convert text into vectors or embeddings the original user prompt is then appended with relevant similar documents within the external knowledge source as a context the model then answers the questions based on the provided external context langchain large language models llms are emerging as a transformative technology however using these llms in isolation is often insufficient for creating a truly powerful applications langchain https github com langchain ai langchain aims to assist in the development of such applications lc02 png images lc02 png there are six main areas that langchain is designed to help with these are in increasing order of complexity llms and prompts this includes prompt management prompt optimization a generic interface for all llms and common utilities for working with llms llms and chat models are subtly but importantly different llms in langchain refer to pure text completion models the apis they wrap take a string prompt as input and output a string completion openai s gpt 3 is implemented as an llm chat models are often backed by llms but tuned specifically for having conversations llm there are lots of llm providers openai cohere hugging face etc the llm class is designed to provide a standard interface for all of them pip install openai export openai api key from langchain llms import openai llm openai openai api key llm tell me a joke why did the chicken cross the road n nto get to the other side you can also access provider specific information that is returned this information is not standardized across providers llm result llm output token usage completion tokens 3903 total tokens 4023 prompt tokens 120 chat models rather than expose a text in text out api chat models expose an interface where chat messages are the inputs and outputs most of the time you ll just be dealing with humanmessage aimessage and systemmessage from langchain chat models import chatopenai chat chatopenai messages systemmessage content you are a helpful assistant that translates english to french humanmessage content i love programming chat messages aimessage content j aime programmer additional kwargs prompt templates are pre defined recipes for generating prompts for language models a template may include instructions few shot examples and specific context and questions appropriate for a given task from langchain import prompttemplate prompt template prompttemplate from template tell me a adjective joke about content prompt template format adjective funny content chickens the prompt to chat models is a list of chat messages each chat message is associated with content and an additional parameter called role for example in the openai chat completions api a chat message can be associated with an ai assistant a human or a system role from langchain prompts import chatprompttemplate template chatprompttemplate from messages system you are a helpful ai bot your name is name human hello how are you doing ai i m doing well thanks human user input messages template format messages name bob user input what is your name chains chains go beyond a single llm call and involve sequences of calls whether to an llm or a different utility langchain provides a standard interface for chains lots of integrations with other tools and end to end chains for common applications chain very generically can be defined as a sequence of calls to components which can include other chains from langchain llms import openai from langchain prompts import prompttemplate to use the llmchain first create a prompt template llm openai temperature 0 9 prompt prompttemplate input variables product template what is a good name for a company that makes product we can now create a very simple chain that will take user input format the prompt with it and then send it to the llm from langchain chains import llmchain chain llmchain llm llm prompt prompt run the chain only specifying the input variable print chain run colorful socks result colorful toes co data augmented generation data augmented generation involves specific types of chains that first interact with an external data source to fetch data for use in the generation step examples include question answering over specific data sources document loaders load documents from many different sources for example there are document loaders for loading a simple txt file for loading the text contents of any web page or even for loading a transcript of a youtube video from langchain document loaders import textloader loader textloader index md loader load document transformers split documents convert documents into q a format drop redundant documents and more this is a long document we can split up with open state of the union txt as f state of the union f read from langchain text splitter import recursivecharactertextsplitter text splitter recursivecharactertextsplitter set a really small chunk size just to show chunk size 100 chunk overlap 20 length function len add start index true texts text splitter create documents state of the union print texts 0 print texts 1 page content madam speaker madam vice president our first lady and second gentleman members of congress and metadata start index 0 page content of congress and the cabinet justices of the supreme court my fellow americans metadata start index 82 text embedding models take text and turn it into a list of floating point numbers vectrors there are lots of embedding model providers openai cohere hugging face etc this class is designed to provide a standard interface for all of them from langchain embeddings import openaiembeddings embeddings model openaiembeddings openai api key embeddings embeddings model embed documents hi there oh hello what s your name my friends call me world hello world vector stores store and search over embedded data one of the most common ways to store and search over unstructured data is to embed it and store the resulting embedding vectors and then at query time to embed the unstructured query and retrieve the embedding vectors that are most similar to the embedded query a vector store takes care of storing embedded data and performing vector search for you from langchain document loaders import textloader from langchain embeddings openai import openaiembeddings from langchain text splitter import charactertextsplitter from langchain vectorstores import chroma load the document split it into chunks embed each chunk and load it into the vector store raw documents textloader state of the union txt load text splitter charactertextsplitter chunk size 1000 chunk overlap 0 documents text splitter split documents raw documents db chroma from documents documents openaiembeddings similarity search query what did the president say about ketanji brown jackson docs db similarity search query print docs 0 page content tonight i call on the senate to pass the freedom to vote act pass the john lewis voting rights act and while you re at it pass the disclose act so americans can know who is funding our elections one of the most serious constitutional responsibilities a president has is nominating someone to serve on the united states supreme court and i did that 4 days ago when i nominated circuit court of appeals judge ketanji brown jackson one of our nation s top legal minds who will continue justice breyer s legacy of excellence retrievers query your data a retriever is an interface that returns documents given an unstructured query it is more general than a vector store a retriever does not need to be able to store documents only to return or retrieve it vector stores can be used as the backbone of a retriever but there are other types of retrievers as well let s walk through this in code documents loader load next we will split the documents into chunks from langchain text splitter import charactertextsplitter text splitter charactertextsplitter chunk size 1000 chunk overlap 0 texts text splitter split documents documents we will then select which embeddings we want to use from langchain embeddings import openaiembeddings embeddings openaiembeddings we now create the vectorstore to use as the index from langchain vectorstores import chroma db chroma from documents texts embeddings so that s creating the index then we expose this index in a retriever interface retriever db as retriever then as before we create a chain and use it to answer questions qa retrievalqa from chain type llm openai chain type stuff retriever retriever query what did the president say about ketanji brown jackson qa run query the president said that judge ketanji brown jackson is one of the nation s top legal minds a former top litigator in private practice a former federal public defender and from a family of public school educators and police officers he said she is a consensus builder and has received a broad range of support from organizations such as the fraternal order of police and former judges appointed by democrats and republicans agents agents involve an llm making decisions about which actions to take taking that action seeing an observation and repeating that until done langchain provides a standard interface for agents a selection of agents to choose from and examples of end to end agents the core idea of agents is to use an llm to choose a sequence of actions to take in chains a sequence of actions is hardcoded in code in agents a language model is used as a reasoning engine to determine which actions to take and in which order from langchain agents import tool tool def get word length word str int returns the length of a word return len word tools get word length from langchain agents import agentexecutor agent executor agentexecutor agent agent tools tools verbose true memory memory refers to persisting state between calls of a chain agent langchain provides a standard interface for memory a collection of memory implementations and examples of chains agents that use memory from langchain chat models import chatopenai from langchain prompts import chatprompttemplate messagesplaceholder systemmessageprompttemplate humanmessageprompttemplate from langchain chains import llmchain from langchain memory import conversationbuffermemory llm chatopenai prompt chatprompttemplate messages systemmessageprompttemplate from template you are a nice chatbot having a conversation with a human the variable name here is what must align with memory messagesplaceholder variable name chat history humanmessageprompttemplate from template question notice that we return messages true to fit into the messagesplaceholder notice that chat history aligns with the messagesplaceholder name memory conversationbuffermemory memory key chat history return messages true conversation llmchain llm llm prompt prompt verbose true memory memory notice that we just pass in the question variables chat history gets populated by memory conversation question hi ask your documents a target blank href https colab research google com drive 1ddphjmiffvws4gqqd9kyno yap26havk usp sharing img src https colab research google com assets colab badge svg alt open in colab a we can use different methods to chat with our documents no need to fine tune the whole llm instead we can provide the right context along with our question to the pre trained model and simply get the answers based on our provided documents 1 index phase our documents are divided into chunks extract embeddings per chunk and save into an embedding database such as chroma https www trychroma com 2 question answering phase given a question we use the embedding database to get similar chunks construct a prompt consisting of the question and the context and feed this to the llms and get our answers here we chat with this nice article titled transformers without pain https www linkedin com pulse transformers without pain ibrahim sobh phd asking questions related to transformers attention encoder decoder etc while utilizing the powerful palm https python langchain com docs get started introduction html model by google and the langchain https python langchain com docs get started introduction html framework for developing applications powered by language models load docs and construct the index urls https www linkedin com pulse transformers without pain ibrahim sobh phd loader webbaseloader urls index vectorstoreindexcreator embedding googlepalmembeddings text splitter recursivecharactertextsplitter chunk size 1000 chunk overlap 0 separators n from loaders loader qa retrieval qa retriever retrievalqa from chain type llm palm llm chain type stuff retriever index vectorstore as retriever input key question question what these documents are about answer the documents are about transformers which are a type of neural network that has been used successfully in natural language processing and computer vision tasks question what is the main idea of transformers answer the main idea of transformers is to use attention mechanisms to model long range dependencies in sequences question what is positional encoding answer positional encoding is a technique used to represent the order of words in a sequence question how query key and value vectors are used answer the query vector is used to compute a weighted sum of the values through the keys specifically q dot product all the keys then softmax to get weights and finally use these weights to compute a weighted sum of the values question how to start using transformers answer to start using transformers you can use the huggingface transformers library this library provides thousands of pretrained models to perform tasks on texts such as classification information extraction question answering summarization translation text generation etc in 100 languages you can try your own documents and questions ask almost everything txt pdf video etc in these simple tutorials how to get answers from text documents pdf files and even youtube videos using chroma https www trychroma com vector database palm https blog google technology ai google palm 2 ai large language model llm by google and a question answering chain from langchain https python langchain com docs get started introduction html finally use streamlit https streamlit io to develop and host the web application you will need to use your google api key you can get one from google ths system architecture https blog streamlit io langchain tutorial 4 build an ask the doc app is as follows lc01 png images lc01 png ask youtube https askyoutube djl4xfut5yj streamlit app askyoutube png images askyoutube png ask pdf https askpdf 6ybdiy0fj3h streamlit app askpdf png images askpdf png evaluating llm based systems there s a difference between evaluating an llm versus evaluating an llm based system typically after generic pre training llms are evaluated on standard benchmarks glue https gluebenchmark com a benchmark of nine sentence or sentence pair language understanding tasks squad 2 0 https rajpurkar github io squad explorer a reading comprehension dataset consisting of questions posed by crowdworkers on a set of wikipedia articles where the answer to every question is a segment of text or span from the corresponding reading passage or the question might be unanswerable snli https nlp stanford edu projects snli a collection of 570k human written english sentence pairs manually labeled for balanced classification with the labels entailment contradiction and neutral etc llms systems can summarize text do question answering find the sentiment of a text can do translation and more based on the system evaluation can be as follows as a good proof of concept we can examine manually a few inputs and expected responses where we tune and build the system by trying different components prompt etc however the systems must be evaluated thoroughly create an evaluation dataset on our proivate data however this approach is the usually comes at a high cost llms evaluating llms use an llm to generate test cases and then evaluate the llm based system on them for example in case of question answering system we need pairs of questions and answers in our evaluation set we can use human annotators to create gold standard pairs of questions and answers manually however it is costly and time consuming one feasible way of creating such a dataset is to leverage an llm you are a smart assistant designed to come up with meaninful question and answer pair the question should be to the point and the answer should be as detailed as possible given a piece of text you must come up with a question and answer pair that can be used to evaluate a qa bot do not make up stuff stick to the text to come up with the question and answer pair when coming up with this question answer pair you must respond in the following format question your question here answer the answer here everything between the must be valid json please come up with a question answer pair in the specified json format for the following text text use an llm to find how well the prediction is compared to the true answer given two texts true and predicted answers an llm can in theory find whether they are semantically identical langchain has a chain called qaevalchain that can take in a question and true answer along with the predicted answer and output correct or incorrect labels moreover we can use standard metrics for evaluation such as recall precision and f1 score once we have an eval dataset a hyperparameter optimisation approach makes sens and can be applied across different models prompts etc for more this article https wandb ai ayush thakur llm eval sweep reports how to evaluate compare and optimize llm systems vmlldzo0nzgymtqz provides an interactive look into how to go about evaluating your large language model llm systems ragas https github com explodinggradients ragas is a framework that helps you evaluate your retrieval augmented generation rag pipelines rag denotes a class of llm applications that use external data to augment the llm s context there are existing tools and frameworks that help you build these pipelines but evaluating it and quantifying your pipeline performance can be hard this is where ragas rag assessment comes in ragas png images ragas png ai agents ai agents use an llm to determine which actions to take and in what order to complete a task an action can either be using a tool and observing its output or returning a response to the user tools are functions that an agent calls examples of tools include apis databases search engines llms other agents etc the core idea of agents is to use an llm to choose a sequence of actions to take in chains a sequence of actions is hardcoded in code in agents a language model is used as a reasoning engine to determine which actions to take and in which order this code https python langchain com docs integrations toolkits csv shows how to use agents to interact with data in csv format it is mostly optimized for question answering aiagents01 png images aiagents01 png chatgpt plugins chatgpt plugins https openai com blog chatgpt plugins are tools designed to help chatgpt access up to date information run computations or use third party services chatgptplugins png images chatgptplugins png examples of extending the power of chatgpt by creating and editing diagrams via show me diagrams https www whatplugin ai plugins show me diagrams aiagentdigrams jpg images aiagentdigrams jpg by accessing the power of mathematics provided by wolfram https www wolfram com wolfram plugin chatgpt aiagentsmath png images aiagentsmath png by allowing you to connect applications services and tools together leading to automating your life the zapier plugin https zapier com blog announcing zapier chatgpt plugin connects you with 100s of online services such as email social media cloud storage and more aiagentszapier png images aiagentszapier png autogpt https github com significant gravitas auto gpt autonomously achieves whatever goal you set auto gpt is an experimental open source application showcasing the capabilities of the gpt 4 language model this program driven by gpt 4 chains together llm thoughts to autonomously achieve whatever goal you set llms for computer vision tbd further readings book speech and language processing daniel jurafsky https www amazon com speech language processing daniel jurafsky dp 0131873210 video natural language processing https youtu be iwea12eau6u list ploromvodv4rohcuxmzknm7j3fvwbby42z gentle introduction to statistical language modeling and neural language models https machinelearningmastery com statistical language modeling and neural language models
deeplearning generative-ai llms nlp transformers
ai
gospec-analyzer
go spec analyzer a tool for natural language processing of go language specification motivation go language specification is written in english and not all people on the earth understand english documentation easily this tool provides some features and data for better understanding of english sentences usage see makefile analyzed output output data is managed in another repo https github com dqneo gospec
ai
EAOT
multi modality agorabanner png https discord gg qutxnk2nmf connecting large language models with evolutionary algorithms yields powerful prompt optimizers agora s open source implementation of the paper connecting large language models with evolutionary algorithms yields powerful prompt optimizers paper link https arxiv org pdf 2309 08532 pdf installation pip install eaot citation bibtex misc 2309 08532 author qingyan guo and rui wang and junliang guo and bei li and kaitao song and xu tan and guoqing liu and jiang bian and yujiu yang title connecting large language models with evolutionary algorithms yields powerful prompt optimizers year 2023 eprint arxiv 2309 08532
artificial-intelligence gpt4 llama llama2 machine-learning prompt-engineering prompting
ai
line-liff-starter
this repository is no longer active this repository is no longer actively maintained please refer line line liff v2 starter https github com line line liff v2 starter for this repository successor liff starter app this is a small web application that demonstrates the basic functionality of the line front end framework liff https developers line me en docs liff overview prerequisites a channel on the line developers console https developers line me en docs liff getting started for your application a channel access token https developers line me en docs liff getting started preparing channel access token a heroku account https www heroku com deploying the application deploy https www herokucdn com deploy button svg https heroku com deploy template https github com line line liff starter 1 click the above deploy to heroku button 2 fill in the required information on the create a new app page in heroku 3 select deploy app and confirm that your app is successfully deployed 4 record the app url https heroku app name herokuapp com you will set this url when you add the app to liff adding the starter app to liff add the app to liff for more information see adding a liff app https developers line me en docs liff registering liff apps running the application 1 to run this application host these files on a web server 2 set your liff s entryurl to point to index html 3 open your liff in the line app trying it out to open the liff app within the line app follow the steps below 1 tap line app liffid on the chat screen of the line app liffid is the liff app id returned to the api request to add the app to liff 2 agree to grant the required permissions to the liff app 3 when opening the liff app the following four buttons and the content of received information are displayed open window opens https line me in the in app browser of the line app close window closes the liff app get access token gets the current user s access token get profile gets the current user s profile send messages sends a sample message on behalf of the user if the liff app is opened in the chat screen for api calls associated with the buttons see calling the liff api https developers line me en docs liff developing liff apps calling liff api for the received information see initializing the liff app https developers line me en docs liff developing liff apps initializing liff app checking logs to get more information you can check the logs of your app using heroku cli heroku cli 1 log in to heroku from the command line shell heroku login 1 check the logs shell heroku logs app heroku app name tail downloading and making changes to the starter app you can download the starter app to your local machine to test and make changes for yourself you can then deploy the app to a web server of your choice here we ll look at how to make and deploy changes to the heroku app you created in the previous step 1 make sure you have the following installed git https git scm com 1 clone the line liff starter https github com line line liff starter github repository shell git clone https github com line line liff starter git 1 cd into your git directory 1 add a remote for heroku to your local repository shell heroku git remote a heroku app name 1 make edits and commit changes optional shell git add git commit m first commit 1 push changes to heroku master shell git push heroku master console console heroku https www heroku com heroku cli https devcenter heroku com articles heroku cli
line javascript
front_end
getir.com-react-tailwind
getir com react tailwind front end https 1 bp blogspot com ry9fbe gjpo ysiubktttqi aaaaaaaaotm v1tk1cmmsymqpkyr1nopy iduu1doiibwclcbgasyhq s1206 ekran 2bresmi 2b2021 08 27 2b10 27 27 png bir gece u ra olarak getir com un anasayfas n react ve tailwind ile haz rlad m daha geli tirilebilir belki youtube da bunu birlikte tekrar t m sayfalar yla kodlar bir rnek kartmaya al r z takipte kalman z dile iyle demo https getir react tailwind netlify app https getir react tailwind netlify app
front_end
adversarial-prompts
adversarial prompts curation of prompts that are known to be adversarial to large language models additionally tracks examples of these adversarial prompts working in the wild prompts prompts are stored in the prompt folder in txt files the format of the prompt should be description description of the adversarial prompt should be easily readable and properly capitalized model name of the model s this attack works against e g text davinci 002 if works against multiple models should be a comma separated list with no spaces attack type type of attack known types of attacks are evasion or leakage prompt full text of the prompt should be pasted here on the line below the prompt header in the wild the following is a collection of adversarial prompts working in the wild remoteli io s twitter bot https arstechnica com information technology 2022 09 twitter pranksters derail gpt 3 bot with newly discovered prompt injection hack google s lamda off topic https simonwillison net 2022 sep 18 michelle m defenses the following is a collection of proposed defenses k shot prompt for non instruct model or create your own finetune https twitter com goodside status 1578278974526222336 s 20 t 3umzb7ntyhwak3qlpkmabw update evasion against k shot prompt https twitter com povgoggles status 1578289474530086918 s 20 t 3umzb7ntyhwak3qlpkmabw
ai
CV_Learning
cv learning learning computer vision and image processing digital image processing course projects including some projects in the digital image processing course 01 histogram equalization pro some algorithms which based on the histogram equalization 02 retinex implement and compare different retinex algorithms like ssr msr msrcr etc 03 haze night image enhancement implement dark channel algorithm to enhance haze and night images 04 um aum implement and compare unsharp masking and adaptive unsharp masking algorithms 3d vision course projects including some projects in the 3d vision course 01 de noising using rof rudin osher fatemi algorithm and chambolle s method to solve the de noising task 02 panorama using sift homography ransac algorithms to synthesis three images and get a panorama image 03 ar system a simple implement of ar system 04 3d structure recover 3d structure recover based on two image views 05 final personal project select a paper about five point algorithm and compare it with seven point and eight point algorithms other projects including some projects from other courses or internet image alignmnet this assignment from coursera deep learning in computer vision course face alignment clearly practice for face alignment edge detection canny edge detection algorithm implement by python3 histogram equalization learning histogram equalization
computer-vision image-processing 3d-vision
ai
Mobile-Web-Development-Books
mobile web development books
front_end
Data-base-programming-with-PHP-and-MySQL
data base programming with php and mysql university of jaffna faculty of engineering ec5070 database systems lab 2
server
IIT
a2 group 58 introduction to information technology assignment 2 team profile team name this is it members steven paul jones s3549218 steven has been working towards a bachelor in it for 4 years has a love for programming and would like to achieve a position as a full stack developer in the future currently working as a manufacturing manager the skills and knowledge that have been gained from rmit has enabled steven to solve many of his company concerns by implementing various web applications other hobbies include ironman events and any endurance sports ryan mansbridge s385198 hello my name is ryan i am 19 and from melbourne l my hobbies include basketball coaching video games and animation my interest in it started when i was in primary school when i took apart my first computer ever since then i have had a passion for how computers run as well as how games are made and programmed the past couple of months i have been working as a printer which uses a lot of different computer systems and machines joshua rogers s3540958 joshua has been working in the sports tech field for the past four years his current role of digital project manager has allowed him to manage large scale project builds for both web native apps platforms working in this field is where his passion for it has stemmed from after spending multiple years on the product management side of things he has found a keen interest to grow his it technical skills this has spurred him to commence his bachelor of it in free time josh s hobbies include camping playing australian rules football and cooking a nice meal steven jones full stack developer full stack developer sits a little below average shown in figure 1 required skills programming skills javascript sql php html css collaboration git team work javascript and sql are very highly demanded skills sets and detailed in figure 2 a full stack developer needs a broad range of skills but it was interesting to see php html and css was not included in figure 2 and git was relatively low for what i would have perceived figure 2 is also supported by the below link to an interesting hay s article https www hays com au report information technology 16659 personally my ideal job has not changed after the above insite i see that demand and pay is not the highest for this discipline but i already understand the rewards and thoroughly enjoy it ryan mansbridge graphic designer the required skills are it skills graphic design generic skills creativity team work presentation skills the required skills for my ideal job are pretty basic graphic design skills are just above average when it comes to the demand in that skill as seen in figure 2 when it comes to the demand for generic skills they all sit just above the halfway mark my ideal job as a graphic designer hasn t changed even though the demand for the job is the second highest on the graph seen in figure 1 the skills that are needed for the job are very easy to learn and i think the end result of becoming a full time graphic designer would be extremely rewarding joshua rogers full stack developer required skills programming skills javascript sql php html css collaboration git team work the required skills for my ideal job you will find are required for the majority of it jobs although you generally will not need the whole suite of skills required like you are as a full stack developer if you are going to be a more specific fe or be developer then you will only need around 40 60 of the skills needed demand for a full stack developer is not that high most companies are tending to leaning towards having developers who suit a specific field rather than having the capability to attend to all facets of development my current programming skills are very limited and i have a lot of work study time required to be committed before i will even be able to consider my ideal job it work video 1 https www youtube com watch v 11mqfyl1gfm this it professional is a system engineer he looks after physical servers and cabling in data centres as well as looking after and managing virtualization machines he states that he sometimes works with networkers at data centres as well as working with storage engineers at data centres this it professional states that he spends most of his time in data centres looking after the server rooms and network devices he works with the network engineer to network all of the servers and devices he states that there are a couple of different levels of system engineer s senior system engineers architectural engineer video 2 https www youtube com watch v 5kas2jbobuy this it professional is a software engineer she works on writing mobile apps that focus on the productivity and social side of things some apps she has worked on are patron and quick books she states that she works with product managers and designers she spends most of her time building and coding the software for the apps as well as constantly having meetings with the team to make sure everyone is on the same page and up to date the most challenging part for her is that coding an app takes quite a bit of time because they are always finding new problems and errors in the software video 3 https www youtube com watch v iafqgmprj g this it professional is a cybersecurity analyst who works for inmarsat which is a company that uses satellite telecommunications which offers a global mobile service his job is to do a number of things from telling colleagues about password complexity to data loss migration as well as trying to protect technical information relating to assets in space he safes guard all information about aviation marine and land customers using computers to see spikes in network trafficking or increases in fail to authorize he states that he interacts with pretty much everyone who works at inmarsat he spends most of his time in a big room with all his other colleagues looking at all different types of computers graphs and data the most challenging part for him is that the threats for his work are only going to get worse as it gets better video 4 https www youtube com watch v kqwgs7vbmku this it professional is a graphic designer she has worked at a couple of different companies as a graphic designer a graphic designer is someone who is a creative problem solver and her job is to find a creative solution for whatever job she has been asked to do the people she works with are mainly customers who need something to be designed for a t shirt or designs for movie posters and packaging the most challenging part for her was training new employees on how to design on a computer using adobe illustrator and photoshop she spends most of her time in her office at work working on new designs for the company she works at as well as helping the stylist come up with the style of whatever projects they have to do video 5 https www youtube com watch v pvubvo6h8oc this it professional is a noc engineer which stands for network operations centre he looks after all monitoring of the offices as well as computer work like graphing and displays for power to cooling to network activity some work activities he does is check all the mtu units which would help run the data centre if there was a power cut another main job they have to do is check to see if all the ats and ups are all running correctly to keep the data centre running well he works with data centre service technicians as well as data centre service managers to make sure the data centres are all running correctly video 6 https www youtube com watch v twfge bhmcm this it professional is an sdm or service delivery manager at a data centre his role is dealing with operations within the data centre and bridging the technical gap between the noc commercial as well as dealing with all other management teams throughout the entire facility as stated before he works with pretty much all team members in the facility as well as giving new employees a tour of the facility and how it all runs and what their job will entail he also deals with customers when they have things to drop off like new server racks and new computers he is moving around all the time going into different rooms to check up on things so he doesn t really have an exact spot he spends all of his time video 7 https www youtube com watch v jcojl6zsxk this it professional is a web designer the type of work he does is making websites for his clients but making sure he uses all the client s design ideas and requirements to make the best website the first thing he does with a new client is wireframing which basically just shows him how everything is going to fit together so that he knows what to do from there he designs the website using photoshop and illustrator he works with coders who code the website sites he also works with user experience team members and user interface designers to make sure they make the perfect website for their clients he spends pretty much all of his time sitting in his office designing the website and talking to his other colleagues about how they are going to make the website the hardest part of the job for him is to please all clients because some clients are extremely picky and want to do too much video 8 https www youtube com watch v leox1ehxhnm this it professional is a software developer the main five things he does at work are building a feature fixing bugs research communicating with all types of people from clients to colleagues and finally fixing production issues or production emergencies when it comes to him making a new future for software he does all the coding from scratch which takes a very long time to figure out how to get the job done he works tightly with product managers and business analysts to make sure he is doing his part of the job correctly or he needs some help on some stuff he spends a lot of his time in meetings with all his colleagues or sitting at his desk working on whatever task he has been given by his managers the hardest part of his job is dealing with production issues which affect the system which is a major problem that needs to be fixed straight away but this can take some time video 9 https www youtube com watch v f3up3v4sjgm this it professional is a disney artist and animator she works in an office will all her other colleagues some of her daily tasks are meetings with all other animators about the new episodes and what they will have to design she mainly designs backgrounds for disney t v shows painting them in adobe illustrator and using photoshop if need be some people she works with are all the other artists and animators at disney as well as product managers art directors and production coordinator who are assigned for the new episodes she spends most of her time in her cube with no light so she can see all the colours fully and design the artwork to its full potential video 10 https www youtube com watch v srec90j6lty this it professional is a ux designer for aj and smart some of the daily tasks she undertakes are wireframes to mock up how a user experience might look like in a web interface or a mobile app in depth user research which basically means the observe the users in their natural environment as well as doing ethnographic research where you immerse yourself in the field some people she works with are website developers to make the user experience the best it can be for the website as well as interacting with the customers themselves making sure their experience was good and what they could work on she spends most of her time in her office or when in depth research is needed for a specific project she will go off into the field of whatever research needs to be done it technologies machine learning what does it do machine learning is a subgroup or application of artificial intelligence from a high level view it is complex algorithms have the ability to learn and improve independently without being explicitly programmed some good examples of day to day applications that make use of machine learning are virtual personal assistants siri alexa google are becoming very popular as in home hubs as virtual personal assistants these assistants make use of machine learning by analysing the information and queries given by the users and using them to predict future enquiries and tailored preferences to your profile social media services we should all be familiar with the numerous social media applications available but may not be so versed in the background machine learnings most social media use machine learning to personalise news feed ads targeting or even making suggestions on friends and clubs that you may have been evaluated to be interested in in the coming years we would expect to see the advent of driverless vehicles on the road these are already in development and pose a revolution in the transport industry the technology that is enabling this is computer vision computer vision is a field of machine learning that trains computers to interpret and understand the visual world they can use images from cameras and videos and deep learning models in some industries computer vision has improved so much that it has exceeded the ability of humans computer vision is accelerating fast using machine learning techniques what is the likely impact in the advent of driverless vehicles a new era in human life shall unfold there are both benefits and disadvantage some of which are discussed below reduction in traffic accidents and death motor accidents shall be greatly reduced by the introduction of driverless vehicles machines shall not fatigue and tire lose concentration and shall in time exceed human ability to detect hazards and potential accidents this is an enormous benefit to society reduction in property damage flow on effects from the reduction in traffic accidents will be the reductions in property damage personal vehicles and other damage caused by the vehicle this also will benefit society in terms of individual capital reduction in energy costs from the reduction in the total number of vehicles required reduction in congestion and the learning ability of the vehicles to drive as efficiently as possible overall energy costs will be reduced reduction in traffic congestion driverless vehicles shall have less need for traffic regulating devices such as traffic lights and ramps etc as they shall easily be able to merge and avoid other traffic through networking abilities and as such shall reduce congestion reduction in pollution less traffic less congestion and efficient driving will result in less pollution to the environment unemployment a disadvantage of driverless vehicles is that they shall make transporters of good and taxiing services redundant vulnerability to hackers vehicles being subject to attacks is a great concern with driverless vehicles and strict measures must be taken prior to launch to avoid this potential threat privacy issues privacy is a concern customers will not want their travels recorded and potentially used and careful privacy policies will need to be created to avoid potential ethical dilemmas collapse of some auto industries and insurance industries the reduction in the number owned vehicles will have a massive effect on the automotive industry this would cascade through their entire supply chain again contributing to unemployment how does it affect you for students i do not see that this would affect our daily routines a great deal cloud cloud services servers what does it do the cloud in simple terms is the delivery of computing services including servers storage software databases and analytics over the internet offering faster innovation resource economies of scale majority of the time you only pay for the cloud services you use helping you lower your costs run your infrastructure and more efficiently scale your business needs cloud computing can be broken into three main types public cloud public clouds are managed operated by third party cloud service providers they deliver computing resources such as storage and servers via the internet all the hardware software and other supporting frameworks are owned and run by the provider private cloud private cloud is cloud computing assets used completely by a single business private clouds can be physically on a company s data centre companies can also pay third party service providers to host their private cloud a private cloud is one in which the services and infrastructure are maintained on a private network hybrid cloud hybrid clouds merge both private and public clouds combining them by technology that allows data applications to move between both types of clouds public private this gives business greater flexibility and more deployments options cloud computing can also be broken down into three services as well software as a service saas software as a service is the method of delivering software via the internet they are typically delivered on a subscription basis and in a way that any maintenance can be handled by patches software updates users can access these applications generally over the internet via web browser phone or tablet infrastructure as a service iaas iaas is the most straightforward category of cloud computing services with infrastructure as a service you are renting the infrastructure servers virtual machines operating systems iaas is generally working on a pay as you go basis platform as a service paas paas refers to cloud computing services that provide environments to building developing software with the goal in mind of aiming to make it easier for developers to create websites or mobile applications without needing to worry about managing the infrastructure of servers storage network and databases needed for development what is the likely impact cloud services have had a significant impact on the business of all sizes as it has continued to grow it has helped business individuals with the following cost savings the cloud has allowed companies to save on costs due to reduction in hardware cost instead of purchasing in house equipment hardware needs are managed by the vendor labour maintenance as a result of the hardware being owned by vendors and stored in off site locations there is less demand for in house it staff if servers or other hardware need repairs or upgrades it is the responsibility of the vendor and doesn t cost your company any time or money productivity in addition to the outright labour savings cloud computing can be extremely cost effective for enterprises because of the increase in workforce productivity the deployment of cloud software is notably faster than a standard installation instead of the weeks or months that a standard company wide installation may take cloud software deployment can happen in a matter of hours efficiency cloud computing allows a business to communicate easier beyond the older more traditional methods the key benefit enabling team members to work collaborate data on the same document at the same time agility cloud computing is giving users access to a broader range of technologies and allowing people to have the ability to obtain better resources needed to build almost anything how does it affect you cloud computing has got to the stage where it profoundly affects almost all humans on a daily perspective whether that s a uni student using google documents to collaborate on a group project to primary school students teachers using virtual desktop programs due to our current landscape and even your grandma who is using facebook natural language processing and chatterbots what does it do natural language processing nlp is software that can turn unstructured data such a large volume of text into usable structured data using algorithms patterning the principal objective of this technology is to assist decipher human language into more valuable data sets this technology has been around since the mid 90s although like almost all tech it continues to evolve like practically all tech we have seen in the past couple of years natural language processing using both speech to text deep learning in such devices as google alexa transitioning nlp from a tool that is used to crunch large volumes of unstructured data for business to a tool that is now used in households the biggest hurdle that nlp currently faces is a shortage of training data because nlp is a diversified field with many different tasks most of these tasks contain specific datasets in which only a few thousand have human labelled training examples however the implementation of deep learning based nlp training models is assisting in closing the gap in this data researchers are developing a wide span variety of techniques for training language representation models they are doing this by using a tremendous amount of unannotated text on the web this is known as pre training with the use of these new nlp training processes there has been a significant increase in benchmark testing for evaluating nlp systems at a range of tasks spanning logic common sense understanding these significant increases in nlp learning in combination with advances in machine learning are seeing massive progress in the capabilities of devices such as google alexa in built ios functionality siri although we see significant advances in capability for above mentioned google alexa siri nlp is also being used for software such as grammar spelling checks grammarly machine translation google translate message chat bots customer service automation as technology progresses nlp will continue to advance in combination with machine learning other technology we are already seeing smarter chatbots that are using both deep learning nlp to analyse human response better respond accordingly and this will only get better and better a great example of this is spotify s release radar playlists for users what is the likely impact natural language processing is an amazing tool that has the ability to impact a wide demographic of people in a very positive way from assisting travellers in foreign countries using applications that scan foreign language and instantly translate to your own to now being able to read sign language and with the assistance of ngl communicate to people who do not know a single sign language signal nlp has the ability to affect everybody in some way although the industries fields that will be most affected by the continual progression of nlp at this stage seem to be the financial legal healthcare systems all of these fields deal with a large volume of text that they need to manually read and translate into usable and non usable data whether its extracting meaning from content data and document collections all at breakneck speed or analysing reams of medical records and identifying hidden correlations between symptoms diagnoses medications and treatments these are all tasks that nlp can now assist in saving hours of manpower time and even sometimes costs on nlp will not put jobs in these fields at risk although it will drastically transform some of the training skills required to perform nlp will also have a similar impact on the rest of the job market it shouldn t cause a bulk loss of jobs although it will certainly change the required training and skill set of many only really having a potentially negative effect on the lower tier customer service jobs that now require minimal training these are using nlp can now be managed by chatbots more more how will this affect you currently nlp has a significant effect on most people s daily lives in which most people would not even realise from your emails filtering what is spam and what isn t to predictive text on your mobile device that allows you to type at a fast pace and not have to worry about small spelling grammar errors the impact that nlp has on people s daily lives will only continue to grow for the majority of the population as we make advances in nlp nlg machine learning raspberry pis arduinos makey makeys and other small computing devices what does it do raspberry pi is the name of a series of single board computers made by the raspberry pi foundation a uk charity that aims to educate people in computing and create more convenient access to computing education the raspberry pi launched in 2012 and there have been several iterations and variations released since then the tiny single board computer is designed to help people learn to program despite its small size and simple design it can be used for a range of tasks as diverse as emulating classic computer games or acting as a bluetooth receiver for you old speakers there is such a wide range of projects people can create with the raspberry pi linux is the operating system that runs on raspberry pis the latest raspberry pi 4 has a quad core 1 5ghz 64 bit arm power plant the entry level model contains 1gb of ram with the ability to spend more and get 2gb or 4gb other changes for the latest model include stepping up to hdmi 2 0 with hdr usb 3 0 bluetooth 5 0 and usb c power apart from the extra power one of the most significant changes in the switch to twin 4k micro hdmi video ports the ability to connect two screens at once is convenient for programmers who want to edit code on one screen while seeing the results on another due to the nature of what the raspberry pi tech is there is no direct trajectory that we expect this hardware to get to within the future although what we do know is that it will continually progress and update its hardware as time goes on just like it has from the early raspberry pi models in 2012 to the current model now what is the likely impact the raspberry pi has been hugely impactful on people of all ages worldwide selling over 10 million units the raspberry pi has given people the opportunity to learn coding for an affordable price from classrooms libraries research laboratories peoples studies and the industrial environment the most significant impact the raspberry pi has had on people is it has encouraged them to experiment with computers and allow them to know that you don t have to have years years of programming experience to try your hand at it and achieve something great the raspberry pi has had no adverse effect on the job market if anything it has assisted with upskilling junior it employees as they look to learn transition further into their it careers how will this affect you the raspberry pi will not affect the majority of the population s daily life for some the raspberry pi will not be part of their entire life although for an individual who works studies it they are almost guaranteed that they will cross paths with it at some stage for those who interact with a raspberry pi during their daily life it will most likely be built coded by them in a way that can assist making a part of their everyday life more efficient like syncing your house s lighting system with your google alexa so it s voice controlled project ideas summary note that this is a continuation from an idea steve jones concepted in his ethics module that we thought was good to look at driverless motorcycle dm will offer residents of melbourne and surrounding areas a commuting taxiing luxury service via an autonomous self driving motorcycle our motorcycle s shal be electric powered with a range before recharge of 160 km customers shall be able to book via and a web page or application whereby they can see the estimated pickup time trip duration prior to paying through the online portal customers shall also be able to order their required helmet size the service shall not require passengers to have a current motorcycle license as our motorcycles shal be self balancing and self driving so no decision making is required by our passengers dm s services has an enormous potential and its success will be attributed to the following dm s offer budget rates when compared to other public transports rates such as taxis and uber as we have the benefit of no driver expenses we also minimise cost by taking advantage of electric power which is a fraction of the cost of combustible fuels dm s offer quicker transit times as our motorcycles shall take advantage of motorcycle lanes dm s electric bikes also compliment environmental requirements by minimizing the pollution seen with conventional transports dm s short term goal is to initially launch in melbourne and surrounding areas this non nationwide rollout will minimise initial investment until a measure of success and use of our services can be fully appreciated in melbourne we are anticipating 2630171 trips per week with an expected 8 9km average distance travelled pending the success of this melbourne launch dm s future plans would be to then consider a nationwide roll out github project https joshrogers1512 github io iit feedback refer to spark group reflection josh what could be improved the critical thing we could have improved on was after joining the group kicking off a group discussion earlier and setting up a plan in the end this had no adverse effect on the group s productivity cohesion although it would have given us a little more breathing room at least one surprising thing i think how easy our group was able to make decisions and delegate tasks there was zero conflict or issues when planning or completing the work it was surprising compared to previous experiences at least one thing that you have learned about groups not all group projects are hard and require a strong leader to guide the project we were able to achieve what we wanted with civil group conversations decisions ryan what went well i think the whole project went really well and and all of us worked good with each other there was no conflict between us which made it easier for us to get project done what could be improved the main thing i think we could have improved was maybe start things off a bit earlier it took us a while to finally get into a group meeting where we could all discuss what we had to do and plan the whole project at least one surprising thing the main thing that surprised me was the fact that we all came up with a decision really quickly with no conflict at all we all worked really well with each other one thing i have learnt about groups the main thing i learnt about groups from this project was that if everyone is on the same page and there is no conflict group projects can run extremely smoothly and another thing i learnt was that not you don t need to have a leader all the time you should be able to make a group decision steve what went well the use of the online coloberate software was great google docs hangouts slack and discord the team members have also been both resourceful and supportive what could be improved i found the allocation to groups was not well described we lost a lot of time and my time outside of the group just trying to figure out how to get in a group at least one surprising thing although we did not have the full amount of time due to point two i was surprised at how quickly the document came together one thing i have learnt about groups if team members have and understand a common goal then work will come together easily and in good quality i have seen in other groups i have been in this was not shared between the members in one instance this resulted in a very poor java program working in this team was refreshing and efficient
server
handsfree
div align center p presenting p p with support from a href https glitch com handsfreejs glitch com a the a href https www cmu edu cfa studio index html studio at cmu a the a href https youtu be cjdpf4xuiey t 58 school of ai a and you p br p img src https media giphy com media 3z15ve7weqgkla1fwc giphy gif alt handsfree js p br h1 handsfree js h1 p a wrapper library around web based computer vision models for the purpose of interacting with the web handsfree p p img class mr 1 src https img shields io github release pre handsfreejs handsfree svg img class mr 1 src https img shields io github last commit handsfreejs handsfree svg img src https img shields io github repo size handsfreejs handsfree svg p p img class mr 1 src https img shields io github issues raw handsfreejs handsfree svg img src https img shields io github issues pr raw handsfreejs handsfree svg p p powered by a href https github com jeeliz jeelizweboji jeeliz weboji a p div br br br quickstart html doctype html head require dependencies which adds handsfree to global namespace link rel stylesheet href https unpkg com handsfree 5 0 4 dist handsfreejs handsfree css script src https unpkg com handsfree 5 0 4 dist handsfreejs handsfree js script head body script create a new instance use one instance for each camera const handsfree new handsfree create a simple plugin that displays pointer values on every frame when using only 1 instance handsfree context handsfree use consolelogger pointer context console log pointer context head rotation context head morphs start tracking handsfree start script body br br br div align center img src https i imgur com qkyyazg gif width 250 h1 learn how to use handsfree js h1 a href https github com handsfreejs handsfree wiki config config a middot a href https github com handsfreejs handsfree wiki methods methods a middot a href https github com handsfreejs handsfree wiki properties properties a middot a href https github com handsfreejs handsfree wiki head the head a middot a href https github com handsfreejs handsfree wiki plugins plugins a middot a href https github com handsfreejs handsfree wiki events events a middot a href https github com handsfreejs handsfree wiki classes classes a h1 tutorials h1 div getting started https dev to heyozramos handsfree js a web based face pointer 24m1 learn how to setup handsfree js and access the handsfree pointer properties controlling a youtube 360 video https dev to heyozramos controlling youtube 360 videos handsfree 2801 learn how to access and use handsfree head rotation to control a 360 video the rest of this document is for running handsfree js org and the handsfree js library locally see the above wiki links for how to actually use handsfree js in your own app br br br local development a note about this codebase this codebase is currently broken into two parts the library itself located in src assets handsfree handsfree js handsfree js org which is everything else this really should be two separate repositories but for now just know that the library itself starts in src assets handsfree handsfree js to run this project locally you ll need nodejs https nodejs org en download and the yarn package manager https yarnpkg com en docs install windows stable after downloading this project repo you ll then need to install dependencies by running yarn in the project s root directory then you ll have the following commands available bash start a local dev environment on localhost 8080 yarn start build for production into dist sandbox yarn build deploy to handsfree js org yarn deploy create handsfree js in dist handsfreejs yarn bundle deploy script running yarn deploy will commit everything inside of dist to the gh pages branch of the repository set in package json deploy repo using the package json deploy url custom domain this lets you quickly deploy this repository to github pages creating the handsfree js library when you run yarn start yarn build or yarn deploy what you re actually doing is running or building the development environment to create a single handsfree js script for use within your own projects do the following install parcel https parceljs org on your system globally with yarn global add parcel bundler run npm bundle the files will be built into dist handsfree license attributions handsfree js is available for free and commercial use under apache 2 0 http www apache org licenses license 2 0 html models jeeliz weboji https github com jeeliz jeelizweboji apache license 2 0 face and head pose estimation art monkey logo adaption https www designevo com apps logo name cute monkey and interesting gaming br br br special thanks a very special thanks goes out to golan https twitter com golan for inviting me out to his studio the studio for creative inquiry at carnegie mellon http studioforcreativeinquiry org during the spring of 2019 https www flickr com photos creativeinquiry albums 72157703188612302 it was during this residency that i was encouraged to begin integrating handsfree js into different libraries and where i had a chance to use handsfree js with a real ur5 robot another special thanks goes out to anildash https twitter com anildash for sponsoring me during winter 2018 also a thank you to the school of ai https twitter com schoolofaioffic for the 2018 fellowship https www youtube com watch v cjdpf4xuiey t 58 and a very special thanks to jess holbrook https twitter com jessscon from google pair for driving all the way way out to meet me and helping kickstart this all with a new computer thanks also to everyone who s supported me on patreon gofundme and through twitter over the months and almost years and thanks everyone else for believing in this project
ai
devops-engineering
devops engineering div id top div span img src https img shields io badge python ffd43b style for the badge logo python logocolor blue img src https img shields io badge go 00add8 style for the badge logo go logocolor white img src https img shields io badge docker 2ca5e0 style for the badge logo docker logocolor white img src https img shields io badge kubernetes 326ce5 svg style for the badge logo kubernetes logocolor white img src https img shields io badge nginx 009639 style for the badge logo nginx logocolor white img src https img shields io badge apache d22128 style for the badge logo apache logocolor white img src https img shields io badge apache kafka 231f20 style for the badge logo apache kafka logocolor white img src https img shields io badge mysql 005c84 style for the badge logo mysql logocolor white img src https img shields io badge redis cc0000 svg style for the badge logo redis logocolor white img src https img shields io badge postgresql 316192 style for the badge logo postgresql logocolor white img src https img shields io badge google cloud 4285f4 style for the badge logo google cloud logocolor white img src https img shields io badge linux fcc624 style for the badge logo linux logocolor black span this project is geared towards the proper understanding of certain technologies and tools that are prevalent in the community of developers as well as explore all the tools and technologies using for devops engineering roadmap x version control git x yaml files serialization concept yaml data types anchors yaml xml json automation tools go networking x osi and tcp ip models ip ports x tcp and udp x http x tls x bandwidth x network devices hub switch bridge router x proxy servers nat x dns rate limiting x web servers security x denial of service linux os concepts x redundant array of inexpensive independent disks x power on self test basic input output system x processes threads x synchronous operations x asynchronous operations x multithreaded operations x multiprocessing operations x inter process communication x mutex x semaphores interrupts context switches system calls x manual x terminal x commands x performance commands x file system x software ecosystem special files message queues pub sub x message queues pub sub apcahe kafka rabbitmq docker x containers x names x environment variables x port mapping x detachment x interactive mode x networks x volumes x bind mounts images x basic concepts x docker hub x dockerfiles x docker image size decreasing techniques multi stage builds docker compose x basic concepts docker compose file examples docker swarm x basic concepts examples important commands containers x images x volumes x networks x compose swarm miscelleaneous ci cd x github actions jenkins kubernetes databases postgresql x rdbms er schema diagrams relational database design techniques postgresql design principles mysql important querying techniques query practice sql clauses mysql design principles redis x redis design principles google cloud platform x google cloud digital leader certification preparation gcp commonly used commands coursera courses x google cloud fundamentals core infrastructure x essential google cloud infrastructure foundation system design low level design common design principles x decoupling solid principles x single responsibility principle open close principle creational gof patterns high level design databases x terminologies x indexes acid theorem base theorem x cap theorem x replication partitioning sharding x rest apis x percentile tail latencies hisory x discussion x microservices monoliths rpc design examples x snakes and ladder tic tac toe package managers to download and install various dependencies and softwares efficiently we would need package managers no matter what environment one is in it would be fundamental to have at least one brew mac linux pacman linux apt linux scoop windows good for developmental needs chocolatey windows all referenced tools 1 dive https github com wagoodman dive a tool to reduce docker image size by layer by layer analysis 1 docker slim https github com docker slim docker slim a tool to securely reduce docker image size and gain insight on an image s properties and data 1 docker hub https hub docker com the official public container repository for docker
cloud
iml
badges start tic https github com christophm iml workflows tic badge svg branch master https github com christophm iml actions cran status badge http www r pkg org badges version iml https cran r project org package iml cran downloads http cranlogs r pkg org badges grand total iml https cran r project org package iml codecov io https codecov io github christophm iml coverage svg branch master https codecov io github christophm iml branch master doi http joss theoj org papers 10 21105 joss 00786 status svg https doi org 10 21105 joss 00786 badges end iml iml is an r package that interprets the behavior and explains predictions of machine learning models img src https github com christophm iml blob master man figures iml png raw true align right height 140 it implements model agnostic interpretability methods meaning they can be used with any machine learning model features feature importance partial dependence plots individual conditional expectation plots ice accumulated local effects tree surrogate localmodel local interpretable model agnostic explanations shapley value for explaining single predictions read more about the methods in the interpretable machine learning https christophm github io interpretable ml book agnostic html book tutorial start an interactive notebook tutorial by clicking on this badge binder http mybinder org badge svg http beta mybinder org v2 gh christophm iml master filepath notebooks tutorial intro ipynb installation the package can be installed directly from cran and the development version from github r stable version install packages iml development version remotes install github christophm iml news changes of the packages can be accessed in the news file https christophm github io iml news index html quickstart first we train a random forest to predict the boston median housing value how does lstat influence the prediction individually and on average accumulated local effects r library iml library randomforest data boston package mass rf randomforest medv data boston ntree 50 x boston which names boston medv model predictor new rf data x y boston medv effect featureeffects new model effect plot features c lstat age rm man figures readme unnamed chunk 2 1 png contribute please check the contribution guidelines contributing md citation if you use iml in a scientific publication please cite it as molnar christoph giuseppe casalicchio and bernd bischl iml an r package for interpretable machine learning journal of open source software 3 26 2018 786 bibtex tex article molnar2018iml title iml an r package for interpretable machine learning author molnar christoph and casalicchio giuseppe and bischl bernd journal journal of open source software volume 3 number 26 pages 786 year 2018 license 2018 2022 christoph molnar https christophm github io the contents of this repository are distributed under the mit license see below for details the mit license mit permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software funding this work is funded by the bavarian state ministry of education science and the arts in the framework of the centre digitisation bavaria zd b
ai
firehose-vagrant-rails5
webdev setup with vagrant one of the most difficult things to get started with web development is installing all the programs in your environment we re going to run a series of commands to provision the environment during the process you shouldn t worry too much about why you re doing what you re doing just know that these steps are the steps it takes to provision a web development environment using vagrant which many students continue to use even after graduation first a little about passwords read all of this sometimes you will be prompted for a password your computer login password user account etc when you re installing programs using the terminal window when you start typing the password it will look like it nothing has been typed eg nothing will display on the screen as type that s totally fine your password is being entered just not shown to you so keep typing and press enter when you re done typing in case you type your password wrong it will prompt you to type in your password again why so nobody can look over your shoulder and see your password or know how many characters your password has run through the guide for your machine windows web dev setup windows md mac web dev setup mac md
front_end
data-vis-lessons
data vis lessons drag and drop demo bi bug issue echarts echarts
front_end
content-store
content store content based file system designed as embedded middleware npm package version https img shields io npm v content store ts svg maxage 3600 https www npmjs com package content store ts background inspired from ipfs but keep it simpler use case in mind videos backup system i do teaching with screen cast photo album backend for online service storage backend for zeronet license this project is licensed with bsd 2 clause license this is free libre and open source software it comes down to four essential freedoms ref https seirdy one 2021 01 27 whatsapp and the domestication of users html fnref 2 the freedom to run the program as you wish for any purpose the freedom to study how the program works and change it so it does your computing as you wish the freedom to redistribute copies so you can help others the freedom to distribute copies of your modified versions to others
datastore content-addressable-storage filesystem embedded
os
udagram-frontend
udagram frontend
cloud
Ionic-IOT-Monitor
ionic mqtt eclipse paho client image https raw githubusercontent com arjunsk ionic iot monitor master ionic mqtt min jpg use case scenario the app s primary objective is to monitor elder peoples health condition from remote periodicaly in fact we wanted to integrate more sensors in the device like temperature gps etc this could also be integrated with artificial pace maker for patients in a sense this is a starter project that can be adapted to wider scopes features 1 publish and subscribe to topics 2 maintain a database of topics 3 visualise the data in the form of spline for arduino code visit esp8266 pubsubclient cloudmqtt https github com arjunsk esp8266 pubsubclient cloudmqtt rate it on ionic market place http market ionic io starters ionic mqtt eclipse paho read the following article ionic mqtt client build using paho part 4 http www arjunsk com iot ionic mqtt client build using paho part 4 also read the complete tutorial series for iot http www arjunsk com tag iot video tutorial ionic iot monitor https img youtube com vi 0qia6a9vzy 0 jpg https www youtube com watch v 0qia6a9vzy
paho iot eclipse-paho ionic mqtt
server
crypto_api_data_modelling_with_postgreSQL
crypto api data modelling with postgresql a data modelling project that pulls data from a public crypto api and stores it in different tables in a posgresql database overview the data is being pulled from https pro api coinmarketcap com v1 cryptocurrency listings latest the api key can be retrieved at https pro coinmarketcap com account database and tables i created a database in postgresql named db crypto the following tables are created under this database crypto all api data table this table consists of the complete dataset all rows and columns pulled from our api this is our fact table crypto analytics table this table consists of columns that might be helpful in drawing insights and creating visualizations and reporting crypto bitcoin table this table consists of all details regarding bitcoin crypto ethereum table this table consists of all details regarding ethereum crypto tether table this table consists of all details regarding tether crypto solana table this table consists of all details regarding solana project files pulling data from api and creating a csv file from it contains code for getting data from api into json format and saving it as a csv file creating a database in postgresql and using the csv data to create a table contains code for connecting to database and creating tables and inserting data from csv file sqlqueries consists of sql queries to create various tables copying data from csv file and inserting data into cryptocurrency table from our fact table how to run run pulling data from api and creating a csv file from it ipynb to pull data and create a csv file then run creating a database in postgresql and using the csv data to create a table ipynb to connect to our database and create and insert data into our tables i worked on this project using jupyter notebooks similar results can be achieved by pycharm or any other ide errors and issues occured in jupyter notebook you might encounter an error iopub data rate limit exceeded to fix this type the following command while launching your jupyter notebook through command prompt jupyter notebook notebookapp iopub data rate limit 1e10 importing csv file data into our database would give permission and privilege errors i made the read access available for everyone on the csv file not the best solution
server
promptulate
h1 align center promptulate h1 p align center a target blank href img src https img shields io github license undertone0809 promptulate svg style flat square a a target blank href img src https img shields io github release undertone0809 promptulate all svg style flat square a a target blank href img src https bestpractices coreinfrastructure org projects 3018 badge a a target blank href img src https static pepy tech personalized badge promptulate period month units international system left color grey right color blue left text downloads week a p english readme md readme zh md p align center img src https zeeland bucket oss cn beijing aliyuncs com images promptulate logo new png p what is promptulate promptulate ai focuses on building a developer platform for large language model applications dedicated to providing developers and businesses with the ability to build extend and evaluate large language model applications promptulate is a large language model automation and application development framework under promptulate ai designed to help developers build industry level large model applications at a lower cost it includes most of the common components for application layer development in the llm field such as external tool components model components agent intelligent agents external data source integration modules data storage modules and lifecycle modules with promptulate you can easily build your own llm applications envisage to create a powerful and flexible llm application development platform for creating autonomous agents that can automate various tasks and applications promptulate implements an automated ai platform through six components core ai engine agent system apis and tools provider multimodal processing knowledge base and task specific modules the core ai engine is the core component of the framework responsible for processing and understanding various inputs generating outputs and making decisions the agent system is a module that provides high level guidance and control over ai agent behavior the apis and tools provider offers apis and integration libraries for interacting with tools and services multimodal processing is a set of modules for processing and understanding different data types such as text images audio and video using deep learning models to extract meaningful information from different data modalities the knowledge base is a large structured knowledge repository for storing and organizing world information enabling ai agents to access and reason about a vast amount of knowledge the task specific modules are a set of modules specifically designed to perform specific tasks such as sentiment analysis machine translation or object detection by combining these components the framework provides a comprehensive flexible and powerful platform for automating various complex tasks and applications features large language model support support for various types of large language models through extensible interfaces dialogue terminal provides a simple dialogue terminal for direct interaction with large language models role presets provides preset roles for invoking gpt from different perspectives long conversation mode supports long conversation chat and persistence in multiple ways external tools integrated external tool capabilities for powerful functions such as web search and executing python code key pool provides an api key pool to completely solve the key rate limiting problem intelligent agent integrates advanced agents such as react and self ask empowering llm with external tools autonomous agent mode supports calling official api interfaces autonomous agents or using agents provided by promptulate chinese optimization specifically optimized for the chinese context more suitable for chinese scenarios data export supports dialogue export in formats such as markdown hooks and lifecycles provides agent tool and llm lifecycles and hook systems advanced abstraction supports plugin extensions storage extensions and large language model extensions quick start quick start official documentation https undertone0809 github io promptulate current development plan https undertone0809 github io promptulate other plan contribution developer s guide https undertone0809 github io promptulate other contribution faq https undertone0809 github io promptulate other fqa pypi repository https pypi org project promptulate to install the framework open the terminal and run the following command shell script pip install u promptulate get started with your helloworld using the simple program below python import os from promptulate llms import chatopenai os environ openai api key your key llm chatopenai answer llm hello how is everything going print answer for more detailed information please refer to the quick start official documentation https undertone0809 github io promptulate architecture currently promptulate is in the rapid development stage and there are still many aspects that need to be improved and discussed your participation and discussions are highly welcome as a large language model automation and application development framework promptulate mainly consists of the following components agent more advanced execution units responsible for task scheduling and distribution llm large language model responsible for generating answers supporting different types of large language models memory responsible for storing conversations supporting different storage methods and extensions such as file storage and database storage framework framework layer that implements different prompt frameworks including the basic conversation model and models such as self ask and react tool provides external tool extensions for search engines calculators etc hook lifecycle hook system and lifecycle system that allows developers to customize lifecycle logic control role presets provides preset roles for customized conversations provider provides more data sources or autonomous operations for the system such as connecting to databases img src https zeeland bucket oss cn beijing aliyuncs com images 20230704180202 png design principles the design principles of the promptulate framework include modularity scalability interoperability robustness maintainability security efficiency and usability modularity refers to the ability to integrate new components models and tools conveniently using modules as the basic unit scalability refers to the framework s capability to handle large amounts of data complex tasks and high concurrency interoperability means that the framework is compatible with various external systems tools and services allowing seamless integration and communication robustness refers to the framework s ability to handle errors faults and recovery mechanisms to ensure reliable operation under different conditions security involves strict security measures to protect the framework its data and users from unauthorized access and malicious behavior efficiency focuses on optimizing the framework s performance resource utilization and response time to ensure a smooth and responsive user experience usability involves providing a user friendly interface and clear documentation to make the framework easy to use and understand by following these principles and incorporating the latest advancements in artificial intelligence technology promptulate aims to provide a powerful and flexible application development framework for creating automated agents community feel free to join the group chat to discuss topics related to large language models llm if the link is expired you can notify the author through an issue or email div style width 250px margin 0 auto img src https zeeland bucket oss cn beijing aliyuncs com images 20231007020407 png div contributions i am currently exploring more comprehensive abstraction patterns to improve compatibility with the framework and the extended use of external tools if you have any suggestions i welcome discussions and exchanges if you would like to contribute to this project please refer to the current development plan https undertone0809 github io promptulate other plan and contribution developer s guide https undertone0809 github io promptulate other contribution i m excited to see more people getting involved and optimizing it
chatgpt gpt-4 langchain llm python prompt prompt-engineering chatbot pne promptulate
ai
linear
linear linear img http mikaa123 github io linear website images screenshot png linear is a ruler application for mac made with web development in mind it sits in your menu bar and doesn t get in your way here are a few highlights linear is not a browser extension browser extensions are really useful but fall short when inspecting the dom of a document since they often inject overlay elements themselves uses px or em values set the default font size to your convenience create multiple rulers duplicate a ruler preserving its height and width really useful when checking margins default themes adapt to dark and light backgrounds system wide shortcuts to hide show rulers fine tune a ruler s positions using arrow keys hold shift for faster move thanks to gtagle https github com gtagle show center guides thanks to radiovisual https github com radiovisual fully customizable css themes thanks to radiovisual https github com radiovisual here is a video https www youtube com watch v vcozn5lwlew action share to get you started linear is proudly built with electron https github com atom electron installing download the latest build https github com mikaa123 linear releases or install via homebrew thanks to goronfreeman https github com goronfreeman brew cask install linear caveats linear uses a transparent resizable frameless window which is rather experimental this is why mac is the only supported platform at the moment customizable themes creating your own linear theme is as simple as adding a custom css file to linear s theme directory linear will use the name of your custom css file as the name of your theme follow these steps to get you started 1 copy the universal css file and rename it to your new theme name e g my awesome theme css 2 edit your custom css file to get the look you want 3 save your custom css to linear s theme directory users username linear themes 4 open or restart the linear application 5 select your new theme by clicking on the theme icon on the bottom right corner of your ruler 6 you can also set your new theme to the default theme in the settings menu tip linear will titlecase your css files to generate your theme name for example a file named my awesome theme css will be seen in linear as my awesome theme contributing any contribution is welcome in fact you ll receive an instant hug for doing one linear was built as a side project and is a little rough around the edges so even bug reports would be great development after you ve cloned or forked the repo you ll need to install the dependencies like electron etc npm install then to start the app npm start to start in debug mode attaches chrome developer tools npm debug license mit the ux shop http www theuxshop com
front_end
CIS430
cis430 information technology infrastructure
server
Beibei_Du_IDS721_Projet3
beibei du ids721 projet3 cloud based big data systems project use a major big data system to perform a data engineering related task big idea plan i am trying to see if i can replicate the work similar to this https huggingface co andite pastel mix img width 832 alt screen shot 2023 02 26 at 9 17 51 pm src https user images githubusercontent com 60382493 221456579 79a179b7 a7a6 41b2 b629 bce7c412e576 png i found out this is not too feasible and decide to preprocess dataset using rust language instead i decide to do food image classification here is the reference of data https www kaggle com datasets dansbecker food 101 i will pre train the model in python and implement in rust however my macbook doesn t support pytorch and i am do it on github codespaces i decided to save this idea for my final project i found another dataset about the college enrollment and decided to do analysis on this https www kaggle com datasets michaelbryantds university enrollments dataset flowchart of this project img width 407 alt screen shot 2023 04 09 at 3 20 15 am src https user images githubusercontent com 60382493 230759856 d6be282f 675a 494a bea4 a845e14c8a24 png rust setup curl proto https tlsv1 2 https sh rustup rs ssf sh source home cargo env cargo new ids721 project3 0322 aws sagemaker setup 1 sign in to aws account 2 open the aws management console and navigate to the sagemaker service 3 choose a name for your notebook instance and select an instance type 4 under iam role choose create a new role and select sagemaker as the service that will use this role 5 choose the policies that provide the necessary permissions for your use case 6 click on the create role button to create the new iam role under iam role choose create a new process 1 getting data from kaggle mavigate to the dataset on kaggle download the data in a format that can be easily read by sagemaker in a csv format store the data in an s3 bucket that can be accessed by sagemaker 2 using sagemaker to do machine learning set up an instance of a jupyter notebook in sagemaker load the data from your s3 bucket into the notebook explore and preprocess the data as necessary train a machine learning model using sagemaker s built in algorithms or by bringing your own algorithm evaluate the model s performance and adjust as necessary img width 1392 alt screen shot 2023 04 09 at 3 49 48 pm src https user images githubusercontent com 60382493 230793622 5d439d1c 91c0 4909 9a0f 3bf1b4278232 png demo of how to do it in sagemaker img width 986 alt screen shot 2023 04 09 at 4 13 14 pm src https user images githubusercontent com 60382493 230794443 6643edbe 833a 46d1 b430 ba971cec10f9 png 3 presenting your work on github using latex modeling results 1 scatterplot img width 567 alt screen shot 2023 04 09 at 3 47 53 am src https user images githubusercontent com 60382493 230760799 a78ff960 2d4c 4971 8183 f843ab59373c png 2 residual plot img width 603 alt screen shot 2023 04 09 at 3 48 03 am src https user images githubusercontent com 60382493 230760809 e54040c8 fa7d 456d 8771 362263b85459 png reference 1 https www kaggle com code odednir multiclass food classification using tensorflow 2 https github com nogibjj rust mlops template blob main pytorch rust example src main rs 3 https www kaggle com datasets michaelbryantds university enrollments dataset
cloud
bedrock
bedrock bedrock lets you set up a production ready webapp in under 10 minutes why should you use it three primary reasons 1 you want a production ready node server in under 5 minutes 2 you want user authentication password resets and more to be setup for you 3 you want react flux and react router enabled with hot loading https github com gaearon react hot loader for super fast development go to documentation site https tilomitra github io bedrock watch a video showing how to get a production ready web app with user authentication set up in 5 minutes using bedrock https www youtube com watch v eduuhdbhfdo features overview an sails express server with user authentication built in hushed auto generated rest api for all your models signup login reset password pages smtp email support server side rendered pages client side rendered components using react new react hot loading enabled to allow super fast development of react components without page refreshes boom star communication between react and server side api with flux client side routing with react router incremental builds using webpack facilitated through grunt migrations to help coordinate database changes production ready such as api access tokens csrf protection cors and more support for multiple environments dev stage prod quickstart we will talk about the installation in more detail in the next section but here is the quickstart guide git clone git github com tilomitra bedrock git project name cd project name npm install then open config connections js and update your database connection details then run the migrations to create the relevant database tables run migrations grunt db migrate up then simply start everything start servers npm start detailed installation and setup bedrock is the starting point of your node application to install you should clone the project and then build on top of it git clone project name cd project name npm install configure database connection go into config connections js update the mysql connection details at this point you may need to create a new database if you want to use a different database postgresql or mongo remove the mysql connection and create a new connection as shown in the comments in the file migrate database tables bedrock sets up authentication for your server and creates login signup and reset password pages it uses passportjs to accomplish this to facilitate this you need to run a migration to create the users and passports table just run this from the command line grunt db migrate up after it runs check your database and you should see users and passports table created we will talk more about migrations in the best practices section run servers in development mode to start developing run npm start this will start up 3 things an express server on port 1337 this is the primary nodejs server a webpack dev server on port 3000 any static assets will be served by webpack in development mode this allows for hot module replacement assets will be built by webpack and a watch task will be started any css and js changes will trigger a new incremental build run servers in production mode in production mode you ll want to build your assets ahead of time to do this run grunt buildprod this will build your assets and store it in the www directory then you can run your server in production mode node env production sails lift this will start up the nodejs server in production mode if you want this to be a long lived background task consider using a node process manager like pm2 http pm2 keymetrics io server side features bedrock is built on sails http sailsjs org so it has all of the great features that sails ships with this includes but is not limited to it s an express server under the hood so all express modules still work reusable security policies middleware configurable via a global config object encourage use of reusable services like a mailer service waterline orm that works well with multiple databases can be swapped out if you need auto generate rest apis optional follows mvc conventions which keeps your code organized as your project grows check out all the features http sailsjs com features on the sails documentation page i ve used raw express and i ve used sails and i will take sails everytime guys client side features bedrock lets you build reusable components using react and call its api via flux actions pages are linked together using react router here s how it works at a high level 1 user navigates to a page 2 page route triggers react router to display a component and execute certain flux actions 3 flux actions trigger api requests 4 api responses trigger flux stores to change 5 flux stores change automatically re renders react components that are watching the store it is simple one way communication that scales to hundreds of components we have 100 components in our internal fork of bedrock more documentation learn more about bedrock on node web apps http nodewebapps com here are some useful tutorials learn how to set up bedrock http nodewebapps com 2016 12 20 create a web app with user authentication in under 10 minutes learn how to set up reset password flow in bedrock http nodewebapps com 2016 12 21 how to send emails in sails bedrock learn how to use migrations to make your database more stable http nodewebapps com 2016 12 20 how to update database schema for a production web app sign up to the node web apps newsletter to stay up to date on new tutorials built with bedrock is composed of these open source frameworks sails http sailsjs com sails makes it easy to build custom enterprise grade node js apps it is designed to emulate the familiar mvc pattern of frameworks like ruby on rails but with support for the requirements of modern apps data driven apis with a scalable service oriented architecture react https facebook github io react a javascript library for building user interfaces react router https github com reacttraining react router react router is a complete routing library for react nuclearjs https github com optimizely nuclear js traditional flux architecture built with immutablejs data structures very similar to redux webpack https webpack github io a module bundler for front end assets it is used in bedrock for chunking javascript files to be loaded on demand and hot reloading react hot loader https github com gaearon react hot loader tweak react components in real time without page refreshes
javascript sails react flux webpack starter-kit
front_end
IMD
intelligent mobile development slides androiddevkotlin https github com walkman617 androiddevkotlin assignments assignments md assignments md final report final report https docs qq com doc dynzhexzlsufyynbg qa qa md qa md kotlin https kotlinlang org kotlin is a cross platform statically typed general purpose programming language with type inference kotlin is designed to interoperate fully with java and the jvm version of kotlin s standard library depends on the java class library but type inference allows its syntax to be more concise develop android apps with kotlin https developer android com kotlin write better android apps faster with kotlin kotlin is a modern statically typed programming language used by over 60 of professional android developers that helps boost productivity developer satisfaction and code safety android studio https developer android com studio android studio provides the fastest tools for building apps on every type of android device machine learning for mobile developers https developers google com ml kit ml kit brings google s machine learning expertise to mobile developers in a powerful and easy to use package
front_end
wuxt
wuxt logo assets wuxt png raw true wuxt nuxt wordpress development environment wuxt wuxt combines wordpress the worlds biggest cms with nuxt js the most awesome front end application framework yet quick start quick introduction intro architecture env getting started start custom container configuration start custom setup wordpress setup wp setup nuxt js setup nuxt generate and deploy deploy wordpress rest api endpoints ep extensions to the api endpoints epp front page epp front menus epp menu slugs epp slugs meta queries epp meta taxonomy queries epp tax geo queries epp geo custom post types epp cpt scripts scripts working with the containers scripts containers wp cli and yarn scripts containers tools scaffolding scripts scaffolding links links credits cred quick start a name quick git clone https github com northosts wuxt git cd wuxt docker compose up d http localhost 3080 install php http localhost 3080 install php install wordpress http localhost 3080 wp admin options permalink php http localhost 3080 wp admin options permalink php set permalinks to post name http localhost 3080 wp admin themes php http localhost 3080 wp admin themes php activate wuxt theme http localhost 3000 http localhost 3000 done introduction a name intro the goal of wuxt is to provide a ready to use development environment which makes the full power of wordpress easily available to your front end app included in wuxt are fully dockerized wordpress and nuxt js container configuration docker compose up d sets up everything needed in one command and you can start working extended rest api to give the front end easy access to meta fields featured media menus or the front page configuration the newest nuxt js version extended with a wordpress wp object to connect to the extended wordpress rest api all together the wuxt features get you started with your front end with just one command you just need to work with the intuitive wordpress admin interface and can skip all back end coding but if you know your way around wordpress you are able to easily extend the back end as well the wuxt architecture a name env wuxt environment assets wuxt env png raw true wuxt environment getting started a name start first clone this repository to a directory you want then change to that directory and simply start your containers you need to have a running docker installation of course docker compose up d that starts the following containers mysql mysql wuxt database for your wordpress installation the data folder of the database container is mirrored to the db folder of your host system to keep the data persistent wordpress wp wuxt on a apache server with the newest php version and the wuxt rest api extension theme acf and other good to have plugins pre installed the wp content directory of the wordpress directory is mirrored to the wp content directory on your host nuxt js front wuxt started in development mode with file monitoring and browser sync and extended by a complete wordpress rest api wrapper and a starter application mimicing base functions of a wordpress theme your containers are available at front end http localhost 3000 back end http localhost 3080 http localhost 3080 wp admin database docker exec ti mysql wuxt bash starting with custom container configuration a name start custom wuxt allows you to change the above setup to run multiple projects at the same time or to adjust to your own environment to change ports and container names add an env file in the same directory of your docker compose yml file you can adjust the following values wuxt port frontend 3000 wuxt port backend 3080 wuxt port dist 8080 wuxt mysql container mysql wuxt wuxt wp container wp wuxt wuxt nuxt container front wuxt after you created the file run docker compose down docker compose up d there is even a script that does everything to you npm run env you will be asked for projectname front end port back end port and dist port the script creates an env file like the folllowing and stops the old containers wuxt port frontend 4000 wuxt port backend 4080 wuxt port dist 9080 wuxt mysql container mysql projectname wuxt wp container wp projectname wuxt nuxt container front projectname after running the script start the new containers docker compose up d setup wordpress a name setup wp in short install wordpress http localhost 3080 install php set permalinks to post name http localhost 3080 wp admin options permalink php activate wuxt theme http localhost 3080 wp admin themes php do a common wordpress installation at http localhost 3080 install php then login to wp admin and select the wuxt theme to activate all the api extensions additionally you might want to activate the acf plugin to make your meta value work easier last but not least you have to set the permalink structure to post name in the wordpress settings to check if everything is running visit http localhost 3080 and verify that the wuxt info screen is showing then check that the rest api at http localhost 3080 wp json is returning a json object you are good to go setup nuxt js a name setup nuxt nuxt should have been started automatically inside the docker container the command we use for running the nuxt js server is yarn dev check if the front end is running by opening http localhost 3000 you should be greeted by the wuxt intro screen check if browsersync is running by doing a minor change to the front page the change should directly be visible on the front page without manually reloading the page generate and deploy a name deploy to make a complete deploy of wuxt you have to get at least one server and solve a lot of configuration stuff to get wordpress mysql and nuxt js running we are working on a manual for some of the big cloud services however there is an easier solution at least without on site user generated content like wordpress comments disqus would be ok though we have tweaked the nuxt js generate command so that you can generate a fully static site with all your content posts and pages inside the dist directory of nuxt then it s only a matter of getting the static html site uploaded to a webspace of your choice generating a fully static site first be sure your containers are running docker compose up d then go to the wuxt root directory and run generate with yarn yarn generate despite generating your static site this commands runs some docker compose action so while generating your wuxt site will be down and started again after after the generation there will be started a small local web server which deploys the static site on http localhost 8080 to get the files you go to the dist directory inside your wuxt nuxt directory wuxt nuxt dist to shut down the local web server you have to run the following command insie the wuxt directory docker compose f dist yml down how it works the generate command simply scrapes all available urls you added to your nuxt js and wordpress installation saves the html output and caches the json requests to the wordpress rest api to know which urls are available the generate command asks the wordpress rest api for a list of existing endpoints and all links used in the wordpress menus you can view that list with the following endpoint get wp json wuxt v1 generate since nuxt js doesn t fully support 100 static sites yet we have to get help of the static plugin used on nuxt org which is jsut taking care of the payload caching read more here https github com nuxt rfcs issues 22 and here https github com nuxt nuxtjs org tree master modules static wordpress rest api endpoints a name ep the wordpress rest api gives you access to a wide range of native endpoints find the docs at https developer wordpress org rest api reference https developer wordpress org rest api reference to easily access the endpoints from nuxt js you can use the wp extension which integrates the node wpapi https www npmjs com package node wp library you can find the full documentation here https github com wp api node wpapi extensions to the api endpoints a name epp to make wuxt even more easy to use there are a bunch of endpoint extensions to the wordpress rest api front page a name epp front wp frontpage wp frontpage embed or get wp json wuxt v1 front page get wp json wuxt v1 front page embed you can use the wordpress front page settings to build your front ends first page if you setup the front page in wordpress as static page the endpoint will return the corresponing page object if there is no front page configured the query automatically returns the result of the default posts query get wp json wp v2 posts note that the embed parameter works for the front page query which gives you access to featured media post thumbnails author information and more menus a name epp menu wp menu wp menu location location or get wp json wuxt v1 menu get wp json wuxt v1 menu location location the wordpress rest api is not providing an endpoint for menus by default so we added one we have also registered a standard menu with the location main which is returned as complete menu tree when you request the endpoint without parameters don t forget to create a menu and adding it to a location in wp admin when you want to use this endpoint if you want to use multiple menus you can request them by providing the menu location to the endpoint slugs a name epp slugs wp slug name post or page slug wp slug name post or page slug embed or get wp json wuxt v1 slug post or page slug get wp json wuxt v1 slug post or page slug embed the wordpress rest api is not providing an endpoint to get posts or pages by slug that doesn t mirror the wordpress theme default behaviour where the url slug can point to both a page or a post with the slug endpoint we add that function which is first looking for a post with the given slug and then for a page the embed parameter is working for the slug endpoint meta fields a name epp meta the wordpress rest api does not include meta fields in the post objects by default for two of the most common plugins acf and yoast wordpress seo we have automatically added the values of these fields they are located in the meta section of the response objects taxonomy queries a name epp tax taxonomy queries are limited of the simple wordpress rest api url structure especially with filtering queries we struggled with the missing relation parameter in queries for posts by taxonomy we added this feature with a new parameter to the wordpress api get wp json wp v2 posts categories 1 2 and true note setting the relation to and will cause all taxonomy queries to use it right now you cant query one taxonomy with and and another with or in nuxt you just have to use the and param after a post query for categories wp posts categories 1 2 param and true geo queries a name epp geo if your application has to get posts by geographical proximity you can use the geo parameters get wp json wp v2 posts coordinates lat lng distance distance the coordinates parameter has to contain lat and lng comma separated and each value can be prefixed with the meta key if has to be compared with default keys lat lng the distance is calculated in kilometers postfix the value with m for miles some example queries get wp json wp v2 posts coordinates 52 585 13 373 distance 10 get wp json wp v2 posts coordinates lat mkey 52 585 lng mkey 13 373 distance 10 get wp json wp v2 posts coordinates 52 585 13 373 distance 10m custom post types a name epp cpt the wordpress rest api is providing endpoints for custom post types as long as they are registered the right way see the scaffolding section for generating cpt definitions to make querying of your custom post types as easy as everything else we added the cpt method to the wp object see post type queries for a fictional movies post type below wp cpt movies wp cpt movies id 7 the cpt function returns cpt objects similar to the posts or pages queries meta fields are included scripts a name scripts to help you with some of the common tasks in wuxt we integrated a bunch of npm scripts just install the needed packages in the root directory and you are ready to run npm install working with the containers a name scripts containers working with docker is awesome but has some drawbacks one of them is that you have to make some changes from inside the container to enter the wuxt containers you can use the following npm scripts npm run enter mysql npm run enter wp npm run enter front you exit a container with exit wp cli and yarn a name scripts containers tools two of the most common tasks are managing wordpress and installing new packages in the front end wuxt provides you with the full power of the wp cli tool check out all documentation at https developer wordpress org cli commands https developer wordpress org cli commands to run any wp cli command inside the wp wuxt container just use the following npm script npm run wp wp cli command examples npm run wp plugin list npm run wp plugin install advanced custom fields npm run wp user create wuxt me wuxt io the same concept we use for yarn in the front container npm run yarn yarn command example npm run yarn add nuxt webfontloader the commands are checking if the containers are running and installing needed dependencies automatically so if wp cli is not installed in the container it will be installed before running a wp command scaffolding a name scripts scaffolding wuxt allows you to generate custom post types and taxonomies via npm scripts you can pass needed parameters as arguments if you don t pass arguments you will get prompted scaffolding a post type npm run scaffold cpt name examples npm run scaffold cpt npm run scaffold cpt movie the custom post type definition is copied into the cpts folder of the wuxt theme and loaded automatically by the theme to query the new post type you can use the cpt method of the wuxt wp object scaffolding a taxonomy npm run scaffold tax name post types examples npm run scaffold tax npm run scaffold tax venue event cafe the taxonomy definition is copied into the taxonomies folder of the wuxt theme and loaded automatically by the theme links a name links wuxt headless wordpress api extensions https wordpress org plugins wuxt headless wp api extensions plugin which includes all our api extensions nuxt wordpress wuxt https www danielauener com nuxt js wordpress wuxt introduction post for wuxt credits a name cred yashha https github com yashha wp nuxt commits author yashha for the excelent idea with the wp object first implemented in https github com yashha wp nuxt https github com yashha wp nuxt
wordpress nuxt vue docker nuxtjs vuejs docker-compose jamstack
front_end
Cloud-Engineering
cloud platform engineering is a multidisciplinary field that requires knowledge of cloud technologies infrastructure management software development practices security and collaboration key aspects of cloud platform engineering include architecture design cloud platform engineers design the overall architecture of cloud environments considering factors like scalability availability performance security and cost optimization they create blueprints for how different components will interact within the cloud infrastructure infrastructure as code iac iac involves using code to define and manage infrastructure resources cloud platform engineers utilize tools like terraform aws cloudformation or azure resource manager templates to automate the provisioning and management of cloud resources resource provisioning cloud platform engineers set up and configure virtual machines networking components storage databases and other resources needed to support applications they ensure resources are provisioned efficiently and securely containerization and orchestration engineers work with container technologies like docker and container orchestration platforms like kubernetes to create deploy and manage applications in a consistent and scalable manner automation and devops automation is a key aspect of cloud platform engineering engineers develop automation scripts and pipelines to streamline deployment scaling monitoring and maintenance processes this aligns with devops principles to achieve faster more reliable software delivery scalability and elasticity cloud platform engineers design systems that can scale horizontally or vertically to handle varying workloads they implement auto scaling mechanisms to ensure resources adjust dynamically based on demand security and compliance engineers implement security best practices to protect data and applications in the cloud this includes identity and access management encryption network security and compliance with industry standards monitoring and performance optimization engineers set up monitoring and alerting systems to track the health and performance of cloud services they analyze metrics and logs to identify bottlenecks and optimize resource utilization high availability and disaster recovery cloud platform engineers design architectures with redundancy and failover mechanisms to ensure high availability they also create disaster recovery plans to mitigate data loss and downtime cost management engineers optimize cloud costs by rightsizing resources utilizing reserved instances and implementing cost allocation and monitoring strategies migration and modernization cloud platform engineers assist organizations in migrating legacy applications to the cloud refactoring applications to take advantage of cloud native services and modernizing it infrastructure
cloud
big-o-performance-java
big o performance a simple html app to demonstrate performance costs of data structures clone the project navigate to the root of the project in a termina or command prompt run npm install run npm start go to the url specified in the terminal or command prompt to try out the app this app was created from the create react app npm below are instructions from that project below you will find some information on how to perform common tasks you can find the most recent version of this guide here https github com facebookincubator create react app blob master template readme md table of contents updating to new releases updating to new releases sending feedback sending feedback folder structure folder structure available scripts available scripts npm start npm start npm run build npm run build npm run eject npm run eject displaying lint output in the editor displaying lint output in the editor installing a dependency installing a dependency importing a component importing a component adding a stylesheet adding a stylesheet post processing css post processing css adding images and fonts adding images and fonts adding bootstrap adding bootstrap adding flow adding flow adding custom environment variables adding custom environment variables integrating with a node backend integrating with a node backend proxying api requests in development proxying api requests in development deployment deployment now now heroku heroku surge surge github pages github pages something missing something missing updating to new releases create react app is divided into two packages create react app is a global command line utility that you use to create new projects react scripts is a development dependency in the generated projects including this one you almost never need to update create react app itself it s delegates all the setup to react scripts when you run create react app it always creates the project with the latest version of react scripts so you ll get all the new features and improvements in newly created apps automatically to update an existing project to a new version of react scripts open the changelog https github com facebookincubator create react app blob master changelog md find the version you re currently on check package json in this folder if you re not sure and apply the migration instructions for the newer versions in most cases bumping the react scripts version in package json and running npm install in this folder should be enough but it s good to consult the changelog https github com facebookincubator create react app blob master changelog md for potential breaking changes we commit to keeping the breaking changes minimal so you can upgrade react scripts painlessly sending feedback we are always open to your feedback https github com facebookincubator create react app issues folder structure after creation your project should look like this my app readme md index html favicon ico node modules package json src app css app js index css index js logo svg for the project to build these files must exist with exact filenames index html is the page template favicon ico is the icon you see in the browser tab src index js is the javascript entry point you can delete or rename the other files you may create subdirectories inside src for faster rebuilds only files inside src are processed by webpack you need to put any js and css files inside src or webpack won t see them you can however create more top level directories they will not be included in the production build so you can use them for things like documentation available scripts in the project directory you can run npm start runs the app in the development mode br open http localhost 3000 http localhost 3000 to view it in the browser the page will reload if you make edits br you will also see any lint errors in the console npm run build builds the app for production to the build folder br it correctly bundles react in production mode and optimizes the build for the best performance the build is minified and the filenames include the hashes br your app is ready to be deployed npm run eject note this is a one way operation once you eject you can t go back if you aren t satisfied with the build tool and configuration choices you can eject at any time this command will remove the single build dependency from your project instead it will copy all the configuration files and the transitive dependencies webpack babel eslint etc right into your project so you have full control over them all of the commands except eject will still work but they will point to the copied scripts so you can tweak them at this point you re on your own you don t have to ever use eject the curated feature set is suitable for small and middle deployments and you shouldn t feel obligated to use this feature however we understand that this tool wouldn t be useful if you couldn t customize it when you are ready for it displaying lint output in the editor note this feature is available with react scripts 0 2 0 and higher some editors including sublime text atom and visual studio code provide plugins for eslint they are not required for linting you should see the linter output right in your terminal as well as the browser console however if you prefer the lint results to appear right in your editor there are some extra steps you can do you would need to install an eslint plugin for your editor first a note for atom linter eslint users if you are using the atom linter eslint plugin make sure that use global eslint installation option is checked img src http i imgur com yvnnhjm png width 300 then make sure package json of your project ends with this block js eslintconfig extends node modules react scripts config eslint js projects generated with react scripts 0 2 0 and higher should already have it if you don t need eslint integration with your editor you can safely delete those three lines from your package json finally you will need to install some packages globally sh npm install g eslint babel eslint eslint plugin react eslint plugin import eslint plugin jsx a11y eslint plugin flowtype we recognize that this is suboptimal but it is currently required due to the way we hide the eslint dependency the eslint team is already working on a solution to this https github com eslint eslint issues 3458 so this may become unnecessary in a couple of months installing a dependency the generated project includes react and reactdom as dependencies it also includes a set of scripts used by create react app as a development dependency you may install other dependencies for example react router with npm npm install save library name importing a component this project setup supports es6 modules thanks to babel while you can still use require and module exports we encourage you to use import and export http exploringjs com es6 ch modules html instead for example button js js import react component from react class button extends component render export default button don t forget to use export default dangerbutton js js import react component from react import button from button import a component from another file class dangerbutton extends component render return button color red export default dangerbutton be aware of the difference between default and named exports http stackoverflow com questions 36795819 react native es 6 when should i use curly braces for import 36796281 36796281 it is a common source of mistakes we suggest that you stick to using default imports and exports when a module only exports a single thing for example a component that s what you get when you use export default button and import button from button named exports are useful for utility modules that export several functions a module may have at most one default export and as many named exports as you like learn more about es6 modules when to use the curly braces http stackoverflow com questions 36795819 react native es 6 when should i use curly braces for import 36796281 36796281 exploring es6 modules http exploringjs com es6 ch modules html understanding es6 modules https leanpub com understandinges6 read leanpub auto encapsulating code with modules adding a stylesheet this project setup uses webpack https webpack github io for handling all assets webpack offers a custom way of extending the concept of import beyond javascript to express that a javascript file depends on a css file you need to import the css from the javascript file button css css button padding 20px button js js import react component from react import button css tell webpack that button js uses these styles class button extends component render you can use them as regular css styles return div classname button this is not required for react but many people find this feature convenient you can read about the benefits of this approach here https medium com seek ui engineering block element modifying your javascript components d7f99fcab52b however you should be aware that this makes your code less portable to other build tools and environments than webpack in development expressing dependencies this way allows your styles to be reloaded on the fly as you edit them in production all css files will be concatenated into a single minified css file in the build output if you are concerned about using webpack specific semantics you can put all your css right into src index css it would still be imported from src index js but you could always remove that import if you later migrate to a different build tool post processing css this project setup minifies your css and adds vendor prefixes to it automatically through autoprefixer https github com postcss autoprefixer so you don t need to worry about it for example this css app display flex flex direction row align items center becomes this css app display webkit box display ms flexbox display flex webkit box orient horizontal webkit box direction normal ms flex direction row flex direction row webkit box align center ms flex align center align items center there is currently no support for preprocessors such as less or for sharing variables across css files adding images and fonts with webpack using static assets like images and fonts works similarly to css you can import an image right in a javascript module this tells webpack to include that image in the bundle unlike css imports importing an image or a font gives you a string value this value is the final image path you can reference in your code here is an example js import react from react import logo from logo png tell webpack this js file uses this image console log logo logo 84287d09 png function header import result is the url of your image return img src logo alt logo export default function header this works in css too css logo background image url logo png webpack finds all relative module references in css they start with and replaces them with the final paths from the compiled bundle if you make a typo or accidentally delete an important file you will see a compilation error just like when you import a non existent javascript module the final filenames in the compiled bundle are generated by webpack from content hashes if the file content changes in the future webpack will give it a different name in production so you don t need to worry about long term caching of assets please be advised that this is also a custom feature of webpack it is not required for react but many people enjoy it and react native uses a similar mechanism for images however it may not be portable to some other environments such as node js and browserify if you prefer to reference static assets in a more traditional way outside the module system please let us know in this issue https github com facebookincubator create react app issues 28 and we will consider support for this adding bootstrap you don t have to use react bootstrap https react bootstrap github io together with react but it is a popular library for integrating bootstrap with react apps if you need it you can integrate it with create react app by following these steps install react bootstrap and bootstrap from npm react bootstrap does not include bootstrap css so this needs to be installed as well npm install react bootstrap save npm install bootstrap 3 save import bootstrap css and optionally bootstrap theme css in the src index js file js import bootstrap dist css bootstrap css import bootstrap dist css bootstrap theme css import required react bootstrap components within src app js file or your custom component files js import navbar jumbotron button from react bootstrap now you are ready to use the imported react bootstrap components within your component hierarchy defined in the render method here is an example app js https gist githubusercontent com gaearon 85d8c067f6af1e56277c82d19fd4da7b raw 6158dd991b67284e9fc8d70b9d973efe87659d72 app js redone using react bootstrap adding flow flow typing is currently not supported out of the box https github com facebookincubator create react app issues 72 with the default flowconfig generated by flow if you run it you might get errors like this js node modules fbjs lib deferred js flow 60 60 promise prototype done apply this promise arguments property done property not found in 495 declare class promise r promise see lib private tmp flow flowlib 34952d31 core js 495 node modules fbjs lib shallowequal js flow 29 29 return x 0 1 x flowissue 1 y flowissue identifier flowissue could not resolve name src app js 3 3 import logo from logo svg logo svg required module not found src app js 4 4 import app css app css required module not found src index js 5 5 import index css index css required module not found to fix this change your flowconfig to look like this ini libs node modules fbjs flow lib options esproposal class static fields enable esproposal class instance fields enable module name mapper css react scripts config flow css module name mapper jpg png gif eot otf webp svg ttf woff woff2 mp4 webm react scripts config flow file suppress type flowissue suppress type flowfixme re run flow and you shouldn t get any extra issues if you later eject you ll need to replace react scripts references with the project root placeholder for example ini module name mapper css project root config flow css module name mapper jpg png gif eot otf webp svg ttf woff woff2 mp4 webm project root config flow file we will consider integrating more tightly with flow in the future so that you don t have to do this adding custom environment variables note this feature is available with react scripts 0 2 3 and higher your project can consume variables declared in your environment as if they were declared locally in your js files by default you will have node env defined for you and any other environment variables starting with react app these environment variables will be defined for you on process env for example having an environment variable named react app secret code will be exposed in your js as process env react app secret code in addition to process env node env these environment variables can be useful for displaying information conditionally based on where the project is deployed or consuming sensitive data that lives outside of version control first you need to have environment variables defined which can vary between oses for example let s say you wanted to consume a secret defined in the environment inside a form jsx render return div small you are running this application in b process env node env b mode small form input type hidden defaultvalue process env react app secret code form div the above form is looking for a variable called react app secret code from the environment in order to consume this value we need to have it defined in the environment windows cmd exe cmd set react app secret code abcdef npm start note the lack of whitespace is intentional linux os x bash bash react app secret code abcdef npm start note defining environment variables in this manner is temporary for the life of the shell session setting permanent environment variables is outside the scope of these docs with our environment variable defined we start the app and consume the values remember that the node env variable will be set for you automatically when you load the app in the browser and inspect the input you will see its value set to abcdef and the bold text will show the environment provided when using npm start html div small you are running this application in b development b mode small form input type hidden value abcdef form div having access to the node env is also useful for performing actions conditionally js if process env node env production analytics disable integrating with a node backend check out this tutorial https www fullstackreact com articles using create react app with a server for instructions on integrating an app with a node backend running on another port and using fetch to access it you can find the companion github repository here https github com fullstackreact food lookup demo proxying api requests in development note this feature is available with react scripts 0 2 3 and higher people often serve the front end react app from the same host and port as their backend implementation for example a production setup might look like this after the app is deployed static server returns index html with react app todos static server returns index html with react app api todos server handles any api requests using the backend implementation such setup is not required however if you do have a setup like this it is convenient to write requests like fetch api todos without worrying about redirecting them to another host or port during development to tell the development server to proxy any unknown requests to your api server in development add a proxy field to your package json for example js proxy http localhost 4000 this way when you fetch api todos in development the development server will recognize that it s not a static asset and will proxy your request to http localhost 4000 api todos as a fallback conveniently this avoids cors issues http stackoverflow com questions 21854516 understanding ajax cors and security considerations and error messages like this in development fetch api cannot load http localhost 4000 api todos no access control allow origin header is present on the requested resource origin http localhost 3000 is therefore not allowed access if an opaque response serves your needs set the request s mode to no cors to fetch the resource with cors disabled keep in mind that proxy only has effect in development with npm start and it is up to you to ensure that urls like api todos point to the right thing in production you don t have to use the api prefix any unrecognized request will be redirected to the specified proxy currently the proxy option only handles http requests and it won t proxy websocket connections if the proxy option is not flexible enough for you alternatively you can enable cors on your server here s how to do it for express http enable cors org server expressjs html use environment variables adding custom environment variables to inject the right server host and port into your app deployment by default create react app produces a build assuming your app is hosted at the server root to override this specify the homepage in your package json for example js homepage http mywebsite com relativepath this will let create react app correctly infer the root path to use in the generated html file now see this example https github com xkawi create react app now for a zero configuration single command deployment with now https zeit co now heroku use the heroku buildpack for create react app https github com mars create react app buildpack you can find instructions in deploying react with zero configuration https blog heroku com deploying react with zero configuration surge install the surge cli if you haven t already by running npm install g surge run the surge command and log in you or create a new account you just need to specify the build folder and your custom domain and you are done sh email email domain com password project path path to project build size 7 files 1 8 mb domain create react app surge sh upload 100 eta 0 0s propagate on cdn 100 plan free users email domain com ip address x x x x success project is published and running at create react app surge sh note that in order to support routers that use html5 pushstate api you may want to rename the index html in your build folder to 200 html before deploying to surge this ensures that every url falls back to that file https surge sh help adding a 200 page for client side routing github pages note this feature is available with react scripts 0 2 0 and higher open your package json and add a homepage field js homepage http myusername github io my app the above step is important create react app uses the homepage field to determine the root url in the built html file now whenever you run npm run build you will see a cheat sheet with a sequence of commands to deploy to github pages sh git commit am save local changes git checkout b gh pages git add f build git commit am rebuild website git filter branch f prune empty subdirectory filter build git push f origin gh pages git checkout you may copy and paste them or put them into a custom shell script you may also customize them for another hosting provider note that github pages doesn t support routers that use the html5 pushstate history api under the hood for example react router using browserhistory this is because when there is a fresh page load for a url like http user github io todomvc todos 42 where todos 42 is a frontend route the github pages server returns 404 because it knows nothing of todos 42 if you want to add a router to a project hosted on github pages here are a couple of solutions you could switch from using html5 history api to routing with hashes if you use react router you can switch to hashhistory for this effect but the url will be longer and more verbose for example http user github io todomvc todos 42 k yknaj read more https github com reactjs react router blob master docs guides histories md histories about different history implementations in react router alternatively you can use a trick to teach github pages to handle 404 by redirecting to your index html page with a special redirect parameter you would need to add a 404 html file with the redirection code to the build folder before deploying your project and you ll need to add code handling the redirect parameter to index html you can find a detailed explanation of this technique in this guide https github com rafrex spa github pages something missing if you have ideas for more how to recipes that should be on this page let us know https github com facebookincubator create react app issues or contribute some https github com facebookincubator create react app edit master template readme md
server
Sweet-Sweets-Mania-SpriteKit-Game-With-Continuous-Integration
sweet sweets mania spritekit game with continuous integration platform ios https img shields io badge platform ios blue svg swift version https img shields io badge swift 4 2 orange svg lisence https img shields io badge license mit lightgrey svg a single player game in spritekit created completely programmatically without the use of sks files and xcode scene editor implemented continuous integration using jenkins fastlane github and unit tests into one continuous integration system so everything is automated from the first commit to the email notification at the end from jenkins when the build finished likewise i used firebase as a backend and implemented its sign up login with facebook and anonymously into the app and saved users data like uid email address username and profile image into firebase database and firebase storage as well a persistent store is implemented so all user data is saved after the user quits the app during the development of the app i followed the mvc design pattern and likewise i used sketch for creating most of the ui side note currently i do not have an iphone so i m unable to test the app on a physical device i apologize in advance for maybe possible bugs kind regards ilija requirements swift 4 2 xcode 9 2 ios 11 0 getting the files use github to clone the repository locally or download the zip file of the repository and extract the files example how the ui looks this is the welcome screen that the user is presented during the first launch of the app on the screen we can see the background with the name of the app with facebook sign up button and the sign up anonymously button at the very bottom of the screen we can see the terms of service and privacy policy alt text https github com ilijamihajlovic sweet sweets mania blob master images welcomescenewoman png here we can see like on the first image the welcome screen the main menu and on the far right side the settings slide menu after the user presses the settings button alt text https github com ilijamihajlovic sweet sweets mania blob master images firstthreescreens png on this image we have three screens on the left side is the gameplay screen in the middle is the game over screen with the replay button the best score the current score and the back button on the top left corner of the screen on the far right side of the image we have the gameplay screen with the skstorereviewcontroller pop up asking the user to rate the app on the app store we are asking the user to rate the app after he she presses the replay button for a new game in the future version of the app i will implement the request after the user finishes a level successfully alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images gamescenes png now we re in what i call the swiping controller here i used a uicollectionviewcontroller and uicollectionviewflowlayout methods to implement the page to page swiping mechanism uipageviewcontroller with the flexibility uicollectionview has uipagecontrol was used for the dots representing on what page the user is currently on and how many pages there are also we have two buttons on near the button of the screen for navigating through the pages together with the uipagecontrol in a uistackview on each page there is a uitextview and a uiimageview that represent a different image and a different text alt text https github com ilijamihajlovic sweet sweets mania blob master images swipingcontroller png on this image we have the user profile controller with the image on left showing when the user sign up anonymously and on the right side is signing up with facebook also there are buttons for signing out one to the main menu and one for fetching users data this one changes the uilabel and uiimageview with the user s profile image from facebook and users email address alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images userprofilescene png the project a short sneak peek how the project looks alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images project png continuous integration with jenkins and fastlane notifying jenkins with github webhook the jenkins server gets notified with the github webhook to pull the project and build it as soon as i push a commit to the github repository here we can see my previous commits during my testing alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images jenkins 20 26 20fastlane 20 ci changesinrepository png one thing we need for this a payload url the url of our your jenkins server unless we specify otherwise we re using a localhost but github needs a real url to send notifications to to overcame this burden is i m using ngrok an app that uses a secure tunnel to expose localhost to the internet after ngrok is downloaded all i need to do is to run it in terminal console ngrok http 8080 and we can see ngrok running in the terminal also in the background are our unit tests making sure every commit and push to the repository is tested alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images jenkins 20 26 20fastlane 20 ci ngrok png after that all i need to do is to head back to my github repository and past the url in the payload url field and add github webhook to the end of the url this will be the endpoint on my jenkins server that responds to pushes from github and we re all set email notifications from jenkins i also added email notifications from jenkins over an smtp simple mail transfer protocol server for each build i don t wanna get into detail here because it would take a whole tutorial for that and this markdown file simply isn t made for that alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images jenkins 20 26 20fastlane 20 ci emailnotificationsfromjenkins png sweet sweets mania workspace on the jenkins server and this is the workspace looks on the jenkins automation server alt text https github com ilijamihajlovic sweet sweets mania spritekit game blob master images jenkins 20 26 20fastlane 20 ci jenkinsworkspace png license mit license copyright c 2019 ilija mihajlovic permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and non infringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software
spritekit swift swift4 swift5 firebase-database firebase-storage firebase-authentication ios ios-swift spritekit-demo ios-app facebook-login facebook-graph-api jenkins jenkins-ci jenkins-pipeline continuous-integration ci fastlane fastlane-ios
server
Using-IOT-toProcess-BlockchainAnalytics
warning this repository is no longer maintained warning this repository will not be updated the repository will be kept available in read only mode using iot toprocess blockchainanalytics docs iot blockchain png introduction iot s role in this project will take the users from the blockchain network and list them as devices on the watson iot platform everytime there s a new user created besides that iot will simulate a demo device and or a real device can be chosen to display their data on the platform s dashboard all these devices in this project will one device type docs iot dash png that s with the watson iot platform but within node red our iot app will run analytics to display the total steps and total fitcoins of all users the app will also use a graph in real time to show the blockchain transactions coming to the iot whether for new user creation user validation or for generated values of steps and fitcoins docs node red dash png prerequisites you will need the following accounts and tools ibm cloud account https console ng bluemix net registration blockchain network pattern optional https developer ibm com code patterns explore hyperledger fabric through a complete set of apis steps of use choose from ibm cloud bluemix catalog the internet of things service name it and create it after few minutes when your app is ready open cloudant and create a database with the name of secretmap as seen below in the pic docs cloudant png click your app s url to open your node red editor copy in there all the contents from the json file inside the scripts folder and paste them in the import box blockchain network will pass the blocks where all the information will be taken and be saved to be used for analytics and display the values on the node red dashboard as seen in the above image make sure to have cloudant database service available and binded to your bluemix ibm cloud app instance make sure also to have a watson iot platform created and binded as a service too docs connections png dashboard needs to be created in the node red if it s not taken care of by the code import docs nr dash setup png this is how the node red code looks like docs nodered nodes png watson iot platform list of devices preview docs iot platform png values are passed in as json into https your app name mybluemix net steps message blocks try live demo url demo display https think iot processor mybluemix net ui 0 technology blockchain https developer ibm com code technologies blockchain hyperledger fabric v1 0 https www hyperledger org projects fabric cm mc uid 56476701007714999647300 cm mc sid 50200000 1501558767 useful links ibm cloud https bluemix net ibm cloud documentation https www ng bluemix net docs ibm cloud developers community http developer ibm com bluemix ibm watson internet of things http www ibm com internet of things ibm watson iot platform http www ibm com internet of things iot solutions watson iot platform ibm watson iot platform developers community https developer ibm com iotplatform node red https nodered org license this code pattern is licensed under the apache software license version 2 separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses contributions are subject to the developer certificate of origin version 1 1 dco https developercertificate org and the apache software license version 2 http www apache org licenses license 2 0 txt apache software license asl faq http www apache org foundation license faq html whatdoesitmean
ibmcode blockchain iot iot-toprocess-blockchainanalytics watson-iot-platform blockchain-network node-red ibm-cloud analytics dashboard
blockchain
WQU-data-science-challenges
wqu data science challenges to complete the 2 unit 16 week applied data science module of worldquant university students are required to succeed 6 mini projects in total i have successfully completed them and maintained a cumulative average score of 90 or above the mini projects are as follows applied data science unit i scientific computing and python in mini project 1 students use python to compute mersenne numbers using the lucas lehmer test to identify mersenne numbers that are prime they had to use python data structures and core programming principles such as for loops to implement their solution further they had to implement the sieve of eratosthenes as a faster solution for checking if numbers are prime learning about the importance of algorithm time complexity in mini project 2 students used object oriented programming to create a class that represents a geometric point they define methods that describe common operations with points such as adding two points together and finding the distance between two points finally they wrote a k means clustering algorithm that uses the previous defined point class in mini project 3 students used basic python data structures functions and control program flow to answer posed questions over medical data from the british nhs on prescription drugs they had to use fundamental data wrangling techniques such as joining data sets together splitting data into groups and aggregating data into summary statistics in mini project 4 students used the python package pandas to perform data analysis on a prescription drug data set from the british nhs they answered questions such as identifying what medical practices prescribe opioids at an usually high rate and what practices are prescribing substantially more rare drugs compared to the rest of the medical practices they used statistical concepts like z score to help identify the aforementioned practices applied data science unit ii machine learning statistical analysis in mini project 5 students worked with nursing home inspection data from the united states predicting which providers may be fined and for how much they used the scikit learn python package to construct progressively more complicated machine learning models they had to impute missing values apply feature engineering and encode categorical data in mini project 6 students used natural language processing to train various machine learning models to predict an amazon review rating based on the text of the review further they used one of the trained models to gain insight on the reviews identifying words that are highly polar with these highly polar words identified one can understand what words highly influence the model s prediction
machine-learning natural-language-processing data-science data-wrangling python
ai
AndroidMusicPlayer
androidmusicplayer project 4 for uic cs478 development of mobile applications version 1 0 info a project dealing with activities android services and database transactions using mysqlite license mit
front_end
cosmos-sdk
div align center h1 cosmos sdk h1 div banner https github com cosmos cosmos sdk docs blob main static img banner jpg div align center a href https github com cosmos cosmos sdk blob main license img alt license apache 2 0 src https img shields io github license cosmos cosmos sdk svg a a href https pkg go dev github com cosmos cosmos sdk img src https pkg go dev badge github com cosmos cosmos sdk svg alt go reference a a href https goreportcard com report github com cosmos cosmos sdk img alt go report card src https goreportcard com badge github com cosmos cosmos sdk a a href https sonarcloud io summary overall id cosmos cosmos sdk img alt code coverage src https sonarcloud io api project badges measure project cosmos cosmos sdk metric coverage a a href https sonarcloud io summary overall id cosmos cosmos sdk img alt sonarcloud analysis src https sonarcloud io api project badges measure project cosmos cosmos sdk metric alert status a div div align center a href https discord gg cosmosnetwork img alt discord src https img shields io discord 669268347736686612 svg a a href https sourcegraph com github com cosmos cosmos sdk badge img alt imported by src https sourcegraph com github com cosmos cosmos sdk badge svg a img alt sims src https github com cosmos cosmos sdk workflows sims badge svg img alt lint satus src https github com cosmos cosmos sdk workflows lint badge svg div the cosmos sdk is a framework for building blockchain applications cometbft bft consensus https github com cometbft cometbft and the cosmos sdk are written in the go programming language cosmos sdk is used to build gaia https github com cosmos gaia the implementation of the cosmos hub warning the cosmos sdk has mostly stabilized but we are still making some breaking changes note we advise to always use the latest maintained go https go dev dl version for building cosmos sdk applications quick start to learn how the cosmos sdk works from a high level perspective see the cosmos sdk high level intro https docs cosmos network main intro overview html if you want to get started quickly and learn how to build on top of cosmos sdk visit cosmos sdk tutorials https tutorials cosmos network you can also fork the tutorial s repository to get started building your own cosmos sdk application for more information see the cosmos sdk documentation https docs cosmos network contributing see contributing md contributing md for details on how to contribute and participate in our dev calls contributing md teams dev calls if you want to follow the updates or learn more about the latest design then join our discord https discord gg cosmosnetwork tools and frameworks the cosmos ecosystem is vast awesome cosmos https github com cosmos awesome cosmos is a community curated list of notable frameworks modules and tools cosmos hub mainnet the cosmos hub application gaia has its own cosmos gaia repository https github com cosmos gaia go there to join the cosmos hub mainnet and more inter blockchain communication ibc the ibc module for the cosmos sdk has its own cosmos ibc go repository https github com cosmos ibc go go there to build and integrate with the ibc module disambiguation this cosmos sdk project is not related to the react cosmos https github com react cosmos react cosmos project yet many thanks to evan coury and ovidiu skidding for this github organization name as per our agreement this disambiguation notice will stay here
go cryptocurrency tendermint cosmos-sdk golang blockchain cometbft
blockchain
05-design-system
components text heading box button textinput textarea checkbox avatar multistep
os
arduino-mqtt
arduino mqtt test https github com 256dpi arduino mqtt actions workflows test yml badge svg https github com 256dpi arduino mqtt actions workflows test yml github release https img shields io github release 256dpi arduino mqtt svg https github com 256dpi arduino mqtt releases this library bundles the lwmqtt https github com 256dpi lwmqtt mqtt 3 1 1 client and adds a thin wrapper to get an arduino like api download the latest version from the release https github com 256dpi arduino mqtt releases section or even better use the built in library manager in the arduino ide and search for lwmqtt the library is also available on platformio https platformio org lib show 617 mqtt you can install it by running pio lib install 256dpi mqtt compatibility the following examples show how you can use the library with various arduino compatible hardware arduino yun yun shield https github com 256dpi arduino mqtt blob master examples arduinoyun arduinoyun ino secure https github com 256dpi arduino mqtt blob master examples arduinoyunsecure arduinoyunsecure ino arduino ethernet shield https github com 256dpi arduino mqtt blob master examples arduinoethernetshield arduinoethernetshield ino arduino wifi shield https github com 256dpi arduino mqtt blob master examples arduinowifishield arduinowifishield ino adafruit huzzah esp8266 https github com 256dpi arduino mqtt blob master examples adafruithuzzahesp8266 adafruithuzzahesp8266 ino secure https github com 256dpi arduino mqtt blob master examples adafruithuzzahesp8266secure adafruithuzzahesp8266secure ino arduino wifi101 shield https github com 256dpi arduino mqtt blob master examples arduinowifi101 arduinowifi101 ino secure https github com 256dpi arduino mqtt blob master examples arduinowifi101secure arduinowifi101secure ino arduino mkr gsm 1400 https github com 256dpi arduino mqtt blob master examples arduinomkrgsm1400 arduinomkrgsm1400 ino secure https github com 256dpi arduino mqtt blob master examples arduinomkrgsm1400secure arduinomkrgsm1400secure ino arduino mkr nb 1500 https github com 256dpi arduino mqtt blob master examples arduinomkrnb1500 arduinomkrnb1500 ino esp32 development board https github com 256dpi arduino mqtt blob master examples esp32developmentboard esp32developmentboard ino secure https github com 256dpi arduino mqtt blob master examples esp32developmentboardsecure esp32developmentboardsecure ino other shields and boards should also work if they provide a client https www arduino cc en reference clientconstructor based network implementation check out the wiki https github com 256dpi arduino mqtt wiki to find more examples notes the maximum size for packets being published and received is set by default to 128 bytes to change the buffer sizes you need to use mqttclient client 256 or mqttclient client 256 512 instead of just mqttclient client at the top of your sketch a single value denotes both the read and write buffer size two values specify them separately beginning with version 2 6 the message payload is sent directly during publishing therefore the write buffer is only needed to encode the packet header and topic for which the default 128 bytes should be enough however the receiving of messages is still fully constrained by the read buffer which may be increased if necessary on the esp8266 it has been reported that an additional delay 10 after client loop fixes many stability issues with wifi connections to use the library with shiftr io you need to provide the instance name username and token secret password as the second and third argument to client connect client id username password example the following example uses an arduino mkr1000 to connect to the public shiftr io instance you can check on your device after a successful connection here https www shiftr io try c include spi h include wifi101 h include mqtt h const char ssid ssid const char pass pass wificlient net mqttclient client unsigned long lastmillis 0 void connect serial print checking wifi while wifi status wl connected serial print delay 1000 serial print nconnecting while client connect arduino public public serial print delay 1000 serial println nconnected client subscribe hello client unsubscribe hello void messagereceived string topic string payload serial println incoming topic payload note do not use the client in the callback to publish subscribe or unsubscribe as it may cause deadlocks when other things arrive while sending and receiving acknowledgments instead change a global variable or push to a queue and handle it in the loop after calling client loop void setup serial begin 115200 wifi begin ssid pass note local domain names e g computer local on osx are not supported by arduino you need to set the ip address directly client begin public cloud shiftr io net client onmessage messagereceived connect void loop client loop if client connected connect publish a message roughly every second if millis lastmillis 1000 lastmillis millis client publish hello world api initialize the object using the hostname of the broker the brokers port default 1883 and the underlying client class for network transport c void begin client client void begin const char hostname client client void begin const char hostname int port client client void begin ipaddress address client client void begin ipaddress address int port client client specify port 8883 when using secure clients for encrypted connections local domain names e g computer local on osx are not supported by arduino you need to set the ip address directly the hostname and port can also be changed after calling begin c void sethost const char hostname void sethost const char hostname int port void sethost ipaddress address void sethost ipaddress address int port set a will message last testament that gets registered on the broker after connecting setwill has to be called before calling connect c void setwill const char topic void setwill const char topic const char payload void setwill const char topic const char payload bool retained int qos void clearwill register a callback to receive messages c void onmessage mqttclientcallbacksimple callback signature void messagereceived string topic string payload void onmessage mqttclientcallbacksimplefunction cb callback signature std function void string topic string payload void onmessageadvanced mqttclientcallbackadvanced callback signature void messagereceived mqttclient client char topic char bytes int length void onmessageadvanced mqttclientcallbackadvancedfunction cb callback signature std function void mqttclient client char topic char bytes int length the set callback is mostly called during a call to loop but may also be called during a call to subscribe unsubscribe or publish qos 0 if messages have been received before receiving the required acknowledgement therefore it is strongly recommended to not call subscribe unsubscribe or publish qos 0 directly in the callback in case you need a reference to an object that manages the client use the void ref property on the client to store a pointer and access it directly from the advanced callback if the platform supports functional you can directly register a function wrapper set more advanced options c void setkeepalive int keepalive void setcleansession bool cleansession void settimeout int timeout void setoptions int keepalive bool cleansession int timeout the keepalive option controls the keep alive interval in seconds default 10 the cleansession option controls the session retention on the broker side default true the timeout option controls the default timeout for all commands in milliseconds default 1000 set a custom clock source custom millis callback to enable deep sleep applications c void setclocksource mqttclientclocksource callback signature uint32 t clocksource the specified callback is used by the internal timers to get a monotonic time in milliseconds since the clock source for the built in millis is stopped when the arduino goes into deep sleep you need to provide a custom callback that first syncs with a built in or external real time clock rtc you can pass null to reset to the default implementation connect to broker using the supplied client id and an optional username and password c bool connect const char clientid bool skip false bool connect const char clientid const char username bool skip false bool connect const char clientid const char username const char password bool skip false if password is present but username is absent the client will fall back to an empty username if the skip option is set to true the client will skip the network level connection and jump to the mqtt level connection this option can be used in order to establish and verify tls connections manually before giving control to the mqtt client the functions return a boolean that indicates if the connection has been established successfully true publish a message to the broker with an optional payload which can be a string or binary c bool publish const string topic bool publish const char topic bool publish const string topic const string payload bool publish const string topic const string payload bool retained int qos bool publish const char topic const string payload bool publish const char topic const string payload bool retained int qos bool publish const char topic const char payload bool publish const char topic const char payload bool retained int qos bool publish const char topic const char payload int length bool publish const char topic const char payload int length bool retained int qos beginning with version 2 6 payloads of arbitrary length may be published see notes notes the functions return a boolean that indicates if the publishing has been successful true obtain the last used packet id and prepare the publication of a duplicate message using the specified packet id c uint16 t lastpacketid void prepareduplicate uint16 t packetid these functions may be used to implement a retry logic for failed publications of qos1 and qos2 messages the lastpacketid function can be used after calling publish to obtain the used packet id the prepareduplicate function may be called before publish to temporarily change the next used packet id and flag the message as a duplicate subscribe to a topic c bool subscribe const string topic bool subscribe const string topic int qos bool subscribe const char topic bool subscribe const char topic int qos the functions return a boolean that indicates if the subscription has been successful true unsubscribe from a topic c bool unsubscribe const string topic bool unsubscribe const char topic the functions return a boolean that indicates if the unsubscription has been successful true sends and receives packets c bool loop this function should be called in every loop the function returns a boolean that indicates if the loop has been successful true check if the client is currently connected c bool connected check whether a session was present at the time of the last connect c bool sessionpresent configure dropping of overflowing messages exceeding read buffer and checking the count of dropped messages c void dropoverflow bool enabled uint32 t droppedmessages access low level information for debugging c lwmqtt err t lasterror lwmqtt return code t returncode the error codes can be found here https github com 256dpi lwmqtt blob master include lwmqtt h l15 the return codes can be found here https github com 256dpi lwmqtt blob master include lwmqtt h l260 disconnect from the broker c bool disconnect the function returns a boolean that indicates if the disconnect has been successful true release management update version in library properties create release on github
arduino mqtt iot
server
masa-node-v1.0
masa testnet node v1 01 release date february 22nd 2022 requirements tested with go version 1 16 14 download here https go dev dl roadmap todo s the masa node ui is in alpha and will get incremental releases please report all bugs you find as an issue here https github com masa finance masa node v1 0 issues run with docker this guide will get you up and running using docker if you want to us the geth binary please navigate to the bottom section of the page here run with geth get docker 1 install docker https www docker com get started if your docker distribution does not contain docker compose follow this https docs docker com compose install to install docker compose make sure your docker daemon has at least 4g memory required docker engine 18 02 0 and docker compose 1 21 install the masa testnet node v1 01 the docker compose file also launches the node ui which can be accessed at the following url http localhost 3000 navigate here to interact with the node git clone https github com masa finance masa node v1 0 cd masa node v1 0 directory structure masa node v1 network testnet genesis json node data src ui geth files docker compose yml genesis json run docker 1 run private config ignore docker compose up d sh cd masa node v1 0 private config ignore docker compose up d 1 run docker ps to verify that you masa node container is healthy 1 run docker logs container name f to view the logs for a particular container 1 note to attach geth to your node javascript console use the same container id or name from docker ps sh docker exec it masa node v10 masa node 1 bin sh geth attach qdata dd geth ipc welcome to the geth javascript console instance geth node1 istanbul v1 9 24 stable d5ef77ca quorum v21 7 1 linux amd64 go1 15 5 coinbase 0xa3178965a2022c8374afe6690182f54d48208d0a at block 18008 thu dec 09 2021 20 45 32 gmt 0000 utc datadir qdata dd modules admin 1 0 debug 1 0 eth 1 0 istanbul 1 0 miner 1 0 net 1 0 personal 1 0 rpc 1 0 txpool 1 0 web3 1 0 to exit press ctrl d 1 to shutdown the masa testnet node sh docker compose down troubleshooting docker 1 docker is frozen or containers crash and reboot check if your docker daemon is allocated enough memory minimum 4g additional bootnodes masa operates several bootnodes one is already included in the docker file by default if you are having issues connecting to the bootnode please use an alternaitve from the list below we are also looking for community run bootnodes to add to our list please reach out to us on discord or submit a pr to this repo if you want to add a bootnode to the community list masa bootnodes enode ac6b1096ca56b9f6d004b779ae3728bf83f8e22453404cc3cef16a3d9b96608bc67c4b30db88e0a5a6c6390213f7acbe1153ff6d23ce57380104288ae19373ef 54 146 254 245 21000 enode 91a3c3d5e76b0acf05d9abddee959f1bcbc7c91537d2629288a9edd7a3df90acaa46ffba0e0e5d49a20598e0960ac458d76eb8fa92a1d64938c0a3a3d60f8be4 54 158 188 182 21000 enode d87c03855093a39dced2af54d39b827e4e841fd0ca98673b2e94681d9d52d2f1b6a6d42754da86fa8f53d8105896fda44f3012be0ceb6342e114b0f01456924c 34 225 220 240 21000 enode fcb5a1a8d65eb167cd3030ca9ae35aa8e290b9add3eb46481d0fbd1eb10065aeea40059f48314c88816aab2af9303e193becc511b1035c9fd8dbe97d21f913b9 52 1 125 71 21000 community bootnodes enode 571be7fe060b183037db29f8fe08e4fed6e87fbb6e7bc24bc34e562adf09e29e06067be14e8b8f0f2581966f3424325e5093daae2f6afde0b5d334c2cd104c79 142 132 135 228 21000 enode 269ecefca0b4cd09bf959c2029b2c2caf76b34289eb6717d735ce4ca49fbafa91de8182dd701171739a8eaa5d043dcae16aee212fe5fadf9ed8fa6a24a56951c 65 108 72 177 21000 submit a pr to add a bootnode to the community list here https github com masa finance masa node v1 0 pulls node syncing it can take some time for your node to fully sync to the masa testnet 2 0 please be patient while your node catches up with the most recent blocks node ui specification react js typescript docker for deployment the node ui runs when you deploy using the docker compose files above navigate to you local host to interact with the masa node http localhost 3000 run with geth to run from source follow these steps clone the repository and build the source git clone https github com masa finance masa node v1 0 cd masa node v1 0 src make all make all must be run from within the src folder run the tests make test add path method 1 binaries are placed in repo root build bin you must add the bin folder to path to make geth and bootnode easily invokable from the command line for example if users yourname masa node v1 0 is the location you have cloned the masa node repository to on your computer in your terminal run sudo nano etc paths or export path path repo root build bin remember to source your path or restart the terminal run echo path from the command line to check that the path has been added correctly for example echo path gives the following response usr local bin usr bin bin usr sbin sbin usr quorum build bin users yourname masa node build bin usr local go bin usr local share dotnet dotnet tools library frameworks mono framework versions current commands method 2 the second way to make geth and bootnode easily invokable is to copy the binaries located in repo root build bin to a folder already in your path file such as usr local bin method 3 you can also supplement path by adding add path path repo root build bin to your bashrc bash aliases or bash profile file testing path when you run geth from the command line from an arbitrary folder you will get the following output on the terminal geth returns info 12 08 05 37 18 131 starting geth on ethereum mainnet info 12 08 05 37 18 131 bumping default cache on mainnet provided 1024 updated 4096 info 12 08 05 37 18 131 running with private transaction manager disabled quorum private transactions will not be supported info 12 08 05 37 18 132 maximum peer count eth 50 les 0 total 50 info 12 08 05 37 18 160 set global gas cap cap 25000000 info 12 08 05 37 18 160 running with private transaction manager disabled quorum private transactions will not be supported info 12 08 05 37 18 160 allocated trie memory caches clean 1023 00mib dirty 1024 00mib info 12 08 05 37 18 160 allocated cache and file handles database users brendanplayford library ethereum geth chaindata cache 2 00gib info 12 08 05 37 18 751 started p2p networking self enode 162cfffb34b0c3e76abeb9f31541737fcd3b622e35fa3b0080a14dfb9d2a53168ac3abf10122b79d3b8d7d55516982e0f903d179916ccb51abe5cd00de1bdb07 127 0 0 1 30303 info 12 08 05 37 18 752 ipc endpoint opened url users brendanplayford library ethereum geth ipc ismultitenant false info 12 08 05 37 18 752 security plugin is not enabled fatal consensus not specified exiting initialize the node navigate to the node directory and initialize the first node the repo directory includes the genesis json file that is used to connect to the masa protocol at the following path network testnet genesis json run the following command cd node geth datadir data init network testnet genesis json you will get the following output info 12 09 18 22 24 031 running with private transaction manager disabled quorum private transactions will not be supported info 12 09 18 22 24 035 maximum peer count eth 50 les 0 total 50 info 12 09 18 22 24 063 set global gas cap cap 25000000 info 12 09 18 22 24 064 allocated cache and file handles database users brendanplayford masa masa node v1 0 node data geth chaindata cache 16 00mib handles 16 info 12 09 18 22 24 135 writing custom genesis block info 12 09 18 22 24 140 persisted trie from memory database nodes 7 size 1 02kib time 280 583 s gcnodes 0 gcsize 0 00b gctime 0s livenodes 1 livesize 0 00b info 12 09 18 22 24 141 successfully wrote genesis state database chaindata hash 69b521 fb4c77 info 12 09 18 22 24 141 allocated cache and file handles database users brendanplayford masa masa node v1 0 node data geth lightchaindata cache 16 00mib handles 16 info 12 09 18 22 24 204 writing custom genesis block info 12 09 18 22 24 205 persisted trie from memory database nodes 7 size 1 02kib time 162 437 s gcnodes 0 gcsize 0 00b gctime 0s livenodes 1 livesize 0 00b info 12 09 18 22 24 205 successfully wrote genesis state database lightchaindata hash 69b521 fb4c77 set your node identity set your own identity of your node on the masa protocol to be easily identified in a list of peers for example we name our node masamoonnode by setting the flag identity masamoonnode will set up an identity for your node so it can be identified as masamoonnode in a list of peers update your flag identity masamoonnode to be unique start the node in the node directory start the node by running the following command private config ignore geth identity masamoonnode datadir data bootnodes enode 91a3c3d5e76b0acf05d9abddee959f1bcbc7c91537d2629288a9edd7a3df90acaa46ffba0e0e5d49a20598e0960ac458d76eb8fa92a1d64938c0a3a3d60f8be4 54 158 188 182 21000 emitcheckpoints istanbul blockperiod 10 mine miner threads 1 syncmode full verbosity 5 networkid 190260 rpc rpccorsdomain rpcvhosts rpcaddr 127 0 0 1 rpcport 8545 rpcapi admin db eth debug miner net shh txpool personal web3 quorum istanbul port 30300 additional bootnodes masa operates several bootnodes one is already included in the comnand above by default if you are having issues connecting to the bootnode please use an alternaitve from the list below we are also looking for community run bootnodes to add to our list please reach out to us on discord or submit a pr to this repo if you want to add a bootnode to the community list masa bootnodes enode ac6b1096ca56b9f6d004b779ae3728bf83f8e22453404cc3cef16a3d9b96608bc67c4b30db88e0a5a6c6390213f7acbe1153ff6d23ce57380104288ae19373ef 54 146 254 245 21000 enode 91a3c3d5e76b0acf05d9abddee959f1bcbc7c91537d2629288a9edd7a3df90acaa46ffba0e0e5d49a20598e0960ac458d76eb8fa92a1d64938c0a3a3d60f8be4 54 158 188 182 21000 enode d87c03855093a39dced2af54d39b827e4e841fd0ca98673b2e94681d9d52d2f1b6a6d42754da86fa8f53d8105896fda44f3012be0ceb6342e114b0f01456924c 34 225 220 240 21000 enode fcb5a1a8d65eb167cd3030ca9ae35aa8e290b9add3eb46481d0fbd1eb10065aeea40059f48314c88816aab2af9303e193becc511b1035c9fd8dbe97d21f913b9 52 1 125 71 21000 community bootnodes enode 571be7fe060b183037db29f8fe08e4fed6e87fbb6e7bc24bc34e562adf09e29e06067be14e8b8f0f2581966f3424325e5093daae2f6afde0b5d334c2cd104c79 142 132 135 228 21000 enode 269ecefca0b4cd09bf959c2029b2c2caf76b34289eb6717d735ce4ca49fbafa91de8182dd701171739a8eaa5d043dcae16aee212fe5fadf9ed8fa6a24a56951c 65 108 72 177 21000 submit a pr to add a bootnode to the community list here https github com masa finance masa node v1 0 pulls node syncing it can take some time for your node to fully sync to the masa testnet 2 0 please be patient while your node catches up with the most recent blocks node ui you must be running docker to run the node ui with geth specification react js typescript docker for deployment run the masa node ui follow these instructions to run the node ui with geth cd masa node v1 0 cd src cd ui docker compose up ui navigate to you local host to interact with the masa node http localhost 3000
blockchain golang node
blockchain
WymaTimesheetWebApp
wymatimesheetwebapp wyma an engineering company in hornby has contacted our class in respect to a project ripe for a scholarship program it will be an app for android and potentially ios that will help the people in their factory sign into the premises without having to manually input their data onto one singular computer at the start and the end of the day they previously have had issues with using manual timesheets in a job costing software and have recently implemented a system that works on mostly for design engineers for the use on local pcs but the biggest issue is for the people on the main factory floor who don t have access to a pc and are still stuck manually signing in at a single sign in point this is where we can come in i believe we can create an android app that will allow the people working on the factory floor to sign in from anywhere on the premises ultimately removing the bottleneck of signing from one point improving work effectiveness and making sure everyone can get to work as soon as they arrive on the premises
os
openplayground
openplayground welcome to openplayground https openplayground groundedai company openplayground https openplayground groundedai company is a web application that allows you to experiment with large language models compare different models from different providers and run them on your own data the goal of openplayground is to make it easier for people to start benefiting from language models in their work with openplayground you can explore a variety of pre trained language models from different providers compare the performance of different models on a variety of tasks fine tune a model to your own data to improve its performance run the model on your own data to generate predictions you can use the hosted version of openplayground at https openplayground groundedai company or you can download and run the open source version from github openplayground is open for use and contribution join the community and start using language models in your work today installation clone the repository install the dependencies and start the server bash npm install bash npm start license openplayground is licensed under the gnu general public license v3 0 see the license license file for details
large-language-models machine-learning mlops nlp
ai
Spock
spock https www spock li build status https github com agrafix spock actions workflows haskell yml badge svg hackage https img shields io hackage v spock svg http hackage haskell org package spock hackage spock core https img shields io hackage v spock core svg http hackage haskell org package spock core intro another haskell web framework for rapid development to get started with spock check our tutorial https www spock li tutorial or take a look at our example project funblog https github com agrafix funblog mailing list please join our mailing list at haskell spock googlegroups com features another haskell web framework for rapid development this toolbox provides everything you need to get a quick start into web hacking with haskell fast typesafe routing middleware json sessions cookies database helper csrf protection typesafe contexts important links tutorial https www spock li tutorial rest api tutorial https www spock li tutorials rest api type safe routing in spock https www spock li 2015 04 19 type safe routing html taking authentication to the next level https www spock li 2015 08 23 taking authentication to the next level html talks english zurihac 2016 spock powerful elegant web applications https www youtube com watch v b oz6y n y by alexander thiemann english spock powerful elegent web applications using haskell https www youtube com watch v knqsobrcblo by alexander thiemann english beginning web programming in haskell using spock https www youtube com watch v gobpigl9jj4 by ollie charles german moderne typsichere web entwicklung mit haskell https dl dropboxusercontent com u 15078797 talks typesafe webdev 2015 pdf by alexander thiemann german reroute talk https github com timjb reroute talk by tim baumann candy extensions the following spock extensions exist background workers for spock spock worker http hackage haskell org package spock worker digestive functors for spock spock digestive http hackage haskell org package spock digestive lucid for spock spock lucid http hackage haskell org package spock lucid works well with spock user management users http hackage haskell org package users data validation validate input http hackage haskell org package validate input blaze bootstrap helpers blaze bootstrap http hackage haskell org package blaze bootstrap digestive forms bootstrap helpers digestive bootstrap http hackage haskell org package digestive bootstrap ssl https if you d like to use your application via https there are two options use nginx haproxy as reverse proxy in front of the spock application convert the spock application to a wai application using the spockasapp then use the warp tls package to run it notes since version 0 11 0 0 spock drops simple routing in favor of typesafe routing and drops safe actions in favor of the usual way of csrf protection with a token since version 0 7 0 0 spock supports typesafe routing if you wish to continue using the untyped version of spock you can use web spock simple the implementation of the routing is implemented in a separate haskell package called reroute since version 0 5 0 0 spock is no longer built on top of scotty the design and interface is still influenced by scotty but the internal implementation differs from scotty s thanks to tim baumann https github com timjb lot s of help with typesafe routing tom nielsen https github com glutamate much feedback and small improvements and all other awesome contributors https github com agrafix spock graphs contributors hacking pull requests are welcome please consider creating an issue beforehand so we can discuss what you would like to do code should be written in a consistent style throughout the project avoid whitespace that is sensible to conflicts e g alignment of signs in functions definitions note that by sending a pull request you agree that your contribution can be released under the bsd3 license as part of the spock package or related packages misc officially supported ghc versions 8 10 7 8 8 4 license released under the bsd3 license c 2013 2021 alexander thiemann
haskell spock web framework functional http server api webframework
front_end
ToolBox-ComputerVision
toolbox computervision image processing and computer vision project toolbox starter code full instructions at https sites google com site sd16spring home project toolbox image processing
ai
simplifying-web-development-with-accessibility-best-practices-2883015
simplifying web development with accessibility best practices this is the repository for the linkedin learning course simplifying web development with accessibility best practices the full course is available from linkedin learning lil course url simplifying web development with accessibility best practices lil thumbnail url too often in the world of web development accessibility is given a low level of priority in the development stage of a site and is often relegated to experts for later implementation but why spend the time and money hiring an outside consultant when you can set up a website for proper accessibility at the build stage in this course morten rand hendriksen shows the benefits of taking this approach and how simple it is to do so as morten explains all that s needed is a basic understanding of why these elements are so important how they work and how good coding practices and modern web standards can get you there with little extra work he tackles accessibility from all sides starting with the origins of some of the most common accessibility issues and how to address them then covers key topics like accessible design hiding and showing visual content handling graphics and media and the semantic elements that can help make your designs more accessible installing the exercise files are stand alone examples in html css and javascript you do not need any additional software to run them and you are free to copy the code and use it in your own projects instructor morten rand hendriksen developer and senior staff instructor check out my other courses on linkedin learning https www linkedin com learning instructors morten rand hendriksen lil course url https www linkedin com learning simplifying web development with accessibility best practices lil thumbnail url https cdn lynda com course 2883015 2883015 1622152000595 16x9 jpg
front_end
FreeRTOS
freertos from analog devices this repository contains a copy of the freertos real time operating system with additional components from analog devices all freertos source code is provided as is unpacked an unmodified from the original freertos source zip repositories unless explicitly stated why is the master branch empty this repository is used to provide users of analog devices dsp and microcontrollers with an easy way to access the ports of freertos for their processors the branches in this repository relate to specific releases of freertos from freertos org 9 0 0 10 0 0 etc with tags indicating formal releases of the additional analog devices content rel frtos adi 1 2 0 etc the following releases from analog devices are currently available in this repository 1 2 0 based on freertos v9 0 0 support for aducm302x aducm4x5x adsp bf7xx and adsp sc5xx cortex a processors 1 5 1 based on freertos v10 0 x support for for adsp bf7xx and adsp sc5xx cortex a and sharc processors 2 0 0 based on freertos v10 4 x support for for adsp bf7xx and adsp sc5xx cortex a and sharc processors getting started please refer to the readme md document on the release branch or see the adi freertos wiki https wiki analog com resources tools software freertos support support for the analog devices freertos ports is primarily provided via the analog devices engineer zone forum https ez analog com community dsp software and development tools freertos licensing licensing is release specific please refer to the license md file on each branch for more details
os
flashlight
circleci https circleci com gh flashlight flashlight svg style shield https app circleci com pipelines github flashlight flashlight documentation status https img shields io readthedocs fl svg https fl readthedocs io en latest docker image build status https img shields io github workflow status flashlight flashlight publish 20docker 20images label docker 20image 20build https hub docker com r flml flashlight tags join the chat at https gitter im flashlight ml community https img shields io gitter room flashlight ml community https gitter im flashlight ml community utm source badge utm medium badge utm campaign pr badge utm content badge codecov https codecov io gh flashlight flashlight branch master graph badge svg token rbp4ailmc0 https codecov io gh flashlight flashlight docker image for cuda backend https img shields io docker image size flml flashlight cuda latest label docker 20 28cuda 29 logo docker https hub docker com r flml flashlight tags page 1 ordering last updated name cuda latest docker image for cpu backend https img shields io docker image size flml flashlight cpu latest label docker 20 28cpu 29 logo docker https hub docker com r flml flashlight tags page 1 ordering last updated name cpu latest install cuda backend with vcpkg https img shields io badge dynamic json color orange label get 20 28cuda 29 query name url https 3a 2f 2fraw githubusercontent com 2fmicrosoft 2fvcpkg 2fmaster 2fports 2fflashlight cuda 2fvcpkg json prefix vcpkg 20install 20 https vcpkg info port flashlight cuda install cpu backend with vcpkg https img shields io badge dynamic json color orange label get 20 28cpu 29 query name url https 3a 2f 2fraw githubusercontent com 2fmicrosoft 2fvcpkg 2fmaster 2fports 2fflashlight cpu 2fvcpkg json prefix vcpkg 20install 20 https vcpkg info port flashlight cpu flashlight is a fast flexible machine learning library written entirely in c from the facebook ai research and the creators of torch tensorflow eigen and deep speech its core features include total internal modifiability including internal apis for tensor computation flashlight fl tensor readme md a small footprint with the core clocking in at under 10 mb and 20k lines of c high performance defaults featuring just in time kernel compilation with modern c via the arrayfire https github com arrayfire arrayfire tensor library an emphasis on efficiency and scale native support in c and simple extensibility makes flashlight a powerful research framework that enables fast iteration on new experimental setups and algorithms with little unopinionation and without sacrificing performance in a single repository flashlight provides apps https github com flashlight flashlight tree master flashlight app for research across multiple domains automatic speech recognition https github com flashlight flashlight tree master flashlight app asr formerly wav2letter https github com flashlight wav2letter project documentation flashlight app asr tutorial flashlight app asr tutorial image classification flashlight app imgclass object detection flashlight app objdet language modeling flashlight app lm project layout flashlight is broken down into a few parts flashlight lib flashlight lib contains kernels and standalone utilities for audio processing and more flashlight fl flashlight fl is the core tensor interface and neural network library using the arrayfire https github com arrayfire arrayfire tensor library by default flashlight pkg flashlight pkg are domain packages for speech vision and text built on the core flashlight app flashlight app are applications of the core library to machine learning across domains quickstart first build and install flashlight building and installing and link it to your own project building your own project with flashlight sequential https fl readthedocs io en latest modules html sequential forms a sequence of flashlight module https fl readthedocs io en latest modules html module s for chaining computation details summary implementing a simple convnet is easy summary c include flashlight fl flashlight h sequential model model add view fl shape im dim im dim 1 1 model add conv2d 1 input channels 32 output channels 5 kernel width 5 kernel height 1 stride x 1 stride y paddingmode same padding mode paddingmode same padding mode model add relu model add pool2d 2 kernel width 2 kernel height 2 stride x 2 stride y model add conv2d 32 64 5 5 1 1 paddingmode same paddingmode same model add relu model add pool2d 2 2 2 2 model add view fl shape 7 7 64 1 model add linear 7 7 64 1024 model add relu model add dropout 0 5 model add linear 1024 10 model add logsoftmax performing forward and backward computation is straightforwards c auto output model forward input auto loss categoricalcrossentropy output target loss backward details see the mnist example https fl readthedocs io en latest mnist html for a full tutorial including a training loop and dataset abstractions variable https fl readthedocs io en latest variable html is a tape based abstraction that wraps flashlight tensors https github com flashlight flashlight blob main flashlight fl tensor tensorbase h tape based automatic differentiation in flashlight https fl readthedocs io en latest autograd html is simple and works as you d expect details summary autograd example summary c auto a variable fl rand 1000 1000 true calcgrad auto b 2 0 a auto c 1 0 b auto d log c d backward populates a grad along with gradients for b c and d details building and installing install with vcpkg library installation with vcpkg with docker building and running flashlight with docker from source building from source from source with vcpkg from source build with vcpkg build your project with flashlight building your own project with flashlight requirements at minimum compilation requires a c compiler with good c 17 support e g gcc g 7 cmake https cmake org version 3 10 or later and make a linux based operating system see the full dependency dependencies list for more details if building from source building from source instructions for building installing python bindings can be found here bindings python readme md flashlight build setups flashlight can be broken down into several components as described above project layout each component can be incrementally built by specifying the correct build options build options there are two ways to work with flashlight 1 as an installed library that you link to with your own project this is best for building standalone applications dependent on flashlight 2 with in source development where the flashlight project source is changed and rebuilt this is best if customizing hacking the core framework or the flashlight provided app binaries flashlight app flashlight can be built in one of two ways 1 with vcpkg installing flashlight with vcpkg a c package manager https github com microsoft vcpkg 2 from source building from source by installing dependencies as needed installing flashlight with vcpkg library installation with vcpkg flashlight is most easily built and installed with vcpkg both the cuda and cpu backends are supported with vcpkg for either backend first install intel mkl https software intel com content www us en develop tools oneapi base toolkit download html for the cuda backend install cuda 9 2 https developer nvidia com cuda downloads cudnn https docs nvidia com deeplearning cudnn install guide index html and nccl https docs nvidia com deeplearning nccl install guide index html then after installing vcpkg https github com microsoft vcpkg getting started install the libraries and core with shell vcpkg vcpkg install flashlight cuda cuda backend or vcpkg vcpkg install flashlight cpu cpu backend to install flashlight apps flashlight app check the features available for installation by running vcpkg search flashlight cuda or vcpkg search flashlight cpu each app is a feature for example vcpkg install flashlight cuda asr installs the asr app with the cuda backend below is the currently supported list of features for each of flashlight cuda https vcpkg info port flashlight cuda and flashlight cpu https vcpkg info port flashlight cpu flashlight cuda cpu lib flashlight libraries flashlight cuda cpu nn flashlight neural net library flashlight cuda cpu asr flashlight speech recognition app flashlight cuda cpu lm flashlight language modeling app flashlight cuda cpu imgclass flashlight image classification app flashlight app binaries flashlight app are also built for the selected features and are installed into the vcpkg install tree s tools directory integrating flashlight into your own project with a vcpkg flashlight installation with is simple using vcpkg s cmake toolchain integration https vcpkg readthedocs io en latest examples installing and using packages cmake from source build with vcpkg first install the dependencies for your backend of choice using vcpkg click to expand the below details summary installing cuda backend dependencies with vcpkg summary to build the flashlight cuda backend from source using dependencies installed with vcpkg install cuda 9 2 https developer nvidia com cuda downloads cudnn https docs nvidia com deeplearning cudnn install guide index html nccl https docs nvidia com deeplearning nccl install guide index html and intel mkl https software intel com content www us en develop tools oneapi base toolkit download html then build the rest of the dependencies for the cuda backend based on which flashlight features you d like to build shell vcpkg install cuda intel mkl fftw3 cub kenlm if building flashlight libraries arrayfire cuda cudnn nccl openmpi cereal stb if building the flashlight neural net library gflags glog if building any flashlight apps libsndfile if building the flashlight asr app gtest optional if building tests details details summary installing cpu backend dependencies with vcpkg summary to build the flashlight cpu backend from source using dependencies installed with vcpkg install intel mkl https software intel com content www us en develop tools oneapi base toolkit download html then build the rest of the dependencies for the cpu backend based on which flashlight features you d like to build shell vcpkg install intel mkl fftw3 kenlm for flashlight libraries arrayfire cpu gloo mpi openmpi onednn cereal stb for the flashlight neural net library gflags glog for the flashlight runtime pkg any flashlight apps using it libsndfile for the flashlight speech pkg gtest optional for tests details build using the vcpkg toolchain file to build flashlight from source with these dependencies clone the repository shell git clone https github com flashlight flashlight git cd flashlight mkdir p build cd build then build from source using vcpkg s cmake toolchain https github com microsoft vcpkg blob master docs users integration md cmake toolchain file recommended for open source cmake projects shell cmake dcmake build type release dfl build arrayfire on dcmake toolchain file path to your vcpkg clone scripts buildsystems vcpkg cmake make j nproc make install j nproc only if you want to install flashlight for external use to build a subset of flashlight s features see the build options build options below building from source to build from source first install the below dependencies dependencies most are available with your system s local package manager some dependencies marked below are downloaded and installed automatically if not found on the local system fl build standalone determines this behavior if disabled dependencies won t be downloaded and built when building flashlight once all dependencies are installed clone the repository shell git clone https github com flashlight flashlight git cd flashlight mkdir p build cd build then build all flashlight components with cmake dcmake build type release dfl build arrayfire on build options make j nproc make install setting the mklroot environment variable export mklroot opt intel oneapi mkl latest or export mklroot opt intel mkl on most linux based systems can help cmake find intel mkl if not initially found to build a smaller subset of flashlight features apps see the build options build options below for a complete list of options to install flashlight in a custom directory use cmake s cmake install prefix https cmake org cmake help v3 10 variable cmake install prefix html argument flashlight libraries can be built as shared libraries using cmake s build shared libs https cmake org cmake help v3 10 variable build shared libs html argument flashlight uses modern cmake and imported targets for most dependencies if a dependency isn t found passing d package dir to your cmake command or exporting package dir as an environment variable equal to the path to package config cmake can help locate dependencies on your system see the documentation https cmake org cmake help v3 10 command find package html for more details if cmake is failing to locate a package check to see if a corresponding issue https github com flashlight flashlight issues has already been created before creating your own minimal setup on macos on macos arrayfire can be installed with homebrew and the flashlight core can be built as follows brew install arrayfire cmake dfl arrayfire use opencl on dfl use onednn off dfl build tests off dfl build examples off dfl build scripts off dfl build distributed off make j nproc dependencies dependencies marked with are automatically downloaded and built from source if not found on the system setting fl build standalone to off disables this behavior dependencies marked with are required if building with distributed training enabled fl build distributed see the build options build options below distributed training is required for all apps dependencies marked with are installable via vcpkg see the instructions for installing those dependencies from source build with vcpkg above for doing a flashlight from source build table thead tr th component th th backend th th dependencies th tr thead tbody tr td rowspan 2 libraries td td cuda td td a href https developer nvidia com cuda downloads cuda a gt 9 2 a href https github com nvidia cub cub a if cuda lt 11 td tr tr td cpu td td a blas library a href https software intel com content www us en develop tools oneapi base toolkit download html intel mkl a gt 2018 openblas etc td tr tr td rowspan 3 core td td any td td a href https github com arrayfire arrayfire installation arrayfire a gt 3 7 3 an mpi library a href https www open mpi org openmpi a etc nbsp nbsp a href https github com uscilab cereal cereal a gt 1 3 0 a href https github com nothings stb stb a td tr tr td cuda td td a href https developer nvidia com cuda downloads cuda a gt 9 2 a href https developer nvidia com nccl nccl a a href https developer nvidia com cudnn cudnn a td tr tr td cpu td td a href https github com oneapi src onednn onednn a gt 2 5 2 a href https github com facebookincubator gloo gloo a a href https github com facebookincubator gloo blob 01e2c2660cd43963ce1fe3e21220ac01f07d9a4b docs rendezvous md using mpi with mpi a td tr tr td app all td td any td td a href https github com google glog google glog a a href https github com gflags gflags gflags a td tr tr td app asr td td any td td a href https github com libsndfile libsndfile libsndfile a gt 10 0 28 a blas library a href https software intel com content www us en develop tools oneapi base toolkit download html intel mkl a gt 2018 openblas etc a href https github com flashlight text flashlight text a td tr tr td app imgclass td td any td td td tr tr td app imgclass td td any td td td tr tr td app lm td td any td td a href https github com flashlight text flashlight text a td tr tr td tests td td any td td a href https github com google googletest google test gtest with gmock a gt 1 10 0 td tr tbody table build options the flashlight cmake build accepts the following build options prefixed with d when running cmake from the command line table thead tr th name th th options th th default value th th description th tr thead tbody tr td rowspan 2 fl build arrayfire td td on off td td on td td build flashlight with the arrayfire backend td tr tr td on off td td on td td downloads builds some dependencies if not found td tr tr td rowspan 3 fl build libraries td td on off td td on td td build the flashlight libraries td tr tr td on off td td on td td build the flashlight neural net library td tr tr td on off td td on td td build with distributed training required for apps td tr tr td fl build contrib td td on off td td on td td build contrib apis subject to breaking changes td tr tr td fl build apps td td on off td td on td td build applications see below td tr tr td fl build app asr td td on off td td on td td build the automatic speech recognition application td tr tr td fl build app imgclass td td on off td td on td td build the image classification application td tr tr td fl build app lm td td on off td td on td td build the language modeling application td tr tr td fl build app asr tools td td on off td td on td td build automatic speech recognition app tools td tr tr td fl build tests td td on off td td on td td build tests td tr tr td fl build examples td td on off td td on td td build examples td tr tr td fl build experimental td td on off td td off td td build experimental components td tr tr td cmake build type td td see a href https cmake org cmake help v3 10 variable cmake build type html docs a td td debug td td see the a href https cmake org cmake help v3 10 variable cmake build type html cmake documentation a td tr tr td cmake install prefix td td directory td td see a href https cmake org cmake help v3 10 variable cmake install prefix html docs a td td see the a href https cmake org cmake help v3 10 variable cmake install prefix html cmake documentation a td tr tbody table building your own project with flashlight flashlight is most easily linked to using cmake flashlight exports the following cmake targets when installed flashlight flashlight contains flashlight libraries as well as the flashlight core autograd and neural network library flashlight fl pkg runtime contains flashlight core as well as common utilities for training logging flags distributed utils flashlight fl pkg vision contains flashlight core as well as common utilities for vision pipelines flashlight fl pkg text contains flashlight core as well as common utilities for dealing with text data flashlight fl pkg speech contains flashlight core as well as common utilities for dealing with speech data flashlight fl pkg halide contains flashlight core and extentions to easily interface with halide given a simple project cpp file that includes and links to flashlight c include iostream include flashlight fl flashlight h int main fl init fl variable v fl full 1 1 true auto result v 10 std cout tensor value is result tensor std endl 11 000 return 0 the following cmake configuration links flashlight and sets include directories cmake cmake minimum required version 3 10 set cmake cxx standard 17 set cmake cxx standard required on add executable myproject project cpp find package flashlight config required target link libraries myproject private flashlight flashlight with a vcpkg flashlight installation if you installed flashlight with vcpkg the above cmake configuration for myproject can be built by running shell cd project mkdir build cd build cmake dcmake toolchain file path to vcpkg clone scripts buildsystems vcpkg cmake dcmake build type release make j nproc with a from source flashlight installation if using a from source installation of flashlight flashlight will be found automatically by cmake shell cd project mkdir build cd build cmake dcmake build type release make j nproc if flashlight is installed in a custom location using a cmake install prefix passing dflashlight dir install prefix share flashlight cmake as an argument to your cmake command can help cmake find flashlight building and running flashlight with docker flashlight and its dependencies can also be built with the provided dockerfiles see the accompanying docker documentation docker for more information contributing and contact contact vineelkpratap fb com awni fb com jacobkahn fb com qiantong fb com antares fb com padentomasello fb com jcai fb com gab fb com vitaliy888 fb com locronan fb com flashlight is being very actively developed see contributing contributing md for more on how to help out acknowledgments some of flashlight s code is derived from arrayfire ml https github com arrayfire arrayfire ml citing you can cite flashlight https arxiv org abs 2201 12465 using misc kahn2022flashlight title flashlight enabling innovation in tools for machine learning author jacob kahn and vineel pratap and tatiana likhomanenko and qiantong xu and awni hannun and jeff cai and paden tomasello and ann lee and edouard grave and gilad avidov and benoit steiner and vitaliy liptchinsky and gabriel synnaeve and ronan collobert year 2022 eprint 2201 12465 archiveprefix arxiv primaryclass cs lg license flashlight is under an mit license see license license for more information
flashlight machine-learning autograd cpp deep-learning neural-network ml
ai
coremob-ucr
content from this document has been folded into the coremob 2012 repository https github com coremob coremob 2012
front_end
ui_dynamo
uidynamo uidynamo is a development companion for developing ui components in isolation increasing developer productivity exponentially documentation create beautiful documentation for your components all in flutter the app is just a platform application that you can distribute anywhere for your team to interact with storyboarding storyboard all of your app screens at once in a pan and zoom environment or create custom routing flows to enable easy iteration device toolbar preview anything in this tool in light mode dark mode different languages screen rotation different display sizes and more flexibility use this tool on your smartphone tablet desktop mac web browser chrome or wherever flutter can run more platforms will be on the way dynamo themes automatically to your application but feel free to use material theming to adjust the look and feel props use the props editor to see changes to your components live within the documentation actions use actions to display logs of component actions localizations display translations for your application and see them in different languages plugins add plugins to appear in the plugins bar or behind the scenes to provide greater flexibility in your workflow preview assets preview png preview2 assets preview 2 png getting started there are two ways to add the library to your project project dependency adding the library as a project dependency will place it within your code as a library add this line to your pubspec yaml to add the library yaml dependencies ui dynamo xx xx xx replace with current version add a folder in your project called dynamo place a main dart file that you will run uidynamo from in your ide add the file as a run configuration separating dynamo from your main application separate project you can alternatively create a new flutter module that depends on your main application this way you can isolate dynamo completely from your application code add a module called dynamo add this line to your pubspec yaml in the dynamo module yaml dependencies ui dynamo xx xx xx app path or application path setting up uidynamo in your main lib main dart of your application adjust your app component to export its materialapp as a function dart materialapp buildapp materialapp title flutter demo theme themedata this is needed to use the main app in dynamo in your dynamo main dart first add the run configuration and appdynamo dart void main runapp appdynamo class appdynamo extends statelesswidget const appdynamo key key super key key override widget build buildcontext context return dynamo withapp buildapp data dynamodata defaultdevice devicesizes iphonex add your dynamo data by default uidynamo will traverse your application routes creating a storyboard and routes folder in the nav bar home is just a placeholder and configureable default setup assets default setup png you can add 3 kinds of items in the sidebar 1 storyboard a pinch zoom and pannable experience that allows you to easily add custom flows within your application all on one screen dart dynamopage storyboard title your title flowmapping home home company flowmapping a key value list mapping that specifies flows that display on screen from left to right if you specify multiple each mapping goes from top to bottom in order storyboard example assets storyboard example png 2 folder a collapsible section that contains more items can nest as many as you like dart dynamofolder of title widgets pages buildtextstylepage buildbuttonspage buildtoastpage buildradiospage buildinputspage folder example assets folder example png 3 page a single focusable page that you can preview simplest builder is dart dynamopage of title title icon icon icons home child context mywidget specify the title icon and child builder the child builder only runs if you the widget is on screen also we support a list dart dynamopage list title alerts icon icon icons error children context organization container title text network alert children widget networkalertdialog onokpressed context actions onpressed alert ok button organization container title text confirmation dialog children widget confirmationalertdialog title context props text title are you sure you want to get pizza group confirmationgroup content context props text content you can always order later group confirmationgroup onyespressed context actions onpressed alert yes button onnopressed context actions onpressed alert no button title icon and children builder the children builder only is used when on screen as well the list builder just displays content in a list ui documentation widgets uidynamo comes with a few widgets to support documentation presentationwidget centers contrains and adds padding around the edges of the view use organization presentation expandablewidgetsection displays content in an expandable container to keep documentation small and tidy use organization expandable expandable example assets expandable example png widgetcontainer wraps content in a card with a title and visual separation use organization container container example assets container example png proptable a table useful for displaying documentation for props on a widget setting up the table is not quite automatic yet using reflection in future could automate it dart proptable items proptableitem name message description displays a message for this toast proptableitem name mode description displays a different ui mode defaultvalue toastmode success tostring proptableitem name onclose description closes the toast before the scheduled timeout defaultvalue null proptable assets prop table example png adding props configuring actions localizations custom plugins maintainers andrew grosner agrosner https www github com agrosner
front_end
HybridMobile-App-using-Ionic-Framework
hybridmobileapp developing hybridmobile app using ionic framework 2015 07
front_end
Spam-Filtering-techniques
spam filtering techniques employing various machine learning and natural language processing techniques for spam filtering
ai
ComputerVision
computervision this repository only contains implementations for neural styel transfer neural style transfer to use the neural style transfer implementation run in google colab https colab research google com github ldfrancis computervision blob master neuralstyletransfer easynst 20on 20google 20colab ipynb python from neuralstyletransfer import implementnts as nst set the image dimension and train and generate an image python nst setimagedim 400 300 nst run num iteration 1000 the setimagedim function in the module implementnts takes 2 arguments br br width the width of the images used br height the height of the images br the run function in the module implementnts takes 3 arguments br br num iteration number of iterations to train for br content image the path to the content image of size width 400 height 300 br style image the path to the style image of size width 400 height 300 br requirement tensorflow numpy scipy
ai
react-express
react express starter pack create full stack apps with react and express run your client and server with a single command redux version this version does not include redux click here for redux version https github com bradtraversy react redux express starter quick start bash install dependencies for server npm install install dependencies for client npm run client install run the client server with concurrently npm run dev run the express server only npm run server run the react client only npm run client server runs on http localhost 5000 and client on http localhost 3000 app info author brad traversy traversy media http www traversymedia com version 1 0 0 license this project is licensed under the mit license
server
micro-app
p align center a href https micro zoe github io micro app img src https zeroing jd com micro app media logo png alt logo width 200 a p p align center a href https www npmjs com package micro zoe micro app img src https img shields io npm v micro zoe micro app svg alt version a a href https www npmjs com package micro zoe micro app img src https img shields io npm dt micro zoe micro app svg alt downloads a a href https github com micro zoe micro app blob master license img src https img shields io npm l micro zoe micro app svg alt license a a href https github com micro zoe micro app blob dev contact md img src https img shields io badge chat wechat blue alt wechat a a href https travis ci com github micro zoe micro app img src https api travis ci com micro zoe micro app svg branch master alt travis a a href https coveralls io github micro zoe micro app branch master img src https coveralls io repos github micro zoe micro app badge svg branch master alt coveralls a p english readme zh cn md documentation https micro zoe github io micro app discussions https github com micro zoe micro app discussions wechat contact md introduction micro app is a micro front end framework launched by jd retail it renders based on webcomponent like and realizes the micro front end from component thinking it aiming to reduce the difficulty of getting started and improve work efficiency it is the lowest cost framework for accessing micro front end and provides a series of perfect functions such as js sandbox style isolation element isolation preloading resource address completion plugin system data communication and so on micro app has no restrictions on the front end framework and any framework can be used as a base application to embed any type of micro application of the framework how to use base application 1 install bash yarn add micro zoe micro app 2 import at the entrance js main js import microapp from micro zoe micro app microapp start 3 use components in page html my page vue template name is the app name url is the app address micro app name my app url http localhost 3000 micro app template sub application set cross domain support in the headers of webpack dev server js devserver headers access control allow origin the above micro front end rendering is completed and the effect is as follows image https img10 360buyimg com imagetools jfs t1 188373 14 17696 41854 6111f4a0e532736ba 4b86f4f8e2044519 png more detailed configuration can be viewed documentation https micro zoe github io micro app docs html zh cn start contribution if you re interested in this project you re welcome to mention pull request and also welcome your star development 1 clone git clone https github com micro zoe micro app git 2 install dependencies yarn bootstrap 3 run project yarn start for more commands see develop https github com micro zoe micro app blob master develop md faq details summary what are the advantages of micro app summary it is easy to use and low invasive it only needs to change a small amount of code to access the micro front end and provides rich functions at the same time details details summary how compatible summary the micro app relies on two newer apis customelements and proxy for browsers that do not support customelements they can be compatible by introducing polyfills for details please refer to webcomponents polyfills https github com webcomponents polyfills tree master packages custom elements however proxy is not compatible for the time being so the micro app cannot be run on browsers that do not support proxy browser compatibility can be viewed can i use https caniuse com search proxy the general is as follows desktop except ie browser other browsers are basically compatible mobile ios10 android5 details details summary must micro applications support cross domain summary yes if it is a development environment you can set headers in webpack dev server to support cross domain js devserver headers access control allow origin if it is a production environment you can support cross domain through configuration nginx https segmentfault com a 1190000012550346 details details summary does it support vite summary yes please see adapt vite https micro zoe github io micro app docs html zh cn framework vite for details details details summary does it support ssr summary yes please see nextjs https micro zoe github io micro app docs html zh cn framework nextjs nuxtjs https micro zoe github io micro app docs html zh cn framework nuxtjs for details details contributors a href https github com micro zoe micro app graphs contributors img src https contrib rocks image repo micro zoe micro app a license mit license https github com micro zoe micro app blob master license
javascript micro-frontend microapp webcomponents
front_end
webcomponentsshift
webcomponentsshift com google io 2013 session https developers google com events io sessions 318907648 analytics https ga beacon appspot com ua 46812528 1 ebidel webcomponentsshift readme https github com igrigorik ga beacon
front_end
wda-curriculum
wdan apprentice curriculum web development apprenticeship curriculum and learning materials structure the repo is set out as a reference tool grouping documents by type todo add navigation and new theme to front end site 1 lessons lesson plans for group training delivered by an experienced web development professional 1 workshops specialist training from experts in the field workflow we are constantly reviewing adding and updating this content every trainer will be required to review the content of their forthcoming session 2 weeks before delivery also please add any new links or edits that come out of delivering the session and add it to the create a new pr this ensures all our content is as current and relevant as possible 1 checkout appropriate branch 1 complete edits amends if nothing to add goto next 1 update last reviewed info 1 push branch 1 create pr and tag or ping one of the team for a review and merge netlify status https api netlify com api v1 badges 20e6d75e 93f3 4472 be02 eb522df12bbc deploy status https app netlify com sites stoic wescoff 52faa9 deploys
front_end
moapd2022
mobile app development bsc 2022 this course gives a fundamental overview of android programming concepts and the best practices for mobile app development description the mobile app development has seen significant growth in the recent past mainly due to the current computational power of modern tablets and mobile phones the development of mobile applications brings a set of different challenges to the developer such as where the application will run hardware specifications and how is the application performance when running os specifications this course provides fundamental knowledge on how to develop android applications using both java and kotlin programming languages and introduces the following topics the android application lifecycle the four different types of android components namely 1 activities 2 services 3 broadcast receivers and 4 content providers the design of user interfaces ui using layouts resources and a set of android ui controls e g textview edittext button checkbox progressbar among others how to share data between android components how to persist data using files and databases and how to manage the internal and external file storages the use of concurrency to improve speed and performance in android applications the development of multimedia applications using the built in camera and audio resources the use of geolocation information to develop location aware android applications the use of device sensors motion position environment and advanced sensors to collect additional information for android applications and the security aspects of android deployment to make an android application safer formal prerequisites the student must be familiar with at least one object oriented programming language such as java highly recommended c c objective c the student must be able to design implement and test medium sized object oriented programs as covered at swu these background skills are achieved by completing one of the following courses in the 1st and 2nd semester from the bsc in software development bswu such as introductory programming with project algorithms and data structure or similar experience with kotlin programming language will be an advantage but is not compulsory knowledge of functional programming is not a requisite for this course intended learning outcomes after the course the student should be able to design and implement a non trivial android application identify the different types of android components their purpose and lifecycle and their role in the mobile application pipeline describe the android os architecture and how to build run and debug a mobile app design mobile applications based on the limitations and resources of mobile platforms design and implement mobile applications using different resources such as communication and multimedia local and remote data persistence location aware applications built in sensors and mobile concurrency plan and execute the deployment of an android application using android studio develop native android applications using the kotlin programming language learning activities the course spans 14 weeks with two hour lectures followed by two hour exercise sessions per week each week students will read the weekly reading material develop an individual exercise based on the week content and solve an optional challenge designed to further the student understanding of mobile app development the students must install the android development environment used in the course on their laptops the installation guides for macos windows and linux will be available on learnit the apps presented in the lectures and apps developed in the exercise sessions can be tested on standard android phones older versions ok or using the virtual device available on android studio the following table presents the lectures plane week date lecture 01 01 02 2022 getting started lecture01 02 08 02 2022 introduction to kotlin for mobile development lecture02 03 15 02 2022 app resources and basic user interfaces lecture03 04 22 02 2022 advanced user interfaces lecture04 05 01 03 2022 flexible data view and threads lecture05 06 08 03 2022 android backend with google firebase lecture06 07 15 03 2022 storage in android part 01 lecture07 08 22 03 2022 storage in android part 02 lecture08 09 29 03 2022 geolocation based applications lecture09 10 05 04 2022 using react and flutter to build android apps lecture10 11 19 04 2022 android multimedia and machine learning lecture11 12 26 04 2022 android sensors lecture12 13 03 05 2022 safety in android apps 14 10 05 2022 introduction to mobile game development mandatory activities activities the course includes two mandatory assignments the students will develop a single android application that covers most of the topics presented the application and reports must be handed in and approved to be eligible for the final oral examination feedback students will receive feedback on their reports and solutions and whether they have passed the mandatory activities students who do not pass the mandatory assignment will possibly get a second attempt to resubmit the revised solutions before the final exam details about submission and resubmission will be available on learnit the student will receive the grade na not approved at the ordinary exam if the mandatory activities are not approved and the student will use an exam attempt course literature android programming the big nerd ranch guide 4th edition by kristin marsicano brian gardner bill phillips and chris stewart published aug 27 2019 by big nerd ranch guides isbn 10 0135245125 isbn 13 978 0135245125 kotlin programming the big nerd ranch guide 1st edition by josh skeen and david greenhalgh published jul 05 2018 by big nerd ranch guides isbn 10 0135161630 isbn 13 978 0135161630 student activity budget estimated distribution of learning activities for the typical student preparation for lectures and exercises 14 lectures 28 exercises 28 assignments 15 exam with preparation 15 ordinary exam exam type b oral exam external 7 point scale exam variation b1h oral exam with time for preparation home exam duration for the preparation preparation time at home 1 week the oral examination will be individual we will publish on learnit after correcting the mandatory assignment a list with consecutive exam slots the exam will take about 20 minutes in total these will be the steps of the exam draw randomly one question from the examiners table i e the main question and write your outline on the whiteboard blackboard the student may bring a sheet of a4 paper with the exam plan e g outline in bullet form keywords and one or two figures to take a look in the paper briefly 30 seconds after drawing the question then the student should put this paper aside and not look at it again during the exam the student will have a maximum of 8 minutes to talk about the selected topic while the student speaks some questions may be asked the examiners will ask additional questions within the topic or other topics within the curriculum includes topics on the slides weekly work plans exercises mandatory assignments and the reading material grading and feedback we strongly recommend that the student prepare and plan the content of each topic of the course the student must start the presentation by giving the examiners a brief overview of the topic and then focus on the more exciting and substantial parts of the questions leave out trivial subjects be sure to be as precise as possible to use the correct terminologies for android development exam duration per student for the oral exam 20 minutes time and data ordinary exam mon 27 jun 2022 09 00 20 55 br ordinary exam tue 28 jun 2022 09 00 20 55 br reexam mon 22 aug 2022 09 00 20 55
front_end
taxi-data-etl-project
nyc taxi data etl project introduction this project builds an end to end data engineering pipeline for analyzing taxi trips data in new york city from january 2023 to may 2023 the project repository composes of the raw data in partitioned parquet format python code files and notebook for the etl pipelines in mage data model and architecture diagrams dockerfile and commands to run on gcp vm sql queries for analytics in bigquery and a screenshot of the final analytics dashboard in google looker project architecture img src assets architecture jpg technology used programing language python pyspark big data tool apache spark devops tool docker cloud services google cloud storage google cloud compute engine vm instances bigquery google looker studio etl pipeline mage https www mage ai the nyc tlc dataset the data set is publicly available in https www nyc gov site tlc about tlc trip record data page the webpage contains monthly trips data for yellow and green taxi in new york cities and record information such as pick up and drop off time and location trip distances fare components rate and payment types etc the downloaded datasets are in parquet formats the data dictionary can be found at https www nyc gov assets tlc downloads pdf data dictionary trip records yellow pdf star schema the transformation logic for the data mainly utilizes a star schema design to split the raw data into fact and dimension tables as presented in the below diagram img src assets star schema png dashboard the data analytics dashboard is built in google looker by connecting to the bigquery data warehouse in this project as presented in the below screenshot img src assets dashboard screenshot png
cloud
python_blockchain_app
python blockchain app a simple tutorial for developing a blockchain application from scratch in python what is blockchain how it is implemented and how it works please read the step by step implementation tutorial https gist github com satwikkansal 4a857cad2797b9d199547a752933a715 to get your answers instructions to run clone the project sh git clone https github com satwikkansal python blockchain app git install the dependencies sh cd python blockchain app pip install r requirements txt start a blockchain node server sh export flask app node server py flask run port 8000 for windows users set lang c utf 8 set flask app node server py flask run port 8000 one instance of our blockchain node is now up and running at port 8000 run the application on a different terminal session sh python run app py for windows users set lang c utf 8 set flask app run app py flask run port 8000 the application should be up and running at http localhost 5000 http localhost 5000 here are a few screenshots 1 posting some content image png https github com satwikkansal python blockchain app raw master screenshots 1 png 2 requesting the node to mine image png https github com satwikkansal python blockchain app raw master screenshots 2 png 3 resyncing with the chain for updated data image png https github com satwikkansal python blockchain app raw master screenshots 3 png to play around by spinning off multiple custom nodes use the register with endpoint to register a new node here s a sample scenario that you might wanna try sh make sure you set the flask app environment variable to node server py before running these nodes already running flask run port 8000 spinning up new nodes flask run port 8001 flask run port 8002 you can use the following curl requests to register the nodes at port 8001 and 8002 with the already running 8000 sh curl x post http 127 0 0 1 8001 register with h content type application json d node address http 127 0 0 1 8000 sh curl x post http 127 0 0 1 8002 register with h content type application json d node address http 127 0 0 1 8000 this will make the node at port 8000 aware of the nodes at port 8001 and 8002 and make the newer nodes sync the chain with the node 8000 so that they are able to actively participate in the mining process post registration to update the node with which the frontend application syncs default is localhost port 8000 change connected node address field in the views py app views py file once you do all this you can run the application create transactions post messages via the web inteface and once you mine the transactions all the nodes in the network will update the chain the chain of the nodes can also be inspected by inovking chain endpoint using curl sh curl x get http localhost 8001 chain curl x get http localhost 8002 chain
blockchain python consensus-algorithm decentralized-application tutorial blockchain-application python-blockchain
blockchain
spark-nlp
spark nlp state of the art natural language processing p align center a href https github com johnsnowlabs spark nlp actions alt build img src https github com johnsnowlabs spark nlp workflows build badge svg a a href https github com johnsnowlabs spark nlp releases alt current release version img src https img shields io github v release johnsnowlabs spark nlp svg style flat square logo github a a href https search maven org artifact com johnsnowlabs nlp spark nlp 2 12 alt maven central img src https maven badges herokuapp com maven central com johnsnowlabs nlp spark nlp 2 12 badge svg a a href https badge fury io py spark nlp alt pypi version img src https badge fury io py spark nlp svg a a href https anaconda org johnsnowlabs spark nlp alt anaconda cloud img src https anaconda org johnsnowlabs spark nlp badges version svg a a href https github com johnsnowlabs spark nlp blob master license alt license img src https img shields io badge license apache 202 0 blue svg a a href https pypi org project spark nlp alt pypi downloads img src https static pepy tech personalized badge spark nlp period total units international system left color grey right color orange left text pip 20downloads a p spark nlp is a state of the art natural language processing library built on top of apache spark it provides simple performant accurate nlp annotations for machine learning pipelines that scale easily in a distributed environment spark nlp comes with 21000 pretrained pipelines and models in more than 200 languages it also offers tasks such as tokenization word segmentation part of speech tagging word and sentence embeddings named entity recognition dependency parsing spell checking text classification sentiment analysis token classification machine translation 180 languages summarization question answering table question answering text generation image classification image to text captioning automatic speech recognition zero shot learning and many more nlp tasks features spark nlp is the only open source nlp library in production that offers state of the art transformers such as bert camembert albert electra xlnet distilbert roberta deberta xlm roberta longformer elmo universal sentence encoder facebook bart instructor e5 google t5 marianmt openai gpt2 and vision transformers vit not only to python and r but also to jvm ecosystem java scala and kotlin at scale by extending apache spark natively project s website take a look at our official spark nlp page https sparknlp org https sparknlp org for user documentation and examples community support slack https join slack com t spark nlp shared invite zt 198dipu77 l3uwne aj8xqdk0ivmih5q for live discussion with the spark nlp community and the team github https github com johnsnowlabs spark nlp bug reports feature requests and contributions discussions https github com johnsnowlabs spark nlp discussions engage with other community members share ideas and show off how you use spark nlp medium https medium com spark nlp spark nlp articles youtube https www youtube com channel ucmfojlpyehxf wjuduz6xxq videos spark nlp video tutorials table of contents features features requirements requirements quick start quick start apache spark support apache spark support scala python support scala and python support databricks support databricks support emr support emr support using spark nlp usage packages cheatsheet packages cheatsheet spark packages spark packages scala scala maven maven sbt sbt python python pip conda pipconda compiled jars compiled jars apache zeppelin apache zeppelin jupyter notebook jupyter notebook python google colab notebook google colab notebook kaggle kernel kaggle kernel databricks cluster databricks cluster emr cluster emr cluster gcp dataproc gcp dataproc spark nlp configuration spark nlp configuration pipelines models pipelines and models pipelines pipelines models models offline offline examples examples faq faq citation citation contributing contributing features tokenization trainable word segmentation stop words removal token normalizer document normalizer stemmer lemmatizer ngrams regex matching text matching chunking date matcher sentence detector deep sentence detector deep learning dependency parsing labeled unlabeled spanbertcorefmodel coreference resolution part of speech tagging sentiment detection ml models spell checker ml and dl models word embeddings glove and word2vec doc2vec based on word2vec bert embeddings tf hub huggingface models distilbert embeddings huggingface models camembert embeddings huggingface models roberta embeddings huggingface models deberta embeddings huggingface v2 v3 models xlm roberta embeddings huggingface models longformer embeddings huggingface models albert embeddings tf hub huggingface models xlnet embeddings elmo embeddings tf hub models universal sentence encoder tf hub models bert sentence embeddings tf hub huggingface models roberta sentence embeddings huggingface models xlm roberta sentence embeddings huggingface models instructor embeddings huggingface models e5 embeddings huggingface models mpnet embeddings huggingface models openai embeddings sentence embeddings chunk embeddings unsupervised keywords extraction language detection identification up to 375 languages multi class sentiment analysis deep learning multi label sentiment analysis deep learning multi class text classification deep learning bert for token sequence classification distilbert for token sequence classification camembert for token sequence classification albert for token sequence classification roberta for token sequence classification deberta for token sequence classification xlm roberta for token sequence classification xlnet for token sequence classification longformer for token sequence classification bert for token sequence classification bert for question answering camembert for question answering distilbert for question answering albert for question answering roberta for question answering deberta for question answering xlm roberta for question answering longformer for question answering table question answering tapas zero shot ner model zero shot text classification by transformers zsl neural machine translation marianmt text to text transfer transformer google t5 generative pre trained transformer 2 openai gpt2 seq2seq for nlg translation and comprehension facebook bart vision transformer google vit swin image classification microsoft swin transformer convnext image classification facebook convnext vision encoder decoder for image to text like captioning automatic speech recognition wav2vec2 automatic speech recognition hubert automatic speech recognition openai whisper named entity recognition deep learning easy onnx and tensorflow integrations gpu support full integration with spark ml functions 15000 pre trained models in 200 languages 5800 pre trained pipelines in 200 languages multi lingual ner models arabic bengali chinese danish dutch english finnish french german hebrew italian japanese korean norwegian persian polish portuguese russian spanish swedish urdu and more requirements to use spark nlp you need the following requirements java 8 and 11 apache spark 3 4 x 3 3 x 3 2 x 3 1 x 3 0 x gpu optional spark nlp 5 1 3 is built with onnx 1 15 1 and tensorflow 2 7 1 deep learning engines the minimum following nvidia software are only required for gpu support nvidia gpu drivers version 450 80 02 or higher cuda toolkit 11 2 cudnn sdk 8 1 0 quick start this is a quick example of how to use spark nlp pre trained pipeline in python and pyspark sh java version should be java 8 or 11 oracle or openjdk conda create n sparknlp python 3 7 y conda activate sparknlp spark nlp by default is based on pyspark 3 x pip install spark nlp 5 1 3 pyspark 3 3 1 in python console or jupyter python3 kernel python import spark nlp from sparknlp base import from sparknlp annotator import from sparknlp pretrained import pretrainedpipeline import sparknlp start sparksession with spark nlp start functions has 3 parameters gpu apple silicon and memory sparknlp start gpu true will start the session with gpu support sparknlp start apple silicon true will start the session with macos m1 m2 support sparknlp start memory 16g to change the default driver memory in sparksession spark sparknlp start download a pre trained pipeline pipeline pretrainedpipeline explain document dl lang en your testing dataset text the mona lisa is a 16th century oil painting created by leonardo it s held at the louvre in paris annotate your testing dataset result pipeline annotate text what s in the pipeline list result keys output entities stem checked lemma document pos token ner embeddings sentence check the results result entities output mona lisa leonardo louvre paris for more examples you can visit our dedicated examples https github com johnsnowlabs spark nlp tree master examples to showcase all spark nlp use cases apache spark support spark nlp 5 1 3 has been built on top of apache spark 3 4 while fully supports apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x spark nlp apache spark 2 3 x apache spark 2 4 x apache spark 3 0 x apache spark 3 1 x apache spark 3 2 x apache spark 3 3 x apache spark 3 4 x 5 0 x no no yes yes yes yes yes 4 4 x no no yes yes yes yes yes 4 3 x no no yes yes yes yes no 4 2 x no no yes yes yes yes no 4 1 x no no yes yes yes yes no 4 0 x no no yes yes yes yes no 3 4 x yes yes yes yes partially n a no 3 3 x yes yes yes yes no no no 3 2 x yes yes yes yes no no no 3 1 x yes yes yes yes no no no 3 0 x yes yes yes yes no no no 2 7 x yes yes no no no no no find out more about spark nlp versions from our release notes https github com johnsnowlabs spark nlp releases scala and python support spark nlp python 3 6 python 3 7 python 3 8 python 3 9 python 3 10 scala 2 11 scala 2 12 5 0 x no yes yes yes yes no yes 4 4 x no yes yes yes yes no yes 4 3 x yes yes yes yes yes no yes 4 2 x yes yes yes yes yes no yes 4 1 x yes yes yes yes no no yes 4 0 x yes yes yes yes no no yes 3 4 x yes yes yes yes no yes yes 3 3 x yes yes yes no no yes yes 3 2 x yes yes yes no no yes yes 3 1 x yes yes yes no no yes yes 3 0 x yes yes yes no no yes yes 2 7 x yes yes no no no yes no databricks support spark nlp 5 1 3 has been tested and is compatible with the following runtimes cpu 9 1 9 1 ml 10 1 10 1 ml 10 2 10 2 ml 10 3 10 3 ml 10 4 10 4 ml 10 5 10 5 ml 11 0 11 0 ml 11 1 11 1 ml 11 2 11 2 ml 11 3 11 3 ml 12 0 12 0 ml 12 1 12 1 ml 12 2 12 2 ml 13 0 13 0 ml 13 1 13 1 ml 13 2 13 2 ml 13 3 13 3 ml gpu 9 1 ml gpu 10 1 ml gpu 10 2 ml gpu 10 3 ml gpu 10 4 ml gpu 10 5 ml gpu 11 0 ml gpu 11 1 ml gpu 11 2 ml gpu 11 3 ml gpu 12 0 ml gpu 12 1 ml gpu 12 2 ml gpu 13 0 ml gpu 13 1 ml gpu 13 2 ml gpu 13 3 ml gpu emr support spark nlp 5 1 3 has been tested and is compatible with the following emr releases emr 6 2 0 emr 6 3 0 emr 6 3 1 emr 6 4 0 emr 6 5 0 emr 6 6 0 emr 6 7 0 emr 6 8 0 emr 6 9 0 emr 6 10 0 emr 6 11 0 emr 6 12 0 full list of amazon emr 6 x releases https docs aws amazon com emr latest releaseguide emr release 6x html note the emr 6 1 0 and 6 1 1 are not supported usage packages cheatsheet this is a cheatsheet for corresponding spark nlp maven package to apache spark pyspark major version apache spark spark nlp on cpu spark nlp on gpu spark nlp on aarch64 linux spark nlp on apple silicon 3 0 3 1 3 2 3 3 3 4 spark nlp spark nlp gpu spark nlp aarch64 spark nlp silicon start function sparknlp start sparknlp start gpu true sparknlp start aarch64 true sparknlp start apple silicon true note m1 m2 and aarch64 are under experimental support access and support to these architectures are limited by the community and we had to build most of the dependencies by ourselves to make them compatible we support these two architectures however they may not work in some environments spark packages command line requires internet connection spark nlp supports all major releases of apache spark 3 0 x apache spark 3 1 x apache spark 3 2 x apache spark 3 3 x and apache spark 3 4 x apache spark 3 x 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x scala 2 12 sh cpu spark shell packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 pyspark packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 spark submit packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 the spark nlp has been published to the maven repository https mvnrepository com artifact com johnsnowlabs nlp spark nlp sh gpu spark shell packages com johnsnowlabs nlp spark nlp gpu 2 12 5 1 3 pyspark packages com johnsnowlabs nlp spark nlp gpu 2 12 5 1 3 spark submit packages com johnsnowlabs nlp spark nlp gpu 2 12 5 1 3 the spark nlp gpu has been published to the maven repository https mvnrepository com artifact com johnsnowlabs nlp spark nlp gpu sh aarch64 spark shell packages com johnsnowlabs nlp spark nlp aarch64 2 12 5 1 3 pyspark packages com johnsnowlabs nlp spark nlp aarch64 2 12 5 1 3 spark submit packages com johnsnowlabs nlp spark nlp aarch64 2 12 5 1 3 the spark nlp aarch64 has been published to the maven repository https mvnrepository com artifact com johnsnowlabs nlp spark nlp aarch64 sh m1 m2 apple silicon spark shell packages com johnsnowlabs nlp spark nlp silicon 2 12 5 1 3 pyspark packages com johnsnowlabs nlp spark nlp silicon 2 12 5 1 3 spark submit packages com johnsnowlabs nlp spark nlp silicon 2 12 5 1 3 the spark nlp silicon has been published to the maven repository https mvnrepository com artifact com johnsnowlabs nlp spark nlp silicon note in case you are using large pretrained models like universalsentenceencoder you need to have the following set in your sparksession sh spark shell driver memory 16g conf spark kryoserializer buffer max 2000m packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 scala spark nlp supports scala 2 12 15 if you are using apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x versions our packages are deployed to maven central to add any of our packages as a dependency in your application you can follow these coordinates maven spark nlp on apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x xml https mvnrepository com artifact com johnsnowlabs nlp spark nlp dependency groupid com johnsnowlabs nlp groupid artifactid spark nlp 2 12 artifactid version 5 1 3 version dependency spark nlp gpu xml https mvnrepository com artifact com johnsnowlabs nlp spark nlp gpu dependency groupid com johnsnowlabs nlp groupid artifactid spark nlp gpu 2 12 artifactid version 5 1 3 version dependency spark nlp aarch64 xml https mvnrepository com artifact com johnsnowlabs nlp spark nlp aarch64 dependency groupid com johnsnowlabs nlp groupid artifactid spark nlp aarch64 2 12 artifactid version 5 1 3 version dependency spark nlp silicon xml https mvnrepository com artifact com johnsnowlabs nlp spark nlp silicon dependency groupid com johnsnowlabs nlp groupid artifactid spark nlp silicon 2 12 artifactid version 5 1 3 version dependency sbt spark nlp on apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x sbtshell https mvnrepository com artifact com johnsnowlabs nlp spark nlp librarydependencies com johnsnowlabs nlp spark nlp 5 1 3 spark nlp gpu sbtshell https mvnrepository com artifact com johnsnowlabs nlp spark nlp gpu librarydependencies com johnsnowlabs nlp spark nlp gpu 5 1 3 spark nlp aarch64 sbtshell https mvnrepository com artifact com johnsnowlabs nlp spark nlp aarch64 librarydependencies com johnsnowlabs nlp spark nlp aarch64 5 1 3 spark nlp silicon sbtshell https mvnrepository com artifact com johnsnowlabs nlp spark nlp silicon librarydependencies com johnsnowlabs nlp spark nlp silicon 5 1 3 maven central https mvnrepository com artifact com johnsnowlabs nlp https mvnrepository com artifact com johnsnowlabs nlp if you are interested there is a simple sbt project for spark nlp to guide you on how to use it in your projects spark nlp sbt starter https github com maziyarpanahi spark nlp starter python spark nlp supports python 3 6 x and above depending on your major pyspark version python without explicit pyspark installation pip conda if you installed pyspark through pip conda you can install spark nlp through the same channel pip bash pip install spark nlp 5 1 3 conda bash conda install c johnsnowlabs spark nlp pypi spark nlp package https pypi org project spark nlp anaconda spark nlp package https anaconda org johnsnowlabs spark nlp then you ll have to create a sparksession either from spark nlp python import sparknlp spark sparknlp start or manually python spark sparksession builder appname spark nlp master local config spark driver memory 16g config spark driver maxresultsize 0 config spark kryoserializer buffer max 2000m config spark jars packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 getorcreate if using local jars you can use spark jars instead for comma delimited jar files for cluster setups of course you ll have to put the jars in a reachable location for all driver and executor nodes quick example python import sparknlp from sparknlp pretrained import pretrainedpipeline create or get spark session spark sparknlp start sparknlp version spark version download load and annotate a text by pre trained pipeline pipeline pretrainedpipeline recognize entities dl en result pipeline annotate the mona lisa is a 16th century oil painting created by leonardo compiled jars build from source spark nlp fat jar for cpu on apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x bash sbt assembly fat jar for gpu on apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x bash sbt dis gpu true assembly fat jar for m on apache spark 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x bash sbt dis silicon true assembly using the jar manually if for some reason you need to use the jar you can either download the fat jars provided here or download it from maven central https mvnrepository com artifact com johnsnowlabs nlp to add jars to spark programs use the jars option sh spark shell jars spark nlp jar the preferred way to use the library when running spark programs is using the packages option as specified in the spark packages section apache zeppelin use either one of the following options add the following maven coordinates to the interpreter s library list bash com johnsnowlabs nlp spark nlp 2 12 5 1 3 add a path to pre built jar from here compiled jars in the interpreter s library list making sure the jar is available to driver path python in zeppelin apart from the previous step install the python module through pip bash pip install spark nlp 5 1 3 or you can install spark nlp from inside zeppelin by using conda bash python conda install c johnsnowlabs spark nlp configure zeppelin properly use cells with spark pyspark or any interpreter name you chose finally in zeppelin interpreter settings make sure you set properly zeppelin python to the python you want to use and install the pip library with e g python3 an alternative option would be to set spark submit options zeppelin env sh and make sure packages is there as shown earlier since it includes both scala and python side installation jupyter notebook python recommended the easiest way to get this done on linux and macos is to simply install spark nlp and pyspark pypi packages and launch the jupyter from the same python environment sh conda create n sparknlp python 3 8 y conda activate sparknlp spark nlp by default is based on pyspark 3 x pip install spark nlp 5 1 3 pyspark 3 3 1 jupyter jupyter notebook then you can use python3 kernel to run your code with creating sparksession via spark sparknlp start optional if you are in different operating systems and require to make jupyter notebook run by using pyspark you can follow these steps bash export spark home path to your spark folder export pyspark python python3 export pyspark driver python jupyter export pyspark driver python opts notebook pyspark packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 alternatively you can mix in using jars option for pyspark pip install spark nlp if not using pyspark at all you ll have to run the instructions pointed here python without explicit pyspark installation google colab notebook google colab is perhaps the easiest way to get started with spark nlp it requires no installation or setup other than having a google account run the following code in google colab notebook and start using spark nlp right away sh this is only to setup pyspark and spark nlp on colab wget https setup johnsnowlabs com colab sh o bash this script comes with the two options to define pyspark and spark nlp versions via options sh p is for pyspark s is for spark nlp g will enable upgrading libcudnn8 to 8 1 0 on google colab for gpu usage by default they are set to the latest wget https setup johnsnowlabs com colab sh o bash dev stdin p 3 2 3 s 5 1 3 spark nlp quick start on google colab https colab research google com github johnsnowlabs spark nlp blob master examples python quick start google colab ipynb is a live demo on google colab that performs named entity recognitions and sentiment analysis by using spark nlp pretrained pipelines kaggle kernel run the following code in kaggle kernel and start using spark nlp right away sh let s setup kaggle for spark nlp and pyspark wget https setup johnsnowlabs com kaggle sh o bash this script comes with the two options to define pyspark and spark nlp versions via options sh p is for pyspark s is for spark nlp g will enable upgrading libcudnn8 to 8 1 0 on kaggle for gpu usage by default they are set to the latest wget https setup johnsnowlabs com colab sh o bash dev stdin p 3 2 3 s 5 1 3 spark nlp quick start on kaggle kernel https www kaggle com mozzie spark nlp named entity recognition is a live demo on kaggle kernel that performs named entity recognitions by using spark nlp pretrained pipeline databricks cluster 1 create a cluster if you don t have one already 2 on a new cluster or existing one you need to add the following to the advanced options spark tab bash spark kryoserializer buffer max 2000m spark serializer org apache spark serializer kryoserializer 3 in libraries tab inside your cluster you need to follow these steps 3 1 install new pypi spark nlp 5 1 3 install 3 2 install new maven coordinates com johnsnowlabs nlp spark nlp 2 12 5 1 3 install 4 now you can attach your notebook to the cluster and use spark nlp note databricks runtimes support different apache spark major releases please make sure you choose the correct spark nlp maven package name maven coordinate for your runtime from our packages cheatsheet https github com johnsnowlabs spark nlp packages cheatsheet emr cluster to launch emr clusters with apache spark pyspark and spark nlp correctly you need to have bootstrap and software configuration a sample of your bootstrap script sh bin bash set x e echo e export pyspark python usr bin python3 export hadoop conf dir etc hadoop conf export spark jars dir usr lib spark jars export spark home usr lib spark home bashrc source home bashrc sudo python3 m pip install awscli boto spark nlp set x exit 0 a sample of your software configuration in json on s3 must be public access json classification spark env configurations classification export properties pyspark python usr bin python3 classification spark defaults properties spark yarn stagingdir hdfs tmp spark yarn preserve staging files true spark kryoserializer buffer max 2000m spark serializer org apache spark serializer kryoserializer spark driver maxresultsize 0 spark jars packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 a sample of aws cli to launch emr cluster sh aws emr create cluster name spark nlp 5 1 3 release label emr 6 2 0 applications name hadoop name spark name hive instance type m4 4xlarge instance count 3 use default roles log uri s3 s3 bucket bootstrap actions path s3 s3 bucket emr bootstrap sh name custome configurations https public access sparknlp config json ec2 attributes keyname your ssh key emrmanagedmastersecuritygroup security group with ssh emrmanagedslavesecuritygroup security group with ssh profile aws profile credentials gcp dataproc 1 create a cluster if you don t have one already as follows at gcloud shell bash gcloud services enable dataproc googleapis com compute googleapis com storage component googleapis com bigquery googleapis com bigquerystorage googleapis com bash region region bash bucket name bucket name gsutil mb c standard l region gs bucket name bash region region zone zone cluster name cluster name bucket name bucket name you can set image version master machine type worker machine type master boot disk size worker boot disk size num workers as your needs if you use the previous image version from 2 0 you should also add anaconda to optional components and you should enable gateway don t forget to set the maven coordinates for the jar in properties bash gcloud dataproc clusters create cluster name region region zone zone image version 2 0 master machine type n1 standard 4 worker machine type n1 standard 2 master boot disk size 128gb worker boot disk size 128gb num workers 2 bucket bucket name optional components jupyter enable component gateway metadata pip packages spark nlp spark nlp display google cloud bigquery google cloud storage initialization actions gs goog dataproc initialization actions region python pip install sh properties spark spark serializer org apache spark serializer kryoserializer spark spark driver maxresultsize 0 spark spark kryoserializer buffer max 2000m spark spark jars packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 2 on an existing one you need to install spark nlp and spark nlp display packages from pypi 3 now you can attach your notebook to the cluster and use the spark nlp spark nlp configuration you can change the following spark nlp configurations via spark configuration property name default meaning spark jsl settings pretrained cache folder cache pretrained the location to download and extract pretrained models and pipelines by default it will be in user s home directory under cache pretrained directory spark jsl settings storage cluster tmp dir hadoop tmp dir the location to use on a cluster for temporarily files such as unpacking indexes for wordembeddings by default this locations is the location of hadoop tmp dir set via hadoop configuration for apache spark note s3 is not supported and it must be local hdfs or dbfs spark jsl settings annotator log folder annotator logs the location to save logs from annotators during training such as nerdlapproach classifierdlapproach sentimentdlapproach multiclassifierdlapproach etc by default it will be in user s home directory under annotator logs directory spark jsl settings aws credentials access key id none your aws access key to use your s3 bucket to store log files of training models or access tensorflow graphs used in nerdlapproach spark jsl settings aws credentials secret access key none your aws secret access key to use your s3 bucket to store log files of training models or access tensorflow graphs used in nerdlapproach spark jsl settings aws credentials session token none your aws mfa session token to use your s3 bucket to store log files of training models or access tensorflow graphs used in nerdlapproach spark jsl settings aws s3 bucket none your aws s3 bucket to store log files of training models or access tensorflow graphs used in nerdlapproach spark jsl settings aws region none your aws region to use your s3 bucket to store log files of training models or access tensorflow graphs used in nerdlapproach how to set spark nlp configuration sparksession you can use config during sparksession creation to set spark nlp configurations python from pyspark sql import sparksession spark sparksession builder master local config spark driver memory 16g config spark driver maxresultsize 0 config spark serializer org apache spark serializer kryoserializer config spark kryoserializer buffer max 2000m config spark jsl settings pretrained cache folder sample data pretrained config spark jsl settings storage cluster tmp dir sample data storage config spark jars packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 getorcreate spark shell sh spark shell driver memory 16g conf spark driver maxresultsize 0 conf spark serializer org apache spark serializer kryoserializer conf spark kryoserializer buffer max 2000m conf spark jsl settings pretrained cache folder sample data pretrained conf spark jsl settings storage cluster tmp dir sample data storage packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 pyspark sh pyspark driver memory 16g conf spark driver maxresultsize 0 conf spark serializer org apache spark serializer kryoserializer conf spark kryoserializer buffer max 2000m conf spark jsl settings pretrained cache folder sample data pretrained conf spark jsl settings storage cluster tmp dir sample data storage packages com johnsnowlabs nlp spark nlp 2 12 5 1 3 databricks on a new cluster or existing one you need to add the following to the advanced options spark tab bash spark kryoserializer buffer max 2000m spark serializer org apache spark serializer kryoserializer spark jsl settings pretrained cache folder dbfs path to cache spark jsl settings storage cluster tmp dir dbfs path to storage spark jsl settings annotator log folder dbfs path to logs note if this is an existing cluster after adding new configs or changing existing properties you need to restart it s3 integration in spark nlp we can define s3 locations to export log files of training models store tensorflow graphs used in nerdlapproach logging to configure s3 path for logging while training models we need to set up aws credentials as well as an s3 path bash spark conf set spark jsl settings annotator log folder s3 my s3 path logs spark conf set spark jsl settings aws credentials access key id my key id spark conf set spark jsl settings aws credentials secret access key my secret access key spark conf set spark jsl settings aws s3 bucket my bucket spark conf set spark jsl settings aws region my region now you can check the log on your s3 path defined in spark jsl settings annotator log folder property make sure to use the prefix s3 otherwise it will use the default configuration tensorflow graphs to reference s3 location for downloading graphs we need to set up aws credentials bash spark conf set spark jsl settings aws credentials access key id my key id spark conf set spark jsl settings aws credentials secret access key my secret access key spark conf set spark jsl settings aws region my region mfa configuration in case your aws account is configured with mfa you will need first to get temporal credentials and add session token to the configuration as shown in the examples below for logging bash spark conf set spark jsl settings aws credentials session token my token an example of a bash script that gets temporal aws credentials can be found here https github com johnsnowlabs spark nlp blob master scripts aws tmp credentials sh this script requires three arguments bash aws tmp credentials sh iam user duration serial number pipelines and models pipelines quick example scala import com johnsnowlabs nlp pretrained pretrainedpipeline import com johnsnowlabs nlp sparknlp sparknlp version val testdata spark createdataframe seq 1 google has announced the release of a beta version of the popular tensorflow machine learning library 2 donald john trump born june 14 1946 is the 45th and current president of the united states todf id text val pipeline pretrainedpipeline explain document dl lang en val annotation pipeline transform testdata annotation show import com johnsnowlabs nlp pretrained pretrainedpipeline import com johnsnowlabs nlp sparknlp 2 5 0 testdata org apache spark sql dataframe id int text string pipeline com johnsnowlabs nlp pretrained pretrainedpipeline pretrainedpipeline explain document dl en public models annotation org apache spark sql dataframe id int text string 10 more fields id text document token sentence checked lemma stem pos embeddings ner entities 1 google has announ document 0 10 token 0 5 go document 0 10 token 0 5 go token 0 5 go token 0 5 go pos 0 5 nnp word embeddings named entity 0 chunk 0 5 go 2 the paris metro w document 0 11 token 0 2 th document 0 11 token 0 2 th token 0 2 th token 0 2 th pos 0 2 dt word embeddings named entity 0 chunk 4 8 pa annotation select entities result show false result google tensorflow donald john trump united states showing available pipelines there are functions in spark nlp that will list all the available pipelines of a particular language for you scala import com johnsnowlabs nlp pretrained resourcedownloader resourcedownloader showpublicpipelines lang en pipeline lang version dependency parse en 2 0 2 analyze sentiment ml en 2 0 2 check spelling en 2 1 0 match datetime en 2 1 0 explain document ml en 3 1 3 or if we want to check for a particular version scala import com johnsnowlabs nlp pretrained resourcedownloader resourcedownloader showpublicpipelines lang en version 3 1 0 pipeline lang version dependency parse en 2 0 2 clean slang en 3 0 0 clean pattern en 3 0 0 check spelling en 3 0 0 dependency parse en 3 0 0 please check out our models hub for the full list of pre trained pipelines https sparknlp org models with examples demos benchmarks and more models some selected languages afrikaans arabic armenian basque bengali breton bulgarian catalan czech dutch english esperanto finnish french galician german greek hausa hebrew hindi hungarian indonesian irish italian japanese latin latvian marathi norwegian persian polish portuguese romanian russian slovak slovenian somali southern sotho spanish swahili swedish tswana turkish ukrainian zulu quick online example python load ner model trained by deep learning approach and glove word embeddings ner dl nerdlmodel pretrained ner dl load ner model trained by deep learning approach and bert word embeddings ner bert nerdlmodel pretrained ner dl bert scala load french pos tagger model trained by universal dependencies val french pos perceptronmodel pretrained pos ud gsd lang fr load italian lemmatizermodel val italian lemma lemmatizermodel pretrained lemma dxc lang it quick offline example loading perceptronmodel annotator model inside spark nlp pipeline scala val french pos perceptronmodel load tmp pos ud gsd fr 2 0 2 2 4 1556531457346 setinputcols document token setoutputcol pos showing available models there are functions in spark nlp that will list all the available models of a particular annotator and language for you scala import com johnsnowlabs nlp pretrained resourcedownloader resourcedownloader showpublicmodels annotator nerdlmodel lang en model lang version onto 100 en 2 1 0 onto 300 en 2 1 0 ner dl bert en 2 2 0 onto 100 en 2 4 0 ner conll elmo en 3 2 2 or if we want to check for a particular version scala import com johnsnowlabs nlp pretrained resourcedownloader resourcedownloader showpublicmodels annotator nerdlmodel lang en version 3 1 0 model lang version onto 100 en 2 1 0 ner aspect based sentiment en 2 6 2 ner weibo glove 840b 300d en 2 6 2 nerdl atis 840b 300d en 2 7 1 nerdl snips 100d en 2 7 3 and to see a list of available annotators you can use scala import com johnsnowlabs nlp pretrained resourcedownloader resourcedownloader showavailableannotators albertembeddings albertfortokenclassification assertiondlmodel xlmrobertasentenceembeddings xlnetembeddings please check out our models hub for the full list of pre trained models https sparknlp org models with examples demo benchmark and more offline spark nlp library and all the pre trained models pipelines can be used entirely offline with no access to the internet if you are behind a proxy or a firewall with no access to the maven repository to download packages or and no access to s3 to automatically download models and pipelines you can simply follow the instructions to have spark nlp without any limitations offline instead of using the maven package you need to load our fat jar instead of using pretrainedpipeline for pretrained pipelines or the pretrained function to download pretrained models you will need to manually download your pipeline model from models hub https sparknlp org models extract it and load it example of sparksession with fat jar to have spark nlp offline python spark sparksession builder appname spark nlp master local config spark driver memory 16g config spark driver maxresultsize 0 config spark kryoserializer buffer max 2000m config spark jars tmp spark nlp assembly 5 1 3 jar getorcreate you can download provided fat jars from each release notes https github com johnsnowlabs spark nlp releases please pay attention to pick the one that suits your environment depending on the device cpu gpu and apache spark version 3 0 x 3 1 x 3 2 x 3 3 x and 3 4 x if you are local you can load the fat jar from your local filesystem however if you are in a cluster setup you need to put the fat jar on a distributed filesystem such as hdfs dbfs s3 etc i e hdfs tmp spark nlp assembly 5 1 3 jar example of using pretrained models and pipelines in offline python instead of using pretrained for online french pos perceptronmodel pretrained pos ud gsd lang fr you download this model extract it and use load french pos perceptronmodel load tmp pos ud gsd fr 2 0 2 2 4 1556531457346 setinputcols document token setoutputcol pos example for pipelines instead of using pretrainedpipeline pipeline pretrainedpipeline explain document dl lang en you download this pipeline extract it and use pipelinemodel pipelinemodel load tmp explain document dl en 2 0 2 2 4 1556530585689 since you are downloading and loading models pipelines manually this means spark nlp is not downloading the most recent and compatible models pipelines for you choosing the right model pipeline is on you if you are local you can load the model pipeline from your local filesystem however if you are in a cluster setup you need to put the model pipeline on a distributed filesystem such as hdfs dbfs s3 etc i e hdfs tmp explain document dl en 2 0 2 2 4 1556530585689 examples need more examples check out our dedicated spark nlp examples https github com johnsnowlabs spark nlp tree master examples repository to showcase all spark nlp use cases also don t forget to check spark nlp in action https sparknlp org demo built by streamlit all examples spark nlp examples https github com johnsnowlabs spark nlp tree master examples faq check our articles and videos page here https sparknlp org learn citation we have published a paper https www sciencedirect com science article pii s2665963821000063 that you can cite for the spark nlp library bibtex article kocaman2021100058 title spark nlp natural language understanding at scale journal software impacts pages 100058 year 2021 issn 2665 9638 doi https doi org 10 1016 j simpa 2021 100058 url https www sciencedirect com science article pii s2665963 2 300063 author veysel kocaman and david talby keywords spark natural language processing deep learning tensorflow cluster abstract spark nlp is a natural language processing nlp library built on top of apache spark ml it provides simple performant accurate nlp annotations for machine learning pipelines that can scale easily in a distributed environment spark nlp comes with 1100 pretrained pipelines and models in more than 192 languages it supports nearly all the nlp tasks and modules that can be used seamlessly in a cluster downloaded more than 2 7 million times and experiencing 9x growth since january 2020 spark nlp is used by 54 of healthcare organizations as the world s most widely used nlp library in the enterprise contributing we appreciate any sort of contributions ideas feedback documentation bug reports nlp training and testing corpora development and testing clone the repo and submit your pull requests or directly create issues in this repo john snow labs http johnsnowlabs com http johnsnowlabs com
nlp natural-language-processing spark pyspark named-entity-recognition sentiment-analysis lemmatizer spell-checker entity-extraction part-of-speech-tagger bert transformers albert tensorflow language-detection machine-translation text-classification language-model llm question-answering
ai
spaceggs
spaceggs spaceggs is the sfu satellite design team sfusat attitude determination system resources to get started kalman filter basics 1 artificial intelligence for robotics https www udacity com course artificial intelligence for robotics cs373 easy 2 probabilistic robotics https docs ufpr br danielsantos probabilisticrobotics pdf math 3 slam course ws13 14 https www youtube com playlist list plgnqpqtftogqrz4o5qzbihgl3b1jhimn somewhere in the middle 4 kalman and bayesian filters in python http nbviewer jupyter org github rlabbe kalman and bayesian filters in python blob master table of contents ipynb preferred reading attitude determination improving attitude determination and control of resource constrained cubesats using unscented kalman filtering by weston alan navarro marlow professor kerri l cahoy http ssl mit edu files website theses sm 2016 marlowweston pdf transformations computer vision algorithms and applications http szeliski org book drafts szeliskibook 20100903 draft pdf by richard szeliski
attitude-determination sensor-fusion sensor-modeling orbital-simulation
os
yahya196
yahya196 information technology
server
altschool-cloud-exercises
altschool cloud exercises this repository contains exercises from the cloud engineering class
cloud
Institute-Service-Search-System
institute services search system better institute service search system software engineering lab project a basic web app implemented using node js mondodb express and bootstrap for frontend a search app that access the database to filter results it finds the local databses of different department and fetchs the result to run npm install node app to run mongo server make sure you have mongodb installed mongod in console page1 https github com ashu9999 institute search system blob master public stylesheets images form png page2 https github com ashu9999 institute search system blob master public stylesheets images inventory png page3 https github com ashu9999 institute search system blob master public stylesheets images item png page4 https github com ashu9999 institute search system blob master public stylesheets images landing png
server
UNITY-Arc-system-Works-Shader
download from releases here not as a zip https github com aerthas unity arc system works shader releases arc system works shader unity edition if you are looking for my blender arc system works shaders they are found here https github com aerthas blender arc system works shader br if you have any questions or want to show off some renders feel free to join my discord https discord gg ekcszg8 custom models using the shader tanjiro created from scratch by crypticlight https twitter com thecrypticlight https twitter com thecrypticlight tanjiro 1 https i imgur com x4uaoh4 png br tanjiro 1 https i imgur com s6y28dc png br emulates the look of the fabulous arc system works games br br confirmed games this shader will work with dragon ball fighterz guilty gear xrd rev 2 guilty gear xrd sign guilty gear strive dnf duel unconfirmed games that there is a 99 999 chance it will work with every other arc system works game that i just don t have disclaimer so this shader is made with zero teachings and everything i know about shaders is self taught over the last year and half i know some colors might be off such as the highlight fresnel color but i am actively trying to figure out how the game does it if you have any information about how a specific thing should be differently to make it look better please do not hesitate to join my discord specifically for this shader https discord gg ekcszg8 br br secondary note yes if you are animating something or using vrchat the colors will look right under normal light but might not look right if you are trying to do something such as having a point light in your hand to emulate a kamehameha arc system games do not actually light in real time per se but every single option of the shader is manually edited frame by frame by the art team as they make animations if you want something to look perfect during an animation you need to put in just as much work as the original art team did i ll keep trying to make it easy to work with but there must be a lot of variables
os
linkifier
estimate primary foreign key constraints in a database do you need to import csv files into a database but no one gave you the entity relationship model to set up the primary and foreign keys say no more screenshot https github com janmotl linkifier blob master screenshot png usage 1 download the latest release https github com janmotl linkifier releases 2 execute the jar file requieres oracle jre 8 because of javafx 3 connect to a database microsoft sql server mysql oracle or postgresql 4 estimate the primary and foreign key constraints 5 export the estimates multiple formats are supported including sql alter queries and graphml for er diagram visualization algorithm the algorithm works on the metadata about tables and columns which are accessible over jdbc and column and table statistics which the database uses for the execution plan optimization that means the algorithm is blazing fast as it does not look at the actual data primary key primary keys are identified by 1 column position in the table pks are commonly at the beginning 2 data type e g integers are preferred over doubles 3 presence of a keyword like id in the column name 4 similarity of the column name with the table name 5 ratio of nulls in the column from the column statistics 6 average column width in bytes from the column statistics pks are commonly short once these features are collected they are passed to logistic regression to estimate the probability that the column is in a primary key compound pks are supported since each table can have at most a single pk the column with the highest probability in the table is declared to be the primary key of the table if the estimated count of unique values in the estimated pk is smaller than the count of the rows in the table it is a sign of a compound pk in that case columns are incrementally added into the pk by their descending probability of being in the pk until the estimated pk unique row count estimated table row count the estimated pk unique row count is optimistic and is defined as a product of the individual unique row counts in the pk for example if we have a table with 100 rows and the best candidate for a pk has only 11 unique rows we will have to create a compound pk the second best candidate for a pk has 13 unique values since 11 13 100 we stop here and return a compound pk with these 2 columns foreign keys foreign keys are identified by 1 known pks fk should reference a pk 2 data types fk should reference a pk of a compatible data type 3 data type properties e g data type sizes should agree 4 similarity of the fk name with the pk name should be high 5 similarity of the fk name with the pk table name should be high 6 similarity of the fk name with the fk table name should be low 7 average column width of the fk should be similar to the average pk width 8 minimum and maximum values of the fk is in the range of the pk 9 minimum and maximum values of the fk cover a significant range of the pk values once again probabilities are estimated with logistic regression and the most likely fk constraints are returned a compound fk is returned if the pk is compound how to improve the quality of the estimates 1 calculate sample statistics on all tables in the schema example queries for calculating the statistics for a table molecule in mutagenesis schema microsoft sql server create statistics mutagenesis molecule mysql analyze table mutagenesis molecule oracle exec dbms stats gather table stats mutagenesis molecule postgresql analyze mutagenesis molecule 2 set primary keys in the schema once linkifier knows for sure what are the pks the accuracy of the foreign key constraint estimates increases 3 set known foreign key constraints in the schema linkifier will then attempt to find only remaining missing foreign key constraints 4 limit the search space with table blacklist regex option to a subset of tables that are actually interesting for you 5 inspect decision justification for fk export which contains the estimated probabilites for the top 2000 foreign key candidates the 3rd column from the right limitations if the schema quality is extremely low e g all columns are typed as text and have irrelevant names the pk and particularly fk estimates are going to be off first set correct data types and names to the columns then rerun linkifier known issues if you are using mysql and get access to data dictionary table mysql table stats is rejected then it is because mysql contrary to mariadb prevents access to internal tables to be able to run linkifier start the db in the debug mode http datacharmer blogspot com 2016 09 showing hidden tables in mysql 8 data html if you have problems to connect to mssql make sure login with username password https serverfault com questions 246951 set a login with username password for sql server 2008 express combination is permitted and tcp ip https stackoverflow com questions 18841744 jdbc connection failed error tcp ip connection to host failed protocol is permitted if you want to use windows authentication instead of username password combination a correct version of sqljdbc auth dll 32 64bit must be in the java path possibly the easisest solution is to copy sqljdbc auth dll from source directory lib mssql auth into bin directory in java e g c program files java jre8 bin alternatively start linkifier from the command line with arguments like java djava library path c path to sqljdbc auth directory jar linkifier jar if you get an error from a database driver the driver might be outdated replace the outdated driver with a new one which is compatible with jdk 1 8 and your version of database if you get table count 0 make sure that database name and schema name are both explicitly specified in mysql you specify only database name if you get error could not find or load main class controller mainapp caused by java lang noclassdeffounderror javafx application application on macos with m1 processor or newer you actually have to use openjdk 1 download and unpack openjdk https jdk java net 20 2 download and unpack javafx sdk https gluonhq com products javafx for macos x64 platform aarch64 currently does not work 3 execute linkifier with something like adjust paths based on the reality downloads jdk 20 jdk contents home bin java module path downloads javafx sdk 20 lib add modules javafx controls javafx fxml jar downloads linkifier linkifier 3 2 9 jar if gui still does not work run linkifier in commandline mode https github com janmotl linkifier issues 8 if you have any question or suggestion let me know how to cite foreign key constraint identification in relational databases http ceur ws org vol 1885 106 pdf jan motl and pavel kord k 2017 itat 2017 proceedings acknowledgement i would like to thank ale fi er oliver kerul kmec jan kuka ka ji kuka ka manuel mu oz george papaloukopoulos and batal thibaut for their help the code is using simmetrics https github com simmetrics simmetrics for text similarity calculations
database foreign-keys reverse-engineering er-diagram mysql postgresql mssql referential-integrity-constraint schema oracle
server
golangAnnotations
build status https travis ci org marcgrol golangannotations svg branch master https travis ci com marcgrol golangannotations coverage status https coveralls io repos github marcgrol golangannotations badge svg https coveralls io github marcgrol golangannotations bch compliance https bettercodehub com edge badge marcgrol golangannotations branch master https bettercodehub com maintainability https api codeclimate com v1 badges ec16a2ec356e87ccfbaf maintainability https codeclimate com github marcgrol golangannotations maintainability detailed explanation https github com marcgrol golangannotations wiki summary the golangannotations tool parses your golang source code into an intermediate representation using this intermediate representation the tool uses your annotations to generate source code that would be cumbersome and error prone to write manually bottom line a lot less code needs to be written example restoperation method get path person uid func s service getperson c context context uid string person error based on the annotation line code is generated that will do do all http handling read request unmarshall request call business logic marshall response write response in addition typestrong test functions are generated that ease testing of your rest operations the same annotation approach is used to ease event sourcing getting the software go get u t v github com marcgrol golangannotations testing and installing make gen make test make install or make currently supported annotations this first implementation provides the following kind of annotations web services jax rs like generate server side http handling for a service generate client side http handling for a service generate helpers to ease integration testing of your services event listeners generate server side http handling for receiving events generate helpers to ease integration testing of your event listeners event sourcing describe which events belong to which aggregate type strong boiler plate code to build an aggregate from individual events type strong boiler plate code to wrap and unwrap events into an envelope so that it can be easily stored and emitted how to use http server related annotations jax rs like a regular golang struct definition with our own restservice and restoperation annotations observe that examples rest tourservice go examples myrest tourservice go is used as input restservice path api type service struct restoperation method get path person uid func s service getperson c context context uid string person error observe that examples rest gen tourservice go have been generated example https github com marcgrol golangannotations wiki example of generated code of the generated http handler how to use event sourcing related annotations a regular golang struct definition with our own event annotation event aggregate tour type touretappecreated struct observe that examples event gen wrappers go and examples event gen aggregates go have been created in examples structexample command to trigger code generation we use the go generate mechanism to trigger our goannotations executable in order to trigger this mechanisme we use a go genarate comment with the command to be executed example go generate golangannotations input dir so can can use the regular toolchain to trigger code genaration cd gopath src github com marcgrol golangannotations go generate go imports will fix all the imports for i in find name go do goimports w local github com i done fixes formatting for generated code for i in find name go do gofmt s w i done
annotations code-generation golang golang-tools go parser tools
front_end
covid19
covid19 problem description welcome to the covid 19 data engineering project the main objective of this project is to apply data engineering concepts and utilize various tools on aws to create an end to end data engineering solution for covid 19 data analysis the project follows an extract transform load etl approach where users can extract load and transform the data to gain valuable insights the ultimate goal is to create a simple data engineering project that enables efficient handling and analysis of covid 19 data architecture example image https github com umidmirzaev covid19 blob main images architecture png raw true important prerequisites 1 create an iam role named s3 glue role this role should grant glue the necessary permissions to call aws services on your behalf 2 create redshift cluster and configure vpc security group set up a redshift cluster and adjust the vpc security group to allow inbound access by adding a new rule for your ip address 3 create an iam role named redshift s3 access this role should provide redshift with permission to access data stored in amazon s3 4 prepare glue environment for redshift job prior to creating a job in glue download the necessary external redshift connector and boto3 library since glue has limited support for default libraries create a zip package containing these libraries and upload it to an s3 bucket 5 establish jdbc connection to redshift cluster to ensure successful execution of a job create a jdbc connection to your redshift cluster 6 create vpc endpoint for amazon s3 it is essential to create a vpc endpoint for amazon s3 to establish connectivity between the vpc of your glue job and s3 key steps 1 upload covid 19 data lake from a registry of open data on aws to amazon s3 the project begins by uploading the covid 19 data lake from a registry of open data onto the amazon s3 storage service 2 create and run crawlers in amazon athena next we set up and execute crawlers within amazon athena these crawlers scan the tables stored in the amazon s3 buckets extracting structured data and making it easily accessible for analysis 3 python etl job we implement an etl job in python which focuses on converting the relational data model into a dimensional data model using a star schema 4 upload transformed data to amazon s3 after the etl job is executed the resulting dataframe is converted into a csv formatted string representation this transformed data is then uploaded back to an amazon s3 bucket for storage and convenient access 5 create configure vpc networking connect and run etl job in aws glue in this step we create and configure virtual private cloud vpc networking by establishing secure communication and connectivity we can seamlessly run the etl job in aws glue to create tables in amazon redshift tech stacks to build an end to end data pipeline for this project we ll be utilizing the following tools python including data extraction transformation and loading sql querying and manipulating data within databases amazon athena athena is instrumental in running queries on our data stored in amazon s3 it allows us to analyze structured and semi structured data using standard sql syntax aws glue glue is an etl extract transform load service provided by aws we ll utilize it to create and run our etl jobs transforming data between various formats and storing it in the desired destination amazon redshift redshift is a fully managed data warehousing service offered by aws it will serve as our storage and querying solution for large scale data analysis amazon s3 s3 simple storage service is an object storage service on aws that will be used to store our raw and transformed data it provides high durability availability and scalability iam identity and access management iam will be employed to manage user access and permissions to various aws resources ensuring secure and controlled data handling data model converting our relational data model to dimensional model by building a star schema before data modeling example image https github com umidmirzaev covid19 blob main images before jpg raw true after data modeling example image https github com umidmirzaev covid19 blob main images after jpg raw true further improvement in addition to the project s results there exist opportunities for analysis and visualization by establishing a connection between redshift and visualization tools such as tableau or power bi it becomes feasible to conduct analysis and construct visualization tables this integration enables users to explore the data further extract meaningful insights and present them in visually informative formats
cloud
Natural-Language-Processing-with-Java-Cookbook
natural language processing with java cookbook natural language processing with java cookbook published by packt download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781789801156 https packt link free ebook 9781789801156 a p
ai
healthvault-dotnetstandard-sdk
healthvault sdk for net standard daily build status https microsofthealth visualstudio com apis public build definitions f8da5110 49b1 4e9f 9022 2f58b6124ff9 174 badge this project is an sdk providing healthvault access to applications built on net standard the net standard projects use the new csproj format and require visual studio 2017 https www visualstudio com which is available as a free download the master branch represents the latest development code and release branches are used to stabilize for an official release this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments contributing we welcome contributions in the form of bugs code feedback and requests for more detailed information regarding how to contribute see our contribution guidelines contributing md
front_end
OBD_labeledVehicularDataBase
obd labeledvehiculardatabase labeled vahicular database to identify safe or unsafe driving it was made in my research in the master in systems engineering there are two types of labeled data databases with 13 pids that were recorded every 2 4s two databases databases with 5 pids that were recorded every 1s four databases
server
moya
moya moya is a platform for the rapid development of modern web applications installation you can install from pypi with the following pip install moya or download the repos and run the following python setup py install which ever method you use this will add a moya command line application quick start moya quick start will create a basic web application based on your specifications in seconds run the following to get started moya start project example projects in addition to the example projects directory in the moya repos you can check out the following complete projects moya techblog https github com moyaproject moya techblog a blogging application for coders and photographers bean counter https github com moyaproject beancounter a virtual currency platform encrypted notes https github com moyaproject notes an encrypted note taking application social links https github com moyaproject sociallinks a social linking reddit like application short url https github com moyaproject shorturl a super simple url shortener application running a project navigate to the project directory you want to run then do the following moya init this will initialize the database and users system you only need to do this once to run a development server do the following moya runserver and navigate to http 127 0 0 1 8000 package index find packages for moya on the moya package index https packages moyaproject com install packages moya pm which comes with moya moya pm install moya sociallinks more information moya homepage http www moyaproject com moya s documentation http docs moyaproject com developer s blog https www willmcgugan com tag moya
front_end
fiftyone
div align center p align center prettier ignore img src https user images githubusercontent com 25985824 106288517 2422e000 6216 11eb 871d 26ad2e7b1e59 png height 55px nbsp img src https user images githubusercontent com 25985824 106288518 24bb7680 6216 11eb 8f10 60052c519586 png height 50px the open source tool for building high quality datasets and computer vision models prettier ignore a href https voxel51 com fiftyone website a a href https voxel51 com docs fiftyone docs a a href https colab research google com github voxel51 fiftyone examples blob master examples quickstart ipynb try it now a a href https voxel51 com docs fiftyone tutorials index html tutorials a a href https github com voxel51 fiftyone examples examples a a href https voxel51 com blog blog a a href https slack voxel51 com community a pypi python https img shields io pypi pyversions fiftyone https pypi org project fiftyone pypi version https badge fury io py fiftyone svg https pypi org project fiftyone downloads https static pepy tech badge fiftyone https pepy tech project fiftyone docker pulls https badgen net docker pulls voxel51 fiftyone icon docker label pulls https hub docker com r voxel51 fiftyone build https github com voxel51 fiftyone workflows build badge svg branch develop event push https github com voxel51 fiftyone actions query workflow 3abuild license https img shields io badge license apache 202 0 blue svg license slack https img shields io badge slack 4a154b logo slack logocolor white https slack voxel51 com medium https img shields io badge medium 12100e logo medium logocolor white https medium com voxel51 mailing list http bit ly 2md9rxm https share hsforms com 1zpj60ggaqtoovebqizdaaa2ykyk twitter https img shields io twitter follow voxel51 style social https twitter com voxel51 fiftyone https voxel51 com images fiftyone poster png https fiftyone ai p div nothing hinders the success of machine learning systems more than poor quality data and without the right tools improving a model can be time consuming and inefficient fiftyone https fiftyone ai supercharges your machine learning workflows by enabling you to visualize datasets and interpret models faster and more effectively use fiftyone to get hands on with your data including visualizing complex labels evaluating your models exploring scenarios of interest identifying failure modes finding annotation mistakes and much more you can get involved by joining our slack community reading our blog on medium and following us on social media slack https img shields io badge slack 4a154b logo slack logocolor white https slack voxel51 com medium https img shields io badge medium 12100e logo medium logocolor white https medium com voxel51 twitter https img shields io badge twitter 1da1f2 logo twitter logocolor white https twitter com voxel51 linkedin https img shields io badge linkedin 0077b5 logo linkedin logocolor white https www linkedin com company voxel51 facebook https img shields io badge facebook 1877f2 logo facebook logocolor white https www facebook com voxel51 installation you can install the latest stable version of fiftyone via pip shell pip install fiftyone consult the installation guide https voxel51 com docs fiftyone getting started install html for troubleshooting and other information about getting up and running with fiftyone quickstart dive right into fiftyone by opening a python shell and running the snippet below which downloads a small dataset https voxel51 com docs fiftyone user guide dataset zoo datasets html quickstart and launches the fiftyone app https voxel51 com docs fiftyone user guide app html so you can explore it py import fiftyone as fo import fiftyone zoo as foz dataset foz load zoo dataset quickstart session fo launch app dataset then check out this colab notebook https colab research google com github voxel51 fiftyone examples blob master examples quickstart ipynb to see some common workflows on the quickstart dataset note that if you are running the above code in a script you must include session wait to block execution until you close the app see this page https voxel51 com docs fiftyone user guide app html creating a session for more information documentation full documentation for fiftyone is available at fiftyone ai https fiftyone ai in particular see these resources tutorials https voxel51 com docs fiftyone tutorials index html recipes https voxel51 com docs fiftyone recipes index html user guide https voxel51 com docs fiftyone user guide index html cli documentation https voxel51 com docs fiftyone cli index html api reference https voxel51 com docs fiftyone api fiftyone html examples check out the fiftyone examples https github com voxel51 fiftyone examples repository for open source and community contributed examples of using fiftyone contributing to fiftyone fiftyone is open source and community contributions are welcome check out the contribution guide https github com voxel51 fiftyone blob develop contributing md to learn how to get involved installing from source the instructions below are for macos and linux systems windows users may need to make adjustments if you are working in google colab skip to here source installs in google colab prerequisites you will need python https www python org 3 7 or newer node js https nodejs org on linux we recommend using nvm https github com nvm sh nvm to install an up to date version yarn https yarnpkg com once node js is installed you can install yarn via npm install g yarn on linux you will need at least the openssl and libcurl packages on debian based distributions you will need to install libcurl4 or libcurl3 instead of libcurl depending on the age of your distribution for example shell ubuntu sudo apt install libcurl4 openssl fedora sudo dnf install libcurl openssl installation we strongly recommend that you install fiftyone in a virtual environment https voxel51 com docs fiftyone getting started virtualenv html to maintain a clean workspace the install script is only supported in posix based systems e g mac and linux first clone the repository shell git clone https github com voxel51 fiftyone cd fiftyone then run the install script shell bash install bash note if you run into issues importing fiftyone you may need to add the path to the cloned repository to your pythonpath shell export pythonpath pythonpath path to fiftyone note the install script adds to your nvm settings in your bashrc or bash profile which is needed for installing and building the app note when you pull in new changes to the app you will need to rebuild it which you can do either by rerunning the install script or just running yarn build in the app directory upgrading your source installation to upgrade an existing source installation to the bleeding edge simply pull the latest develop branch and rerun the install script shell git checkout develop git pull bash install bash developer installation if you would like to contribute to fiftyone https github com voxel51 fiftyone blob develop contributing md you should perform a developer installation using the d flag of the install script shell bash install bash d source installs in google colab you can install from source in google colab https colab research google com by running the following in a cell and then restarting the runtime shell shell git clone depth 1 https github com voxel51 fiftyone git cd fiftyone bash install bash docker installs refer to these instructions https voxel51 com docs fiftyone environments index html docker to see how to build and run docker images containing source or release builds of fiftyone ui development on storybook voxel51 is currently in the process of implementing a storybook https storybook js org which contains examples of its basic ui components you can access the current storybook instances by running yarn storybook in app packages components while the storybook instance is running any changes to the component will trigger a refresh in the storybook app shell shell cd app packages components yarn storybook generating documentation see the docs guide https github com voxel51 fiftyone blob develop docs readme md for information on building and contributing to the documentation uninstallation you can uninstall fiftyone as follows shell pip uninstall fiftyone fiftyone brain fiftyone db fiftyone desktop contributors special thanks to these amazing people for contributing to fiftyone a href https github com voxel51 fiftyone graphs contributors img src https contrib rocks image repo voxel51 fiftyone a citation if you use fiftyone in your research feel free to cite the project but only if you love it bibtex article moore2020fiftyone title fiftyone author moore b e and corso j j journal github note https github com voxel51 fiftyone year 2020
machine-learning artificial-intelligence deep-learning computer-vision developer-tools data-science python hacktoberfest
ai
blog-finetuning-llama-adapters
blog finetuning llama adapters understanding parameter efficient finetuning of large language models from prefix tuning to adapters
deep-learning large-language-models pytorch transformer
ai
html-css-ex-01
html css ex 01 web development basic fork clone the repo checkout branch with your student id create duplicate and rename the blank html file as student id html follow instructions and hints in the document commit and push send pull request
front_end
mayday-2.0-frontend
mayday 2 0 frontend build status https travis ci org mayoneus mayday 2 0 onboarding svg branch master https travis ci org mayoneus mayday 2 0 onboarding front end repository for the mayday 2 0 site built with jekyll http jekyllrb com and hosted on amazon s3 http aws amazon com s3 getting started this site is built with the ruby library jekyll http jekyllrb com it s a html css preprocessor that supports limited plugins you ll need that on your computer to hack on the code if you don t already have ruby installed we recommend installing it with rvm https rvm io rvm install 1 install ruby any 2 1 version should be fine 1 install bundler if it isn t already installed with gem install bundler 1 run bundle install to install dependent gems including the jekyll library which is a gem 1 run jekyll server watch to get the server up and running 1 check out http localhost 4000 to see the site up and running any update you make to the source files should be reflected in localhost virtually immediately if you run into execjs error autodetect could not find a javascript runtime you can confirm the following gems are installed or installing nodejs will install a compatiable javascript runtime 1 gem install execjs 1 gem install therubyracer execjs requires a js runtime and therubyracer installs a v8 javascript runtime j issue https github com jekyll jekyll issues 2327 note non intel machines may have additional issues jekyll issue 2327 j issue contributing code review process goal ensure at least two parties have reviewed any code commited for production process 1 branch off any new feature development 2 regularly commit to your branch 3 when code is ready to be merged create merge request merge request should be able to be merged by github and all any tests should be passing 4 assign another developer to review your merge request 5 merge request is reviewed and made on github hosting deployments and continuous integration we use amazon s simple storage service s3 to host this site here are the urls for each environment alpha http alpha mayday us automatically receives changes from tags w alpha in tag name beta master http beta mayday us automatically receives changes to master branch production https mayday us automatically receives changes to production branch we use the s3 website https github com laurilehmijoki s3 website project to deploy this site you can see the build details and history here https travis ci org mayoneus mayday 2 0 frontend to manually deploy to production follow these steps 1 fetch and merge the latest code from the production branch to your machine 1 run jekyll build 1 make sure the credentials for the aws s3 production deploy user are in your env file ask another mayday tech team member if you don t have these 1 install the s3 website gem with gem install s3 website if you haven t already 1 run s3 website push to deploy the new code if you have added new redirects to the s3 website yml file you will also need to run s3 website cfg apply license this project is open source under the apache license 2 0 license note that while this license allows code reuse it does not permit reuse of mayday pac branding or logos if you reuse this project you may need to remove mayday pac branding from the source code
front_end