names
stringlengths 1
98
| readmes
stringlengths 8
608k
| topics
stringlengths 0
442
| labels
stringclasses 6
values |
---|---|---|---|
tuto_python_kafka_cassandra_postgres | h1 align center python kafka cassandra postgresql tutorial h1 what is it data engineering project consumer data from kafka and insert in to cassandra amp postgresql databases there is also some data enrichement use case in this use case we are to get data from kafka broker consuming and inserting into databases cassandra postgresql simultaeniously how does it work initialization before we start the project we need to update the conf sample ini file that holds configurations then remane it to conf ini how to use it python python3 initialize database py todo add other data pipeline tools in the design and see how they work together design a website to monitor all the pipeline trigger data generation kafka topics messages stats records in cassandra postgresql tables license this project is licensed under the mit license see the license license file for details credits see as you fit contact if you have any questions or would like to get in touch please open an issue on github or send me an email mike kenneth47 gmail com | server |
|
1-hanyumeng-643 | 1 hanyumeng 643 the notes about technological information | server |
|
Computer-Vision-Projects-with-OpenCV-and-Python-3 | computer vision projects with opencv and python 3 a href https www packtpub com big data and business intelligence computer vision projects opencv and python 3 utm source github utm medium repository utm campaign 9781789954555 img src https d1ldz4te4covpm cloudfront net sites default files imagecache ppv4 main book cover 4555 20 b13293 png alt computer vision projects with opencv and python 3 height 256px align right a this is the code repository for computer vision projects with opencv and python 3 https www packtpub com big data and business intelligence computer vision projects opencv and python 3 utm source github utm medium repository utm campaign 9781789954555 published by packt six end to end projects built using machine learning with opencv python and tensorflow what is this book about python is the ideal programming language for rapidly prototyping and developing production grade codes for image processing and computer vision with its robust syntax and wealth of powerful libraries this book will help you design and develop production grade computer vision projects tackling real world problems this book covers the following exciting features install and run major computer vision packages within python apply powerful support vector machines for simple digit classification understand deep learning with tensorflow build a deep learning classifier for general images use lstms for automated image captioning read text from real world images extract human pose data from images if you feel this book is for you get your copy https www amazon com dp 178995455x today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders for example chapter02 the code will look like the following testfile test images dog jpeg figure imshow imread testfile following is what you need for this book python programmers and machine learning developers who wish to build exciting computer vision projects using the power of machine learning and opencv will find this book useful the only prerequisite for this book is that you should have a sound knowledge of python programming with the following software and hardware list you can run all code files present in the book chapter 1 7 software and hardware list chapter software required os required 1 7 anaconda python 3 x jupyter notebook windows mac os x and linux any we also provide a pdf file that has color images of the screenshots diagrams used in this book click here to download it http www packtpub com sites default files downloads 9781789954555 colorimages pdf related products deep learning for computer vision packt https www packtpub com big data and business intelligence deep learning computer vision utm source github utm medium repository utm campaign 9781788295628 amazon https www amazon com dp b072l1cg5x practical computer vision packt https www packtpub com big data and business intelligence practical computer vision utm source github utm medium repository utm campaign 9781788297684 amazon https www amazon com dp b079qxg3wr get to know the author matthew rever matthew rever is an image processing and computer vision engineer at a major national laboratory he has years of experience in automating the analysis of complex scientific data as well as in controlling sophisticated instruments he has applied computer vision technology to save a great many hours of valuable human labor he is also enthusiastic about making the latest developments in computer vision accessible to developers of all backgrounds suggestions and feedback click here https docs google com forms d e 1faipqlsdy7datc6qmel81fiuuymz0wy9vh1jhkvpy57oimekgqib ow viewform if you have any feedback or suggestions download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781789954555 https packt link free ebook 9781789954555 a p | ai |
|
Exemplos-FreeRTOS | exemplos freertos exemplos de projetos com freertos e esp32 usando o framework arduino material usado para ensino da api do freertos hardware placa esp32 leds e resistores push buttons pot nciometro display oled i2c protoboard jumpers ambiente vscode platformio framework arduino para esp32 exemplos 00 pisca o led da placa em intervalos de 1 segundo exemplo para entender o fluxo de desenvolvimento no pio 01 exemplo para verificar o funcionamento de um rtos usando o freertos 02 exemplo para criar duas tasks com diferentes tamanhos de stacks 03 exemplo para deletar uma task 04 exemplo para parar e reiniciar uma task 05 06 07 08 09 10 11 12 13 14 15 16 17 18 extras 1 espnowreceiver 2 espnowsender 3 esp32 getmac | os |
|
DB | db third year database engineering lab | server |
|
dns-tools | dns tools build status https travis ci org egymgmbh dns tools svg branch master https travis ci org egymgmbh dns tools go report card https goreportcard com badge github com egymgmbh dns tools https goreportcard com report github com egymgmbh dns tools dns tools that we use for sre work license copyright 2017 egym gmbh licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license | cloud |
|
PeTaL | petal periodic table of life the periodic table of life petal pronounced petal is a design tool aimed at allowing users to seamlessly move from ideas from nature or other sources to design petal is comprised of multiple interconnected services this repository is for the reactjs web front end client there are other repositories for the api and database https github com nasa petal db and labeller https github com nasa petal labeller getting started these instructions will get you a copy of the project up and running on your local machine for development and testing purposes see deployment for notes on how to deploy the project to a production system uses the create react app material ui example https github com mui org material ui tree master examples create react app after cloning this repo run npm install npm start or yarn install yarn start contributing we re using the git feature branch workflow https www atlassian com git tutorials comparing workflows feature branch workflow to contribute code to the code base to summarize master should always contain only stable tested code to work on a change first create a feature branch off of master perform your changes and when your code is ready submit a pull request to merge your branch with master deployment npm run deploy https create react app dev docs deployment github pages view site at https nasa petal github io petal https nasa petal github io petal layout legacy code for the django version as it existed in 2020 see the master legacy 2020 branch for flask version of petal as it existed in 2019 see the legacy directory in the master legacy 2020 branch since the html css javascript is similar to what is currently used and some code is shared this is kept close by potentially serving as a reference for future developers to build upon one module in the legacy code biomole makes use of code written by soren knudsen the code is available at https github com sknudsen biomole https github com sknudsen biomole the code makes use of data from asknature that is not in the published source due to property rights instead you can get the data from the links below to see a working version http biomole asknature org json hierarchy json http biomole asknature org json strategies json useful tutorials https www youtube com watch v bybgopx44vo https www youtube com watch v dlx62g4lc44 project team vik shyam principal investigator colleen unsworth product owner https www mountaingoatsoftware com agile scrum roles product owner herb schilling data science lead consultant facilitator alternate mentor calvin robinson data architect defines how the data will be stored consumed integrated and managed by different data entities and it systems as well as any applications using or processing that data in some way alex ralevski data scientist and machine learning engineer paht juangphanich technical support helps team members overcome any technical challenges they face brandon ruffridge scrum master https www agilealliance org glossary scrum master and technical lead petal 0 4 shruti janardhanan 08 2020 jerry qian 08 2020 petal 0 3 lucas saldyt backend development machine learning 01 2020 04 2020 olakunle akinpelu backend development 01 2020 06 2020 kei kojima elliot hill benjamin huynh petal 0 2 angeera naser front end development and intern team lead summer 2018 allie calvin ui development and data collection bishoy boktor backend development brian whiteaker text classification connor baumler backend development courtney schiebout backend development isaias reyes 3d visualizations interactions full redesign jonathan dowdall backend development kaylee gabus front end development kei kojima front end development lauren friend backend development manju johny backend development petal 0 1 r nicholas bense backend development allie calvin ui development and data collection victoria kravets backend development | os |
|
SDSIT | seminar in data science welcome this course is jointly taught by uc berkeley and the tsinghua berkeley shenzhen institute tbsi notice the password access to all links below is shared in the course wechat group please check slides with annotation in real time courses are also offered with title named like lect annot instructors prof laurent el ghaoui uc berkeley https people eecs berkeley edu elghaoui elghaoui at berkeley edu ta zifeng wang tbsi wangzf18 at mails tsinghua edu cn final project 8 12 update the final project presentation has been decided on monday 8 30 am 8 17 8 10 update guideline note project guideline sdsit 2020 pdf for the final project schedule week lec date beijing time california time hw given hw due office hour 29 1 2020 7 13 8 30am 10 05am 5 30pm 7 05pm 1 2 2020 7 15 8 30am 10 05am 5 30pm 7 05pm 3 2020 7 17 8 30am 10 05am 5 30pm 7 05pm yes 30 4 2020 7 20 8 30am 10 05am 5 30pm 7 05pm 5 2020 7 22 8 30am 10 05am 5 30pm 7 05pm 6 2020 7 24 8 30am 10 05am 5 30pm 7 05pm 2 1 yes 31 7 2020 7 27 8 30am 10 05am 5 30pm 7 05pm yes 8 2020 7 29 8 30am 10 05am 5 30pm 7 05pm 9 2020 7 31 8 30am 10 05am 5 30pm 7 05pm 32 10 2020 8 3 8 30am 10 05am 5 30pm 7 05pm yes 11 2020 8 5 8 30am 10 05am 5 30pm 7 05pm 3 2 12 2020 8 7 8 30am 10 05am 5 30pm 7 05pm 33 13 2020 8 10 8 30am 10 05am 5 30pm 7 05pm yes 14 2020 8 12 8 30am 10 05am 5 30pm 7 05pm 15 2020 8 14 8 30am 10 05am 5 30pm 7 05pm yes 34 16 8 17 8 30am 3 hours 3 slides recordings day topic speaker slides notes real time lecture recordings hw 1 intro optimization models laurent el ghaoui lect1 https cloud tsinghua edu cn f a4b7d6177c834266bafa br br lect1 annot https cloud tsinghua edu cn f ede7beab10054de9beac rec1 https cloud tsinghua edu cn f e95a96694ab4411db9ef hw1 https cloud tsinghua edu cn f 1df50b71c25248799321 br br hw1 data https cloud tsinghua edu cn f 3fc33723e0f04836b831 2 intro machine learning laurent el ghaoui lect2 https cloud tsinghua edu cn f 68587cb22b1f4391b347 br br lect2 annot https cloud tsinghua edu cn f 150c09df49ec4bf6a4f9 rec2 https cloud tsinghua edu cn f 9c2c3e3416db475997c5 3 covariance estimation laurent el ghaoui lect3 https cloud tsinghua edu cn f 6165ac02bb2d42588171 br br lect3 annot https cloud tsinghua edu cn f 9c7ecd27ee4943ea820b rec3 https cloud tsinghua edu cn f 7cce93b3a920468a82ec 4 principal components analysis laurent el ghaoui lect4 https cloud tsinghua edu cn f 693d1672a90d40388ca5 rec4 https cloud tsinghua edu cn f 4504e7efd31b4aedbfad 5 generalized low rank models and matrix completion laurent el ghaoui lect5 https cloud tsinghua edu cn f 1226fb0aa4f042fdb876 rec5 https cloud tsinghua edu cn f 3058381cddc54a73b604 6 regression and penalization laurent el ghaoui lect6 https cloud tsinghua edu cn f 452bddbdb755415a8e94 rec6 https cloud tsinghua edu cn f cc6f3fe5646f4ca0abed 7 classification laurent el ghaoui lect7 https cloud tsinghua edu cn f 02cf0b8f6dfe4300853a rec7 https cloud tsinghua edu cn f 260333e5125f425a98b1 hw1 sol https cloud tsinghua edu cn f 66d903134cb743038782 8 kernel methods feature engineering and encoding laurent el ghaoui lect8 https cloud tsinghua edu cn f a5619c7987ea4e209e03 rec8 https zoom com cn rec share nven7fm6ttladbl9vhmapidt5bnx6a81yky8vyoxufu zngcuzajjgjqyxfhtr8 hw2 https cloud tsinghua edu cn f e9e0ab1150284df3bc70 br br hw2 data https cloud tsinghua edu cn f 95f5f27d03b24fd094de 9 neural networks laurent el ghaoui rec9 https zoom com cn rec share psjefzuq9d9otnlp lzvcf4tokffx6a80haz afbn0i umbnne0stszmpzdhdoae hw2 sol https cloud tsinghua edu cn f 9a809ddfe32249b98060 10 convex functions laurent el ghaoui lec10 https cloud tsinghua edu cn f a2c7fb758d7449d18ab5 rec10 https zoom com cn rec share 1mbfcuj5xhnogp2c51nyau0qqy ft6a8hiblrvcmxuyq3stif4 sovgyxylutxxy 11 convex optimization laurent el ghaoui lec11 https cloud tsinghua edu cn f 63e83e7200364c33ace8 rec11 https zoom com cn rec share 1pjzc43wxejocs qxn7va6okacp6t6a82iax pmixuojao0eymta 0xv0iog1ll 12 conic models laurent el ghaoui lec12 https cloud tsinghua edu cn f 6dd2bb63317945e1bcab rec12 https zoom com cn rec share 7ufskudq3uvob4315gzcqkd8bqdxaaa80ced vpymenqvlnv2mqjcyljhoxzdihp 13 robust optimization laurent el ghaoui lec13 https cloud tsinghua edu cn f ccd0f59e540a4c03b5f2 rec13 https zoom com cn rec share 6tqpc7b6ttpie43v1h le 8dd53sx6a8gsliqpqkyx61sovunfx1ggdvz9b391hm 14 affine recourse for multi period decision models laurent el ghaoui lec14 https cloud tsinghua edu cn f 4cd4ef6b837a4f1ebe3c rec14 https zoom com cn rec share 3nvqcpfz5njlg5wr40 ox6lxgiv4eaa80ymc vfykl nnsztekpocdp mpfuxii 15 review laurent el ghaoui lec15 https cloud tsinghua edu cn f 3cc58506126f4d3697a7 rec15 https zoom com cn rec share 2svsfrlz6t5ot6ptwwqyvyj wyjkeaa8h3afq6dbme44vysuavphlpxd5uwsagc2 16 final presentation lecture note live book optimization models and applications http livebooklabs com keeppies c5a5868ce26b8125 extensive slides eecs 127 uc berkeley https cloud tsinghua edu cn d 7e93767fcfd9493e990d | server |
|
libpeer | libpeer portable webrtc library for iot embedded device pear ci https github com sepfy pear actions workflows pear ci yml badge svg libpeer is a webrtc implementation written in c developed with bsd socket the library aims to integrate iot embedded device video audio streaming with webrtc such as esp32 and raspberry pi features vdieo audio codec h264 g 711 pcm a law g 711 pcm law opus datachannel stun turn signaling whip https www ietf org archive id draft ietf wish whip 01 html mqtt dependencies mbedtls https github com mbed tls mbedtls libsrtp https github com cisco libsrtp usrsctp https github com sctplab usrsctp cjson https github com davegamble cjson git corehttp https github com freertos corehttp coremqtt https github com freertos coremqtt getting started bash sudo apt y install git cmake git clone recursive https github com sepfy libpeer cd libpeer build third party sh mkdir cmake cd cmake cmake make wget http www live555 com livemedia public 264 test 264 download test video file wget https mauvecloud net sounds alaw08m wav download test audio file examples sample sample examples esp32 https github com sepfy libpeer tree main examples esp32 mjpeg over datachannel esp32 s3 https github com sepfy libpeer tree main examples esp32s3 push video and audio to media server raspberry pi https github com sepfy libpeer tree main examples raspberrypi video and two way audio stream | webrtc raspberry-pi c iot linux h264 | server |
Instruction_Finetuning_Tutorials | pycon 2023 workshop instruction finetuning icon pycon jpg welcome to the repository this work is created for participants of a workshop track at pycon 2023 https in pycon org 2023 there are multiple notebooks available there here are short descriptions which may help to quickly follow 1 inference zero shot few shot tutorial notebook 1 this utilizes llama 2 7b https huggingface co meta llama llama 2 7b chat hf and gpt 3 5 turbo https platform openai com docs models gpt 3 5 models for predicting sentiment on financial statements https huggingface co datasets financial phrasebank i will be walking over this notebook to develop basic intiution about llms and issues with prompt engineering in real case scenarios 2 inference zero shot few shot practice notebook 2 this utilizes llama 2 7b and gpt 3 5 turbo models for predicting language clarity categories on a esg statements https huggingface co datasets abhijeet3922 esg prospectus clarity category from prospectus this notebook is meant for participants to practice 3 peft prompt tuning sentiment notebook 3 prompt tuning is a parameter efficient finetuning technique which utilizes soft tokens extra parameters to learn specific task along with utilizing the pretrained knowledge this demostrates finetuning a llm with few examples 1k examples to get much improved performance the notebook utilizes opt https huggingface co facebook opt 1 3b language model and sentiment dataset to run over google collab https colab research google com 4 peft lora tuning sentiment notebook 4 low rank adaption lora is a popular technique to train low rank adaptor matrices on specific task and merge parameters with pretrained language model this notebook uses same dataset and language model as notebook 3 5 peft prompt tuning esg notebook 5 this practice notebook utilizes opt model for predicting language clarity categories on a esg statements from prospectus 6 peft lora tuning esg notebook 6 this practice notebook utilizes opt model for predicting language clarity categories on a esg statements from prospectus 7 full finetuning lora summarization notebook 7 this is an instruction finetuning example using flan t5 base https huggingface co google flan t5 base model on a sequence to sequence summarization task https huggingface co datasets samsum which we will walk through in the workshop installation installing these libraries using pip over google collab will be sufficient for notebooks to run accelerate 0 21 0 bitsandbytes 0 40 2 transformers 4 31 0 rouge score 0 1 2 peft 0 4 0 trl 0 4 7 sentencepiece xformers evaluate py7zr | ai |
|
irita | irita inter realm industry trust alliance irita is the 1st enterprise level permissioned blockchain product in cosmos ecosystem powered by tendermint and iris sdk it has 6 core technical advantages such as privacy protecting data sharing with authorization highly efficient consensus protocol cutting edge cross chain technology highly practical on off chain inter operation capabilities flexible digital asset modeling and exchange and business analytics powered by big data thus irita can be widely used in various industries such as the financial industry providing value based on blockchain trust machines 6 core technical advantages 1 tendermint consensus the tendermint consensus engine is the first internet level byzantine consensus protocol it is also the consensus technology referred by libra tendermint powers more than 40 of the pos blockchain which provides a high performance consistent and secure byzantine consensus engine tendermint is very suitable for scaling heterogeneous systems including public blockchains and permissioned consortium chains which focus on performance 2 inter blockchain communication protocol ibc ibc is a well known cutting edge design and implementation of cross chain technology standards contributed by bianjie s international partner the tendermint team ibc supports interoperability between various heterogeneous chains among them the bianjie team contributed to the development of the ics20 cross chain transfer a very important module in ibc 3 data sharing with authorization data is encrypted and stored on chain to protect data ownership and privacy of all parties data is shared with third parties only if it is authorized by the owner 4 iservice iservice aims to deliver the entire life cycle of off chain services from definition binding registration of service provider invocation to governance profiling and dispute resolution to cross the gap between blockchain and traditional business application the iris sdk supports service semantics through enhanced ibc processing logic to allow distributed business services to be available on the blockchain internet 5 digital asset modeling and trading it supports a flexible modeling of the multiple structures of digital asset data irita is based on nft non fungible token and supports asset digitalization in various fields such as supply chain intellectual property medical treatment and evidence storage 6 big data support irita s built in storage layer supports cloud storage and distributed storage and supports efficient on chain data life cycle query through the combination of database and chain data | blockchain |
|
goldenrepo | golden repo m3 sctp cloud engineering | cloud |
|
easychain | easychain a simple python blockchain this is the result of a bit of research and some tinkering to understand the fundamental concepts of a blockchain readers are directed to ilya gritorik s fantastic minimum viable blockchain https www igvita com 2014 05 05 minimum viable block chain article for more general information about what blockchains are and how they work conceptually example this is a quick example a more detailed example including hostile tampering and detection is in example py there are also a few basic unit tests msg1 message simple message msg2 message with a sender and receiver alice bob each of the above now has a hashed payload link them by adding to a block block1 block block1 add message msg1 block1 add message msg2 or using the constructor block1 block msg1 msg2 messages are now fully hashed and linked msg2 depends on msg1 tampering can be detected block2 block block2 add message message just need a second block for an example now link the blocks together chain blockchain chain add block block1 chain add block block2 now the blocks are linked block2 depends on block1 i do not pretend to be a python expert so some of this can certainly be written better what it isn t this implementation focuses only on the hashed ledger concept it specifically does not include any concept of mining or any other form of distributed consensus it also abstracts the concept of a transaction to that of a message in general narrowing the scope to a blockchain as a cryptographic primitive instead of as a method of distributed consensus the concept of a header and payload in messages and blocks is adapted from bitcoin note that the entire point of a blockchain is to provide a mechanism for distributed consensus this is typically accomplished through a proof of work mechanism bitcoin requires the network to engage in significant computational effort by way of a cryptographic puzzle competition in order to create a block this is the mining part of bitcoin the computer that solves the puzzle creates the next block marking it with the answer to the puzzle and distributing the block to all other computers the winning computer is also awarded a prize of bitcoins currently 12 5 bitcoins it is a strict constraint of the bitcoin system that all participating nodes only accept the block that contains the answer to the puzzle and also that all nodes accept the longest chain of valid blocks as the correct blockchain anything else allows attackers to exploit the system and commit fraud destroying the entire system see satoshi nakamoto s e mail response http satoshi nakamotoinstitute org emails cryptography 6 for more details on why so understand that while this project is helpful in explaining how the blockchain is structured it does not in any way address how it operates in a network such as bitcoin in that sense this is analyzing the static semantics of the blockchain not its dynamic semantics more on semantics http cs lmu edu ray notes plspec doing that would require writing client code that would send out transactions and listen for them on the network include a means for synchronizing a puzzle to be solved work to solve the puzzle package the outstanding transactions into a block once the puzzle is solved distribute the block to the rest of the network and robustly deal with bad transactions and blocks when they arrive classes conceptually it works somewhat like this image from satoshi nakamoto s original bitcoin paper https bitcoin org bitcoin pdf incidentally if you haven t read it its utterly brilliant only nine pages and much more approachable than you think simplified bitcoin blockchain https i imgur com hzobtjn png a message transaction item in bitcoin etc is simply some data an optional sender and receiver and some metadata timestamp and data size which is then hashed together for integrity a message has two sections a header and a payload each section is managed and hashed separately first the payload is populated and hashed when the message is created later when the message is eventually added to a block the header is finalized and the entire message is then hashed the block will then link the messages by populating the current message s prev hash field with the last message s hash unless it is the first message seal the message and validate it calling seal computes the message s hash from a combination of its payload hash and prev hash value you can manually link and seal messages together but the block will handle this for you from this point any tampering of the message or of any previous message can be easily detected by recalculating hashes validate will verify the message integrity it won t verify the prev hash comes from the prior message that is the responsibility of the block a block consists of 1 m message objects linked together in a sequential chain as each message is added to the block that message s prev hash field is updated with the hash from the last message in the block when no more messages are added and the block is eventually added to the blockchain it is linked to the previous block sealed and validated again you can do this manually but the blockchain handles it automatically when sealing the most recent message hash is pulled and combined with the block s prev hash and the current timestamp only the most recent message hash is required since it transitively includes all prior messages in the block due to the message chaining described above once it is sealed and the block hash has been computed tampering of any message in the block is detected by recomputing the block hash and comparing expected vs actual validate will verify block integrity by calling validate on each message in sequence but as with messages it won t verify inter block links are valid the blockchain is responsible for that a blockchain consists of 1 m block objects linked together just like message objects are once the block is added it is linked to the previous block and sealed this sequence of hashed blocks forms the block chain the integrity of the entire chain can be checked at any time by calling validate which calls validate on each block in sequence | blockchain |
|
swiftGuard | logo div align center picture source media prefers color scheme dark srcset img banner banner dark png width 600vw source media prefers color scheme light srcset img banner banner light png width 600vw img alt application banner src img banner banner light png width 600vw picture div br badges div align center a href https github com lennolium swiftguard branches img src https img shields io github last commit lennolium swiftguard label last 20updated color orange alt last updated a a a href https app codacy com gh lennolium swiftguard dashboard utm source gh utm medium referral utm content utm campaign badge grade img src https app codacy com project badge grade 7e4271efc8894c9fab80e2f27f896a87 alt code quality a a a href https github com lennolium swiftguard commits main img src https img shields io github commit activity m lennolium swiftguard label commit 20activity alt commit activity a a a href https github com lennolium swiftguard releases img src https img shields io badge version 0 0 2 brightgreen alt stable version br a href https github com lennolium swiftguard issues img src https img shields io github issues raw lennolium swiftguard label open 20issues color critical alt open issues a href https github com lennolium swiftguard issues q is 3aissue is 3aclosed img src https img shields io github issues closed raw lennolium swiftguard label closed 20issues color inactive alt closed issues a href img src https img shields io github repo size lennolium swiftguard label repo 20size color yellow alt repo size a href https github com lennolium swiftguard blob main license img src https img shields io github license lennolium swiftguard label license color blueviolet alt license a a a a a a a a a a div title div align center h1 h1 div description div align center anti forensic macos tray application designed to safeguard your system by monitoring usb ports it ensures your device s security by automatically initiating either a system shutdown or hibernation if an unauthorized device connects or a connected device is unplugged it offers the flexibility to whitelist designated devices to select an action to be executed and to set a countdown timer allowing to disarm the shutdown process br br donate https img shields io badge donate paypal blue style flat square logo paypal https www paypal me smogg buymeacoffee https img shields io badge buy 20me 20a coffee f5d132 style flat square logo buymeacoffee https buymeacoffee com lennolium div div align center h3 h3 div nbsp table of contents contents features features screenshots screenshots why should you care why should you care installation installation usage usage gui gui cli cli development development roadmap roadmap security code quality security code quality contributors contributors credits credits license license nbsp features features monitoring continuously monitors usb ports for device activity even in sleep mode whitelisting allows users to whitelist authorized devices ensuring hassle free connectivity discrete operates in the macos system tray minimizing interruptions customizable allows users to configure various settings including action shutdown hibernate countdown timer and auto start lightweight designed to consume minimal system resources for optimal performance privacy only connects to the internet to check for updates at startup open source provides transparency and allows community contributions for continuous development nbsp screenshots screenshots div align center picture source srcset img screenshots screenshots png width 600vw img alt application screenshots src img screenshots screenshots png width 600vw picture left manipulation button to defuse the alarm right whitelist and settings menu div br nbsp why why should you care a few reasons to use this tool anti forensic measures in case the police or other thugs break in the police often use a mouse jiggler https en wikipedia org wiki mouse jiggler to prevent the screen saver or sleep mode from being activated prevent data exfiltration you do not want someone adding or copying documents to or from your computer via usb public environments if you frequently use your mac in public places like libraries or cafes swiftguard acts as an additional layer of security against physical attacks in a potentially vulnerable https www ccn com fbi illegally stole ross ulbrichts laptop brought silk road setting server protection you want to improve the security of your home or company server e g your raspberry pi nas etc data protection regulations many industries and organizations are subject to strict data protection regulations swiftguard helps maintain compliance by preventing unauthorized data transfers and access through usb ports tip you might also want to use a cord to attach a usb key to your wrist then plug the key into your computer and run swiftguard if your computer is robbed the usb is removed and the computer shuts down immediately nbsp installation installation 1 obtain the most recent version by downloading it from releases https github com lennolium swiftguard releases 2 open the downloaded swiftguard dmg file 3 drag the swiftguard application into the applications folder 4 open the swiftguard application from the applications folder by right clicking and selecting open see note below 5 swiftguard should now appear in the macos system tray 6 test at least once if the shutdown or hibernation is executed correctly on first run you will be asked to grant the necessary permissions by macos 7 automatic startup at login can be enabled in the app s settings menu nbsp important make sure you use filevault macos s built in disk encryption feature encrypt your entire disk ensuring that your data remains secure even if your device falls into the wrong hands otherwise unauthorized users may gain access to your data easily system preferences security privacy security filevault do not enable icloud recovery note if you get a warning that the application is from an unidentified developer you have to open system preferences security privacy security and click open anyway to allow the application to run nbsp usage usage gui 1 open the swiftguard application from the applications folder 2 click on the application icon in the macos system tray to open the main menu 3 click the guarding inactive entry to start or pause the guarding of your usb ports 4 the devices menu displays all allowed and connected devices allowed devices are indicated with a checkmark even if they are not connected 5 to add or remove a device from the whitelist simply click on the corresponding device entry 6 if manipulation is detected an alert manipulation will appear in the main menu clicking on it will reset the alarm the exit button will not work 7 in the settings menu you can set a delay 0 60 seconds and an action shutdown or hibernate the delay determines how long swiftguard will wait for you to reset defuse the alarm before executing the action nbsp notes swiftguard alerts you if devices are removed that were connected before or while the application was started except you add them to the whitelist connecting new devices will always trigger an alert if these devices are not whitelisted if you encounter any problems please check the log file in the library logs swiftguard folder your settings and whitelisted devices are stored in the library preferences swiftguard swiftguard ini file nbsp cli you can run swiftguard as a simple python script from the command line without a graphical user interface gui this is useful when operating swiftguard on a headless system or saving system resources however you will lose the ability to defuse the shutdown process via the gui but you can kill the swiftguard process from the command line instead the preferences and whitelists are stored in the same location as the gui version and can be edited manually for further information please refer to the src swiftguard cli py https github com lennolium swiftguard blob main src swiftguard cli py file 1 open a terminal and navigate to the desired install directory bash cd desktop 2 clone the repository bash git clone https github com lennolium swiftguard git 3 navigate to the swiftguard directory bash cd swiftguard 4 create a virtual environment and activate it bash python3 m venv venv source venv bin activate pip install poetry 5 install poetry in the venv bash pip install poetry 6 install swiftguard in development mode bash poetry install this installs swiftguard and its python packages in the virtual environment venv bin swiftguard and venv lib python3 11 site packages in development mode so you can change code in the src swiftguard folder and immediately test it in the terminal 7 run it in cli mode bash swiftguard gui mode swiftguardgui notes settings whitelist library preferences swiftguard swiftguard ini logs library logs swiftguard swiftguard log logs are rotated every 2 mb with a maximum of 5 files you can set the log level debug 1 critical 5 and the log output file syslog stdout required file in the swiftguard ini file nbsp development development as an open source project i strive for transparency and collaboration in my development process i greatly appreciate any contributions members of our community can provide whether you are fixing bugs proposing features improving documentation or spreading awareness your involvement strengthens the project please review the code of conduct https github com lennolium swiftguard blob main github code of conduct md to understand how we work together respectfully bug report if you are experiencing an issue while using the application please create an issue https github com lennolium swiftguard issues new choose feature request make this project better by submitting a feature request https github com lennolium swiftguard discussions 2 documentation improve our documentation by adding a wiki page https github com lennolium swiftguard wiki community support help others on github discussions https github com lennolium swiftguard discussions security report report critical security issues via our template https github com lennolium swiftguard blob main github security md nbsp roadmap roadmap now next later unit tests linux support ci cd code quality bluetooth and wifi detection apple watch website docs wiki custom system wide hotkey for defusing auto update yet just notifying encrypted configuration e mail notification native apple silicon support code sign apple countdown dialog more actions wipe ram delete files folders email user defined actions passwort protected defusing dialog translations professional security audit nbsp security security code quality regarding swiftguard is a security application and therefore security is of the utmost importance i am committed to ensuring that it is secure and reliable for all users i am grateful for any feedback regarding security issues and will do my best to address them as quickly as possible please refer to the security policy https github com lennolium swiftguard blob main github security md for more information additionally i let my code be checked by several code quality and security tools bandit black codacy codeql pmd cpd prospector pylint pysa pyre trivy and radon the results can be found by clicking on the badges below these routines are no replacement for a manual code and security audit but they help to find errors and vulnerabilities please note that the results of these tools are not always accurate and may contain false positives div align center a href https app codacy com gh lennolium swiftguard dashboard utm source gh utm medium referral utm content utm campaign badge grade img src https app codacy com project badge grade 7e4271efc8894c9fab80e2f27f896a87 alt codacy a a a href https github com lennolium swiftguard actions workflows black yml img src https github com lennolium swiftguard actions workflows black yml badge svg alt black a a a href https github com lennolium swiftguard actions workflows github code scanning codeql img src https github com lennolium swiftguard actions workflows github code scanning codeql badge svg alt codeql a a a href https github com lennolium swiftguard actions workflows pyre yml img src https github com lennolium swiftguard actions workflows pyre yml badge svg event status alt pyre br a href https github com lennolium swiftguard actions workflows pysa yml img src https github com lennolium swiftguard actions workflows pysa yml badge svg event status alt pysa a a a a a div nbsp contributors contributors thank you so much for giving feedback implementing features and improving the code and project a href https github com lennolium swiftguard graphs contributors img src https contrib rocks image repo lennolium swiftguard a nbsp credits credits this application is heavily inspired and based on project usbkill https github com hephaest0s usbkill by hephaestos and buskill https github com buskill buskill app by michael altfield i want to thank him and all the other great contributors of usbkill for their great work inspiration and help i firmly believe in the principles of the open source community which call for the sharing and enhancement of one another work the purpose of this project is to revive an abandoned project and to support others in learning and comprehending the fundamentals of python qt and macos and to develop their own projects many more credits are in the acknowledgments https github com lennolium swiftguard blob main acknowledgments file nbsp license license provided under the terms of the gnu gpl3 license https www gnu org licenses gpl 3 0 en html lennart haack 2023 see license https github com lennolium swiftguard blob main license file for details for the licenses of used third party libraries and software please refer to the acknowledgments https github com lennolium swiftguard blob main acknowledgments file | anti-forensics macos opsec physical-security security defensive-security tampering-detection | os |
BlockchainNetwork-CompositeJourney | disclaimer as of august 2018 ibm will not be contributing new features to hyperledger composer and will only be maintaining it through fabric 2 x releases ibm recommends using hyperledger composer solely for demos and proof of concepts ibm does not provide support for networks using hyperledger composer in production this includes the cli javascript apis rest server and web playground this pattern has been upgraded to fabric v2 0 read this in other languages readme ko md readme ja md blockchainnetwork compositejourney build your first network byfn welcome to the first in a series of building a blockchain application part 1 will show you how to create a hyperledger composer business network archive bna file for commodity trade and deploy it on a hyperledger fabric this will be the hello world of hyperledger composer samples so beginner developers should be able to manage this this pattern has been updated to support hyperledger composer v0 20 5 hyperledger fabric v1 2 hyperledger fabric is a blockchain framework implementation and one of the hyperledger projects hosted by the linux foundation intended as a foundation for developing applications or solutions with a modular architecture hyperledger fabric allows components such as consensus and membership services to be plug and playin part 2 https github com ibm blockchainsmartcontracttrading compositejourney we will explore more about creating a complex network with multiple participants and using access control rules acl to provide them network access permissions in this journey you will run hyperledger fabric locally you can use hyperledger composer https github com hyperledger composer to quickly model your current business network containing your existing assets and the transactions related to them assets are tangible or intangible goods services or property as part of your business network model you define the transactions which can interact with assets business networks also include the participants who interact with them each of which can be associated with a unique identity across multiple business networks a business network definition consists of model cto script js and acl acl files packaged and exported as an archive bna file the archive file is then deployed to a hyperledger fabric network included components hyperledger fabric v1 2 hyperledger composer for v20 5 docker v1 13 application workflow diagram application workflow images arch blockchain network1 png 1 install the network dependencies a cryptogen b configtxgen c configtxlator d peer 2 configure the network a generating the network artifacts b starting up the network prerequisites we find that blockchain can be finicky when it comes to installing node we want to share this stackoverflow response https stackoverflow com questions 49744276 error cannot find module api hyperledger composer because many times the errors you see with composer are derived in having installed either the wrong node version or took an approach that is not supported by composer docker https www docker com products v1 13 or higher docker compose https docs docker com compose overview v1 8 or higher npm https www npmjs com get npm v5 6 0 or higher nvm https github com creationix nvm blob master readme md v8 11 3 use to download and set what node version you are using node js https nodejs org en download node v8 11 3 don t install in sudo mode git client https git scm com downloads v 2 9 x or higher python https www python org downloads 2 7 x steps 1 installing hyperledger composer development tools 1 installing hyperledger composer development tools 2 starting hyperledger fabric 2 starting hyperledger fabric 3 generate the business network archive bna 3 generate the business network archive bna 4 deploy the business network archive using composer playground 4 deploy the business network archive using composer playground 5 deploy the business network archive on hyperledger composer running locally 5 deploy the business network archive on hyperledger composer running locally 1 installing hyperledger composer development tools note check your node version nvm version if you are not using node version 8 11 some of the composer commands won t run correctly switch node versions using nvm use 8 note you will be installing the latest version of composer cli 0 20 5 if you have an older versions installed go ahead and remove it by using the command npm uninstall composer cli npm uninstall generator hyperledger composer npm uninstall composer rest server npm uninstall yo the composer cli contains all the command line operations for developing business networks to install composer cli run the following command npm install g composer cli 0 20 5 the generator hyperledger composer is a yeoman plugin that creates bespoke e g customized applications for your business network yeoman is an open source client side development stack consisting of tools and frameworks intended to help developers build web applications to install generator hyperledger composer run the following command npm install g generator hyperledger composer 0 20 5 the composer rest server uses the hyperledger composer loopback connector to connect to a business network extract the models and then present a page containing the rest apis that have been generated for the model to install composer rest server run the following command npm install g composer rest server 0 20 5 when combining yeoman with the generator hyperledger composer component it can interpret business networks and generate applications based on them to install yeoman run the following command npm install g yo 2 0 5 2 starting hyperledger fabric note p if you have previously used an older version of strong hyperledger composer strong and are now setting up a new install you may want to kill and remove all previous docker containers which you can do with these commands p div class highlight pre code class language data lang docker kill docker ps q docker rm docker ps aq docker rmi f docker images q code pre div first download the docker files for fabric in preparation for creating a composer profile hyperledger composer uses connection profiles to connect to a runtime a connection profile is a json document that lives in the user s home directory or may come from an environment variable and is referenced by name when using the composer apis or the command line tools using connection profiles ensures that code and scripts are easily portable from one runtime instance to another the peeradmin card is a special id card used to administer the local hyperledger fabric in a development installation such as the one on your computer the peeradmin id card is created when you install the local hyperledger fabric the form for a peeradmin card for a hyperledger fabric v1 0 network is peeradmin hlfv1 in general the peeradmin is a special role reserved for functions such as deploying business networks creating issuing and revoking id cards for business network admins first clone the contents of this repo locally and cd into the project folder by running these commands bash git clone https github com ibm blockchainnetwork compositejourney cd blockchainnetwork compositejourney then start the fabric and create a composer profile using the following commands bash downloadfabric sh startfabric sh createpeeradmincard sh no need to do it now however as an fyi you can stop and tear down fabric using stopfabric sh teardownfabric sh 3 generate the business network archive bna this business network defines participant trader asset commodity transaction trade commodity is owned by a trader and the owner of a commodity can be modified by submitting a trade transaction the next step is to generate a business network archive bna file for your business network definition the bna file is the deployable unit a file that can be deployed to the composer runtime for execution use the following command to generate the network archive bash npm install you should see the following output bash creating business network archive looking for package json of business network definition input directory users ishan documents git demo blockchainnetwork compositejourney found description sample trade network name my network identifier my network 0 0 1 written business network definition archive file to output file dist my network bna command succeeded the composer archive create command has created a file called my network bna in the dist folder you can test the business network definition against the embedded runtime that stores the state of the blockchain in memory in a node js process this embedded runtime is very useful for unit testing as it allows you to focus on testing the business logic rather than configuring an entire fabric from your project working directory blockchainnetwork compositejourney run the command npm test you should see the following output bash my network 0 0 1 test users laurabennett 2017 newrole code blockchainnetwork compositejourney mocha recursive commodity trading tradecommodity should be able to trade a commodity 198ms 1 passing 1s 4 deploy the business network archive using composer playground open composer playground https composer playground mybluemix net by default the basic sample network is imported if you have previously used playground be sure to clear your browser local storage by running localstorage clear in your browser console now import the my network bna file and click on deploy button if you don t know how to import take a tour of composer playground https www youtube com watch time continue 29 v jqmh dq6wxc you can also setup composer playground locally https hyperledger github io composer latest installing development tools html you will see the following p align center img width 400 height 200 src images composerplayground jpg p to test your business network definition first click on the test tab click on the create new participant button p align center img width 200 height 100 src images createparticipantbtn png p create trader participants class org acme mynetwork trader tradeid tradera firstname tobias lastname funke class org acme mynetwork trader tradeid traderb firstname simon lastname stone highlight the commodity tab on the far left hand side and create a commodity asset with owner as tradera class org acme mynetwork commodity tradingsymbol commoditya description sample commodity mainexchange dollar quantity 100 owner resource org acme mynetwork trader tradera click on the submit transaction button on the lower left hand side and submit a trade transaction to change the owner of commodity commoditya class org acme mynetwork trade commodity resource org acme mynetwork commodity commoditya newowner resource org acme mynetwork trader traderb you can verify the new owner by clicking on the commodity registry also you can view all the transactions by selecting the all transactions registry example of transaction view p align center img width 400 height 200 src images transactionsview png p 5 deploy the business network archive on hyperledger composer running locally alternative deployment approach deploying a business network to the hyperledger fabric requires the hyperledger composer chaincode to be installed on the peer then the business network archive bna must be sent to the peer and a new participant identity and associated card must be created to be the network administrator finally the network administrator business network card must be imported for use and the network can then be pinged to check it is responding change directory to the dist folder containing my network bna file the composer network install command requires a peeradmin business network card in this case one has been created and imported in advance and the name of the business network to install the composer runtime run the following command cd dist composer network install card peeradmin hlfv1 archivefile my network bna the composer network start command requires a business network card as well as the name of the admin identity for the business network the file path of the bna and the name of the file to be created ready to import as a business network card to deploy the business network run the following command composer network start networkname my network networkversion 0 0 1 networkadmin admin networkadminenrollsecret adminpw card peeradmin hlfv1 file networkadmin card the composer card import command requires the filename specified in composer network start to create a card to import the network administrator identity as a usable business network card run the following command composer card import file networkadmin card you can verify that the network has been deployed by typing composer network ping card admin my network you should see the the output as follows the connection to the network was successfully tested my network business network version 0 0 1 version 0 20 5 participant org hyperledger composer system identity 82c679fbcb1541eafeff1bc71edad4f2c980a0e17a5333a6a611124c2addf4ba command succeeded to integrate with the deployed business network creating assets participants and submitting transactions we can either use the composer node sdk or we can generate a rest api to create the rest api we need to launch the composer rest server and tell it how to connect to our deployed business network now launch the server by changing directory to the project working directory and type bash cd composer rest server answer the questions posed at startup these allow the composer rest server to connect to hyperledger fabric and configure how the rest api is generated enter admin my network as the card name select never use namespaces when asked whether to use namespaces in the generated api select no when asked whether to secure the generated api select no when asked whether to enable authentication with passport select no when asked if you want to enable the explorer test interface select no when asked if you want to enable dynamic logging select yes when asked whether to enable event publication select no when asked whether to enable tls security if the composer rest server started successfully you should see these output lines discovering types from business network definition discovered types from business network definition generating schemas for all types in business network definition generated schemas for all types in business network definition adding schemas for all types to loopback added schemas for all types to loopback web server listening at http localhost 3000 browse your rest api at http localhost 3000 explorer open a web browser and navigate to http localhost 3000 explorer you should see the loopback api explorer allowing you to inspect and test the generated rest api follow the instructions to test business network definition as mentioned above in the composer section ready to move to step 2 congratulations you have completed step 1 of this composite journey move onto step 2 https github com ibm blockchainsmartcontracttrading compositejourney additional resources hyperledger fabric docs https hyperledger fabric readthedocs io en latest license this code pattern is licensed under the apache software license version 2 separate third party code objects invoked within this code pattern are licensed by their respective providers pursuant to their own separate licenses contributions are subject to the developer certificate of origin version 1 1 dco https developercertificate org and the apache software license version 2 https www apache org licenses license 2 0 txt apache software license asl faq https www apache org foundation license faq html whatdoesitmean | ibmcode blockchain hyperledger-fabric hyperledger-composer call-for-code | blockchain |
perfil-politico-frontend | perfil pol tico front end for perfil pol tico https github com okfn brasil perfil politico a platform for profiling candidates in brazilian 2022 general election based entirely on open data prerequisites node js https nodejs org en 16 npm 8 17 you can use nvm to manage multiple installations of node js on your computer check nvm instalattion guides for macos and linux here https github com nvm sh nvm and for windows here https docs microsoft com en us windows nodejs setup on windows once you have it you can use the right node version for the project using these commands project setup npm install compiles and hot reloads for development npm run serve it will make your website available here localhost 8080 http localhost 8080 compiles and minifies for production npm run build then serve the contents of the dist directory run your unit tests npm run test unit lints and fixes files npm run lint | vuejs | front_end |
mineral-ui | build status https travis ci com mineral ui mineral ui svg branch master https travis ci com mineral ui mineral ui dependency status https david dm org mineral ui mineral ui svg https david dm org mineral ui mineral ui commitizen friendly https img shields io badge commitizen friendly brightgreen svg style flat https github com commitizen cz cli code style prettier https img shields io badge code style prettier ff69b4 svg https github com prettier prettier mineral ui mineral ui https mineral ui com is a design system and react https reactjs org component library for the web that lets you quickly build high quality accessible apps please read important updates regarding the status of this project docs status of mineral ui md project goals consistent thoughtful design accessible and inclusive performant responsive powerful and easy to develop with inside your app getting started installation install the mineral ui package https www npmjs com package mineral ui and its dependencies bash npm install mineral ui emotion theming emotion core emotion is prop valid emotion styled react react dom or bash yarn add mineral ui emotion theming emotion core emotion is prop valid emotion styled react react dom usage jsx import react from react import render from react dom import button from mineral ui button import themeprovider from mineral ui themes function app return themeprovider button hello world button themeprovider render app document getelementbyid app your app must be wrapped in a themeprovider https mineral ui com components theme provider at its root in order for the styles to apply correctly also please see our import syntax guide https mineral ui com import syntax open sans font mineral ui was designed around open sans https fonts google com specimen open sans to get the components to look right you will need to include this font in your project yourself or our styles will fall back to system fonts to quickly include this font in your app copy this code into the head of your html document html link href https fonts googleapis com css family open sans 300 300i 400 400i 600 600i 700 700i 800 800i rel stylesheet for more options loading this font from google check out the selected family popup in the specimen https fonts google com specimen open sans selection family open sans you can also download the font file and serve it yourself if you d like but we ll leave that to you styling this project uses css in js and emotion https emotion sh for component styling refer to the styling page https mineral ui com styling for details questions if you have read through the documentation but are still facing difficulties we are happy to help please reproduce your error by forking our mineral ui starter https codesandbox io s v410y75m0 and adding the relevant code there then send us a message on the flowdock mineral styleguide flow or open an issue including the relevant link contributing we welcome all contributors who abide by our code of conduct code of conduct md please see the contributors guide contributing md and developer docs docs readme md for more details on submitting a pr setting up a local dev environment running tests etc how you can help all of the work for this project is accomplished via pull requests and issues you can submit a pr or issue to update components pr update tests pr improve documentation pr suggest a change improvement feature issue report a bug issue thank you for offering your time expertise and feedback it s greatly appreciated versioning until this project reaches a 1 0 milestone minor version numbers will simply be incremented during each release the changelog changelog md will continue to document the different types of updates including any breaking changes after the 1 0 milestone this project will follow semver http semver org browser support mineral ui supports the latest versions of chrome firefox safari edge and internet explorer 11 roadmap future plans and high priority features and enhancements can be found on the roadmap https mineral ui com roadmap license this project is licensed under the apache 2 0 license see the license license md file for details | mineral-ui react design-systems components accessibility | os |
electroneum-blockchain-explorer | electroneum blockchain explorer official explorer urls mainnet https blockexplorer electroneum com https blockexplorer electroneum com testnet https blockexplorer thesecurityteam rocks https blockexplorer thesecurityteam rocks development branch current development branch https github com electroneum electroneum blockchain explorer tree develop note develop branch of the explorer follows master branch of the main blockchain codebase compilation on ubuntu 16 04 18 04 electroneum download and compilation in order for the explorer to compile you will need to compile the main blockchain code first bash change to the home directory for your user home user it s important that you use this specific directory cd clone recursively for the latest release branch and set up the submodules git clone recursive b release 4 0 0 0 http github com electroneum electroneum enter the cloned repository cd electroneum sanity check optional git submodule init git submodule update compile the explorer will not compile unless you build within a single build directory use single builddir 1 make compile and run the explorer once the electroneum compiles the explorer can be downloaded and compiled as follows bash go to home folder if still in electroneum cd download the source code git clone https github com electroneum electroneum blockchain explorer git enter the downloaded sourced code folder cd electroneum blockchain explorer make a build folder and enter it mkdir build cd build create the makefile cmake compile cd make to run it etnblocks by default it will look for blockchain in its default location i e electroneum lmdb you can use b option if its in different location for example bash etnblocks b home mwo non defult electroneum location lmdb example output bash mwo arch electroneum blockchain explorer etnblocks 2016 may 28 10 04 49 160280 blockchain initialized last block 669785 d0 h0 m12 s47 time ago current difficulty 14786 2016 05 28 02 04 49 info crow 0 1 server is running local port 8081 go to your browser http 127 0 0 1 8081 the explorer s command line options etnblocks onion electroneum blockchain explorer h help arg 1 0 produce help message t testnet arg 1 0 use testnet blockchain s stagenet arg 1 0 use stagenet blockchain enable pusher arg 1 0 enable signed transaction pusher enable mixin details arg 1 0 enable mixin details for key images e g timescale mixin of mixins in tx context enable key image checker arg 1 0 enable key images file checker enable output key checker arg 1 0 enable outputs key file checker enable json api arg 1 1 enable json rest api enable tx cache arg 1 0 enable caching of transaction details show cache times arg 1 0 show times of getting data from cache vs no cache enable block cache arg 1 0 enable caching of block details enable js arg 1 0 enable checking outputs and proving txs using javascript on client side enable autorefresh option arg 1 0 enable users to have the index page on autorefresh enable emission monitor arg 1 0 enable electroneum total emission monitoring thread p port arg 8081 default explorer port testnet url arg you can specify testnet url if you run it on mainnet or stagenet link will show on front page to testnet explorer stagenet url arg you can specify stagenet url if you run it on mainnet or testnet link will show on front page to stagenet explorer mainnet url arg you can specify mainnet url if you run it on testnet or stagenet link will show on front page to mainnet explorer no blocks on index arg 10 number of last blocks to be shown on index page mempool info timeout arg 5000 maximum time in milliseconds to wait for mempool data for the front page mempool refresh time arg 5 time in seconds for each refresh of mempool state b bc path arg path to lmdb folder of the blockchain e g electroneum lmdb ssl crt file arg path to crt file for ssl https functionality ssl key file arg path to key file for ssl https functionality d deamon url arg http 127 0 0 1 26968 electroneum deamon url example usage defined as bash aliases bash for mainnet explorer alias etnblocksmainnet electroneum blockchain explorer build etnblocks port 8081 testnet url http 139 162 32 245 8082 enable pusher enable emission monitor for testnet explorer alias etnblockstestnet electroneum blockchain explorer build etnblocks t port 8082 mainnet url http 139 162 32 245 8081 enable pusher enable emission monitor enable electroneum emission obtaining current electroneum emission amount is not straight forward thus by default it is disabled to enable it use enable emission monitor flag e g bash etnblocks enable emission monitor this flag will enable emission monitoring thread when started the thread will initially scan the entire blockchain and calculate the cumulative emission based on each block since it is a separate thread the explorer will work as usual during this time every 10000 blocks the thread will save current emission in a file by default in electroneum lmdb emission amount txt for testnet or stagenet networks it is electroneum testnet lmdb emission amount txt or electroneum stagenet lmdb emission amount txt this file is used so that we don t need to rescan entire blockchain whenever the explorer is restarted when the explorer restarts the thread will first check if electroneum lmdb emission amount txt is present read its values and continue from there if possible subsequently only the initial use of the tread is time consuming once the thread scans the entire blockchain it updates the emission amount using new blocks as they come since the explorer writes this file there can be only one instance of it running for mainnet testnet and stagenet thus for example you cant have two explorers for mainnet running at the same time as they will be trying to write and read the same file at the same time leading to unexpected results off course having one instance for mainnet and one instance for testnet is fine as they write to different files when the emission monitor is enabled information about current emission of coinbase and fees is displayed on the front page the values given can be checked using electroneum daemon s print coinbase tx sum command for example print coinbase tx sum 0 1313449 to disable the monitor simply restart the explorer without enable emission monitor flag enable javascript for decoding proving transactions by default decoding and proving tx s outputs are done on the server side to do this on the client side private view and tx keys are not send to the server javascript based decoding can be enabled etnblocks enable js enable ssl https by default the explorer does not use ssl but it has such a functionality as an example you can generate your own ssl certificates as follows bash cd tmp example folder openssl genrsa out server key 1024 openssl req new key server key out server csr openssl x509 req days 3650 in server csr signkey server key out server crt having the crt and key files run etnblocks in the following way bash etnblocks ssl crt file tmp server crt ssl key file tmp server key note because we generated our own certificate modern browsers will complain about it as they cant verify the signatures against any third party so probably for any practical use need to have properly issued ssl certificates json api the explorer has json api for the api it uses conventions defined by jsend https labs omniti com labs jsend by default the api is disabled to enable it use enable json api flag e g etnblocks enable json api api transaction tx hash bash curl w n x get http 127 0 0 1 8081 api transaction 6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d partial results shown json data block height 1268252 coinbase false confirmations 1 current height 1268253 extra 01be23e277aed6b5f41f66b05244bf994c13108347366ec678ae16657f0fc3a22b inputs amount 0 key image 67838fd0ffd79f13e735830d3ec60412aed59e53e1f997feb6f73d088b949611 mixins block no 1238623 public key 0a5b853c55303c10e1326acfb085b9e246e088b1ccac7e37f7a810d46a28a914 block no 1246942 public key 527cf86f5abbfb006c970f7c6eb40493786d4751306f8985c6a43f98a88c0dff mixin 9 outputs amount 0 public key 525779873776e4a42f517fd79b72e7c31c3ba03e730fc32287f6414fb702c1d7 amount 0 public key e25f00fceb77af841d780b68647618812695b4ca6ebe338faba6e077f758ac30 payment id payment id8 rct type 1 timestamp 1489753456 timestamp utc 2017 03 17 12 24 16 tx fee 12517785574 tx hash 6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d tx size 13323 tx version 2 etn inputs 0 etn outputs 0 status success api transactions transactions in last 25 blocks bash curl w n x get http 127 0 0 1 8081 api transactions partial results shown json data blocks age 33 16 49 53 height 1268252 size 105390000000000000 timestamp 1489753456 timestamp utc 2017 03 17 12 24 16 txs coinbase true mixin 0 outputs 8491554678365 rct type 0 tx fee 0 tx hash 7c4286f64544568265bb5418df84ae69afaa3567749210e46f8340c247f4803f tx size 151000000000000 tx version 2 coinbase false mixin 5 outputs 0 rct type 2 tx fee 17882516700 tx hash 2bfbccb918ee5f050808dd040ce03943b7315b81788e9cdee59cf86b557ba48c tx size 19586000000000000 tx version 2 limit 25 page 0 status success api transactions page page no limit tx per page bash curl w n x get http 127 0 0 1 8081 api transactions page 2 limit 10 result analogical to the one above api block block number block hash bash curl w n x get http 139 162 32 245 8081 api block 1293257 partial results shown json data block height 1293257 block reward 0 current height 1293264 hash 9ef6bb8f9b8bd253fc6390e5c2cdc45c8ee99fad16447437108bf301fe6bd6e1 size 141244 timestamp 1492761974 timestamp utc 2017 04 21 08 06 14 txs coinbase true extra 018ae9560eb85d5ebd22d3beaed55c21d469eab430c5e3cac61b3fe2f5ad156770020800000001a9030800 mixin 0 payment id payment id8 rct type 0 tx fee 0 tx hash 3ff71b65bec34c9261e01a856e6a03594cf0472acf6b77db3f17ebd18eaa30bf tx size 95 tx version 2 etn inputs 0 etn outputs 8025365394426 status success api mempool return all txs in the mempool bash curl w n x get http 127 0 0 1 8081 api mempool partial results shown json data limit 100000000 page 0 total page no 0 txs coinbase false extra 022100325f677d96f94155a4840a84d8e0c905f7a4697a25744633bcb438feb1e51fb2012eda81bf552c53c2168f4130dbe0265c3a7898f3a7eee7c1fed955a778167b5d mixin 3 payment id 325f677d96f94155a4840a84d8e0c905f7a4697a25744633bcb438feb1e51fb2 payment id8 rct type 2 timestamp 1494470894 timestamp utc 2017 05 11 02 48 14 tx fee 15894840000 tx hash 9f3374f8ac67febaab153eab297937a3d0d2c706601e496bf5028146da0c9aef tx size 13291 tx version 2 etn inputs 0 etn outputs 0 txs no 7 status success limit of 100000000 is just default value above to ensure that all mempool txs are fetched if no specific limit given api mempool limit no of top txs return number of newest mempool txs e g only 10 bash curl w n x get http 127 0 0 1 8081 api mempool limit 10 result analogical to the one above api search block number tx hash block hash bash curl w n x get http 127 0 0 1 8081 api search 1293669 partial results shown json data block height 1293669 current height 1293670 hash 5d55b8fabf85b0b4c959d66ad509eb92ddfe5c2b0e84e1760abcb090195c1913 size 118026 timestamp 1492815321 timestamp utc 2017 04 21 22 55 21 title block txs coinbase true extra 01cb7fda09033a5fa06dc601b9295ef3790397cf3c645e958e34cf7ab699d2f5230208000000027f030200 mixin 0 payment id payment id8 rct type 0 tx fee 0 tx hash 479ba432f5c88736b438dd4446a11a13046a752d469f7828151f5c5b86be4e9a tx size 95 tx version 2 etn inputs 0 etn outputs 7992697599717 status success api outputs txhash tx hash address address viewkey viewkey txprove 0 1 for txprove 0 we check which outputs belong to given address and corresponding viewkey for txprove 1 we use to prove to the recipient that we sent them founds for this we use recipient s address and our tx private key as a viewkey value i e viewkey tx private key checking outputs bash we use here official electroneum project s donation address as an example curl w n x get http 127 0 0 1 8081 api outputs txhash 17049bc5f2d9fbca1ce8dae443bbbbed2fc02f1ee003ffdd0571996905faa831 address 44affq5ksigboz4nmdwytn18obc8aems33dblws3h7otxft3xjrpdtqgv7sqssabybb98unbr2vbbet7f2wfn3rvgqbep3a viewkey f359631075708155cc3d92a32b75a7d02a5dcf27756707b47a2b31b21c389501 txprove 0 json data address 42f18fc61586554095b0799b5c4b6f00cdeb26a93b20540d366932c6001617b75db35109fbba7d5f275fef4b9c49e0cc1c84b219ec6ff652fda54f89f7f63c88 outputs amount 34980000000000 match true output idx 0 output pubkey 35d7200229e725c2bce0da3a2f20ef0720d242ecf88bfcb71eff2025c2501fdb amount 0 match false output idx 1 output pubkey 44efccab9f9b42e83c12da7988785d6c4eb3ec6e7aa2ae1234e2f0f7cb9ed6dd tx hash 17049bc5f2d9fbca1ce8dae443bbbbed2fc02f1ee003ffdd0571996905faa831 tx prove false viewkey f359631075708155cc3d92a32b75a7d02a5dcf27756707b47a2b31b21c389501 status success proving transfer we use recipient s address i e not our address from which we sent etn to recipient for the viewkey we use tx private key although the get variable is still called viewkey that we obtained by sending this txs bash this is for testnet transaction curl w n x get http 127 0 0 1 8082 api outputs txhash 94782a8c0aa8d8768afa0c040ef0544b63eb5148ca971a024ac402cad313d3b3 address 9wuf8ucputb2huk7rphbw5pfcykoskxqtgxbckbdnztcprdnfjjljtuht87zhtgsffcb21qmjxjj18pw7cbnrctckhrub7n viewkey e94b5bfc599d2f741d6f07e3ab2a83f915e96fb374dfb2cd3dbe730e34ecb40b txprove 1 json data address 71bef5945b70bc0a31dbbe6cd0bd5884fe694bbfd18fff5f68f709438554fb88a51b1291e378e2f46a0155108782c242cc1be78af229242c36d4f4d1c4f72da2 outputs amount 1000000000000 match true output idx 0 output pubkey c1bf4dd020b5f0ab70bd672d2f9e800ea7b8ab108b080825c1d6cfc0b7f7ee00 amount 0 match false output idx 1 output pubkey 8c61fae6ada2a103565dfdd307c7145b2479ddb1dab1eaadfa6c34db65d189d5 tx hash 94782a8c0aa8d8768afa0c040ef0544b63eb5148ca971a024ac402cad313d3b3 tx prove true viewkey e94b5bfc599d2f741d6f07e3ab2a83f915e96fb374dfb2cd3dbe730e34ecb40b status success result analogical to the one above api networkinfo bash curl w n x get http 127 0 0 1 8081 api networkinfo json data alt blocks count 0 block size limit 600000 cumulative difficulty 2091549555696348 difficulty 7941560081 fee per kb 303970000 grey peerlist size 4991 hash rate 66179667 height 1310423 incoming connections count 0 outgoing connections count 5 start time 1494822692 status ok target 120 target height 0 testnet false top block hash 76f9e85d62415312758bc09e0b9b48fd2b005231ad1eee435a8081e551203f82 tx count 1219048 tx pool size 2 white peerlist size 1000 status success api outputsblocks search for our outputs in last few blocks up to 5 blocks using provided address and viewkey bash testnet address curl w n x get http 127 0 0 1 8081 api outputsblocks address 9sdynu82ih1gdhdgrqhbecfsdfasjfgxl9b9v5f1aytfurysvej7bd9pyx5sw2qlk8hggdfm8qj5dnecqghm24ce6qwegdi viewkey 807079280293998634d66e745562edaaca45c0a75c8290603578b54e9397e90a limit 5 mempool 1 example result json data address 0182d5be0f708cecf2b6f9889738bde5c930fad846d5b530e021afd1ae7e24a687ad50af3a5d38896655669079ad0163b4a369f6c852cc816dace5fc7792b72f height 960526 limit 5 mempool true outputs amount 33000000000000 block no 0 in mempool true output idx 1 output pubkey 2417b24fc99b2cbd9459278b532b37f15eab6b09bbfc44f9d17e15cd25d5b44f payment id tx hash 9233708004c51d15f44e86ac1a3b99582ed2bede4aaac6e2dd71424a9147b06f amount 2000000000000 block no 960525 in mempool false output idx 0 output pubkey 9984101f5471dda461f091962f1f970b122d4469077aed6b978a910dc3ed4576 payment id 0000000000000055 tx hash 37825d0feb2e96cd10fa9ec0b990ac2e97d2648c0f23e4f7d68d2298996acefd amount 96947454120000 block no 960525 in mempool false output idx 1 output pubkey e4bded8e2a9ec4d41682a34d0a37596ec62742b28e74b897fcc00a47fcaa8629 payment id 0000000000000000000000000000000000000000000000000000000000001234 tx hash 4fad5f2bdb6dbd7efc2ce7efa3dd20edbd2a91640ce35e54c6887f0ee5a1a679 viewkey 807079280293998634d66e745562edaaca45c0a75c8290603578b54e9397e90a status success api emission bash curl w n x get http 127 0 0 1 8081 api emission json data blk no 1313969 coinbase 14489473877253413000 fee 52601974988641130 status success emission only works when the emission monitoring thread is enabled api version bash curl w n x get http 127 0 0 1 8081 api version json data api 65536 blockchain height 1357031 git branch name update to current electroneum last git commit date 2017 07 25 last git commit hash a549f25 electroneum version full 0 10 3 1 ab594cfe status success api number is store as uint32 t in this case 65536 represents major version 1 and minor version 0 in javascript to get these numbers one can do as follows javascript var api major response data api 16 var api minor response data api 0xffff api rawblock block number block hash return raw json block data as represented in electroneum bash curl w n x get http 139 162 32 245 8081 api rawblock 669786 example result not shown api rawtransaction tx hash return raw json tx data as represented in electroneum bash curl w n x get http 139 162 32 245 8081 api rawtransaction 6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d example result not shown other electroneum examples other examples can be found on github https github com electroneumexamples tab repositories please know that some of the examples repositories are not finished and may not work as intended how can you help constructive criticism code and website edits are always good they can be made through github | blockchain |
|
mobile | mobile mobile development | front_end |
|
Yasiru_HelloWorldLabs | yasiru helloworldlabs firdst repository of digital design and embeded systems | os |
|
machine-learning-101 | machine learning 101 a repository related to datasets for machine learning link to tutorial blogs https medium com machine learning 101 chapter 1 theory about naive bayes https medium com machine learning 101 chapter 1 supervised learning and naive bayes classification part 1 theory 8b9e361897d5 coding example https medium com machine learning 101 chapter 1 supervised learning and naive bayes classification part 2 coding 5966f25f1475 chapter 2 support vector machine theory https medium com machine learning 101 chapter 2 svm support vector machine theory f0812effc72 coding https medium com machine learning 101 chapter 2 svm support vector machine coding edd8f1cf8f2d chapter 3 decision tree classifier theory https medium com machine learning 101 chapter 3 decision trees theory e7398adac567 coding https medium com machine learning 101 chapter 3 decision tree classifier coding ae7df4284e99 chapter 4 k nearest neighbors classifier https medium com machine learning 101 k nearest neighbors classifier 1c1ff404d265 chapter 5 random forest classifier https medium com machine learning 101 chapter 5 random forest classifier 56dc7425c3e1 | ai |
|
noFilter | nofilter a social network application built for ios for tufts web engineering implemented so far a log in button a log out button a feed of posts posts with profile pictures text and image content debug coloration of different areas in progress api integration to pull data from the website instead of it being hard coded authentication and authorization of users visuals | os |
|
Powerpointer-For-Local-LLMs | powerpoint generator using python pptx and local large language models this is a powerpoint generator that uses python pptx and local llm s using the oobabooga text generation webui api to generate beautiful and informative presentations powerpointer doesn t use marp it directly creates the powerpoints so you can easily make changes to them or finish it within powerpoint it also makes placeholders for images you can even select between 7 designs to make the powerpoints more beautiful this is a port from my powerpointer which uses the gpt 3 5 openai api powerpointer https github com cybertimon powerpointer the goal was to have this running completely local with no costs using for example a llama based model you can support this by giving this repo a star i optimized the prompts to work with the vicuna and alpaca like models you can select the model type in the powerpointer py file or can create a new prompt format in the pormpts py file how it works it asks the user about the informations of the powerpoint then it generates the text for the powerpoint using some hacky prompts and the text generation webui api the python pptx library converts the generated text using my powerpoint format into a powerpoint presentation how to use this to make this work clone the repository and install the following packages pip install python pptx regex collection after this start your oobabooga text generation webui instance with an instruct finetuned model and the api extension extensions api 13b models and upwards work the best but you sometimes also receive good output with 7b models if you run oobabooga on a remote machine or not on a different port ip you have to open powerpointer py and change the host or url variable when you are there also make sure that the model type for the prompt format is set correctly vicuna or alpaca finally start the powerpoint generator by running python3 powerpointer py known issues because of the limitation of small and sometimes dumb local models the generator easily hallucinates things the generator sometimes ignores the selected slide count the generator sometimes forgets to include the additional info because of my pro code it s complicated to add new templates i m searching for an easier way please report any issues and feel free to make a pull request to fix my code i wrote at night made by cybertimon timon cybertimon ch demo screenshots here are some screenshots from the local generated powerpoints alt text https raw githubusercontent com cybertimon powerpointer for local llms main examples ai sample png alt text https raw githubusercontent com cybertimon powerpointer for local llms main examples ai sample2 png alt text https raw githubusercontent com cybertimon powerpointer for local llms main examples example run png | ai |
|
frontend-test | sitepoint frontend test you need to create a simple counter application that can do the following add a named counter to a list of counters increment any of the counters decrement any of the counters delete a counter show a sum of all the counter values it must persist data back to the server we have provided compiled directory of static static index html that will be served at localhost 3000 when the server is running static app js and static app css will be used automatically by static index html if you need other publicly available files other than index html app js app css you will have to modify the server code in index js some other notes the design layout ux is all up to you you can change anything you want server stuff included as long as the above list is completed this isn t a backend test don t make it require any databases if you decide to use a precompiler of any kind js css etc we need to be able to run it with npm run build we don t want to run any npm install g whatever commands no global dependencies tests are good a possible layout could be counter app input x bob 5 x steve 1 x pat 4 total 10 install and start the server npm install npm start npm run build optional use for any precompilers you choose api endpoints examples the following endpoints are expecting a content type application json get api v1 counters post title bob api v1 counter id asdf title bob count 0 post title steve api v1 counter id asdf title bob count 0 id qwer title steve count 0 post id asdf api v1 counter inc id asdf title bob count 1 id qwer title steve count 0 post id qwer api v1 counter dec id asdf title bob count 1 id qwer title steve count 1 delete id qwer api v1 counter id asdf title bob count 1 get api v1 counters id asdf title bob count 1 note each request returns the current state of all counters | front_end |
|
Awesome_Computer_Vision | computervision awesome git state of art commit 1 2 https zh d2l ai chapter prerequisite index html | ai |
|
ITEC362 | itec362 information technology website | server |
|
raspberry-pi-computer-vision-programming | raspberry pi computer vision programming second edition a href https www packtpub com in data raspberry pi computer vision programming second edition utm source github utm medium repository utm campaign 9781800207219 img src https www packtpub com media catalog product cache bf3310292d6e1b4ca15aeea773aca35e 9 7 9781800207219 original 42 jpeg alt raspberry pi computer vision programming second edition height 256px align right a this is the code repository for raspberry pi computer vision programming second edition https www packtpub com in data raspberry pi computer vision programming second edition utm source github utm medium repository utm campaign 9781800207219 published by packt design and implement computer vision applications with raspberry pi opencv and python 3 what is this book about raspberry pi is one of the popular single board computers of our generation all the major image processing and computer vision algorithms and operations can be implemented easily with opencv on raspberry pi this updated second edition is packed with cutting edge examples and new topics and covers the latest versions of key technologies such as python 3 raspberry pi and opencv this book will equip you with the skills required to successfully design and implement your own opencv raspberry pi and python based computer vision projects this book covers the following exciting features set up a raspberry pi for computer vision applications perform basic image processing with libraries such as numpy matplotlib and opencv demonstrate arithmetic logical and other operations on images work with a usb webcam and the raspberry pi camera module implement low pass filters and high pass filters and understand their applications in image processing if you feel this book is for you get your copy https www amazon com dp 1800207212 today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders the code will look like the following p2 person p2 name jan p2 age 20 print p2 name int p2 age following is what you need for this book this book is for python 3 developers computer vision professionals and raspberry pi enthusiasts who are looking to implement computer vision applications on a low cost platform basic knowledge of programming mathematics and electronics will be beneficial however even beginners in this area will be comfortable with covering the contents of the book as they are presented in a step by step manner with the following software and hardware list you can run all code files present in the book chapter 1 13 software and hardware list software required os required raspberry pi with raspbian os raspberry pi camera module and a usb webcam windows mac os x python 3 interpreter windows mac os x we also provide a pdf file that has color images of the screenshots diagrams used in this book click here to download it https static packt cdn com downloads 9781800207219 colorimages pdf code in action click on the following link to see the code in action https www youtube com playlist list plelcvrwle187ns94zljc9u9qhbz hr3po related products other books you may enjoy mastering computer vision with tensorflow 2 x packt https www packtpub com in data advanced computer vision with tensorflow 2 x utm source github utm medium repository utm campaign 9781838827069 amazon https www amazon com dp 1838827064 pytorch computer vision cookbook packt https www packtpub com in data pytorch computer vision cookbook utm source github utm medium repository utm campaign 9781838644833 amazon https www amazon com dp 1838644830 get to know the author ashwin pajankar is a polymath he is a science popularizer a programmer a maker an author and a youtuber he graduated from iiit hyderabad with an mtech in computer science and engineering he has a keen interest in the promotion of science technology engineering and mathematics stem education other book by the author raspberry pi computer vision programming https www packtpub com in hardware and creative raspberry pi computer vision programming utm source github utm medium repository utm campaign 9781784398286 suggestions and feedback click here https docs google com forms d e 1faipqlsdy7datc6qmel81fiuuymz0wy9vh1jhkvpy57oimekgqib ow viewform if you have any feedback or suggestions | ai |
|
HX711_zephyr_driver | out of tree hx711 sensor driver module for zephyr rtos hx711 is a precision 24 bit analog to digital converter adc designed for weigh scale applications supported zephyr versions 3 2 0 september 2022 2 7 0 lts2 release october 2021 usage module installation add this project to your west yml manifest yaml name hx711 path modules hx711 revision refs tags zephyr v3 2 0 url https github com nobodyguy hx711 zephyr driver so your projects should look something like this yaml manifest projects name zephyr url https github com zephyrproject rtos zephyr revision refs tags zephyr v3 2 0 import true name hx711 path modules hx711 revision refs tags zephyr v3 2 0 url https github com nobodyguy hx711 zephyr driver this will import the driver and allow you to use it in your code additionally make sure that you run west update when you ve added this entry to your west yml driver configuration enable sensor driver subsystem and hx711 driver by adding these entries to your prj conf ini config sensor y config hx711 y define hx711 in your board overlay like this example dts hx711 compatible avia hx711 status okay label hx711 dout gpios gpio0 26 gpio active high gpio pull up sck gpios gpio0 27 gpio active high rate gpios gpio0 2 gpio active high driver usage c include zephyr kernel h include zephyr device h include zephyr devicetree h include zephyr drivers sensor h include zephyr logging log h include zephyr sys util h include zephyr types h include sensor hx711 hx711 h include stddef h log module register main log level inf const struct device hx711 dev void set rate enum hx711 rate rate static struct sensor value rate val rate val val1 rate sensor attr set hx711 dev hx711 sensor chan weight sensor attr sampling frequency rate val void measure void static struct sensor value weight int ret ret sensor sample fetch hx711 dev if ret 0 log err cannot take measurement d ret else sensor channel get hx711 dev hx711 sensor chan weight weight log inf weight d 06d grams weight val1 weight val2 void main void int calibration weight 100 grams hx711 dev device dt get any avia hx711 assert hx711 dev null failed to get device binding log inf device is p name is s hx711 dev hx711 dev name log inf calculating offset avia hx711 tare hx711 dev 5 log inf waiting for known weight of d grams calibration weight for int i 5 i 0 i log inf d i k msleep 1000 log inf calculating slope avia hx711 calibrate hx711 dev calibration weight 5 while true k msleep 1000 log inf test measure log inf setting sampling rate to 10hz set rate hx711 rate 10hz measure k msleep 1000 log inf setting sampling rate to 80hz set rate hx711 rate 80hz measure relevant prj conf ini config sensor y config hx711 y config log y | os |
|
alphawave-py | alphawave minor bug fixes os client now correctly handles host port temperature top p max tokens new searchcommand will search the web you will need a google api key see tests searchcommandagenttest py alphawave is a very opinionated client for interfacing with large language models llm it uses promptrix https github com stevenic promptrix for prompt management and has the following features supports calling openai and azure openai hosted models out of the box but a simple plugin model lets you extend alphawave to support any llm supports os llms through an osclient currently assumes a server on port 5004 see details below promptrix integration means that all prompts are universal and work with either chat completion or text completion api s automatic history management alphawave manages a prompts conversation history and all you have todo is tell it where to store it it uses an in memory store by default but a simple plugin interface provided by promptrix lets you store short term memory like conversation history anywhere state of the art response repair logic alphawave lets you provide an optional response validator plugin which it will use to validate every response returned from an llm should a response fail validation alphawave will automatically try to get the model to correct its mistake more below automatic response repair a key goal of alphawave is to be the most reliable mechanisms for talking to an llm on the planet if you lookup the wikipedia definition for alpha waves you see that it s believed that they may be used to help predict mistakes in the human brain one of the key roles of the alphawave library is to help automatically correct for mistakes made by an llm leading to more reliable output it can correct for everything from hallucinations to just malformed output it does this by using a series of techniques first it uses validation to programmatically verify the llm s output this would be the equivalent of a guard in other libraries like langchain when a validation fails alphawave immediately forks the conversation to isolate the mistake this is critical because the last thing you want to do is promote a mistake hallucination to the conversation history as the llm will just double down on the mistake they are primarily pattern matchers once alphawave has isolated the mistake it will attempt to get the model to repair the mistake itself it uses a process called feedback which simply tells the model the mistake it made and asks it to correct it for gpt 4 this works more often then not in 1 turn for the other models it sometimes works but it depends on the type of mistake alphawave will even ask the model to slow down and think step by step on the last try to give it every shot at fixing itself if the llm can correct its mistake alphawave will delete the conversation fork write the corrected response to the conversation history and move forward as if nothing ever happened for gpt 4 you should be able to make several hundred sequential model calls before running into a sequence that can t be repaired in the event that the model isn t able to repair itself a result with a status of invalid response will be returned and the app can either abort the task or give it one more go for well defined prompts and tasks i d recommend given it one more go the reason for that is that if you ve made it hundreds of model calls without it making a mistake the odds of it making a mistake if you simply try again are low you just hit the stochastic nature of talking to llms so why even use feedback at all if retrying can work it doesn t always work some mistakes especially hallucinations the llm will make over and over again they need to be confronted with their mistake and then they will happily correct it you need both appproaches feedback retry to build a system that s as reliable as possible installation to get started you ll want to install the latest versions of both alphawave and promptrix pip should pull both if you just install alphawave bash pip install alphawave basic usage you ll need to import a couple of components from alphawave along with the various prompt parts you want to use from promptrix here s a super simple wave that creates a basic chatgpt like bot python import os from pathlib import path import readline from alphawave alphawave import alphawave from alphawave alphawavetypes import promptcompletionoptions from alphawave openaiclient import openaiclient import promptrix from promptrix prompt import prompt from promptrix systemmessage import systemmessage from promptrix conversationhistory import conversationhistory from promptrix usermessage import usermessage import asyncio create an openai or azureopenai client client openaiclient apikey os getenv openai api key create a wave wave alphawave client client prompt prompt systemmessage you are an ai assistant that is friendly kind and helpful 50 conversationhistory history 1 0 usermessage input 450 prompt options promptcompletionoptions completion type chat model gpt 3 5 turbo temperature 0 9 max input tokens 2000 max tokens 1000 define main chat loop async def chat bot message none show the bots message if bot message print f 033 32m bot message 033 0m prompt the user for input user input input user check if the user wants to exit the chat if user input lower exit exit the process exit else route users message to wave result await wave completeprompt user input if result status success print result await chat result message content else if result message print f result status result message else print f a result status of result status was returned exit the process exit start chat session asyncio run chat hello how can i help you one of the key features of promptrix is its ability to proportionally layout prompts so this prompt has an overall budget of 2000 input tokens it will give the systemmessage up to 50 tokens the usermessage up to 450 tokens and then the conversationhistory gets 100 of the remaining tokens once the prompt is formed we just need to call completeprompt on the wave to process the users input the parameter to wave completeprompt is optional and the wave can also take input directly from memory but you don t have to pass prompts input you can see in the example that if the prompt doesn t reference the input via a input template variable it won t use it anyway logging if you want to see the traffic with the server the client constructors osclient and openaiclient take a logrequests parameter false by default set it to true to see prompts and responses on the console osclient the default way to use alphawave py with openai is to use the openai client as in line 49 of the example above if you want to use your own llm you can use instead python client osclient apikey none the current osclient assumes a server exists on localhost port 5004 using my own unique protocol not very useful i know short term plans include 1 allow specification of the host and port in the client constructor 2 allow fastchat like specification of the conversation template user assistant etc support for this is already in the osclient just need to bring it out to the constructor 3 implementation of a fastchat compatible api again this was running in a dev version of the code just need to re insert it now that basic port is stable osclient protocol osclient sends json to the server python server message prompt prompt temp temp top p top p max tokens max tokens smj json dumps server message client socket sendall smj encode utf 8 client socket sendall b x00xff where prompt is a string containing the messages python role system content you are an ai assistant that is friendly kind and helpful role user content hi how are you today and the x00xff is the end of send message because i know nothing about sockets osclient expects to receive from the server 1 streaming or all at once the text followed by x00xff 2 that s it no return code no json wrapper no role assistant content str just the response 4 the x00ff signals end of messages | ai |
|
obsidian-flashcards-llm | obsidian flashcards llm this plugin allows you to automatically generate flashcards using large language models such as openai s gpt from within obsidian features openai s text davinci 003 and gpt 3 5 turbo integrations through official apis you can add your api key under the plugin settings and choose which language model to use generate flashcards from any open note by running the generate flashcards command from the command palette the generated flashcards will be appended to the current note under a new markdown header you can select which part of the document you want to generate the flashcards from or you can use the entire note customize the separator to use for generating inline flashcards note the plugin works with obsidian spaced repetition https github com st3v3nmw obsidian spaced repetition in mind as of today it can only generate inline flashcards in the format that you choose future updates will make more customizations available including multiline flashcards reversed flashcards and more demo img src https github com crybot obsidian flashcards llm blob master docs flashcards gif | ai |
|
Web-Developer | br web developer wangfeng br qq 769407183 br date october 31 modify br used to store the front end development file br h2 1 h2 front end standards 1 1 h2 2 h2 h3 2 1 html h3 h4 2 1 1 h4 1 html js bootstrap css 2 aboutus html product html contactus html page html h4 2 1 2 html h4 lt gt head h4 2 1 3 html h4 1 br html5 br lt doctype html gt br lt meta charset utf 8 gt br 2 4 2 2 lt head gt lt head gt javascript 3 javascript lt link rel stylesheet type text css href gt lt style gt lt style gt lt script src gt lt script gt style css script js js jquery 1 9 1 min js jquery cookie js 4 xhtml lt br gt lt hr gt html5 5 html div span em strong optgroup label html data data 6 html h h1 p ul ol 7 lt div class box gt lt div class welcome gt xxx lt div class name gt lt div gt lt div gt lt div gt lt div class box gt lt p gt xxx lt span gt lt span gt lt p gt lt div gt 8 href http chi chi chi chi chi chi url 9 style style 10 checkbox radio label input textarea lt p gt lt input type text id name name name gt lt p gt lt p gt lt label gt lt input type text id name gt lt label gt lt p gt 11 alt title 12 13 html 14 15 class id css h3 2 2 css h3 1 utf 8 2 css reset css common css mainsource css css mainsource css svn 3 class id a id css b or i comment fontred w200 h200 css js 1 2 3 4 c important d z index e header content container footer copyright nav mainnav globalnav topnav subnav menu submenu logo banner title sidebar status vote partner friendlink wrap label leftsidebar rightsidebar menu1content menucontainer icon note search btn loginbar login link manage tab list msg tips joinus guild service hot news download regsiter shop set add del op pw inc btn tx p icon color bg bor suc lost tran gb bor pop title tit menu cont hint nav info pvw index page detailsinfo index head page head index p page p f id class g reset page fl fl db db display block h lt div id nav gt lt div gt div lt div id nav gt lt div class subnav gt lt div gt lt div gt nav subnav lt div id nav gt lt div class nav subnav gt lt div gt lt div gt nav subnav 4 css display list style position top right bottom left float clear visibility overflow width height margin padding border background 5 6 html 7 unicode 8 table table width height cellspacing cellpadding table table thead tr th td tbody tfoot colgroup scope cellspaing cellpadding css table border 0 margin 0 border collapse collapse table th table td padding 0 reset css 9 lt meta http equiv x ua compatible content ie 7 gt ie8 10 css3 png png 24 11 css 12 h4 2 2 2 css h4 a name overview require author xxx gmail com b comment c display none h3 2 3 h3 a images image template b gif png jpg c 1 gif kb 2 png png 24 3 jpg kb 80 60 40 d png css e css sprite http sprite psd images f icon css banner banner aboutus jpg banner 01 jpg banner 02 jpg on off menu on off png button on off png h3 2 3 javascript h3 h4 2 3 1 h4 js h4 2 3 2 h4 ide jslint jshint jshint ul li jshint options li li es3 true es3 li li esnext false es6 li li camelcase true li li eqeqeq false li li expr true console console log 1 li li freeze false li li immed true li li latedef true li li maxparams 4 4 li li noarg false arguments caller arguments callee li li noempty false li li nonew false new li li plusplus false li li quotmark single li li smarttabs false tab li li maxerr false li li strict false es5 li li sub true li li unused false li li multistr false li li undef true myvar global myvar li li forin false jshint for in hasownproperty li li devel true alert console log li li jquery true li li browser true li li predef define seajs console require module li li wsh true li li evil false eval li li asi true li li newcap true jshint li li curly true if while li li maxlen 100 li ul 2space ide h4 2 3 3 h4 yuidoc yuidoc h4 2 3 4 h4 1 100 2 i i var name web var obj foo foo bar bar function test todo 3 if if if condition do else if condition todo else todo 4 var var foo foo var bar bar h4 2 3 4 js h4 1 jquery var list list jquery var list document getelementbyid list 2 classdefine person function person name this name name 3 getfunctionname setfunctionname isfunctionname hasfunction 4 var static variable 100 var pi math pi h4 2 3 5 h4 1 var 2 var foo function a console log a function console log hello 3 new new var list new array 1 2 3 var obj new object var list 1 2 3 var obj 4 html var listhtml lt ul class list gt lt li class item gt first item lt li gt lt li class item gt second item lt li gt lt ul gt 5 js 6 eval with 7 for in hasownproperty 8 ie var f function cc on if jscript return 2 3 javascript h2 3 h2 3 1 for mac os for more app detial check iusethis 3 2 mac sublime text 2 textmate 2 vim intellij idea iterm2 chrome firefox firebug mozilla fennec virtualbox win xp ie 8 github for mac versions sourcetree ftp filezilla forklift http post get charles php xampp for mac mamp sql sequel pro mark man xscope frank deloupe sip snip xscope markdown mou livereload codekit css sprite spriteright sass css ruby compass koala colorschemer studio cookie cookie sparkbox pixa h2 4 h2 h3 4 1 hack h3 ie hack table id tiptable tbody tr th th th ie6 th th ie7 th th ie8 th th ff th th opera th th safari th tr tr td gt lt td td td td td td x td td x td td x td td x td tr tr td td td td td x td td x td td x td td x td td x td tr tr td 9 td td td td td td td td x td td x td td x td tr tr td 0 td td x td td x td td td td x td td td td x td tr tr td important td td x td td td td td td td td td td td tr tr td media screen and webkit min device pixel ratio 0 span class red selector span td td x td td x td td x td td x td td x td td td tr tr td span class red selector span x moz any link x default td td x td td td td x td td ff3 5 td td x td td x td tr tr td moz document url prefix span class red selector span td td x td td x td td x td td td td x td td x td tr tr td media all and min width 0px span class red selector span td td x td td x td td x td td td td td td td tr tr td html span class red selector span td td x td td td td x td td x td td x td td x td tr tr td td td trident td td trident td td trident td td gecko td td presto td td webkit td tr tbody table ff ie7 ie6 css hack ff ie7 ie6 background color red 0 ie8 ie9 background color blue 9 0 ie9 h3 hack h3 1 firefox moz document url prefix selector property value firefox firefox ep selector id selector property value moz document url prefix selector property value gecko firefox selector property value 2 webkit chrome and safari media screen and webkit min device pixel ratio 0 selector property value webkit chrome safari ep media screen and webkit min device pixel ratio 0 demo color 666666 3 opera kestrel presto html first child body selector property value media all and min width 0 selector property value media all and webkit min device pixel ratio 10000 not all and webkit min device pixel ratio 0 head body selector property value opera hack ep media all and webkit min device pixel ratio 10000 not all and webkit min device pixel ratio 0 head body demo background green 4 ie9 root selector property value 9 ie9 ep root demo color fff03f 9 5 ie9 ie9 selector property value 9 ie9 ie9 9 9 7 ep demo background lime 9 6 media screen and max device width 480px iphone or mobile s webkit property value hack hack trick selector child property value for ie 6 selector child property value except ie 6 h3 4 2 reset css h3 html body padding 0 margin 0 font 12px normal sunsin color 666 background ffffff html ms text size adjust 100 webkit text size adjust 100 overflow y scroll webkit overflow scrolling touch h1 h2 h3 h4 h5 h6 form div p i img ul li ol table tr td th fieldset label legend button input margin 0 padding 0 ul li list style none img border none vertical align middle table border collapse collapse hr clear both border width 0 border top 1px solid ccc border bottom 1px solid fff height 2px overflow hidden cl zoom 1 cl after display block clear both height 0 visibility hidden content cl zoom 1 cl after content 20 display block clear both br cl zoom 1 br cl before cl after content display table br cl after clear both br sass link a text decoration none outline none color 666 cursor pointer a link color 333 a visited color 333 a hover color bc2931 a active color bc2931 font f12 font size 12px f14 font size 14px fb font weight 800 fi font style italic dn display none db display block fl float left fr float right dele text decoration line through ful text decoration underline css h2 class h2 h2 br http ganquan info yui br css br http code ciaoca com style css cheat sheet br br http www baidufe com fehelper endecode html br json br http www baidufe com fehelper jsonformat html br css br http jigsaw w3 org css validator br css br http ecd tencent com css3 guide html br css br http css doyoe com br png br https tinypng com br br http jsbin com br css html js | front_end |
|
RepresentationLearning | representation learning basic information representation learning a review and new perspectives arxiv https arxiv org pdf 1206 5538 transformer representation learning rssformer foreground saliency enhancement for remote sensing land cover segmentation tip2023 https ieeexplore ieee org abstract document 10026298 self correspondence distillation for end to end weakly supervised semantic segmentation aaai 2023 https arxiv org pdf 2302 13765 pdf dc net dual context network for 2d medical image segmentation miccai 2021 https link springer com chapter 10 1007 978 3 030 87193 2 48 mtldesc looking wider to describe better aaai 2022 https ojs aaai org index php aaai article view 20138 da net dual branch transformer and adaptive strip upsampling for retinal vessels segmentation miccai 2022 https link springer com chapter 10 1007 978 3 031 16434 7 51 | ai |
|
db-rev-engine | db rev engine database reverse engineering | server |
|
carbon-design-kit | carbon design kit the carbon design kit is a living breathing document that contains all of our visual assets components iconography color palettes grids templates responsive behavior etc this document evolves and changes as we collaborate with partners and service teams all of the assets that live in the design kit can also be found on our a href http carbondesignsystem com carbon design system a website change log contains release notes on current and previous versions of the carbon design kit 8 x x https github ibm com bluemix design kit releases 1 0 4 and older https github ibm com bluemix design kit releases tag 1 0 4 mag accessibility all of the designs within the carbon design kit meet the a href https www w3 org tr wcag20 wcag 2 0 aa accessibility guidelines a and are compliant with a href https www section508 gov section 508 a standards if you choose to customize these designs please make sure they continue to meet these requirements getting started designers the carbon design kit now uses the new typeface ibm plex it is open sourced and free to download from the ibm github repo https github com ibm plex download the latest version of the carbon design kit by clicking on the latest kit file in this repo carbon design kit x x x sketch then on the next page click the download button or view raw note downloading from the main repo page will download the entire repo including previous version the design kit requires having the latest version of a href https www sketchapp com sketch a installed open the design kit file in sketch core visual styles components and templates are broken out into respective pages use the components and core visual styles to build out your design updates to the design kit will be posted in our change log so check back frequently for changes developers our carbon components a href https carbondesignsystem com developing react tutorial overview getting started a guide has all the instructions you need to get up and running our code base for carbon data visualizations please head over to a href https www carbondesignsystem com data visualization getting started this a guide for instructions contributing design follow the steps in our contributing guidelines https github com carbon design system carbon design kit blob master contributing md to contribute to the carbon design system code for instructions on contributing to our component library please read our a href https github com carbon design system carbon blob master github contributing md contributing docs a for carbon components workflow sketch libraries sketch libraries is a new sketch feature that allows designers to share symbols across documents and easily update to the latest symbol changes read the sketch libraries overview wiki https github com carbon design system carbon design kit wiki sketch libraries overview to learn more about how to use and implement sketch libraries in your workflow plugins the carbon design kit now comes with the sketch palettes plugin which allows you to import carbon specific palettes into your sketch files there are four carbon palettes a carbon default palette and three data vis palettes included with the carbon design kit download we also have a wiki https github com carbon design system carbon design kit wiki suggested sketch plugins of suggested sketch plugins to help make your design workflow more efficient copyright 2016 2017 2018 ibm | sketch design carbon-design-system | os |
SQL-Employee-DataBase | your first major task is a research project on employees of a client company from the 1980s and 1990s all that remain of the database of employees from that period are six csv files in this assignment you will design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data you will perform data modeling inspect the csvs and sketch out an erd of the tables resource for tools that can be used can be found here http www quickdatabasediagrams com data engineering use the information obtained to create a table schema for each of the six csv files specify data types primary keys foreign keys and other constraints import each csv file into the corresponding sql table data analysis once complete database is in place 1 list the following details of each employee employee number last name first name gender and salary 2 list employees who were hired in 1986 3 list the manager of each department with the following information department number department name the manager s employee number last name first name and start and end employment dates 4 list the department of each employee with the following information employee number last name first name and department name 5 list all employees whose first name is hercules and last names begin with b 6 list all employees in the sales department including their employee number last name first name and department name 7 list all employees in the sales and development departments including their employee number last name first name and department name 8 in descending order list the frequency count of employee last names i e how many employees share each last name | database sql | server |
CHP016-Unity-step-by-step- | chp016 unity step by step unity in embedded system design and robotics a step by step guide | os |
|
SQL-Data-Engineering-Analysis | sql data engineering analysis this repo creates a sql database from six csv files data engineering then answers questions about the data data analysis using sql python sqlalchemy pandas numpy and matplotlib pgadmin 4 and postgresql sql png images sql png getting started these instructions will get you a copy of the project up and running on your local machine for developement and testing purposes see deployment deployment goto deployment for notes on how to deploy the project on a live system prerequisites what things you need to install the software and how to install them github account python libraries pandas numpy matplotlib and sqlalchemy git pgadmin 4 additional information here https www pgadmin org installing a step by step series of examples that tell you how to get a development environment running we are using python and jupyter notebook so those two should be installed i recommend using anaconda https docs anaconda com anaconda user guide index html and conda https docs conda io projects conda en latest user guide both are free to create a virtual environment if you currently don t have either one installed follow the instructions below to set these up on a pc for additional information on python virtual environments click here https docs python org 3 tutorial venv html python download and install anaconda https www anaconda com products individual click download and select the latest python version open the installer and follow the instructions to complete the installation verify conda https docs conda io projects conda en latest user guide is installed by entering the following command into your terminal window conda version update conda by running the following command into your terminal window conda update conda from your terminal run the following code to install pip https pip pypa io en stable conda install pip pip is the package installer for python this will come in handy for the python libraries we will be installing python is infamous for its libraries https en wikipedia org wiki category python programming language scientific libraries git download a free version control system called git https git scm com git is a great way to interact with github https github com you are currently viewing this repository on github so i m assuming you already have an account if not sign up now go through the installation until you get to the choosing the default editor used by git select use visual studio code as git s default editor if you don t already have a code editor installed download visual studio code here https code visualstudio com set up your user name in git by following the instructions here https docs github com en get started getting started with git setting your username in git set up your user email address in git by following the instructions here https docs github com en github setting up and managing your github user account managing email preferences setting your commit email address set up your ssh keys in git by following the steps in this video https www youtube com watch v nf2ggt3mwgk python libraries click the library name for instructions on how to download pandas https pandas pydata org docs getting started install html numpy https numpy org install matplotlib https matplotlib org stable users installing html sqlalchemy https docs sqlalchemy org en 14 intro html installation clone the repo 1 in github in this repository repo fork a copy of this repo by selecting fork for additional information click here https docs github com en desktop contributing and collaborating using github desktop adding and cloning repositories cloning and forking repositories from github desktop you should now have a fork of this repo in your your repositories on your github account 2 in github under your repositories select the name of this repo find the code dropdown select ssh and copy the entire line of code it should look similar to git github com yourusername reponame git 3 open a git bash terminal window where you would like to clone this repo for now you can open the terminal window on your desktop if you don t have another location in mind 4 type the following code into your git bash terminal git clone 5 paste the ssh key copied from step 2 after git clone then press enter you should now have a clone of this repo on your local machine open visual studio code 1 open your local copy of the repo in visual studio code 2 create a new file in sql data engineering analysis folder and config py 3 in the config py file enter the following lines of code user postgres password your pgadmin 4 password here port 5432 host localhost database your database name here 4 save these changes in visual studio code 5 open a git bash terminal from your repo file and enter the following code git add press enter git commit m your meaningful note about what you did goes here press enter git push origin main press enter you should now see the config py file on your local repo copy on your computer but not on github pgadmin 4 for windows operating system the following installation instructions were created by dominic labella for the data visualization and analytics boot camp after downloading the latest version of postgresql 11 here https www enterprisedb com downloads postgres postgresql downloads and reading the installation instructions here https www enterprisedb com docs supported open source postgresql installer double click on the postgresql 11 x windows x64 exe file note exact file version may be slightly different postgresql 11 1 1 windows x64 png images postgresql 11 1 1 windows x64 png go through the setup wizard and install postgresql keep the default location as library postgresql 11 select the components to be installed un check the option to install stack builder stack builder png images stack builder pc png next add your data directory keep the default location as library postgresql 11 data next enter postgres as the password be sure to record this password for future use keep the default port as 5432 and in the advanced options set locale as as default locale the final screen will be the pre installation summary when you are done the postgres 11 folder can be accessed from the start menu of your computer this folder contains the pgadmin 4 application which will be used throughout this unit to confirm the installation start pgadmin this will open in a new browser window and connect to the default server by clicking on it and entering the password if prompted deployment data modeling inspect the csvs and sketch out an erd of the tables feel free to use a tool like http www quickdatabasediagrams com http www quickdatabasediagrams com erdimage png images erdimage png data engineering use the information you have to create a table schema tables sql for each of the six csv files remember to specify data types primary keys foreign keys and other constraints for the primary keys check to see if the column is unique otherwise create a composite key https en wikipedia org wiki compound key which takes to primary keys in order to uniquely identify a row be sure to create tables in the correct order to handle foreign keys import each csv file into the corresponding sql table note be sure to import the data in the same order that the tables were created and account for the headers when importing to avoid errors data analysis once you have a complete database do the following see the project visuals below 1 list employee details goto employee details the following details of each employee employee number last name first name sex and salary 2 list unique year hires goto unique year hires first name last name and hire date for employees who were hired in 1986 3 list manager goto manager the manager of each department with the following information department number department name the manager s employee number last name first name 4 list employee dept goto employee dept the department of each employee with the following information employee number last name first name and department name 5 list hercules goto hercules first name last name and sex for employees whose first name is hercules and last names begin with b 6 list employee sales dept goto employee sales dept all employees in the sales department including their employee number last name first name and department name 7 list sales department goto sales department all employees in the sales and development departments including their employee number last name first name and department name 8 in decending order frequency count goto frequency count list the frequency count of employee last names i e how many employees share each last name open jupyter notebook 1 open a git bash terminal from your repo file 2 in your git bash terminal type the following code source activate pythondata pythondata should now be displayed in your git bash terminal 3 in your git bash terminal type the following code jupyter notebook you should now see a jupyter notebook sqlalchemy ipynb tab open in your web browser acknowledgments i d like to thank dominic labella https www linkedin com in dominiclabella my instructor at the data visualization and analytics boot camp i attended for planting a seed in my brain that was the start of my computer programming knowledge and understanding i work hard every day to grow that seed none of this would be possible for me if not for his passion for teaching and deep understanding of these subjects project visuals employee details screenshot1 images screenshot1 png unique year hires screenshot2 images screenshot2 png manager screenshot3 images screenshot3 png employee dept screenshot4 images screenshot4 png hercules screenshot5 images screenshot5 png employee sales dept screenshot6 images screenshot6 png sales department screenshot7 images screenshot7 png frequency count screenshot8 images screenshot8 png jupyter notebook image screenshot9 images screenshot9 png | server |
|
Deep-Learning-Computer-Vision | 1 pytorch knn the goals of this assignment are as follows develop proficiency with pytorch tensors gain experience using notebooks on google colab understand the basic image classification pipeline and the data driven approach train predict stages understand the train val test splits and the use of validation data for hyperparameter tuning implement and apply a k nearest neighbor knn classifier notebooks pytorch101 ipynb pytorch101 ipynb knn ipynb knn ipynb 2 svm neural network the goals of this assignment are as follows implement and apply a multiclass support vector machine svm classifier implement and apply a two layer neural network classifier understand the differences and tradeoffs between these classifiers practice implementing vectorized gradient code by checking against naive implementations and using numeric gradient checking notebooks linear classifier ipynb linear classifier ipynb two layer net ipynb two layer net ipynb 3 backpropagation bn dropout the goals of this assignment are as follows understand neural networks and how they are arranged in layered architectures understand and be able to implement modular backpropagation implement various update rules used to optimize neural networks implement batch normalization for training deep networks implement dropout to regularize networks understand the architecture of convolutional neural networks and get practice with training these models on data notebooks fully connected networks ipynb fully connected networks ipynb convolutional networks ipynb convolutional networks ipynb 4 pytorch modules rnn image captioning attention style transfer the goals of this assignment are understand how autograd can help automate gradient computation see how to use pytorch modules to build up complex neural network architectures understand and implement recurrent neural networks see how recurrent neural networks can be used for image captioning understand how to augment recurrent neural networks with attention use image gradients to synthesize saliency maps adversarial examples and perform class visualizations combine content and style losses to perform artistic style transfer notebooks pytorch autograd and nn ipynb pytorch autograd and nn ipynb rnn lstm attention captioning ipynb rnn lstm attention captioning ipynb network visualization ipynb network visualization ipynb style transfer ipynb style transfer ipynb 5 single two stage object detection the goals of this assignment are learn about the object detection pipeline understand how to build an anchor based single stage object detectors understand how to build a two stage object detector that combines a region proposal network with a recognition network notebooks single stage detector yolo ipynb single stage detector yolo ipynb two stage detector faster rcnn ipynb two stage detector faster rcnn ipynb 6 generative adversarial networks gans the goals of this assignment are understand generative adversarial networks gans implement vanila gan ls gan dc gan notebooks generative adversarial networks ipynb generative adversarial networks ipynb reference eecs 498 007 598 005 deep learning for computer vision https web eecs umich edu justincj teaching eecs498 fall 2019 | ai |
|
awesome-llm-attributions | a survey of attributions for large language models introduction open domain dialogue systems driven by large language models have changed the way we use conversational ai however these systems often produce content that might not be reliable in traditional open domain settings the focus is mostly on the answer s relevance or accuracy rather than evaluating whether the answer is attributed to the retrieved documents a qa model with high accuracy may not necessarily achieve high attribution attribution refers to the capacity of a model such as an llm to generate and provide evidence often in the form of references or citations that substantiates the claims or statements it produces this evidence is derived from identifiable sources ensuring that the claims can be logically inferred from a foundational corpus making them comprehensible and verifiable by a general audience the primary purposes of attribution include enabling users to validate the claims made by the model promoting the generation of text that closely aligns with the cited sources to enhance accuracy and reduce misinformation or hallucination and establishing a structured framework for evaluating the completeness and relevance of the supporting evidence in relation to the presented claims in this repository we focus on unraveling the sources that these systems tap into for attribution or citation we delve into the origins of these facts their utilization by the models the efficacy of these attribution methodologies and grapple with challenges tied to ambiguous knowledge reservoirs inherent biases and the pitfalls of excessive attribution 1 attribution definition position paper 2021 07 rethinking search making domain experts out of dilettantes donald metzler et al arxiv paper https arxiv org pdf 2105 02274 pdf this position paper says for example for question answering tasks our envisioned model is able to synthesize a singleanswer that incorporates information from many documents in the corpus and it will be able to support assertions in the answer by referencing supporting evidence in the corpus much like a properly crafted wikipedia entry supports each assertion of fact with a link to a primary source this is just one of many novel tasks that this type of model has the potential to enable 2023 03 trak attributing model behavior at scale sung min park et al arxiv paper https arxiv org abs 2303 14186 code https github com madrylab trak attributing model trace model predictions back to training data this paper introduces a data attribution method that is both effective and computationally tractable for large scale differentiable models 2021 12 measuring attribution in natural language generation models h rashkin et al cl paper https arxiv org pdf 2112 12870 pdf this paper presents a new evaluation framework entitled attributable to identified sources ais for assessing the output of natural language generation models 2023 07 citation a key to building responsible and accountable large language models jie huang et al arxiv paper https arxiv org pdf 2307 02185 pdf this position paper embarks on an exploratory journey into the potential of integrating a citation mechanism within large language models examining its prospective benefits the inherent technical obstacles and foreseeable pitfalls 2023 10 establishing trustworthiness rethinking tasks and model evaluation robert litschko et al arxiv paper https arxiv org pdf 2310 05442 pdf 2 attribution paper before the era of large language models and related task 2 1 fact checking claim verificication natural language inference 2021 11 the fact extraction and verification fever shared task james thorne et al emnlp 18 paper https aclanthology org w18 5501v3 pdf 2021 08 a survey on automated fact checking zhijiang guo et al tacl 22 paper https arxiv org pdf 2108 11896 pdf 2021 10 truthful ai developing and governing ai that does not lie owain evans et al arxiv paper http arxiv org abs 2110 06674 2021 05 evaluating attribution in dialogue systems the begin benchmark nouha dziri et al tacl 22 paper https aclanthology org 2022 tacl 1 62 code https github com google begin dataset 2023 10 explainable claim verification via knowledge grounded reasoning with large language models haoran wang et al findings of emnlp 23 paper https arxiv org abs 2310 05253 2022 07 improving wikipedia verifiability with ai fabio petroni et al arxiv paper https arxiv org pdf 2207 06220 pdf code https github com facebookresearch side 2 2 feature attribution and interpretability of models for nlp 2022 12 foveate attribute and rationalize towards physically safe and trustworthy ai alex mei et al findings of acl 22 paper https aclanthology org 2023 findings acl 701 pdf 2 3 attribution in mutli modal systems 2017 06 a unified view of gradient based attribution methods for deep neural networks marco ancona et al arxiv paper http www interpretable ml org nips2017workshop papers 02 pdf 2021 03 towards multi modal causability with graph neural networks enabling information fusion for explainable ai andreas holzinger et al arxiv paper https featurecloud eu wp content uploads 2021 03 holzinger et al 2021 towards multi model causability pdf 2023 03 retrieving multimodal information for augmented generation a survey ruochen zhao et al arxiv paper https arxiv org pdf 2303 10868 pdf 2023 07 improving explainability of disentangled representations using multipath attribution mappings lukas klein et al arxiv paper https arxiv org pdf 2306 09035 pdf 2023 07 visual explanations of image text representations via mult modal information bottleneck attribution ying wang et al arxiv paper https openreview net pdf id enskaebyte 2023 07 maea multimodal attribution for embodied ai vidhi jain et al arxiv paper https arxiv org pdf 2307 13850 pdf 2023 10 rephrase augment reason visual grounding of questions for vision language models archiki prasad et al arxiv paper https arxiv org abs 2310 05861 code https github com archiki repare 2 4 wiki 2019 11 transforming wikipedia into augmented data for query focused summarization haichao zhu et al taslp paper https arxiv org pdf 1911 03324 pdf 2023 04 webbrain learning to generate factually correct articles for queries by grounding on large web corpus hongjin qian et al arxiv paper https arxiv org pdf 2304 04358 pdf 2 5 model based information retrieval 2022 02 transformer memory as a differentiable search index yi tay et al neurips 22 paper https arxiv org pdf 2202 06991 pdf 3 sources of attribution 3 1 pre training data 2023 02 the roots search tool data transparency for llms aleksandra piktus et al arxiv paper https arxiv org pdf 2302 14035 pdf 2022 05 orca interpreting prompted language models via locating supporting data evidence in the ocean of pretraining data xiaochuang han et al arxiv paper https arxiv org pdf 2205 12600 pdf 2022 05 understanding in context learning via supportive pretraining data xiaochuang han et al arxiv paper https arxiv org pdf 2306 15091 pdf 2022 07 link the fine tuned llm to its pre trained base model matching pairs attributing fine tuned models to their pre trained large language models myles foley et al acl 2023 paper https aclanthology org 2023 acl long 410 pdf 3 2 out of model knowledge and retrieval based question answering knowledge grounded dialogue 2021 04 retrieval augmentation reduces hallucination in conversation kurt shuster et al arxiv paper https arxiv org pdf 2104 07567 pdf 2020 07 leveraging passage retrieval with generative models for open domain question answering gautier izacard et al arxiv paper https arxiv org pdf 2007 01282 pdf 2021 12 improving language models by retrieving from trillions of tokens sebastian borgeaud et al arxiv paper https arxiv org pdf 2112 04426 pdf 2022 12 rethinking with retrieval faithful large language model inference hangfeng he et al arxiv paper https arxiv org pdf 2301 00303 pdf 4 datasets for attribution 2022 12 citebench a benchmark for scientific citation text generation martin funkquist et al arxiv paper https arxiv org abs 2212 09577 2023 04 webbrain learning to generate factually correct articles for queries by grounding on large web corpus hongjing qian et al arxiv paper https arxiv org pdf 2304 04358 pdf code https github com qhjqhj00 webbrain 2023 05 enabling large language models to generate text with citations tianyu gao et al arxiv paper https arxiv org pdf 2305 14627 pdf code https github com princeton nlp alce this paper proposes alce dataset which collects a diverse set of questions and retrieval corpora and requires building end to end systems to retrieve supporting evidence and generate answers with citations 2023 07 hagrid a human llm collaborative dataset for generative information seeking with attribution ehsan kamalloo et al arxiv paper https arxiv org pdf 2307 16883 pdf code https github com project miracl hagrid this paper introduces the hagrid dataset for building end to end generative information seeking models that are capable of retrieving candidate quotes and generating attributed explanations 2023 09 expertqa expert curated questions and attributed answers chaitanya malaviya et al arxiv paper https arxiv org pdf 2309 07852 pdf code https github com chaitanyamalaviya expertqa this paper introduces the expertqa a high quality long form qa dataset with 2177 questions spanning 32 fields along with verified answers and attributions for claims in the answers 5 approaches to attribution 5 1 direct generated attribution 2023 05 according to prompting language models improves quoting from pre training data orion weller et al arxiv paper https arxiv org abs 2305 13252 this paper proposes according to prompting to directing llms to ground responses against previously observed text and propose quip score to measure the extent to which model produced answers are directly found in underlying text corpora 2023 07 credible without credit domain experts assess generative language models denis peskoff et al acl 2023 paper https aclanthology org 2023 acl short 37 2023 09 chatgpt hallucinates when attributing answers guido zuccon et al arxiv paper https arxiv org abs 2309 09401 this paper suggests that chatgpt provides correct or partially correct answers in about half of the cases 50 6 of the times but its suggested references only exist 14 of the times in thoses referenced answers the reference often does not support the claims chatgpt attributes to it 2023 09 towards reliable and fluent large language models incorporating feedback learning loops in qa systems dongyub lee et al arxiv paper https arxiv org pdf 2309 06384 pdf 2023 09 retrieving evidence from ehrs with llms possibilities and challenges hiba ahsan et al arxiv paper https arxiv org pdf 2309 04550 pdf 2023 10 learning to plan and generate text with citations annoymous et al openreview iclr 2024 paper https openreview net forum id 6nej0renzr 5 2 retrieval then answering 2023 04 search in the chain towards the accurate credible and traceable content generation for complex knowledge intensive tasks shicheng xu et al arxiv paper https arxiv org pdf 2304 14732 pdf 2023 05 mitigating language model hallucination with interactive question knowledge alignment shuo zhang et al arxiv paper https arxiv org pdf 2305 13669 pdf 2023 03 smartbook ai assisted situation report generation revanth gangi reddy et al arxiv paper https arxiv org abs 2303 14337 5 3 post generation attribution 2022 10 rarr researching and revising what language models say using language models luyu gao et al arxiv paper https arxiv org abs 2210 08726 2023 04 the internal state of an llm knows when its lying amos azaria et al arxiv paper http arxiv org abs 2304 13734 this paper utilizes the llm s hidden layer activations to determine the veracity of statements by a classifier receiveing as input the activation values from the llm for each of the statements in the dataset 2023 05 do language models know when they re hallucinating references ayush agrawal et al arxiv paper http arxiv org abs 2305 18248 2023 05 complex claim verification with evidence retrieved in the wild jifan chen et al arxiv paper https arxiv org abs 2305 11859 code https github com jifan chen fact checking via raw evidence this paper proposes a pipeline claim decomposition multi granularity evidence retrieval claim focused summarization to improve veracity judgments 2023 06 retrieving supporting evidence for llms generated answers siqing huo et al arxiv paper https arxiv org pdf 2309 11392 pdf this paper proposes a two step verification the llm s answer and the retrieved document queried by question and llm s answer are compared by llm checking whether the llm s answer is hallucinated 5 4 attribution systems end to end attribution models 2022 03 lamda language models for dialog applications romal thoppilan et al arxiv paper https arxiv org pdf 2201 08239 pdf 2022 03 webgpt browser assisted question answering with human feedback reiichiro nakano jacob hilton suchir balaji et al arxiv paper https arxiv org pdf 2112 09332 pdf 2022 03 gophercite teaching language models to support answers with verified quotes jacob menick et al arxiv paper http arxiv org abs 2203 11147 2022 09 improving alignment of dialogue agents via targeted human judgements amelia glaese et al arxiv paper https arxiv org pdf 2209 14375 pdf 2023 05 webcpm interactive web search for chinese long form question answering yujia qin et al arxiv paper https arxiv org pdf 2305 06849 pdf 6 attribution evaluation 2022 07 improving wikipedia verifiability with ai fabio petroni et al arxiv paper https arxiv org pdf 2207 06220 pdf 2022 12 attributed question answering evaluation and modeling for attributed large language models b bohnet et al arxiv paper https arxiv org pdf 2212 08037 pdf code https github com google research datasets attributed qa 2023 04 evaluating verifiability in generative search engines nelson f liu et al arxiv paper http arxiv org abs 2304 09848 annonated data https github com nelson liu evaluating verifiability in generative search engines 2023 05 wice real world entailment for claims in wikipedia ryo kamoi et al arxiv paper https arxiv org pdf 2303 01432 pdf 2023 05 evaluating and modeling attribution for cross lingual question answering benjamin muller et al arxiv paper https arxiv org abs 2305 14332 2023 05 factscore fine grained atomic evaluation of factual precision in long form text generation sewon min et al arxiv paper https arxiv org abs 2305 14251 code https github com shmsw25 factscore 2023 05 automatic evaluation of attribution by large language models x yue et al arxiv paper https arxiv org pdf 2305 06311 pdf code https github com osu nlp group attrscore this paper investigate the automatic evaluation of attribution by llms attributionscore by providing a definition of attribution and then explore two approaches for automatic evaluation the results highlight both promising signals as well as remaining challenges for the automatic evaluation of attribution 2023 07 factool factuality detection in generative ai a tool augmented framework for multi task and multi domain scenarios i chun chern et al arxiv paper https arxiv org abs 2307 13528v2 code https github com gair nlp factool 2023 09 quantifying and attributing the hallucination of large language models via association analysis li du et al arxiv paper https arxiv org pdf 2309 05217v1 pdf 2023 10 towards verifiable generation a benchmark for knowledge aware language model attribution xinze li et al arxiv paper https arxiv org pdf 2310 05634 pdf this paper defines a new task of knowledge aware language model attribution kalma and builds a dataset in biography domain biokalma via a well designed evolutionary question generation strategy 2023 10 understanding retrieval augmentation for long form question answering hung ting chen et al arxiv paper https arxiv org pdf 2310 12150 pdf 7 limitations future directions and challenges in attribution a hallucination of attribution i e does attribution faithfully to its content b inability to attribute parameter knowledge of model self c validity of the knowledge source source trustworthiness faithfulness factuality d bias in attribution method e over attribution under attribution f knowledge conflict cite misc llmattribution2023 title a survey of attributions for large language models author dongfang li zetian sun xinshuo hu zhenyu liu year 2023 howpublished url https github com hitsz tmg awesome llm attributions for finding survey of hallucination please refer to siren s song in the ai ocean a survey on hallucination in large language models cognitive mirage a review of hallucinations in large language models a survey of hallucination in large foundation models project maintainers contributors dongfang li http crazyofapple github io zetian sun xinshuo hu zhenyu liu | ai |
|
automated-parallel-processing | automate parallel processing data engineering project runs airflow on aws ec2 to apply parallel tasks on aws rds postgres instance database as the database all commands used sudo apt update sudo apt install python3 pip sudo apt install python3 10 venv python3 m venv airflow venv sudo pip install pandas sudo pip install s3fs sudo pip install fsspec sudo pip install apache airflow sudo pip install apache airflow providers postgres psql h rds db test yml 4 cvzpgj7bczqy us west 2 rds amazonaws com p 5432 u postgres w create extension aws s3 cascade aws iam create role role name postgresql s3 role yml 4 assume role policy document version 2012 10 17 statement effect allow principal service rds amazonaws com action sts assumerole aws iam create policy policy name postgress3policy yml 4 policy document version 2012 10 17 statement sid s3import action s3 getobject s3 listbucket effect allow resource arn aws s3 testing ymlo arn aws s3 testing ymlo aws iam attach role policy policy arn arn aws iam 177571188737 policy postgress3policy yml 4 role name postgresql s3 role yml 4 aws rds add role to db instance db instance identifier rds db test yml 4 feature name s3import role arn arn aws iam 177571188737 role postgresql s3 role yml 4 region us west 2 | server |
|
janeway | janeway logo http www openlibhums org hosted files janeway logo 05 png janeway janeway is a web based platform for publishing journals preprints conference proceedings and books it is developed and maintained by a team of developers at the open library of humanities part of birkbeck university of london technology janeway is written in python 3 6 and utilises the django framework 3 2 installation instructions developer installation instructions are available in our documentation site https janeway readthedocs io en latest installation html installation guide a guide for installing on the live environment with apache and mod wsgi https github com birkbeckctp janeway wiki janeway 2c apache and wsgi is also available running janeway with docker janeway s development server can be run within a docker container avoiding the need to install and run its dependencies from your machine a docker compose file as well as a makefile can be found at the root of the project wrapping the most common operations docker is compatible with multiple architectures and operating systems if you need help installing docker have a look at the docker documentation https docs docker com install simarly to the native installation janeway can be installed in a docker environment by running make install and following the installation steps described above https github com birkbeckctp janeway wiki installation as a result a database volume will be populated under janeway db postgres data once installation is completed just type make janeway to run janeway with a postgres backend default behaviour if a change to the dependencies for janeway is required the janeway container can be re created with make rebuild the database volume will be preserved in order to run a different rdbms the environment variable db vendor can be set to one of postgres mysql or sqlite e g db vendor mysql make install make uninstalling janeway is as simple as running make uninstall which will delete all docker related containers as well as wipe the database volume janeway design principles 1 no code should appear to work by magic readability is key 2 testing will be applied to security modules and whenever a post launch bugfix is committed we do not aim for total testing but selective regression testing 3 security bugs jump the development queue and are a priority 4 we will never accept commits of or ourselves write paywall features into janeway current development what are we working on right now for a high level view check out our public roadmap https github com orgs birkbeckctp projects 21 you can get more detail by viewing our project boards here on github https github com orgs birkbeckctp projects open a project to see which issues it includes and what their status is the status should be one of these to do we plan to do this and include it in this release in progress someone is working on it at this very moment pr submitted this means one developer has come up with a solution and is waiting for feedback from others done this means at least one other developer has approved the solution and it has been merged into the main codebase in preparation for the release we aim to build releases in 8 week sprints though some development cycles have taken quite a bit longer more on how we develop janeway https github com orgs birkbeckctp projects 21 views 1 pane issue itemid 18253226 licensing janeway is available under the terms of the gnu affero general public license version 3 19 november 2007 contributions we welcome all code contributions via pull requests where they can be reviewed and suggestions for enhancements via issues we do not currently have a code of conduct for this repo but expect contributors to be courteous to one another in order to more easily associate changes to their respective github issues please adhere to the following conventions branch names should be prefixed with the issue number they are related to followed by either feature or hotfix depending on the nature of the change e g 66 feature start every commit with a reference to the github issue they are related to e g 66 adds new feature xyz contacts if you wish to get in touch about janeway contact information is provided below andy byers senior publishing technologies developer a byers bbk ac uk major releases major releases are listed below between v1 3 v1 4 there were a large number of minor releases you can find more information on the releases https github com birkbeckctp janeway releases page version released code name v1 0 10 07 2017 kathryn v1 1 01 09 2017 chakotay v1 2 06 11 2017 tuvok v1 3 10 08 2018 doctor v1 4 25 10 2021 kes geolocation janeway includes geolite2 data created by maxmind available from https www maxmind com https www maxmind com | janeway journal publishing preprints | os |
jku-ws20 | special topics cloud computing from an engineering perspective in this repository the content for the special topic cloud computing from an engineering perspective is maintained in this course we will cover the journey from code to a scalable application running in the cloud therefore we will learn how to build applications using continuous integration pipelines after building a state of the art cloud technology to operate applications namely kubernetes is explained then ways to deploy applications into a kubernetes environment are explored finally we will learn how to operate an application in a production environment table of contents this part of the course consists of five topic areas building a microservice 1 20building 20a 20microservice continuous integration 2 20continuous 20integration kubernetes 3 20kubernetes continuous deployment 4 20continuous 20deployment operations 5 20operations maintainers johannes br uer octocat johannes b andreas grimmer octocat agrimmer | cloud |
|
frontend | h1 align center blockscout frontend h1 p align center span frontend application for span a href https github com blockscout blockscout blob master readme md blockscout a span blockchain explorer span p running and configuring the app app is distributed as a docker image here you can find information about the package https github com blockscout frontend pkgs container frontend and its recent releases https github com blockscout frontend releases you can configure your app by passing necessary environment variables when stating the container see full list of envs and their description here docs envs md sh docker run p 3000 3000 env file path to your env file ghcr io blockscout frontend latest alternatively you can build your own docker image and run your app from that please follow this guide docs custom build md for more information on migrating from the previous frontend please see the frontend migration docs https docs blockscout com for developers frontend migration contributing see our contribution guide docs contributing md for pull request protocol we expect contributors to follow our code of conduct code of conduct md when submitting code or comments resources app envs list docs envs md contribution guide docs contributing md making a custom build docs custom build md frontend migration guide https docs blockscout com for developers frontend migration license license gpl v3 0 https img shields io badge license gpl 20v3 blue svg https www gnu org licenses gpl 3 0 this project is licensed under the gnu general public license v3 0 see the license license file for details | blockchain ethereum explorer | front_end |
BlockchainForFederatedLearning | decentralization of artificial intelligence with federated learning on blockchain this repo contains the code and data for doing federated learning on mnist dataset on blockchain ieee paper record and reward federated learning contributions with blockchain https ieeexplore ieee org document 8945913 installation before you do anything else you will first need to install the following python packages absl py 0 5 0 astor 0 7 1 certifi 2018 10 15 chardet 3 0 4 click 7 0 cycler 0 10 0 flask 1 0 2 gast 0 2 0 grpcio 1 15 0 h5py 2 8 0 idna 2 7 jinja2 2 10 keras applications 1 0 6 keras preprocessing 1 0 5 kiwisolver 1 0 1 markdown 3 0 1 markupsafe 1 0 matplotlib 3 0 0 numpy 1 15 2 protobuf 3 6 1 pyparsing 2 2 2 python dateutil 2 7 3 requests 2 20 0 six 1 11 0 tensorboard 1 11 0 termcolor 1 1 0 urllib3 1 24 werkzeug 0 14 1 itsdangerous 1 1 0 tensorflow 1 11 0 these are specified in the src packages to install txt file this project was built using python3 but may work with python2 given a few minor tweaks preprocessing the next step is to build the federated dataset to do federating learning on you can prepare it by running this script python src data federated data extractor py the default split is 2 split dataset dataset 2 which can be changed as per your number of clients training once you ve generated chunks of federated data x d you can begin training for this simply run the following bash script src run blockfl sh assuming you ve installed all dependencies and everything else successfully this should start federated learning on the generated federated datasets on blockchain retrieving the models once you ve finished training you can get the aggregated globally updated model federated modelx block per round from the src blocks folder | blockchain |
|
installer | installer translation status https l10n elementary io widgets installer svg badge svg https l10n elementary io projects installer utm source widget note this is the installer for elementary os 6 and newer for the elementary os 5 1 and older installer see ubiquity https wiki ubuntu com ubiquity screenshot data screenshot progress png raw true an installer for open source operating systems see the wiki https github com elementary installer wiki for goals design spec user flow and details about each step building testing and installation you ll need the following dependencies meson desktop file utils gettext gparted libgnomekbd dev libgranite dev 0 5 libgtk 3 dev libgee 0 8 dev libhandy 1 dev libjson glib dev libpwquality dev libxml2 dev libxml2 utils distinst https github com pop os distinst valac run meson build to configure the build environment change to the build directory and run ninja test to build and run automated tests meson build prefix usr cd build ninja test to install use ninja install then execute with io elementary installer note that listing drives and actually installing requires root sudo ninja install io elementary installer you can also use test mode for development to disable destructive behaviors like installing restarting and shutting down io elementary installer test for debug messages set the g messages debug environment variable e g to all g messages debug all io elementary installer | pantheon oem linux vala meson gtk3 hacktoberfest distinst | front_end |
Bularafa- | bularafa information technology student | server |
|
Labs | ios code samples work 1 stretchableimage blog post http jayrparro com blog 2012 05 17 stretchable images in ios 5 2 dropshadow blog post http jayrparro com blog 2012 07 28 adding drop shadow effect in a view 3 bands core animation sample http jayrparro com blog 2013 01 22 bands core animation sample br you can contact me thru personal blog jayr parro http jayrparro com twitter http twitter com jayrparro http twitter com jayrparro br thanks | front_end |
|
StateOS-STM32F3Discovery | stateos ci https github com stateos stateos stm32f3discovery actions workflows test yml badge svg https github com stateos stateos stm32f3discovery actions workflows test yml free extremely simple amazingly tiny and very fast real time operating system rtos designed for deeply embedded applications template not tested target stm32f3discovery board license this project is licensed under the terms of mit license mit https opensource org licenses mit | stm32f3-discovery | os |
ML-examples | ml examples source code for machine learning tutorials and examples used in arm s ml developer space https developer arm com technologies machine learning on arm developer material projects and tutorials arm nn mobilenet on android deploy a quantized tensorflowlite mobilenet v2 model on android using the arm nn sdk tutorial on arm s developer site coming soon source code on github https github com arm software ml examples tree main armnn mobilenet android readme md arm style transfer on android implement a neural style transfer on android with arm nn apis tutorial on arm s developer site https developer arm com solutions machine learning on arm developer material how to guides implement a neural style transfer on android with arm nn apis source code on github https github com arm software ml examples tree main armnn style transfer android readme md cmsis pack based examples for arm corstone 300 cmsis project showing keyword spotting kws and object detection on arm corstone 300 tutorial on arm s developer site coming soon source code on github https github com arm software ml examples tree main cmsis pack examples readme md ethos u on corstone 300 explore the arm corstone 300 with arm cortex m55 and arm ethos u55 npu tutorial on arm s developer site coming soon source code on github https github com arm software ml examples tree main ethos u corstone 300 readme md multi gesture recognition train a convolutional neural network from scratch to recognize multiple gestures in a wide range of conditions with tensorflow and a raspberry pi 4 model b or pi 3 tutorial on arm s developer site https developer arm com technologies machine learning on arm developer material how to guides teach your pi multi gesture source code on github https github com arm software ml examples tree main multi gesture recognition readme md fire detection on a raspberry pi using pyarmnn deploy a neural network trained to recognize images that include a fire or flames on a raspberry pi using pyarmnn tutorial on arm s developer site https developer arm com solutions machine learning on arm developer material how to guides accelerate ml inference on raspberry pi with pyarmnn source code on github https github com arm software ml examples blob main pyarmnn fire detection readme md pytorch to tensorflow deploy a jupyter notebook that will demonstrate how to convert a model trained in pytorch to tensorflow lite format tutorial on arm s developer site coming soon source code on github https github com arm software ml examples tree main pytorch to tflite readme md rnn unrolling for tf lite deploy a jupyter notebook that will demonstrate how to train a recurrent neural network rnn in tensorflow and then prepare it for exporting to tensorflow lite format by unrolling it tutorial on arm s developer site coming soon source code on github https github com arm software ml examples tree main rnn unrolling tflite readme md image recognition on mbed using cmsis and tflm deply an image recognition demo on a discovery stm32f746g board using tensorflow lite for microcontrollers tflm and cmsis nn tutorial on arm s developer site https developer arm com solutions machine learning on arm developer material how to guides image recognition on arm cortex m with cmsis nn source code on github https github com arm software ml examples tree main tflm cmsisnn mbed image recognition readme md yeah world explore gesture recognition with tensorflow and transfer learning on the raspberry pi 4 model b pi 3 and pi zero tutorial on arm s developer site https developer arm com technologies machine learning on arm developer material how to guides teach your raspberry pi yeah world source code on github https github com arm software ml examples tree main yeah world readme md | arm machine-learning python deep-learning deep-neural-networks neural-network ml raspberry-pi raspberry-pi-3 | ai |
ErrorMessagesAPI | welcome to the errormessagesapi web api nowadays many applications are depoloyed on the cloud using the infrastructure of the cloud providers such like aws google ovh etc heroku is a platform as a service paas that facilitates the applications deployment on the cloud the working engineers in this company end up with 12 principles that should be done to deploy a scalable portable and ready cloud native application errormessagesapi is a rest api that satisfies the 12 factor methodology https 12factor net this api exposes the crud create read update delete on the message model the api is built using the aps net core the main domain of this api is to provide a way for others to mange their sw errors get get by id add delete anytime will be explained later in details system requirments all you need to run this api is a microsoft visual studio community 2019 docker desktop local minikube installed mongo database image api methods table tbody tr th http verb th th uri th th action desc th th consumes th th produces th th parameters th th responses th tr tr td align center get td td align center messages td td align center get all messages td td align center td td align center application json td td align center td td align center ul li 200 description get all messages li li 404 description no messages found li ul td tr tr td align center get td td align center messages id td td align center get message by id td td align center td td align center application json td td align center id integer td td align center ul li 200 description successfull operation li li 400 description invalid id supplied li li 404 description message not found li ul td tr tr td align center post td td align center messages td td align center create a new message td td align center application json td td align center application json td td align center td td align center ul li 200 description successfull operation li ul td tr tr td align center put td td align center messages id td td align center update the message with id td td align center application json td td align center application json td td align center id integer td td align center ul li 200 description successfull operation li li 400 description invalid id supplied li li 404 description message not found li ul td tr tr td align center delete td td align center messages id td td align center delete message with id td td align center td td align center td td align center id integer td td align center ul li 200 description successfull operation li li 400 description invalid id supplied li li 404 description message not found li ul td tr tbody table contribution visit the issue tracker https github com swen6305cloudnative sra messagesapi issues to find a list of open issues that need attention fork then clone the repo git clone https github com swen6305cloudnative sra messagesapi git create a new branch git checkout b name of your new branch push the branch on github git push origin name of your new branch 1 codebase one codebase tracked in revision control many deploys br to achieve the first factor we created a repository for the backend api called errormessagesapi in this repository two branches have been created 1 master branch for the production 2 development branch for the development the source code has been tracked using the github platform and every team member contributes using pull requests mechanism 2 dependencies explicitly declare and isolate dependencies br in the twelve factor app we should never relies on the implicit existence of system wide package instead we should declare all our dependencies in a dependecy declaration manifest this is to ensure that no implicit dependecies leak in from the surrounding system since we are using microsoft development platform nuget package manager is used the nuget package manager is responsible for declaring and isolating the application dependencies and it allows you to easily install uninstall and update nuget packages in projects and solutions in our api here is the list of installed packages microsoft extensions options 3 1 9 microsoft visualstudio azure containers tools targets 1 10 9 mongodb bson 2 11 3 mongodb driver 2 11 3 swashbuckle aspnetcore 5 6 3 visual studio can restore packages automatically when it builds a project and you can restore packages at any time through visual studio nuget restore package restore first installs the direct dependencies of a project as needed then installs any dependencies of those packages throughout the entire dependency graph 3 configuration store config in the environment br the application configuration is everything that vary among deploys these configurations should be separated from the application code according to our application the database connection string is considered a configuration and should be separated from the code so we store that string in the user environment under the name of connectiostring we also store the name of the database in an environmental variable below are the environment variables of our api environmental variables for the error message api in the docker compose file under the environment four variables are defined host mongo root user mongo root password db name the values of the four varaibles are stored in the user environmental variables errormessage api environment host host username mongo root user password mongo root password database db name the four env variables are read in code to consturct the url of the mongo database in the class name mongodbconfig cs https github com eng aomar errormessagesapi blob master config mongodbconfig cs besides the mongo database and the mongo express credentials are stored in the docker compose file such as the following mongo env varaibles environment mongo initdb root username mongo root user mongo initdb root password mongo root password mongo express env varaibles environment me config mongodb adminusername mongo root user me config mongodb adminpassword mongo root password in k8s the configeration is satisied by using secert file to store the base64 encrypted passwords mongo sercert https github com eng aomar errormessagesapi blob master k8s mongo secret yml 4 backing services treat backing services as attached resources br anything external to a service is treated as an attached resource including other services in general backing services are those that our application consumes over the network as part of its normal process in our case mongodb is considered a backing service that can be accessed using the credentials stored in the config file see factor 3 applying this factor ensures that every service is completely portable and loosely coupled to the other resources in the system strict separation increases flexibility during development developers only need to run the services they are modifying not others a database mongo db should be referenced by a simple endpoint url and credentials if necessary 5 build release run strictly separate build and run stages br build release and run phases must be kept separated in order to transform our codebase to a deploy we should pass 3 stages br 1 build stage where the code repo is converted to an executable bundle called build in this stage adll dependencies are fetched and the binaries files are complied br 2 release stage where we combines the build produced in the previous stage with the deploy config therefore ready to be executed note that each release must have a unique id and using the release management tools we can rollback to previous releases br 3 run stage where we run the application in the execution environment br the whole pipline is automated using github actios by buliding a continous integration https github com eng aomar errormessagesapi blob master github workflows ci yml and a continous delivery https github com eng aomar errormessagesapi blob master github workflows cd yml for our api a release is deployed on the execution environment and must be immutable what does that mean for our application we ll use docker in the whole development pipeline we will start by adding a dockerfile that will help define the build phase during which the dependencies are compiled in node modules folder multi stage build the multi stage build feature helps make the container building process more efficient and it also makes containers smaller by letting them contain only the bits your application needs at runtime the multi stage version is used for net core projects see https aka ms containerfastmode to understand how visual studio uses this dockerfile to build your images for faster debugging from mcr microsoft com dotnet core aspnet 3 1 buster slim as base workdir app expose 80 expose 443 from mcr microsoft com dotnet core sdk 3 1 buster as build workdir src copy messagesapi messagesapi csproj messagesapi run dotnet restore messagesapi messagesapi csproj copy workdir src messagesapi run dotnet build messagesapi csproj c release o app build from build as publish run dotnet publish messagesapi csproj c release o app publish from base as final workdir app copy from publish app publish entrypoint dotnet messagesapi dll base stage the lines in the dockerfile begin with the buster image from microsoft container registry mcr microsoft com and create an intermediate image base that exposes ports 80 for http and 443 for https and sets the working directory to app the next stage is build which appears as follows from mcr microsoft com dotnet core sdk 3 1 buster as build workdir src copy messagesapi messagesapi csproj messagesapi run dotnet restore messagesapi messagesapi csproj copy workdir src messagesapi run dotnet build messagesapi csproj c release o app build build stage the build stage starts from a different original image from the registry sdk rather than aspnet rather than continuing from base the sdk image has all the build tools and for that reason it s a lot bigger than the aspnet image which only contains runtime components the reason for using a separate image becomes clear when you look at the rest of the dockerfile final stage from build as publish run dotnet publish messagesapi csproj c release o app publish from base as final workdir app copy from publish app publish entrypoint dotnet messagesapi dll the final stage starts again from base and includes the copy from publish to copy the published output to the final image this process makes it possible for the final image to be a lot smaller since it doesn t need to include all of the build tools that were in the sdk image let s build our application docker build t messageapi and verify the resulting image is in the list of available images docker images repository tag image id created size messagesapi dev bccbbeb61b61 9 hours ago 347mb mcr microsoft com dotnet core aspnet 3 1 nanoserver 1903 08f393180806 2 weeks ago 347mb now the image build is available execution environment must be injected to create a release there are several options to inject the configuration in the build among them create a new image based on the build define a compose file we ll go for the second option and define a docker compose file where the mongo url will be set with the value of the execution environment version 3 4 services errormessagesapi image docker registry errormessagesapi container name error messages api build context dockerfile dockerfile environment host host username mongo root user password mongo root password database db name depends on mongo networks error messgeapi network mongo image mongo container name mongo restart always ports 27017 27017 environment mongo initdb root username mongo root user mongo initdb root password mongo root password mongo initdb database db name volumes mongo data data db initmongo js docker entrypoint initdb d initmongo js ro networks error messgeapi network mongo express image mongo express container name mongo express restart always ports 8081 8081 environment me config mongodb adminusername mongo root user me config mongodb adminpassword mongo root password depends on mongo networks error messgeapi network volumes mongo data driver local networks error messgeapi network driver bridge in the dockor compose we connect three images messagapi mongo mongo express docker compose creat the defualt network between the images regarding the mongo image we define the mongo data volume locally inside the conatiner to presist the data the mongo express is used to provide a simple interface to the mongo database mongo express can be accessed using the following url http localhost 8081 by defualt mongo database has alreday three pre defined databases as the picture shows we add messagedb database and message collection to it mongo express https user images githubusercontent com 55650010 100858792 1c092700 3497 11eb 9e8e a9f280afb10c png this file defines a release as it considers a given build and inject the execution environment the run phase can be done manually with compose cli or through an orchestrator docker cloud compose cli enables to run the global application as simple as docker compose up d note when this command is executed docker compose will pull all the images necessary for the setup generate all the services configured in the docker compose file create the network between the containers set the environment variables for the containers and expose the configured ports conatiners https user images githubusercontent com 55650010 100858948 4eb31f80 3497 11eb 85b6 69d243525145 png fork then clone the repo git clone https github com swen6305cloudnative sra messagesapi git create a new branch git checkout b name of your new branch push the branch on github git push origin name of your new branch continuous integration below is the yaml file name net on push branches master pull request branches master jobs build runs on ubuntu latest steps uses actions checkout v2 name setup net uses actions setup dotnet v1 with dotnet version 3 1 402 name restore dependencies run dotnet restore errormessagesapi csproj name build run dotnet build errormessagesapi csproj no restore name test run dotnet test errormessagesapi csproj no build verbosity normal continuous deployment after the code is buit successfuly it is then depolyed to the docker hub sraorganaization o the errormessageapi repository name push to docker hub cd on push branches master pull request branches master jobs build runs on ubuntu latest steps uses actions checkout v2 name docker login env docker user secrets docker hub username docker password secrets dockerhub token run docker login u docker user p docker password name build the docker image run docker build file dockerfile tag secrets docker organization errormessageapi name docker push run docker push secrets docker organization errormessageapi deploy to kubernetes loacaly the following steps are followed to setup k8s start a cluster by running minikube start to access kubernetes dashboard of the started cluster we run minikube dashboard now we can interact with our cluster using the kubectl deploy conatiners to k8s k8s https user images githubusercontent com 55650010 106782711 2a202300 6653 11eb 9f08 82f3c54f7bd5 jpg to deploy to k8s we created four deployemts one configmap one secert file to show the yaml files please click here https github com eng aomar errormessagesapi tree master k8s image https user images githubusercontent com 55650010 106785579 32c62880 6656 11eb 8975 a3a526bbc1b7 png to deploy the image to kubernetes the commands found here https github com eng aomar errormessagesapi blob master k8s k8s commands md are executed in order kubectl apply f mongo secret yml kubectl apply f mongo yml kubectl apply f mongo configmap yml kubectl apply f mongo express yml kubectl apply f error message api deployment yml the following are the service deployment and pods of our deployed image kubectl get all name ready status restarts age pod errormessage api deployment 658d79df9f 49c4m 1 1 running 0 5d7h pod errormessage api deployment 658d79df9f npf7r 1 1 running 0 5d7h pod mongo express 78fcf796b8 4cmvz 1 1 running 0 6d5h pod mongodb deployment 8f6675bc5 tnvgn 1 1 running 0 6d5h name type cluster ip external ip port s age service errormessage api service loadbalancer 10 108 119 117 pending 8080 31829 tcp 5d7h service kubernetes clusterip 10 96 0 1 none 443 tcp 6d6h service mongo express service loadbalancer 10 103 189 131 pending 8081 30000 tcp 6d5h service mongodb service clusterip 10 98 52 19 none 27017 tcp 6d5h name ready up to date available age deployment apps errormessage api deployment 2 2 2 2 5d7h deployment apps mongo express 1 1 1 1 6d5h deployment apps mongodb deployment 1 1 1 1 6d5h name desired current ready age replicaset apps errormessage api deployment 658d79df9f 2 2 2 5d7h replicaset apps mongo express 78fcf796b8 1 1 1 6d5h replicaset apps mongodb deployment 8f6675bc5 1 1 1 6d5h 6 processes execute the app as one or more stateless processes br every meaningful business application creates modifies or uses data in it a synonym for data is state an application service that creates or modifies persistent data is called stateful component in general database services are stateful component on the other hand application components that do not create or modify persistent data are called stateless components stateless components are easier and simpler to handle and can be easily scaled up and down compared with stateful components moreover they can easily restarted on a completely different node because they have no persistent data associated with them error messsage api is a statless application as we don t store data inside our api we use backing services mongo database to store our data 7 port binding export services via port binding br this factor talks about exposing the application services to the outside world using port binding in general docker containers can connect to the outside world without further configuration but the outside world cannot connect to docker containers by default to allow communication between different containers in the same network we need to publish our container on one or more ports and this is done by the expose command in our dockerfile we put the expose 80 command we can put any port number this tells docker that our container s service can be connected to on port 80 note the ports are configured with the host port container port syntax in our api port binding is satisfied using both docker compose file and k8s yaml deployment service files as the service of each application forwards the traffic to the specified port using target port that points to the container port 8 concurrency scale out via the process model br sometimes the application becomes bigger and can t handle all coming request therefore we need to do a horizontal scaling that means to create multiple and concurrent processes instances and then distribute the load of your application among those processes in our application this can be done using the docker compose file by using the scale to run multiple instances of a service here is the syntax docker compose up scale service name num of services however once we try to run this command an error occures because we are trying to map num of services on the same port on our docker host engine to solve this problem we should remove the port from the docker compose file but using this approach we can t know the ports to access these services until the containers are started to know the different processes we run this command docker compose ps for more details you can click here https pspdfkit com blog 2018 how to use docker compose to run multiple instances of a service in development moreover kubernetes also allows us to scale the stateless application at runtime by defining the replicaset where it automatically scale the number of pods with that number 9 disposability maximize robustness with fast startup and graceful shutdown br the different instances of our application are disposable which means that they can started or stopped at any moment this actually facilitate the scalability and deployment for the application this is achieved in out application when we use volumes storing long term persistent data outside the container in an external to docker database in docker volumes means that we can desdtriy any container without affecting the user experience pods in k8s are considered stateless ephemeral objects they can be stopped started and deleted gracefully without the knowledge of the end users 10 dev prod parity keep development staging and production as similar as possible historically there have been substantial gaps between development a developer making live edits to a local deploy of the app and production a running deploy of the app accessed by end users these gaps manifest in three areas the time gap a developer may work on code that takes days weeks or even months to go into production the personnel gap developers write code ops engineers deploy it the tools gap developers may be using a stack like nginx sqlite and os x while the production deploy uses apache mysql and linux what does that mean for our application we use continuous deployment to minimize the gap between deveoplment and production by pushing the docker image continously into docker hub on push and pull reguest of the master branch 11 logs treat logs as event streams br image https user images githubusercontent com 55650010 106784093 a36c4580 6654 11eb 8614 f8efd5ef9aaf png to satisfy this factor we are going to do logging at the node level on kubernetes using fluend agent https docs fluentd org v 0 12 which is collects logging data from containers on that are running on k8s and unify all the logs then forwards them to a preferred destination such as elasticsearch mongo database kafka and others the flunetd github repository https github com fluent fluentd kubernetes daemonset provides many templates we implemented the fluentd daemonset elasticsearch https github com fluent fluentd kubernetes daemonset blob master fluentd daemonset elasticsearch yaml to define the logging mechanism 12 admin processes run admin management tasks as one off processes in our case we stored the taml files for both kubernetes and docker compose ready to deploy our apps in any envionment in the same repostory you can find all needed yaml files in the following folder k8s https github com eng aomar errormessagesapi tree master k8s 13 open api spesification in our api we used the swagger openapi https swagger io for documentation swagger is used together with a set of open source software tools to design build document and use restful web services we integrate our dotnet cor api with swagger using the following code in startup cs https github com eng aomar errormessagesapi blob master startup cs in order to view the documentation of our api you are supposed to run the project and go to this link https localhost 5001 swagger sub note the port number may be different in your run sub swagger https user images githubusercontent com 55650010 100859067 773b1980 3497 11eb 86ee 260f1d881528 png | asp-net-core docker docker-compose kuberentes swagger 12-factor cloud-native | cloud |
HandsomeMod | p align center img src img logo png p h1 align center handsomemod 21 03 h1 h4 align center iot freedom for end user h4 p align center img height 200 width 300 src img wayland png img height 200 width 300 src img xorg png p feature overview opkg packager manager and procd init lower memory usage than debian or some systemd based linux support qt5 gtk3 xorg wayland and lots of graphics stuff support sound csi camera encoder decoder drm on allwinner platform support librarys commonly use in embedded project opencv ncnn wiringpi etc support networkmanager and connman just enough generic os for embedded devices building firmware to build your own firmware you need a linux bsd or macosx system case sensitive filesystem required cygwin is unsupported you need gcc binutils bzip2 flex python perl make find grep diff unzip gawk getopt subversion libz dev and libc headers installed 1 run scripts feeds update a to obtain all the latest package definitions defined in feeds conf feeds conf default 2 run scripts feeds install a to install symlinks for all obtained packages into package feeds 3 run make menuconfig to select your preferred configuration for the toolchain target system firmware packages 4 run make to build your firmware this will download all sources build the cross compile toolchain and then cross compile the linux kernel all chosen applications for your target system mainly supporting platform this means those socs can get better support than others allwinner socs linux mainline qualcomm msm89xx family rapsberry pi x86 freescale i mx6ull family wip loongson64 family wip allwinner socs bsp kernel planing thanks handsomemod based on openwrt project branch openwrt 21 02 commit fc86176363149493810dc0b424583dd120e7f4c7 https github com openwrt openwrt warning now this project is not good enough for production environment some packages may buggy and unusable license handsomemod is licensed under gpl 2 0 | openwrt linux iot firmware allwinner qt5 wayland gtk3 lvgl msm8916 qualcomm | server |
Hiphopathy | hiphopathy hiphopathy a socio curricular study of introductory cs anchored with data science using rap lyrics welcome to an innovative way of teaching and sharing the delights that comes when data computational methods and culture collide the goal of this work is to connect cultural relevance to computing by introducing elementary techniques of natural language processing with a corpus of hip hop data art is something that we often believe is separate from the so called hard sciences as a result people who are gifted in the arts or are inclined in that direction shy away from the sciences because of the perceived false choice of choosing between the arts or the sciences i want to change this false dichotomy by bringing the power of computational methods to investigate and explore the riches that are inherent in the arts it is for this reason that i have chosen to use a corpus of hip hop lyrics as the dataset on which we will practice our computational techniques at the end of this series of exercises you will be able to build a visualization of the occurrence of certain words in a rap corpus the unit was implemented on edx mooclet bjc 3x data information and the internet https www edx org course beauty joy computing cs principles part uc berkeleyx bjc 3x the coursed launched in 2015 with over 20 000 students resources chapter 1 snap to python http nbviewer ipython org github omoju hiphopathy blob master snaptopython snaptopython ipynb chapter 2 lyrics exploration http nbviewer ipython org github omoju hiphopathy blob master hiphopdataexploration hiphopdataexploration ipynb chapter 3 lets play with some apis http nbviewer ipython org github omoju hiphopathy blob master rapgeniusapi rapgeniusapi ipynb publication title hiphopathy a socio curricular study of introductory computer science author omoju miller advisor dr alice agogino alice degree doctor of philosophy computer science education date dec 2015 alice http www me berkeley edu people faculty alice m agogino its deeper than rap toward culturally responsive cs http dl acm org citation cfm id 2604994 xrds crossroads the acm magazine for students 20 4 28 30 snap build your own blocks http dl acm org citation cfm id 2539022 in proceedings of the 45th acm technical symposium on computer science education pp 749 749 acm ap cs principles and the beauty and joy of computing curriculum http dl acm org citation cfm id 2539026 in proceedings of the 45th acm technical symposium on computer science education pp 746 746 acm 2014 how to contribute find any typos have another resource you think should be included contributions are welcome first fork this repository fork icon images fork icon png next clone this repository to your desktop to make changes sh git clone your repository clone url cd hiphopathy once you ve pushed changes to your local repository you can issue a pull request by clicking on the green pull request icon pull request icon images pull request icon png license the contents of this repository are covered under the gnu general public license license md | data-science python snap rap-lyrics art | ai |
token-vesting | h1 align center token vesting h1 br p align center img width 250 src https i imgur com nn7lmnv png p p align center a href https twitter com bonfida img src https img shields io twitter url label bonfida style social url https 3a 2f 2ftwitter com 2fbonfida a p br div align center img src https img shields io badge typescript 007acc style for the badge logo typescript logocolor white img src https img shields io badge rust 000000 style for the badge logo rust logocolor white div br h2 align center table of contents h2 br 1 program id program id 2 audit audit 3 ui ui 4 overview overview 5 structure structure br a name program id a h2 align center program id h2 br mainnet cchtq6pthwu82yzkbvea3wdf7s97bwhbk4vx9bmst743 devnet dlxb9dsqta4wj49hwfhxqiqkd9v6m67yfk9voxpxrbs4 br a name audit a h2 align center audit h2 br this code has been audited by kudelski audit report bonfida token vesting report audit bonfida securityassessment vesting final050521 pdf br a name ui a h2 align center ui h2 br the bonfida token vesting ui https vesting bonfida com can be used to unlock tokens the ui only works for vesting accounts using the mainnet deployment cchtq6pthwu82yzkbvea3wdf7s97bwhbk4vx9bmst743 br a name overview a h2 align center overview h2 br simple vesting contract svc that allows you to deposit x spl tokens that are unlocked to a specified public key at a certain block height slot unlocking works by pushing a permissionless crank on the contract that moves the tokens to the pre specified address token address should be derived from https spl solana com associated token account vesting schedule contract a contract containing an array of the svc s that can be used to develop arbitrary vesting schedules tooling to easily setup vesting schedule contracts recipient address should be modifiable by the owner of the current recipient key implementation should be a rust spl compatible program plus client side javascript bindings that include a cli interface rust program should be unit tested and fuzzed br a name structure a h2 align center structure h2 br cli cli tool to interact with on chain token vesting contract js javascript binding to interact with on chain token vesting contract program the bpf compatible token vesting on chain program smart contract diagram assets structure png | react rust solana solana-program solana-token typescript | blockchain |
party | app description this is a personal project of mine that i built when i was first learning mobile app development used django react native redux and a django rest framework for development h3 preview h3 friends changing app logo this is simply a placeholder friends app preview friends gif profile editing imagepicker app preview imagepicker gif bdaypicker app preview bdaypicker gif signup signup app preview signup gif | server |
|
learn-aws-iot | learn aws iot learn how to use amazon web services internet of things iot service to build connected applications aws iot mqtt over web sockets we want to enable browser based apps to send and receive data from iot connected devices using websockets aws iot acts as a message broker essentially a pub sub broker service that enables sending and receiving messages to and from aws iot mqtt is lightweight connectivity protocol for pub sub message transport it can be used over the web socket protocol to send messages between a client and server aws introduced support for mqtt over websockets for aws iot in january 2016 https aws amazon com about aws whats new 2016 01 aws iot now supports websockets custom keepalive intervals and enhanced console the index html file has an example of of mqtt websockets html js subscription and publishing of topics messages in chrome console websocket protocol in a web application we want to enable browser based apps to send and receive data from iot connected devices the process is as follows 1 initiate a websocket connection on a client according to the docs this is done by sending a http get request with a url that is signed with aws credentials aws requires aws credentials to be specified in a particular format signature version 4 http docs aws amazon com general latest gr sigv4 add signature to request html sigv4 add signature querystring in the url query string the docs provide several utility functions for constructing a request url these have been used to create the formatrequesturl function in the file src js utils request js this function takes an options object and returns the signed requesturl options object shape js var options regionname secretkey accesskey endpoint get this from the aws iot describe endpoint cli command sessiontoken 2 open the websocket connect to aws iot once the web socket connection has been initiated an mqtt client needs to be created to receive mqtt messages over the web socket protocol the docs recommend using the paho mqtt client this can be included as a script in the index html file or the mqtt npm module could be used an example of using the paho client has been included in the src js components request js file in the initclient function based on the aws iot docs http docs aws amazon com iot latest developerguide protocols html websockets with amazon cognito to securely authenticate end users to your apps and devices instead of specifying the access key and secret access key credentials to initiate the web socket connection you could use aws cognito to provide the aws credentials for both authenticated and unauthenticated users suggested by this example http draw kyleroche com main js this has been implemented in src js components app js try it out run the example by typing npm run dev start in your terminal after cloning this repo and navigate to localhost 8080 dev in your browser todo fix webpack bundling error require is not defined try fixing the http request error for the get request to initiate the web socket connection xmlhttprequest cannot load my endpoint cross origin requests are only supported for protocol schemes http data chrome chrome extension https chrome extension resource try implementing this example https github com awslabs aws iot examples reading http stackoverflow com questions 35345481 aws iot mqtt over websocket protocol what is aws iot aws iot provides two way communication between devices that are connected to the internet and the aws cloud this means that you can create applications that your users will be able to control from their phones or tablets very cool stuff aws iot consists of a few components they are message broker a mechanism for things to publish and receive messages from each other the mqtt protocol can be used to publish and subscribe and the http rest interface can be used to publish rules engine allows the integration between iot and other aws services thing registry a k a device registry this organises the resources associated with each thing thing shadow service provides persistent representations of your things in the aws cloud it can hold state information about your thing which can be synchronised when it next connects your things can also publish their current state to a thing shadow for other applications or devices to use thing shadow a k a device shadow it s a json document used to store and retrieve current state information for a thing device gateway enables devices to communicate with aws iot security and identity service your things must keep their credentials safe in order to send data securely to the message broker the message broker and rules engine use aws security features to send data to devices or other aws services get started by creating a thing with aws iot to create a thing you ll need to use the aws cli install it by following the steps here http docs aws amazon com cli latest userguide cli chap getting set up html you ll then need to go to your iam console and create a user take down the access key id and the secret access key you ll need them to configure your aws cli in the command line type the following command aws configure you ll then be prompted to enter your access key id secret access key and your region press enter for the fourth prompt your aws cli should now be ready to be used 1 to create a thing type the following into your terminal aws iot create thing thing name thenameofyourthing in order to see all of the things you have created type aws iot list things into the command line it will then return an object with an array of your things in it things attributes thingname thing1 attributes thingname thing2 2 to enable secure communication between a device and aws iot you ll need to provision a certificate for your thing you can create one yourself or you can have one created for you by aws iot type the following command into your command line to have a certificate provisioned for you aws iot create keys and certificate set as active certificate pem outfile cert pem public key outfile publickey pem private key outfile privatekey pem this will automatically create your certificate and add the key files to your project make a note of the certificate arn that it returns to your terminal 3 create and attach an aws iot policy to your certificate to do so you ll need to create a new json file in your project you can call it whatever you like we ve called it policy json include the following code in the newly created file version 2012 10 17 statement effect allow action iot resource to create a policy you ll then need to type a create policy command into your terminal you ll need to name your policy here and copy the full file path to the policy file you just created aws iot create policy policy name policyname policy document file path to your policy document now let s attach that policy to the certificate reference the certificate arn you made a note of earlier use the folling command aws iot attach principal policy principal certificate arn policy name policyname 4 attach your certificate to your device using the following command aws iot attach thing principal thing name thenameofyourthing principal certificate arn 5 verify mqtt subscribe and publish you ll need to download the root certificate from here https www symantec com content en us enterprise verisign roots verisign class 203 public primary certification authority g5 pem and then save it in a file called rootca pem next download mqtt fx from here http mqttfx jfx4ee org index php download click on the link mqtt download https cloud githubusercontent com assets 12450298 13011989 467f8938 d1a1 11e5 9308 efeb0294e793 png download the relevant files for your machine eg windows or mac machine spec https cloud githubusercontent com assets 12450298 13012068 98e2be0c d1a1 11e5 8b71 21eeb75ca95e png type the following command to find out your aws iot endpoint make a note of it aws iot describe endpoint open the downloaded file to install mqtt and then launch it click on the gear at the top to configure the application make sure egress to port 8883 is allowed on your network settings https cloud githubusercontent com assets 12450298 13012350 e21f9ac6 d1a2 11e5 8a4a 29417e29ab95 png give it a profile name and then enter your broker address which is the iot endpoint then click on the ssl tls tab config https cloud githubusercontent com assets 12450298 13012454 325b627c d1a3 11e5 8ea4 eff11487ede5 png click on the self signed certificates radio button and then enter the following information ca file which is the path to your rootca pem file client certificate file which is the path to your cert pem file client key file which is the path to your privatekey pem file ssl tls https cloud githubusercontent com assets 12450298 13012711 4393899c d1a4 11e5 931c aaf072f37bc7 png click ok which will take you back to the mqtt fx dashboard 6 click the connect button at the top of the window to connect to aws iot connect https cloud githubusercontent com assets 12450298 13012844 e8fd4c38 d1a4 11e5 8bfd 105445768591 png 7 subscribe to an mqtt topic by clicking on the subscribe tab entering a topic name and then clicking subscribe make sure the qos 0 option is selected in the dropdown on the right subscribe https cloud githubusercontent com assets 12450298 13012947 576ee19a d1a5 11e5 8f98 0ff25a229e31 png 8 publish to an mqtt topic by clicking on the publish tab and entering the name of the topic you just subscribed to publish https cloud githubusercontent com assets 12450298 13013055 b12be66a d1a5 11e5 93e3 93523dcaa5a5 png then enter a message that you want to publish then hit publish we posted hello world go back to the subscribe tab to see your message message sent https cloud githubusercontent com assets 12450298 13013500 a61b4048 d1a7 11e5 883b 5ab4424e41bb png that s it you ve set up your first pub sub connection using aws iot create an iam role for aws iot in order for iot to interact with other aws services you need to create an iam role so that it has the right permissions copy the following code into a file and call it trust policy json make sure you enter an sid it can be anything you like version 2012 10 17 statement sid 123456789example effect allow principal service iot amazonaws com action sts assumerole then to create the iam role run the create role command giving it the assume role policy document that you just created aws iam create role role name iot actions role assume role policy document file path to file trust policy json make sure you save the arn from the output of this command as you ll need it to create a rule later on rules enable you to access other aws servives through iot grant permissions to the role later on we re going to go through a couple of examples for using the aws iot service one to post data to a dynamodb table and then one to invoke a lambda function to do this we have to create a policy and then attach it to the role we created in the previous section copy the following code into a new file and name it whatever you like we ve called it iam policy json version 2012 10 17 statement effect allow action dynamodb lambda invokefunction resource run the create policy and link to the file path you just created aws iam create policy policy name iot actions policy policy document file iam policy document file path make a note of the arn that is returned in the command line and then run the attach policy role with this command aws iam attach role policy role name iot actions role policy arn policy arn you should now be able to interact with dynamodb and lambda create a rule to insert a message into a dynamodb table 1 create a table in the dynamodb console create table https cloud githubusercontent com assets 12450298 13015426 eca45280 d1b0 11e5 9e99 7ffe2e9127af png make sure it has a partition key hash key and sort key range key of type string we ve called ours key and timestamp keys https cloud githubusercontent com assets 12450298 13015450 019e1518 d1b1 11e5 94c8 2e41eb92bcda png select your provisioned capacity and then click create table provisioned capacity https cloud githubusercontent com assets 12450298 13015473 2586f0b2 d1b1 11e5 9b82 eccb1a89f119 png table overview https cloud githubusercontent com assets 12450298 13015481 408941d0 d1b1 11e5 8d5f 10f81fdcb8ef png 2 create a rule to trigger on a topic of your choice and insert an item into dynamodb add the following to a file and call it dynamodb rule json copy the arn from the iot actions role we created earlier for the rolearn sql select from topic test ruledisabled false actions dynamodb tablename iot hashkeyfield key hashkeyvalue topic 2 rangekeyfield timestamp rangekeyvalue timestamp rolearn arn aws iam 123456789012 role iot actions role 3 create a topic rule using the create topic rule command with the path to the dynamodb rule from the previous step aws iot create topic rule rule name savetodynamodb topic rule payload file path to file dynamodb rule json 4 open up mqtt fx and then publish a message to the topic you defined in the rule ours is topic test write the message in the form of an object as opposed to a string otherwise it will get converted into binary msg hello world 5 go back to your dynamodb console and then check the table you created you should now see the entry you just published published entry https cloud githubusercontent com assets 12450298 13015516 7070c2c4 d1b1 11e5 910f 18526c0e56ee png you should now be able to post items to a dynamodb table using aws iot create a rule to invoke a lambda function 1 go to the lambda console and create a new function it can be very basic as we re just testing that it s being invoked give it a name choose your runtime and then write the function function https cloud githubusercontent com assets 12450298 13015801 00e5527e d1b3 11e5 8536 c9d60c8f8fce png leave the handler as index handler and then give it a lambda basic execution role then press next then create function role and handler https cloud githubusercontent com assets 12450298 13015836 2e2e35b6 d1b3 11e5 8991 b4a86f53317e png note when you select the role just click allow on the page it takes you to make a note of the arn on the review page for your function you ll need that for the rule 2 create a new file for your lambda rule we ve called ours lambda rule json enter the following code with your arn sql select from topic test ruledisabled false actions lambda functionarn arn aws lambda us east 1 123456789012 function myhelloworld 3 create a topic rule by entering the create topic rule command from iot name it what you like and then link it to the rule we just created aws iot create topic rule rule name invokelambda topic rule payload file path to file lambda rule json 4 provide a resource based policy so that aws iot can invoke the lambda function here is the command aws lambda add permission function name function name region region principal iot amazonaws com source arn arn aws iot us east 1 account id rule rule name source account account id statement id unique id action lambda invokefunction the account id can be found in your aws security credentials https console aws amazon com iam home security credential page click on account identifiers to view it note that you have to take the dashes out so you re left with the 12 digits the statement id is also known as sid which we defined earlier when we created an iam role for iot security credentials https cloud githubusercontent com assets 12450298 13016299 aa144e66 d1b5 11e5 97e8 9cd00a254745 png 5 go back to mqtt fx and publish a message to your topic that you defined in the lambda rule 6 go to the lambda console and then click on your iot function click on the monitoring tab and you should see that the function has been invoked through iot monitoring https cloud githubusercontent com assets 12450298 13016336 02b1096a d1b6 11e5 9879 d69f28bc7627 png that s it now you should be able to invoke a lambda function through aws iot simulate a device with device registry and device shadow 1 the first thing you ll need to do is register the device by using the create thing command we re going to simulate a light bulb here is our example command aws iot create thing thing name lightbulb it should return the name of your device with the arn for that device thingarn arn aws iot eu west 1 123456789 thing lightbulb thingname lightbulb 2 confirm that your device has been created with the following command aws iot list things you should then see a list of your registered devices returned in the form of an object things attributes thingname lightbulb 3 to simulate our new device we re going to use mqtt fx it can also be done with a restful api mqtt will synchronize a thing with its shadow in aws iot to report its state over mqtt the thing publishes on aws things thingname shadow update if there s an error such as version conflict when merging the reported state aws iot will push an error message on topic aws things thingname shadow rejected to receive updates from the shadow the thing should subscribe to topic aws things thingname shadow update accepted subscribe to both the aws things mylightbulb shadow update rejected and aws things mylightbulb shadow update accepted topics subscribe topics https cloud githubusercontent com assets 12450298 13048106 8e99f1c2 d3db 11e5 9cea 9d1265638487 png publish the following message to aws things thingname shadow update state reported color red by doing so this simulates reporting the state of the thing to aws iot 4 next we re going to want to check if the state of our thing has been updated to do this we can enter the following command into our terminal aws iot data get thing shadow thing name thingname output txt cat output txt this should return an object like this one state reported color red metadata reported color timestamp 123456789 version 1 timestamp 123456789 5 to request an update set state on the thing we can use the update thing shadow command this is as follows aws iot data update thing shadow thing name thingname payload state desired color green output txt cat output txt it returns the following in your terminal state desired color green metadata desired color timestamp 123456789 version 2 timestamp 123456789 6 now if you run the first command aws iot data get thing shadow thing name thingname output txt cat output txt you should then get a return of state desired color green reported color red delta color green metadata desired color timestamp 123456789 reported color timestamp 123456789 version 2 timestamp 123456789 you can see the desired colour is green the reported colour is red and the delta colour is green this is basically saying that when the device is next connected change the colour of the light bulb from red to green delta is showing the value by which the colour is changing a copy of this is saved to your file tree structure in the specified output txt file 7 let s say we want to delete our thing using mqtt fx all we have to to is publish a state of null to the aws things thingname shadow update topic it will look like this state null to delete it using the cli simply type the following command into your terminal aws iot delete thing thing name thingname note you must detach any attached principals using the detach thing principal cli command before deleting a thing from the thing registry to detach a principal type this command into your command line aws iot detach thing principal thing name thingname principal principalyouwanttoremove use iot to create a websocket connection in the browser this is a complete tutorial that will help you to build a websocket application with aws iot from scratch 1 the first thing we re going to have to do is to create a thing in aws iot to do this we can run the create thing command in our terminal aws iot create thing thing name thingname this will return an object with an arn and your thing name 2 next we ll need to create a couple of certificates for our device run this command in your command line aws iot create keys and certificate set as active certificate pem outfile cert pem public key outfile publickey pem private key outfile privatekey pem this will create 3 certificate files in your project folder make a note of the certificate arn that gets returned to your command line 3 create and attach an aws iot policy to your certificate create a file called policy json and save the following code in it version 2012 10 17 statement effect allow action iot resource this is basically giving complete access to publish and subscribe on any resource next we have to create a policy document that links to the file we just created to do so we run the following command aws iot create policy policy name pubsubtoanytopic policy document file path to your policy document now that we ve created the policy we have to attach it to our certificate use the following command with our certificate arn to do so aws iot attach principal policy principal certificate arn we noted down earlier policy name pubsubtoanytopic note it shouldn t return anything to the command line 4 now that we ve got a certificate that we can use let s attach it to our device use this command to attach it aws iot attach thing principal thing name thing name principal certificate arn we noted down earlier again this shouldn t return anything to the command line 5 now let s go to the iam console because we ll need to create a user with the correct permissions click on the users tab and then click on the create new users button new users https cloud githubusercontent com assets 12450298 13117245 f7fdb704 d596 11e5 8e6c 9b7309623038 png give your new user a name and then click the box that says generateand access key for each user then click create create user https cloud githubusercontent com assets 12450298 13117299 39d000b0 d597 11e5 83ac cb9639315343 png make a note of the access key id and secret access key because we ll need them to configure our websocket 6 while in the iam console click on the groups tab on the left we re going to create a new group and attach a policy to it that will give any user within that group access to aws iot we ll then add our user to that group note make sure you only give this group the permission specified as the secret keys will be public create group https cloud githubusercontent com assets 12450298 13117471 fe24f254 d597 11e5 8839 8ac7fd387fbe png give your group a name and then click next step name group https cloud githubusercontent com assets 12450298 13117555 5d74fa6a d598 11e5 8b5a 9c5aa39f1cc1 png now select the awsiotdataaccess policy and attach it policy https cloud githubusercontent com assets 12450298 13118675 3e7b3b92 d59d 11e5 9238 d96d5d71f75c png then click on the users tab and then select add users add users https cloud githubusercontent com assets 12450298 13117634 b41925f8 d598 11e5 9a4c 57a94ba45ee6 png then click add users you should then be able to see your user in the group user in group https cloud githubusercontent com assets 12450298 13117693 ef600dfc d598 11e5 8d19 0f2345915f26 png 7 we re going to have to get our aws endpoint in order to configure the websocket to do so type this command and then make note of the endpoint aws iot describe endpoint note the endpoint must be in lowercase when you include it in your html file 8 now let s create our index html file that we re going to be serving up enter the following code into index html make sure you change the access keys to the ones associated with the user we created earlier when you re creating the endpoint html doctype html html lang en head meta content text html charset utf 8 http equiv content type meta content utf 8 http equiv encoding head body ul id chat li v for m in messages m li ul input type text name say id say placeholder input a message here button id send send button script src https cdnjs cloudflare com ajax libs vue 1 0 16 vue min js type text javascript script script src https cdnjs cloudflare com ajax libs moment js 2 11 2 moment min js type text javascript script script src https cdnjs cloudflare com ajax libs crypto js 3 1 2 components core min js type text javascript script script src https cdnjs cloudflare com ajax libs crypto js 3 1 2 components hmac min js type text javascript script script src https cdnjs cloudflare com ajax libs crypto js 3 1 2 components sha256 min js type text javascript script script src http git eclipse org c paho org eclipse paho mqtt javascript git plain src mqttws31 js type text javascript script script type text javascript var data messages new vue el chat data data document getelementbyid send addeventlistener click function e var say document getelementbyid say send say value say value function sigv4utils sigv4utils sign function key msg var hash cryptojs hmacsha256 msg key return hash tostring cryptojs enc hex sigv4utils sha256 function msg var hash cryptojs sha256 msg return hash tostring cryptojs enc hex sigv4utils getsignaturekey function key datestamp regionname servicename var kdate cryptojs hmacsha256 datestamp aws4 key var kregion cryptojs hmacsha256 regionname kdate var kservice cryptojs hmacsha256 servicename kregion var ksigning cryptojs hmacsha256 aws4 request kservice return ksigning function createendpoint regionname awsiotendpoint accesskey secretkey var time moment utc var datestamp time format yyyymmdd var amzdate datestamp t time format hhmmss z var service iotdevicegateway var region regionname var secretkey secretkey var accesskey accesskey var algorithm aws4 hmac sha256 var method get var canonicaluri mqtt var host awsiotendpoint var credentialscope datestamp region service aws4 request var canonicalquerystring x amz algorithm aws4 hmac sha256 canonicalquerystring x amz credential encodeuricomponent accesskey credentialscope canonicalquerystring x amz date amzdate canonicalquerystring x amz signedheaders host var canonicalheaders host host n var payloadhash sigv4utils sha256 var canonicalrequest method n canonicaluri n canonicalquerystring n canonicalheaders nhost n payloadhash var stringtosign algorithm n amzdate n credentialscope n sigv4utils sha256 canonicalrequest var signingkey sigv4utils getsignaturekey secretkey datestamp region service var signature sigv4utils sign signingkey stringtosign canonicalquerystring x amz signature signature return wss host canonicaluri canonicalquerystring var endpoint createendpoint eu west 1 your region lowercasea315z3lphjmasx iot eu west 1 amazonaws com require lowercamelcase hkaefljblkjhfakj your access key id 1234556664smblvmnbxvmbexampleqi5cttu acbci secret access key var clientid math random tostring 36 substring 7 var client new paho mqtt client endpoint clientid var connectoptions usessl true timeout 3 mqttversion 4 onsuccess subscribe client connect connectoptions client onmessagearrived onmessage client onconnectionlost function e console log e function subscribe client subscribe test chat console log subscribed function send content var message new paho mqtt message content message destinationname test chat client send message console log sent function onmessage message data messages push message payloadstring console log message received message payloadstring script body html 9 let s take our code for a spin type the following command into the terminal to start up a simple server python m simplehttpserver then open two windows side by side at http localhost 8000 type something in the left and then press send send https cloud githubusercontent com assets 12450298 13118451 18333d00 d59c 11e5 9fb0 f75507c6094c png the message should appear instantaneously in both windows sent https cloud githubusercontent com assets 12450298 13118502 59639694 d59c 11e5 8b20 2bfaf1a17f7f png do the same from the other window reply https cloud githubusercontent com assets 12450298 13118527 76a46a58 d59c 11e5 932b 7e2fa69ebef3 png and you should see the same thing happen replied https cloud githubusercontent com assets 12450298 13118538 89e06e78 d59c 11e5 9e8a f83e4b4f0ccf png that s it you should now be able to open up a websocket connection in the browser using aws iot credit to yusuke arai for the tutorial http dev classmethod jp cloud aws aws iot mqtt over websocket | server |
|
nlp-course-notebooks | shai nlp course notebooks this repository contains the notebooks for shai s natural language processing course content notebook chapter section description exercise pytorch basics 1 pytorch basics 1 ipynb 0 3 pytorch tensor creation and basic operations pytorch basics 2 pytorch basics 2 ipynb 0 3 creating neural networks with pytorch pytorch basics exercise pytorch basics exercise ipynb 0 3 practice neural networks with pytorch regex basics regular expressions basics ipynb 0 4 introduction to regular expressions text preprocessing text preprocessing ipynb 1 all apply preprocessing to text text tokenization text vectorization tokenization ipynb 2 1 tokenize text by word or character text encoding text vectorization encoding ipynb 2 2 3 encode text by onehot and tfidf word embeddings word2vec text vectorization word2vec ipynb 2 4 implement word2vec to generate word embeddings text classification text classification rnn ipynb 3 implement sentiment analysis with rnn text translation text translation ipynb 4 1 rnn seq2seq for translation text translation with attention text translation attention ipynb 4 2 add bahdanau attention to seq2seq model text translation with transformer text translation transformer ipynb 4 3 introduction of transformer model for translation text classification with huggingface transfomers text classification huggingface ipynb 6 use huggingface pretrainedtransformers for sentiment analysis instructions 1 create a virtual envrionment bash python3 m venv venv 2 activate the environment and install the required packages bash source venv bin activate pip3 install r requirements txt 3 run jupyter notebook sever bash jupyter notebook | ai |
|
TTK4155 | ttk4155 industrial and embedded computer systems design term project this is a software for our ttk4155 project ping pong machine the project consists of 2 nodes each node is running on separate microcontroller can bus is used for communication between them parts of the project part 1 initial assembly of microcontroller and rs 232 part 2 address decoding and external ram part 3 a d converting and joystick input part 4 oled display and user interface part 5 spi and can controller part 6 can bus and communication between nodes part 7 controlling servo and ir part 8 controlling motor and solenoid part 9 completion of the project and extras as an extra additional steering using the pc was implemented | os |
|
nOS | nos join the chat at https gitter im jimtremblay nos https badges gitter im join 20chat svg https gitter im jimtremblay nos utm source badge utm medium badge utm campaign pr badge utm content badge mplv2 license https img shields io badge license mplv2 blue svg style flat https www mozilla org mpl 2 0 features preemptive or cooperative scheduling depending on your configuration can be tickless for battery powered application binary and counting semaphores mutexes with priority ceiling or priority inheritance queues for thread safe communication flags for waiting on multiple events memory blocks for dynamic memory allocation software timers with callback and priority software interrupts signal with callback and priority real time module compatible with unix timestamp software alarms with callback no limits on number of nos objects except your available memory tiny footprint as low as 1kb rom few bytes of ram fully configurable rom and ram open source royalty free win32 and linux simulator available documentation https github com jimtremblay nos wiki | rtos avr msp430 m16c r8c rx600 arm arm-cortex-m4f arm-cortex-m3 arm-cortex-m0 arm-cortex-m7 posix win32 pic24 gcc iar keil nc30 xc16 arm7-tdmi | os |
machine_learning_economics | machine learning economics this repository contains lecture notes for a basic course on machine learning for economics along with some example notebooks written in r and python this repository has been set up for https mybinder org which allows the user to open the notebooks in an executable environment with all dependencies already built the binder for this repository is https mybinder org v2 gh sekhansen machine learning economics master | ai |
|
eos-transit | transit wallet access layer for eosio blockchain networks this library is a small abstraction layer on top of eosjs which aims to assist eos dapp decentralized app developers with wallet communication signature verification and acceptance by providing a simple and intuitive api instead of focusing on supporting specific signature providers one by one developers can support every one that has built a transit plugin allowing the user to use their signature provider of choice this way the best ux for signature providers wins and the developers can focus on building their dapp instead of setting up eosjs and wallet connections please see the quick start and thorough guide in the eos transit package docs packages eos transit features easy to use api managed wallet connection state tracking easily pluggable wallet providers and easy to write your own typescript support small footprint core is just 9kb minified and around 2 7kb gzipped packages this is a monorepo that is managed with lerna https github com lerna lerna package version description eos transit packages eos transit npm version https badge fury io js eos transit svg https badge fury io js eos transit transit core package eos transit scatter provider packages eos transit scatter provider npm version https badge fury io js eos transit scatter provider svg https badge fury io js eos transit scatter provider wallet provider for scatter https get scatter com app eos transit lynx provider packages eos transit lynx provider npm version https badge fury io js eos transit lynx provider svg https badge fury io js eos transit lynx provider wallet provider for lynx https eoslynx com app eos transit ledger provider packages eos transit ledger provider npm version https badge fury io js eos transit ledger provider svg https badge fury io js eos transit ledger provider wallet provider for ledger https www ledger com app eos transit tokenpocket provider packages eos transit tokenpocket provider npm version https badge fury io js eos transit tokenpocket provider svg https badge fury io js eos transit tokenpocket provider wallet provider for token pocket https www tokenpocket pro app eos transit meetone provider packages eos transit meetone provider npm version https badge fury io js eos transit meetone provider svg https badge fury io js eos transit meetone provider wallet provider for meet one https meet one app eos transit portis provider packages eos transit portis provider npm version https badge fury io js eos transit portis provider svg https badge fury io js eos transit portis provider wallet provider for portis https www portis io app eos transit whalevault provider packages eos transit whalevault provider npm version https badge fury io js eos transit whalevault provider svg https badge fury io js eos transit portis provider wallet provider for whalevault https github com alexpmorris whalevault app eos transit keycat provider packages eos transit keycat provider npm version https badge fury io js eos transit keycat provider svg https badge fury io js eos transit keycat provider wallet provider for keycat https github com eosdaq keycat in browser wallet eos transit simpleos provider packages eos transit simpleos provider npm version https badge fury io js eos transit simpleos provider svg https badge fury io js eos transit simpleos provider wallet provider for simpleos https eosrio io simpleos app eos transit stub provider packages eos transit stub provider npm version https badge fury io js eos transit stub provider svg https badge fury io js eos transit stub provider stub wallet provider that does nothing for demo and testing only eos transit anchorlink provider packages eos transit anchorlink provider npm version https badge fury io js eos transit anchorlink provider svg https badge fury io js eos transit anchorlink provider anchor esr provider for eos transit contribution the below instructions only apply to developers wishing to build the transit project in most cases developers will want to pull the npm pakckages if you re wanting to use eos transit and plugins please see the quick start and thorough guide in the eos transit package docs packages eos transit if you are looking to build your own plugin the transit plugin developer kit eosnewyork eos transit tree master plugin dev transit dev simple is a good place to start package development 1 make sure you have yarn https yarnpkg com installed 2 install the dependencies with yarn install note that before eos transit eos transit scatter provider and eos transit stub provider are published they are managed by lerna along with packages themselves that means before running the examples lerna should wire up all the dependencies and instead of running yarn install manually from this folder the following commands should be run from the project root 3 bootstrap the dependencies with yarn bootstrap this will make lerna install all the necessary dependencies for managed packages 4 proceed with the development of a certain package each package has its own set of commands to assist the development and build process builds to build all packages simultaneously run the following command from the project root yarn build packages this will perform both typescript compilation into lib folder and a minified production ready umd build into umd folder for each managed package publishing 1 make sure the current state of package folders is consistent and that packages which are about to be published are actually built with yarn build packages see previous section to make sure you can run yarn pack command to create a tgz tarball of the package and inspect its contents but make sure that doesn t leak to a published package so cleanup that before publishing 2 make sure you re logged into npm registry by running yarn login https yarnpkg com lang en docs cli login please note that this won t ask you for a password it will be asked upon publishing since yarn doesn t maintain authenticated sessions with npm registry 3 since this monorepo is managed with lerna the latter is responsible for publishing too so run lerna publish possibly proividing additional options https github com lerna lerna tree master commands publish if needed normally this should guide you through version bumping process as well as creating and pushing new git version tags for each package that had been published | blockchain |
|
ExtractGPT | attribute value extraction using large language models this repository contains code and data for experiments on attribute value extraction using large language models requirements we evaluate hosted llms such as gpt 3 5 and gpt 4 as well as open source llms based on llama2 which can be run locally for the hosted llms an openai access tokens needs to be placed in a env file at the root of the repository to obtain this openai access token users must sign up https platform openai com signup for an openai account to run the open source llms gpu support is required installation the codebase requires python 3 9 to install dependencies we suggest to use a conda virtual environment conda create n avellms python 3 9 conda activate avellms pip install r requirements txt pip install pieutils dataset for this work we use subsets of two datasets from related work oa mine https github com xinyangz oamine and ae 110k https github com cubenlp acl19 scaling up open tagging each subset contains product offers from 10 distinct product categories and is split into small train large train and test set oa mine contains 115 attributes and ae 110k contains 101 attributes further statistics and information about the subsets can be found in the table below and in the paper oa mine small train oa mine large train oa mine test ae 110k small train ae 110k large train ae 110k test attribute value pairs 1 467 7 360 2 451 859 4 360 1 482 unique attribute values 1 120 4 177 1 749 302 978 454 product offers 286 1 452 491 311 1 568 524 the dataset subsets of oa mine and ae 110k can be available in the folder data processed datasets if you want to start from scratch you can download the datasets and preprocess them yourself the dataset oa mine and ae 110k have to be added as raw data to the folder data raw for oa mine the annoations can be downloaded here https github com xinyangz oamine tree main data for ae 110k the annoations can be downloaded here https github com cubenlp acl19 scaling up open tagging blob master publish data txt the data is then preprocessed using the script prepare datasets py in the folder preprocessing the script can be run using the following command scripts 00 prepare datasets sh prompts we experiment with zero shot prompts and prompts that use training data for both scenarios the two target schema descriptions a list and b schema are used the following figure shows the prompt structures for the two schema descriptions prompt designs resources zero shot prompt designs png list the list description iterates over the target attribute names for the extraction the prompt looks as follows system you are a world class algorithm for extracting information in structured formats user extract the attribute values from the product title in a json format valid attributes are brand color material if an attribute is not present in the product title the attribute value is supposed to be n a user dr brown s infant to toddler toothbrush set 1 4 ounce blue schema the schema description iterates over the target attribute names descriptions and example values the prompt looks as follows system you are a world class algorithm for extracting information in structured formats name toothbrush description correctly extracted toothbrush with all the required parameters with correct types parameters type object properties brand description the brand of the toothbrush examples philips sonicare sewak al falah gum oral b pro health arm hammer colgate jojo wild stone glister oral b vitality type string color description the color of the toothbrush examples pink white spiderman multi four colours blue and white sparkle shine lunar blue black soft type string material description the material used in the toothbrush examples miswak bamboo sewak plant based bristles wood nylon tynex bristle silicone type string user split the product title by whitespace extract the valid attribute values from the product title in json format keep the exact surface form of all attribute values all valid attributes are provided in the json schema unknown attribute values should be marked as n a user dr brown s infant to toddler toothbrush set 1 4 ounce blue execution the prompts and the code to execute the prompts are defined in the folder prompts you can run the zero shot prompts and prompts using training with the following scripts scripts 01 run zero shot prompts sh scripts 02 run prompts with training data sh fine tuning the python scripts to prepare the data for the fine tuning of the llms are located in the folder finetuning the fine tuning can be run using the following scripts scripts 03 prepare fine tuning sh you find support for uploading the data to the openai api and to start the finetuning in the folder pieutils open ai scripts further information can be found in openai s guides for fine tuning https platform openai com docs guides fine tuning | ai |
|
hotel_management_system | hotel management system a project which was done during the software engineering databases course nothing special | server |
|
tpf-foundations-flanfranco | trabajo pr ctico final foundations itba cloud data engineering bienvenido al tp final de la secci n foundations del m dulo 1 de la diplomatura en cloud data engineering del itba en este trabajo pr ctico vas a poner en pr ctica los conocimientos adquiridos en 1 bases de datos relacionales postgresql espec ficamente 2 bash y linux commandline 3 python 3 7 4 docker para realizar este tp vamos a utlizar la plataforma github classrooms donde cada alumno tendr acceso a un repositorio de git privado hosteado en la plataforma github en cada repositorio en la secci n de issues https guides github com features issues tab a la derecha de code en las tabs de navegaci n en la parte superior de la pantalla podr s ver que hay creado un issue por cada ejercicio el objetivo es resolver ejercicio issue creando un branch y un pull request asociado debido a que cada ejercico utiliza el avance realizado en el issue anterior cada nuevo branch debe partir del branch del ejercicio anterior para poder realizar llevar a cabo esto puede realizarlo desde la web de github pero recomendamos hacerlo con la aplicaci n de l nea de comando de git o con la aplicaci n de github desktop https desktop github com interfaz visual o github cli https cli github com interfaz de l nea de comando la idea de utilizar github es replicar el ambiente de un proyecto real donde las tareas se deber an definir como issues y cada nuevo feature se deber a crear con un pull request correspondiente que lo resuelve https guides github com introduction flow https docs github com en github getting started with github quickstart github flow muy importante parte importante del trabajo pr ctico es aprender a buscar en google para poder resolver de manera exitosa el trabajo pr ctico ejercicios ejercicio 1 elecci n de dataset y preguntas elegir un dataset de la wiki de postgresql https wiki postgresql org wiki sample databases u otra fuente que sea de inter s para el alumno crear un pull request con un archivo en formato markdown https guides github com features mastering markdown expliando el dataset elegido y una breve descripci n de al menos 4 preguntas de negocio que se podr an responder teniendo esos datos en una base de datos relacional de manera que sean consultables con lenguaje sql otras fuentes de datos abiertos sugeridas https catalog data gov dataset https datasetsearch research google com https www kaggle com datasets ejercicio 2 crear container de la db crear un archivo de docker compose https docs docker com compose gettingstarted que cree un container de docker https docs docker com get started con una base de datos postgresql con la versi n 12 7 recomendamos usar la imagen oficial de postgresql https hub docker com postgres disponible en docker hub se debe exponer el puerto est ndar de esa base de datos para que pueda recibir conexiones desde la m quina donde se levante el container ejercicio 3 script para creaci n de tablas crear un script de bash que ejecute uno o varios scripts sql que creen las tablas de la base de datos en la base postgresql creada en el container del ejercicio anterior se deben solamente crear las tablas primary keys foreign keys y otras operaciones de ddl https en wikipedia org wiki data definition language sin crear o insertar los datos ejercicio 4 popular la base de datos crear un script de python que una vez que el container se encuentre funcionando y se hayan ejecutado todas las operaciones de ddl necesarias popule la base de datos con el dataset elegido la base de datos debe quedar lista para recibir consultas durante la carga de informaci n puede momentareamente remover cualquier constraint que no le permita insertar la informaci n pero luego debe volverla a crear este script debe ejecutarse dentro de un nuevo container de docker mediante el comando docker run el container de docker generado para no debe contener los datos crudos que se utilizar an para cargar la base para pasar los archivos con los datos se puede montar un volumen argumento v de docker run o bien bajarlos directamente desde internet usando alguna librer a de python como requests ejercicio 5 consultas a la base de datos escribir un script de python que realice al menos 5 consultas sql que puedan agregar valor al negocio y muestre por pantalla un reporte con los resultados este script de reporting debe correrse mediante una imagen de docker con docker run del mismo modo que el script del ejercicio 4 ejercicio 6 documentaci n y ejecuci n end2end agregue una secci n al readme md comentando como resolvi los ejercicios linkeando al archivo con la descripci n del dataset y explicando como ejecutar un script de bash para ejecutar todo el proceso end2end desde la creaci n del container operaciones de ddl carga de datos y consultas para esto crear el archivo de bash correspondiente resoluci n de ejercicios elecci n del dataset y preguntas en el siguiente link https github com flanfranco tpf foundations flanfranco blob main documentation dataset info md se detalla el dataset seleccionado junto con su modelo de datos y las preguntas de an lisis a responder acerca de la soluci n las tecnolog as utilizadas para resolver los distintos apartados fueron 1 sistema operativo linux ubuntu 20 04 2 docker y docker compose 3 dbeaver 4 github por qu docker compose se opt por docker compose como orquestador sobre scripts de bash principalmente porque ofrece un importante dinamismo a la hora de desplegar r pidamente una soluci n productiva a su vez docker compose mantiene el tracking de manera nativa de todo lo que ocurre con los containers ejecutados y como caracter sticas fundamentales ofrece estandarizaci n legilibilidad y portabilidad a la hora de definir la orquestaci n facilitando el mantenimiento de la misma es importante remarcar que la resoluci n de los ejercicios podr a llevarse a cabo completamente con scripts de bash logrando los mismos resultados utilizando piping por ejemplo echo sql query docker exec i docker postgres db 1 psql u username database variables ejemplo docker run v pwd csv volume csv volume my app image y estructuras l gicas de comparaci n e iteraci n para gestionar la ejecuci n de toda la orquestaci n es en base a este ltimo punto donde se opt por una soluci n mas robusta como docker compose para poder orquestar toda la ejecuci n y a su vez que esta soluci n pueda servir como base para futuros trabajos pr cticos correspondientes a los pr ximos m dulos gu a de ejecuci n para poder ejecutar la soluci n es requisito tener instalado docker y docker compose en el sistema operativo a continuaci n se listan los pasos para desplegar la soluci n 1 descargar el contenido del repositorio https github com flanfranco tpf foundations flanfranco git 2 posicionados sobre la carpeta donde descargamos el contenido del repositorio tpf foundations flanfranco se deber a encontrar el archivo docker compose yml que contiene todas las especificaciones correspondientes al proceso de orquestaci n all ejecutamos docker compose up para que se empiece a desplegar la soluci n 3 una vez que la soluci n termin de desplegarse deber amos ver algo similar a la siguiente captura image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 01 deploy example png 4 ya teniendo desplegada la soluci n para poder acceder a la jupyter notebook que contiene el an lisis solo har a falta ingresar al siguiente link http 127 0 0 1 8888 token itba jupyter notebook token http 127 0 0 1 8888 token itba jupyter notebook token 5 una vez en el repositorio de jupyter deber amos seleccionar la siguiente notebook itba cde tp foundations flavio lanfranco ipynb como se muestra en la captura a continuaci n image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 03 jupyter notebook file png logrando acceder finalmente a la notebook que contiene todo el an lisis interactivo para responder los requerimientos planteados https github com flanfranco tpf foundations flanfranco blob main documentation dataset info md image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 04 jupyter notebook snapshot png comentarios acerca del desarrollo en los siguientes apartados se comenta en detalle las principales consideraciones tenidas en cuenta en el desarrollo de la soluci n 1 en la siguiente captura se muestra y comenta el contenido del docker compose yml detallando las caracter sticas principales definidas para la orquestaci n image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 05 docker compose yml png 2 con respecto a la creaci n del container de la db requerida en el ejercicio 2 se comparte una captura del dockerfile donde adem s se muestra como se ejecuta la creaci n del schema y tablas necesarias para el ejercicio 3 image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 06 dockerfile db png 3 en las siguientes capturas se muestra c mo se llev adelante la resoluci n del ejercicio 4 definiendo el siguiente dockerfile y script de python image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 07 dockerfile app load db png image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 08 python script load db png 5 finalmente se comparte captura del dockerfile utilizado para resolver el ejercicio 5 image of the deployment https github com flanfranco tpf foundations flanfranco blob main documentation resources images 09 dockerfile jupyter png notas finales es importante remarcar que se podr a mejorar mucho mas la soluci n presentada por ejemplo en lo referido al script de python para que considere control de excepciones reintentos logueo etc todo esto podr a desarrollarse como un framework para luego reutilizarse en otros proyectos tambi n respecto a la parametrizaci n de la soluci n los schemas nombres de tablas csv y credenciales podr an haberse definido en un archivo externo json evitando el harcodeo en el desarrollo y mejorando aspectos referidos a la seguridad a su vez se podr an haber utilizado im genes de docker mucho mas peque as y b sicas que las presentadas reduciendo considerablemente el espacio de almacenamiento necesario para desplegar la soluci n la idea es que esta soluci n sea el punto de partida y de mejoras para la realizaci n de los futuros trabajos pr cticos correspondientes a los pr ximos m dulos de la diplomatura gracias por el tiempo brindado flavio lanfranco | cloud |
|
EmbeddedMotorControlSystem | embeddedmotorcontrolsystem a system designed to control a stepper motor through a web interface using a netburner mod5234 embedded ethernet core module | os |
|
Awesome-Medical-Large-Language-Models | awesome medical large language models awesome https awesome re badge svg https awesome re a curated list of popular foundation models in healthcare large language models title institute date code radiology llama2 best in class large language model for radiology https arxiv org pdf 2309 06419 pdf university of georgia br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 261696494 fields citationcount query 24 citationcount label citations 2023 08 codoc enhancing the reliability and accuracy of ai enabled diagnosis via complementarity driven deferral to clinicians https www nature com articles s41591 023 02437 x deepmind google br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 259939727 fields citationcount query 24 citationcount label citations 2023 07 github https github com deepmind codoc tree main br star https img shields io github stars deepmind codoc style social label stars med palm 2 towards expert level medical question answering with large language models https arxiv org pdf 2305 09617 pdf google br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 258715226 fields citationcount query 24 citationcount label citations 2023 05 capabilities of gpt 4 on medical challenge problems https arxiv org pdf 2303 13375 pdf microsoft openai br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 257687695 fields citationcount query 24 citationcount label citations 2023 03 biomedlm pubmedgpt a purpose built ai model trained to interpret biomedical language https crfm stanford edu 2022 12 15 biomedlm html stanford crfm mosaicml 2022 12 huggingface https huggingface co stanford crfm biomedlm br likes https img shields io badge dynamic json url https huggingface co api models stanford crfm biomedlm query 24 likes label likes med palm large language models encode clinical knowledge https arxiv org pdf 2212 13138 pdf google br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 255124952 fields citationcount query 24 citationcount label citations 2022 12 github https github com conceptofmind palm br star https img shields io github stars conceptofmind palm style social label stars clinicalt5 a generative language model for clinical text https aclanthology org 2022 findings emnlp 398 pdf university of oregon baidu br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 256631112 fields citationcount query 24 citationcount label citations 2022 12 huggingface https huggingface co luqh clinicalt5 large br likes https img shields io badge dynamic json url https huggingface co api models luqh clinicalt5 large query 24 likes label likes gatortron a large language model for electronic health records https www nature com articles s41746 022 00742 2 university of florida nvidia br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 255175535 fields citationcount query 24 citationcount label citations 2022 12 huggingface https huggingface co ufnlp gatortron base br likes https img shields io badge dynamic json url https huggingface co api models ufnlp gatortron base query 24 likes label likes biogpt generative pre trained transformer for biomedical text generation and mining https aclanthology org 2020 clinicalnlp 1 17 microsoft research br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 252542956 fields citationcount query 24 citationcount label citations 2022 09 huggingface https huggingface co microsoft biogpt br likes https img shields io badge dynamic json url https huggingface co api models microsoft biogpt query 24 likes label likes biobart pretraining and evaluation of a biomedical generative language model https arxiv org pdf 2204 03905 pdf tsinghua university br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 248069469 fields citationcount query 24 citationcount label citations 2022 04 huggingface https huggingface co ganjinzero biobart v2 base br likes https img shields io badge dynamic json url https huggingface co api models ganjinzero biobart v2 base query 24 likes label likes kebiolm improving biomedical pretrained language models with knowledge https aclanthology org 2021 bionlp 1 20 pdf tsinghua alibaba br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 233324564 fields citationcount query 24 citationcount label citations 2021 04 github https github com ganjinzero kebiolm br star https img shields io github stars ganjinzero kebiolm style social label stars pretrained language models for biomedical and clinical tasks understanding and extending the state of the art https aclanthology org 2020 clinicalnlp 1 17 meta br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 226283910 fields citationcount query 24 citationcount label citations 2020 11 github https github com facebookresearch bio lm br star https img shields io github stars facebookresearch bio lm style social label stars biomegatron larger biomedical domain language model https aclanthology org 2020 emnlp main 379 pdf nvidia br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 222310618 fields citationcount query 24 citationcount label citations 2020 10 huggingface https huggingface co embo biomegatron345muncased br likes https img shields io badge dynamic json url https huggingface co api models embo biomegatron345muncased query 24 likes label likes pubmedbert domain specific language model pretraining for biomedical natural language processing https arxiv org pdf 2007 15779 pdf microsoft research br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 220919723 fields citationcount query 24 citationcount label citations 2020 07 huggingface https huggingface co microsoft biomednlp pubmedbert base uncased abstract fulltext br likes https img shields io badge dynamic json url https huggingface co api models microsoft biomednlp pubmedbert base uncased abstract fulltext query 24 likes label likes publicly available clinical bert embeddings https arxiv org pdf 1904 03323 pdf mit csail br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 102352093 fields citationcount query 24 citationcount label citations 2019 04 huggingface https huggingface co emilyalsentzer bio clinicalbert br likes https img shields io badge dynamic json url https huggingface co api models emilyalsentzer bio clinicalbert query 24 likes label likes clinicalbert modeling clinical notes and predicting hospital readmission https arxiv org pdf 1904 05342 pdf harvard princeton nyu br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 119308351 fields citationcount query 24 citationcount label citations 2019 04 github https github com kexinhuang12345 clinicalbert br star https img shields io github stars kexinhuang12345 clinicalbert style social label stars biobert a pre trained biomedical language representation model for biomedical text mining https arxiv org pdf 1901 08746 korea university br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 59291975 fields citationcount query 24 citationcount label citations 2019 01 github https github com dmis lab biobert br star https img shields io github stars dmis lab biobert style social label stars paper name arxiv link institute name br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 12345678 fields citationcount query 24 citationcount label citations 20xx xx huggingface https github com br likes https img shields io badge dynamic json url https huggingface co api models query 24 likes label likes large vision models title institute date code segment anything model for medical image analysis an experimental study https arxiv org pdf 2304 10517 duke university br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 258236547 fields citationcount query 24 citationcount label citations 2023 04 github https github com mazurowski lab segment anything medical evaluation br star https img shields io github stars mazurowski lab segment anything medical evaluation style social label stars survey papers title institute date do we still need clinical language models https arxiv org pdf 2302 08091 mit xyla br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 256900662 fields citationcount query 24 citationcount label citations 2023 02 paper name arxiv link institute name br citations https img shields io badge dynamic json url https api semanticscholar org graph v1 paper corpusid 12345678 fields citationcount query 24 citationcount label citations 20xx xx | large-language-models multimodal-large-language-models large-vision-language-models | ai |
awesome-attention-mechanism-in-cv | awesome attention mechanism in cv awesome https awesome re badge svg https awesome re doc att jpg table of contents introduction introduction attention mechanism attention mechanism plug and play module plug and play module vision transformer vision transformer contributing contributing introduction this is a list of awesome attention mechanisms used in computer vision as well as a collection of plug and play modules due to limited ability and energy many modules may not be included if you have any suggestions or improvements welcome to submit an issue https github com pprp awesome attention mechanism in cv issues or pr https github com pprp awesome attention mechanism in cv pulls attention mechanism paper publish link blog squeeze and excitation network https arxiv org abs 1709 01507 cvpr18 senet https github com hujie frank senet zhihu https zhuanlan zhihu com p 102035721 global second order pooling convolutional networks https cs jhu edu alanlab pubs20 li2020neural pdf cvpr19 gsopnet https github com zilingao global second order pooling convolutional networks neural architecture search for lightweight non local networks https cs jhu edu alanlab pubs20 li2020neural pdf cvpr20 autonl https github com liyingwei autonl selective kernel network https arxiv org pdf 1903 06586 pdf cvpr19 sknet https github com implus sknet zhihu https zhuanlan zhihu com p 102034839 convolutional block attention module https arxiv org pdf 1807 06521 pdf eccv18 cbam https github com jongchan attention module zhihu https zhuanlan zhihu com p 102035273 bottleneck attention module https arxiv org pdf 1807 06514 pdf bmvc18 bam https github com jongchan attention module zhihu https zhuanlan zhihu com p 102033063 concurrent spatial and channel squeeze excitation in fully convolutional networks http arxiv org pdf 1803 02579v2 pdf miccai18 scse https github com ai med squeeze and excitation zhihu https zhuanlan zhihu com p 102036086 non local neural networks https arxiv org abs 1711 07971 cvpr19 non local nl https github com alexhex7 non local pytorch zhihu https zhuanlan zhihu com p 102984842 gcnet non local networks meet squeeze excitation networks and beyond https arxiv org abs 1904 11492 iccvw19 gcnet https github com xvjiarui gcnet zhihu https zhuanlan zhihu com p 102990363 ccnet criss cross attention for semantic segmentation https arxiv org abs 1811 11721 iccv19 ccnet https github com speedinghzl ccnet sa net shuffle attention for deep convolutional neural networks https arxiv org pdf 2102 00240 pdf icassp 21 sanet https github com wofmanaf sa net zhihu https zhuanlan zhihu com p 350912960 eca net efficient channel attention for deep convolutional neural networks https arxiv org pdf 1910 03151 pdf cvpr20 ecanet https github com bangguwu ecanet spatial group wise enhance improving semantic feature learning in convolutional networks https arxiv org abs 1905 09646 corr19 sgenet https github com implus pytorchinsight fcanet frequency channel attention networks https arxiv org pdf 2012 11879 pdf iccv21 fcanet https github com cfzd fcanet a 2 text nets double attention networks https arxiv org abs 1810 11579 neurips18 danet https github com nguyenvo09 double attention network asymmetric non local neural networks for semantic segmentation https arxiv org pdf 1908 07678 pdf iccv19 apnb https github com mendelxu ann efficient attention attention with linear complexities https arxiv org pdf 1812 01243v7 pdf corr18 efficientattention https github com cmsflash efficient attention image restoration via residual non local attention networks https arxiv org pdf 1903 10082 pdf iclr19 rnan https github com yulunzhang rnan exploring self attention for image recognition http vladlen info papers self attention pdf cvpr20 san https github com hszhao san an empirical study of spatial attention mechanisms in deep networks https arxiv org pdf 1904 05873 pdf iccv19 none object contextual representations for semantic segmentation https arxiv org pdf 1909 11065 pdf eccv20 ocrnet https github com hrnet hrnet semantic segmentation tree hrnet ocr v 2 iaunet global text aware feature learning for person re identification https arxiv org pdf 2009 01035 pdf ttnnls20 iaunet https github com blue blue272 imgreid ianet resnest split attention networks https arxiv org pdf 2004 08955 pdf corr20 resnest https github com zhanghang1989 resnest gather excite exploiting feature context in convolutional neural networks https papers nips cc paper 8151 gather excite exploiting feature context in convolutional neural networks pdf neurips18 genet https github com hujie frank genet improving convolutional networks with self calibrated convolutions http mftp mmcheng net papers 20cvprscnet pdf cvpr20 scnet https github com mcg nku scnet rotate to attend convolutional triplet attention module https arxiv org pdf 2010 03045 pdf wacv21 tripletattention https github com landskapeai triplet attention dual attention network for scene segmentation https arxiv org pdf 1809 02983 pdf cvpr19 danet https github com junfu1115 danet relation aware global attention for person re identification https arxiv org pdf 1904 02998v1 pdf cvpr20 rganet https github com microsoft relation aware global attention networks attentional feature fusion https arxiv org abs 2009 14082 wacv21 aff https github com yimiandai open aff an attentive survey of attention models https arxiv org abs 1904 02874 corr19 none stand alone self attention in vision models https arxiv org pdf 1906 05909 pdf neurips19 fullattention https github com leaderj1001 stand alone self attention bisenet bilateral segmentation network for real time semantic segmentation https arxiv org abs 1808 00897 eccv18 bisenet https github com coincheung bisenet zhihu https zhuanlan zhihu com p 105925132 dcanet learning connected attentions for convolutional neural networks https arxiv org pdf 2007 05099 pdf corr20 dcanet https github com 13952522076 dcanet an empirical study of spatial attention mechanisms in deep networks https arxiv org abs 1904 05873 iccv19 none look closer to see better recurrent attention convolutional neural network for fine grained image recognition https openaccess thecvf com content cvpr 2017 papers fu look closer to cvpr 2017 paper pdf cvpr17 oral ra cnn https github com jianlong fu recurrent attention cnn guided attention network for object detection and counting on drones https arxiv org abs 1909 11307v1 acm mm20 ganet https isrc iscas ac cn gitlab research ganet attention augmented convolutional networks https arxiv org abs 1904 09925 iccv19 aanet https github com leaderj1001 attention augmented conv2d global self attention networks for image recognition https arxiv org pdf 2010 03019 pdf iclr21 gsa https github com lucidrains global self attention network attention guided hierarchical structure aggregation for image matting https ieeexplore ieee org document 9156481 cvpr20 hattmatting https github com wukaoliu cvpr2020 hattmatting weight excitation built in attention mechanisms in convolutional neural networks http www ecva net papers eccv 2020 papers eccv papers 123750086 pdf eccv20 none expectation maximization attention networks for semantic segmentation https arxiv org pdf 1907 13426 pdf iccv19 oral emanet https github com xialipku emanet dense and implicit attention network https arxiv org abs 1905 10671 aaai 20 dianet https github com gbup group dianet coordinate attention for efficient mobile network design https arxiv org abs 2103 02907 cvpr21 coordattention https github com andrew qibin coordattention cross channel communication networks https papers nips cc paper 8411 cross channel communication networks pdf neurlps19 c3net https github com jwyang c3net pytorch gated convolutional networks with hybrid connectivity for image classification https arxiv org pdf 1908 09699 pdf aaai20 hcgnet https github com winycg hcgnet weighted channel dropout for regularization of deep convolutional neural network http home ustc edu cn saihui papers aaai2019 weighted pdf aaai19 none ba 2m a batch aware attention module for image classification https arxiv org pdf 2103 15099 pdf cvpr21 none epsanet an efficient pyramid split attention block on convolutional neural network https arxiv org abs 2105 14447 corr21 epsanet https github com murufeng epsanet stand alone self attention in vision models https arxiv org pdf 1906 05909 pdf neurlps19 sasa https github com leaderj1001 stand alone self attention rest an efficient transformer for visual recognition https arxiv org pdf 2105 13677 pdf corr21 rest https github com wofmanaf rest spanet spatial pyramid attention network for enhanced image recognition https ieeexplore ieee org document 9102906 icme20 spanet https github com 13952522076 spanet space time mixing attention for video transformer https arxiv org pdf 2106 05968 pdf corr21 none dmsanet dual multi scale attention network https arxiv org abs 2106 08382 corr21 none compconv a compact convolution module for efficient feature learning https arxiv org abs 2106 10486 corr21 none volo vision outlooker for visual recognition https arxiv org pdf 2106 13112 pdf corr21 volo https github com sail sg volo interflow aggregating multi layer featrue mappings with attention mechanism https arxiv org abs 2106 14073 corr21 none muse parallel multi scale attention for sequence to sequence learning https arxiv org abs 1911 09483 corr21 none polarized self attention towards high quality pixel wise regression https arxiv org pdf 2107 00782 pdf corr21 psa https github com delightcmu psa ca net comprehensive attention convolutional neural networks for explainable medical image segmentation https arxiv org pdf 2009 10549v2 pdf tmi21 ca net https github com hilab git ca net bam a lightweight and efficient balanced attention mechanism for single image super resolution https arxiv org ftp arxiv papers 2104 2104 07566 pdf corr21 bam https github com dandingbudanding bam attention as activation https arxiv org pdf 2007 07729v2 pdf corr21 atac https github com yimiandai open atac region based non local operation for video classification https arxiv org pdf 2007 09033v5 pdf corr21 rnl https github com guoxih region based non local network msaf multimodal split attention fusion https arxiv org pdf 2012 07175v2 pdf corr21 msaf https github com anita hu msaf all attention layer https arxiv org abs 1907 01470v1 corr19 none compact global descriptor https arxiv org abs 1907 09665v10 corr20 cgd https github com holmesshuan compact global descriptor simam a simple parameter free attention module for convolutional neural networks https ruyuanzhang github io files 2107 icml pdf icml21 simam https github com zjjconan simam drop an octave reducing spatial redundancy in convolutional neural networks with octave convolution https export arxiv org pdf 1904 05049 iccv19 octconv https github com facebookresearch octconv contextual transformer networks for visual recognition https arxiv org abs 2107 12292 iccv21 cotnet https github com jdai cv cotnet residual attention a simple but effective method for multi label recognition https arxiv org abs 2108 02456 iccv21 csra self supervised equivariant attention mechanism for weakly supervised semantic segmentation https arxiv org pdf 2004 04581v1 pdf cvpr20 seam https github com yudewang seam an attention module for convolutional neural networks https arxiv org abs 2108 08205 iccv2021 aw conv attentive normalization https arxiv org pdf 1908 01259 pdf arxiv2020 none person re identification via attention pyramid https arxiv org abs 2108 05340 tip21 apnet https github com chengy12 apnet unifying nonlocal blocks for neural networks https arxiv org abs 2108 02451 iccv21 snl https github com zh460045050 snl iccv2021 tiled squeeze and excite channel attention with local spatial context https openaccess thecvf com content iccv2021w neurarch papers vosco tiled squeeze and excite channel attention with local spatial context iccvw 2021 paper pdf iccvw21 none pp nas searching for plug and play blocks on convolutional neural network https openaccess thecvf com content iccv2021w neurarch papers shen pp nas searching for plug and play blocks on convolutional neural network iccvw 2021 paper pdf iccvw21 pp nas https github com sbl1996 pp nas distilling knowledge via knowledge review https arxiv org pdf 2104 09044 pdf cvpr21 reviewkd https github com dvlab research reviewkd dynamic region aware convolution https arxiv org pdf 2003 12243 pdf cvpr21 none encoder fusion network with co attention embedding for referring image segmentation https openaccess thecvf com content cvpr2021 papers feng encoder fusion network with co attention embedding for referring image segmentation cvpr 2021 paper pdf cvpr21 none introvert human trajectory prediction via conditional 3d attention https openaccess thecvf com content cvpr2021 papers shafiee introvert human trajectory prediction via conditional 3d attention cvpr 2021 paper pdf cvpr21 none ssan separable self attention network for video representation learning https openaccess thecvf com content cvpr2021 papers guo ssan separable self attention network for video representation learning cvpr 2021 paper pdf cvpr21 none delving deep into many to many attention for few shot video object segmentation https openaccess thecvf com content cvpr2021 papers chen delving deep into many to many attention for few shot video object segmentation cvpr 2021 paper pdf cvpr21 danet https github com scutpaul danet a2 fpn attention aggregation based feature pyramid network for instance segmentation https openaccess thecvf com content cvpr2021 papers hu a2 fpn attention aggregation based feature pyramid network for instance segmentation cvpr 2021 paper pdf cvpr21 none image super resolution with non local sparse attention https openaccess thecvf com content cvpr2021 papers mei image super resolution with non local sparse attention cvpr 2021 paper pdf cvpr21 none keep your eyes on the lane real time attention guided lane detection https openaccess thecvf com content cvpr2021 papers tabelini keep your eyes on the lane real time attention guided lane detection cvpr 2021 paper pdf cvpr21 laneatt https github com lucastabelini laneatt nam normalization based attention module https arxiv org abs 2111 12419 corr21 nam https github com christian lyc nam nas scam neural architecture search based spatial and channel joint attention module for nuclei semantic segmentation and classification https link springer com chapter 10 1007 2f978 3 030 59710 8 26 miccai20 nas scam https github com zuhaoliu nas scam nasabn a neural architecture search framework for attention based networks http vigir missouri edu gdesouza research conference cds ieee wcci 2020 ijcnn papers n 20308 pdf ijcnn20 none att darts differentiable neural architecture search for attention http vigir missouri edu gdesouza research conference cds ieee wcci 2020 ijcnn papers n 21042 pdf ijcnn20 att darts https github com chomin att darts on the integration of self attention and convolution https arxiv org pdf 2111 14556 pdf corr21 acmix https gitee com mindspore models boxer box attention for 2d and 3d transformers https arxiv org pdf 2111 13087 pdf corr21 none coatnet marrying convolution and attention for all data sizes https proceedings neurips cc paper 2021 file 20568692db622456cc42a2e853ca21f8 paper pdf neurlps21 coatnet https github com chinhsuanwu coatnet pytorch pay attention to mlps https proceedings neurips cc paper 2021 file 4cc05b35c2f937c5bd9e7d41d3686fff paper pdf neurlps21 gmlp https github com jaketae g mlp ic conv inception convolution with efficient dilation search https arxiv org pdf 2012 13587 pdf cvpr21 oral ic conv https github com yifan123 ic conv srm a style based recalibration module for convolutional neural networks https openaccess thecvf com content iccv 2019 papers lee srm a style based recalibration module for convolutional neural networks iccv 2019 paper pdf iccv19 srm https github com hyunjaelee410 style based recalibration module spanet spatial pyramid attention network for enhanced image recognition https par nsf gov servlets purl 10206339 icme20 spanet https github com 13952522076 spanet tmm competitive inner imaging squeeze and excitation for residual network https arxiv org pdf 1807 08920v4 pdf corr18 competitive senet https github com scut aitcm competitive inner imaging senet ulsam ultra lightweight subspace attention module for compact convolutional neural networks https arxiv org pdf 2006 15102 pdf wacv20 ulsam https github com nandan91 ulsam augmenting convolutional networks with attention based aggregation https arxiv org pdf 2112 13692 pdf corr21 none context aware attentional pooling cap for fine grained visual classification https arxiv org abs 2101 06635 aaai21 cap https github com ardhendubehera cap instance enhancement batch normalization an adaptive regulator of batch noise https ojs aaai org index php aaai article download 5917 5773 aaai20 iebn https github com gbup group iebn asr attention alike structural re parameterization https arxiv org pdf 2304 06345 pdf corr23 none dynamic networks title publish github dynamic neural networks a survey https arxiv org abs 2102 04906v4 corr21 none condconv conditionally parameterized convolutions for efficient inference https arxiv org abs 1904 04971 neurlps19 condconv https github com d li14 condconv pytorch dynet dynamic convolution for accelerating convolutional neural networks https arxiv org abs 2004 10694 corr20 none dynamic convolution attention over convolution kernels https arxiv org abs 1912 03458 cvpr20 dynamic convolution pytorch https github com kaijieshi7 dynamic convolution pytorch weightnet revisiting the design space of weight network https arxiv org pdf 2007 11823 pdf eccv20 weightnet https github com megvii model weightnet dynamic filter networks http papers nips cc paper 6578 dynamic filter networks pdf neurlps20 none dynamic deep neural networks optimizing accuracy efficiency trade offs by selective execution https arxiv org pdf 1701 00299 pdf aaai17 none skipnet learning dynamic routing in convolutional networks https arxiv org pdf 1711 09485 pdf eccv18 skipnet https github com ucbdrive skipnet pay less attention with lightweight and dynamic convolutions https arxiv org pdf 1901 10430 pdf iclr19 fairseq https github com facebookresearch fairseq unified dynamic convolutional network for super resolution with variational degradations https arxiv org pdf 2004 06965 pdf cvpr20 none dynamic group convolution for accelerating convolutional neural networks eccv20 dgc https github com zhuogege1943 dgc plug and play module title publish github acnet strengthening the kernel skeletons for powerful cnn via asymmetric convolution blocks https arxiv org abs 1908 03930 iccv19 acnet https github com dingxiaoh acnet deeplab semantic image segmentation with deep convolutional nets atrous convolution and fully connected crfs https arxiv org pdf 1606 00915v2 pdf tpami18 aspp https github com kazuto1011 deeplab pytorch mixconv mixed depthwise convolutional kernels https bmvc2019 org wp content uploads papers 0583 paper pdf bmcv19 mixedconv https github com tensorflow tpu tree master models official mnasnet mixnet pyramid scene parsing network https arxiv org pdf 1612 01105 pdf cvpr17 psp https github com hszhao pspnet receptive field block net for accurate and fast object detection https www ecva net papers eccv 2018 papers eccv papers songtao liu receptive field block eccv 2018 paper pdf eccv18 rfb https github com goatmessi7 rfbnet strip pooling rethinking spatial pooling for scene parsing https arxiv org pdf 2003 13328 pdf cvpr20 spnet https github com andrew qibin spnet ssh single stage headless face detector https arxiv org pdf 1708 03979 pdf iccv17 ssh https github com mahyarnajibi ssh ghostnet more features from cheap operations https arxiv org pdf 1911 11907 pdf cvpr20 ghostnet slimconv reducing channel redundancy in convolutional neural networks by weights flipping https arxiv org abs 2003 07469 tip21 slimconv https github com jiaxiongq slimconv efficientnet rethinking model scaling for convolutional neural networks https arxiv org abs 1905 11946 icml19 efficientnet https github com lukemelas efficientnet pytorch condconv conditionally parameterized convolutions for efficient inference https arxiv org abs 1904 04971 neurlps19 condconv https github com d li14 condconv pytorch pp nas searching for plug and play blocks on convolutional neural network https ieeexplore ieee org document 9607527 iccvw21 ppnas https github com sbl1996 pp nas dynamic convolution attention over convolution kernels https openaccess thecvf com content cvpr 2020 papers chen dynamic convolution attention over convolution kernels cvpr 2020 paper pdf cvpr20 dynamicconv https github com kaijieshi7 dynamic convolution pytorch psconv squeezing feature pyramid into one compact poly scale convolutional layer https arxiv org abs 2007 06191 eccv20 psconv https github com d li14 psconv dcanet dense context aware network for semantic segmentation https arxiv org pdf 2104 02533 pdf eccv20 dcanet https github com 13952522076 dcanet enhancing feature fusion for human pose estimation https link springer com article 10 1007 s00138 020 01104 2 mva20 seb https github com tongjiangwei featurefusion object contextual representation for sematic segmentation https arxiv org abs 1909 11065 eccv2020 hrnet ocr https github com hrnet hrnet semantic segmentation tree hrnet ocr v 2 do conv depthwise over parameterized convolutional layer https arxiv org abs 2006 12030 corr20 do conv https github com yangyanli do conv pyramidal convolution rethinking convolutional neural networks for visual recognition http arxiv org abs 2006 11538 corr20 pyconv https github com iduta pyconv ulsam ultra lightweight subspace attention module for compact convolutional neural networks https arxiv org pdf 2006 15102 pdf wacv20 ulsam https github com nandan91 ulsam dynamic group convolution for accelerating convolutional neural networks https www ecva net papers eccv 2020 papers eccv papers 123510137 pdf eccv20 dgc https github com zhuogege1943 dgc vision transformer an image is worth 16x16 words transformers for image recognition at scale iclr 2021 vit paper https arxiv org abs 2010 11929 github https github com lucidrains vit pytorch title publish github swin transformer hierarchical vision transformer using shifted windows https arxiv org abs 2103 14030 iccv21 swint https github com microsoft swin transformer cpvt conditional positional encodings for vision transformer https arxiv org abs 2102 10882 corr21 cpvt https github com meituan automl cpvt glit neural architecture search for global and local image transformer https arxiv org pdf 2107 02960 pdf corr21 glit https github com bychen515 glit convit improving vision transformers with soft convolutional inductive biases https arxiv org abs 2103 10697 corr21 convit https github com facebookresearch convit ceit incorporating convolution designs into visual transformers https arxiv org abs 2103 11816 corr21 ceit https github com rishikksh20 ceit pytorch botnet bottleneck transformers for visual recognition https openaccess thecvf com content cvpr2021 papers srinivas bottleneck transformers for visual recognition cvpr 2021 paper pdf cvpr21 botnet https github com leaderj1001 bottlenecktransformers cvt introducing convolutions to vision transformers https openaccess thecvf com content iccv2021 papers wu cvt introducing convolutions to vision transformers iccv 2021 paper pdf iccv21 cvt https github com microsoft cvt transcnn transformer in convolutional neural networks https arxiv org abs 2106 03180 corr21 transcnn https github com yun liu transcnn rest an efficient transformer for visual recognition https arxiv org abs 2105 13677 corr21 rest https github com wofmanaf rest coat co scale conv attentional image transformers corr21 coat https github com mlpc ucsd coat contnet why not use convolution and transformer at the same time https arxiv org abs 2104 13497 corr21 contnet https github com yan hao tian contnet dynamicvit efficient vision transformers with dynamic token sparsification https arxiv org abs 2106 02034 neurlps21 dynamicvit https github com raoyongming dynamicvit dvt not all images are worth 16x16 words dynamic transformers for efficient image recognition https arxiv org abs 2105 15075 neurlps21 dvt https github com blackfeather wang dynamic vision transformer coatnet marrying convolution and attention for all data sizes https arxiv org pdf 2106 04803 pdf corr21 coatnet https github com chinhsuanwu coatnet pytorch early convolutions help transformers see better https openreview net pdf id lpfh1bpqfk corr21 none compact transformers escaping the big data paradigm with compact transformers https arxiv org abs 2104 05704 corr21 cct https github com shi labs compact transformers mobilevit light weight general purpose and mobile friendly vision transformer https arxiv org abs 2110 02178 context cs lg corr21 mobilevit https github com chinhsuanwu mobilevit pytorch levit a vision transformer in convnet s clothing for faster inference https arxiv org abs 2104 01136 corr21 levit https github com facebookresearch levit shuffle transformer rethinking spatial shuffle for vision transformer https arxiv org abs 2106 03650 corr21 shuffletransformer https github com mulinmeng shuffle transformer vitae vision transformer advanced by exploring intrinsic inductive bias https openreview net pdf id rnhyieu5y5 corr21 vitae https github com annbless vitae localvit bringing locality to vision transformers https arxiv org abs 2104 05707 corr21 localvit https github com ofsoundof localvit deit training data efficient image transformers distillation through attention https arxiv org abs 2012 12877 icml21 deit https github com facebookresearch deit cait going deeper with image transformers https github com facebookresearch deit blob main readme cait md iccv21 cait https github com facebookresearch deit ef cient training of visual transformers with small size datasets https arxiv org abs 2106 03746 neurlps21 none vision transformer with deformable attention https arxiv org pdf 2201 00520 pdf corr22 dat https github com leaplabthu dat maxvit multi axis vision transformer https arxiv org abs 2204 01697 corr22 none conv2former a simple transformer style convnet for visual recognition https arxiv org abs 2211 11943 corr22 conv2former https github com hvision nku conv2former rethinking mobile block for efficient neural models https arxiv org abs 2301 01146 corr23 emo https github com zhangzjn emo wave vit unifying wavelet and transformers for visual representation learning eccv22 wave vit https github com yehli imagenetmodel dual vision transformer corr23 dual vit https github com yehli imagenetmodel cotnet contextual transformer networks for visual recognition contextual transformer networks for visual recognition tpami22 cotnet https github com yehli imagenetmodel convnext v2 co designing and scaling convnets with masked autoencoders https arxiv org abs 2301 00808 corr23 convnext v2 https github com facebookresearch convnext v2 a close look at spatial modeling from attention to convolution https arxiv org abs 2212 12552 corr22 fcvit https github com ma xu fcvit scalable diffusion models with transformers https arxiv org abs 2212 09748 cvpr22 dit https github com facebookresearch dit dynamic grained encoder for vision transformers https proceedings neurips cc paper 2021 file 2d969e2cee8cfa07ce7ca0bb13c7a36d paper pdf neurlps21 vtpack https github com stevengrove vtpack segment anything https arxiv org abs 2304 02643 corr23 sam https segment anything com improved robustness of vision transformers via prelayernorm in patch embedding https www sciencedirect com science article abs pii s0031320323003606 pr23 none title publish github main idea swin transformer hierarchical vision transformer using shifted windows https arxiv org abs 2103 14030 iccv21 swint https github com microsoft swin transformer cpvt conditional positional encodings for vision transformer https arxiv org abs 2102 10882 corr21 cpvt https github com meituan automl cpvt glit neural architecture search for global and local image transformer https arxiv org pdf 2107 02960 pdf corr21 glit https github com bychen515 glit nas convit improving vision transformers with soft convolutional inductive biases https arxiv org abs 2103 10697 corr21 convit https github com facebookresearch convit gpsa ceit incorporating convolution designs into visual transformers https arxiv org abs 2103 11816 corr21 ceit https github com rishikksh20 ceit pytorch lca leff botnet bottleneck transformers for visual recognition https openaccess thecvf com content cvpr2021 papers srinivas bottleneck transformers for visual recognition cvpr 2021 paper pdf cvpr21 botnet https github com leaderj1001 bottlenecktransformers nonblock like cvt introducing convolutions to vision transformers https openaccess thecvf com content iccv2021 papers wu cvt introducing convolutions to vision transformers iccv 2021 paper pdf iccv21 cvt https github com microsoft cvt projection transcnn transformer in convolutional neural networks https arxiv org abs 2106 03180 corr21 transcnn https github com yun liu transcnn rest an efficient transformer for visual recognition https arxiv org abs 2105 13677 corr21 rest https github com wofmanaf rest coat co scale conv attentional image transformers corr21 coat https github com mlpc ucsd coat contnet why not use convolution and transformer at the same time https arxiv org abs 2104 13497 corr21 contnet https github com yan hao tian contnet dynamicvit efficient vision transformers with dynamic token sparsification https arxiv org abs 2106 02034 nips21 dynamicvit https github com raoyongming dynamicvit dvt not all images are worth 16x16 words dynamic transformers for efficient image recognition https arxiv org abs 2105 15075 nips21 dvt https github com blackfeather wang dynamic vision transformer coatnet marrying convolution and attention for all data sizes https arxiv org pdf 2106 04803 pdf corr21 coatnet https github com chinhsuanwu coatnet pytorch early convolutions help transformers see better https openreview net pdf id lpfh1bpqfk corr21 none compact transformers escaping the big data paradigm with compact transformers https arxiv org abs 2104 05704 corr21 cct https github com shi labs compact transformers mobilevit light weight general purpose and mobile friendly vision transformer https arxiv org abs 2110 02178 context cs lg corr21 mobilevit https github com chinhsuanwu mobilevit pytorch levit a vision transformer in convnet s clothing for faster inference https arxiv org abs 2104 01136 corr21 levit https github com facebookresearch levit shuffle transformer rethinking spatial shuffle for vision transformer https arxiv org abs 2106 03650 corr21 shuffletransformer https github com mulinmeng shuffle transformer vitae vision transformer advanced by exploring intrinsic inductive bias https openreview net pdf id rnhyieu5y5 corr21 vitae https github com annbless vitae localvit bringing locality to vision transformers https arxiv org abs 2104 05707 corr21 localvit https github com ofsoundof localvit deit training data efficient image transformers distillation through attention https arxiv org abs 2012 12877 icml21 deit https github com facebookresearch deit cait going deeper with image transformers https github com facebookresearch deit blob main readme cait md iccv21 cait https github com facebookresearch deit ef cient training of visual transformers with small size datasets https arxiv org abs 2106 03746 nips21 none vision transformer with deformable attention https arxiv org pdf 2201 00520 pdf corr22 dat https github com leaplabthu dat deformconv sa maxvit multi axis vision transformer https arxiv org abs 2204 01697 corr22 none dilated attention conv2former a simple transformer style convnet for visual recognition https arxiv org abs 2211 11943 corr22 conv2former https github com hvision nku conv2former demystify transformers convolutions in modern image deep networks https arxiv org pdf 2211 05781 pdf corr22 stm evaluation https github com opengvlab stm evaluation dai jifeng contributing if you know of any awesome attention mechanism in computer vision resources please add them in the prs or issues additional article papers and corresponding code links are welcome in the issue thanks to dedekinds https github com dedekinds for pointing out the problem in the dianet description | pytorch-attention attention-model attention-mechanisms implementation vision-transformer plugandplay computer-vision | ai |
YouLinkedMe | youlinkedme an iot system | server |
|
gpt2-vs-bert | comparison of gpt 2 and bert this experiment evaluates the performance of 6 language models 2 gpt 2 and 4 bert on a token prediction task performance over 100 positions img perf100 png experiment setup to evaluate the models i sampled 10 000 random sequences from wikitext 2 https paperswithcode com dataset wikitext 2 for bert a random sequence of 100 tokens is selected then for each sequence a random position within that sequence is selected and masked bert will be required to predict this token so accuracy is measured as the percentage of the time which its masked token is predicted correctly for gpt 2 a random sequence of 100 tokens is selected then for each sequence a random position within that sequence is selected because gpt 2 is autoregressive it cannot attend to tokens on the right so the sequence is truncated at the selected position the sequence is then padded appropriately to maintain a fixed sequence length of 100 this experiment can be run from this google colab notebook https colab research google com drive 1foksuyupwch0j76qk3jyelxorday9pva usp sharing blog post additional details including interactive data visualizations can be found on my blog https lukesalamone github io posts bert vs gpt2 | ai |
|
vim-config | vim config for web development img src http i imgur com eloehz3 png features only one file you don t need to run any installation script integration with typescript javascript integration with git integration with grep ack automatic syntax and codestyle checks show code coverage if available smart autocomplet tweeks for easier navigation snippets fully documented spellchecker is enabled by default optimized for web development quick tab navigation recent files list fuzzy search and other see the full list of features hotkeys in vim there is the leader key which is by default but you can change it in any time action hotkey file operations recent files list leader m show current file in nerdtree in a split leader f jshint csslint navigation show error window leader ll go to the next line with error warning go to the previous line with error warning advanced typescript javascript features go to type definition declaration leader td show all references to variable under coursor leader gr show type of variable under cursor leader gt show docs for entity under cursor leader gd smart rename an entity under coursor and all refs to it leader rr code completion next completion item tab previous completion item shift tab undo autocompletion ctrl e expand snippet enter integration with git git blame on the current line or all selected line leader b git status leader gst git add checkout file leader gw git diff leader gd git commit leader gc git commit all leader gca git commit a amend leader gcf spellcheck show suggestions z add word under the cursor as a good word zg splits move between splits leader w move to the top split shift arrow up move to the bottom split shift arrow down move to the the right split shift arrow right move to the the left split shift arrow left make split bigger vertically shift ctrl arrow up make split smaller vertically shift ctrl arrow down make split bigger horizontally shift ctrl arrow right make split smaller horizontally shift ctrl arrow left other toggle comment on the current line gcc toggle comments gc in visual mode or gc motion change surrounding symbols like or cs what to what toggle folding space toggle insert mode leader p yanking history with quick navigation leader h replace leader s esc j k simultaneously quick tab navigation leader color scheme i use color scheme solarized the light version is enabled by default if you want the dark one you have to change the following lines vim setting up light color scheme set background light set highlighting for colorcolumn highlight colorcolumn ctermbg lightgrey to those vim setting up light color scheme set background dark set highlighting for colorcolumn highlight colorcolumn ctermbg darkgrey full features list easy installation you just need to place vimrc in your home directory and that s all all plugins and dependencies will install automatically upon first vim launch folding folding is disabled by default but you may fold any part of js code according to the syntax with just space key remember your last editing sessions when you open file which you used editted last vim will open it on the exact same line vertical ruler there is a vertical line indicating 80 character limit it can be seen on the screenshot above smart search vim s built in search ignores case by default but if you use mixed lower upper case in the search pattern it ll be case sensitive quick plugin install mdash neobundle https github com shougo neobundle vim it s a bundler which helps to install other bundles it s quite smart and works better then vundle color scheme mdash solarized https github com altercation vim colors solarized a popular light dark color scheme screen https raw githubusercontent com altercation solarized master img solarized vim png snippets mdash ultisnips https github com sirver ultisnips vim snippets https github com honza vim snippets neosnippet is a snippet engine itself and vim snippets mdash it s default snippets collection this config features snippets which can be autocompleted by tab nd expanded by enter here is a full list of snippets https github com honza vim snippets tree master snippets smart panels mdash unite https github com shougo unite vim provides features like recent files list with help of neomru https github com shougo neomru vim quick tab navigation yank history on the go syntax checker mdash syntastic https github com scrooloose syntastic this plugin integrates many spellchekers and syntax checkers and shows you errors when saving or opening a file by default this config use npm packets jshint http www jshint com and css lint http csslint net to check js and css files on the fly advanced file system navigation mdash nerdtree https github com scrooloose nerdtree imroved file system navigation looks pretty much like the standard one but with some cool features like tree navigation bookmarks and some more improved status line mdash airline https github com bling vim airline nice and good loking status bar for vim nicely integrated with syntastic and fugitive good keyword completion system mdash youcompleteme https github com valloric youcompleteme smart and mighty autocompletion integration with git mdash fugitive https github com tpope vim fugitive provides full integration wit git show code coverage mdash forked version of coverage vim https github com musinux coverage vim if you have code coverage report for you project located in coverage coverage final json or coverage coverage final json the covered lines would be highlighted advanced typescript integration mdash youcompleteme https github com valloric youcompleteme typesript compiler https github com microsoft typescript provides advanced javascript features like contextaware typescript autocompletion immediately show type errors and semantic errors adwanced navigation ability etc advanced javascript integration mdash youcompleteme https github com valloric youcompleteme tern for vim https github com marijnh tern for vim provides advanced javascript features like context aware javascript code completion variable rename find variable references and go to variable improved syntax higlighting typescript mdash leafgarland typescript vim https github com leafgarland typescript vim javascript mdash vim javascript syntax https github com jelera vim javascript syntax jsx mdash vim jsx https github com mxw vim jsx json mdash vim json https github com elzr vim json css3 mdash vim css3 syntax https github com hail2u vim css3 syntax stylus mdash vim stylus https github com wavded vim stylus markdown mdash vim markdown gabrielelana vim markdown mustache handlebars mdash vim mustache handlebars https github com mustache vim mustache handlebars improved editing delimitmate https github com raimondi delimitmate mdash provides automatic closing of quotes parenthesis brackets etc also has some other related features that will make your time in insert mode a little bit easier tcomment https github com tomtom tcomment vim mdash tcomment provides easy to use file type sensible comments for vim it can handle embedded syntax surround https github com tpope vim surround is all about surroundings parentheses brackets quotes xml tags and more the plugin provides keystrokes to easily delete change and add such surroundings in pairs matchtag https github com gregsexton matchtag mdash highlights the matching html tag when the cursor is positioned on a tag it works in much the same way as the matchparen plugin installation to install just clone the repo and place symlink to vimrc in your home directory e g bash git clone https github com l0stsoul vim config git ln s vim config vimrc npm http en wikipedia org wiki npm software is required for some features pro tips if you want to make some changes just fork the repo if these changes will be helpfull to others don t forget to share it through a pull request | front_end |
|
iota.legacy.rs | iota rs iota implementation rust awful dusty here don t use me | server |
|
voice-iot-maker-demo | google actions particle photon via dialogflow this maker friendly tutorial is a great starting point for developers students and tinkerers of all types who want to integrate the google home https madeby google com home with their iot prototyping projects you can use this app to control an led by voice thanks to the magic of google assistant https assistant google com and dialogflow https dialogflow com and an internet connected particle photon https www particle io disclaimer this is not an official google product what s included this example ties together multiple technology platforms so there are a few separate components included in this repo dialogflow agent an agent https dialogflow com docs agents for dialogflow dialogflow webhook a web app to parse and react to the dialogflow agent s webhook particle photon a photon app to handle web requests and to turn the light on and off we ve included two separate web app implementations choose and build on the one that best suits your preferences 1 firebase functions a microservices https cloud google com appengine docs standard python microservices on app engine oriented implementation built for deployment to cloud functions for firebase https firebase google com docs functions a serverless on demand platform 2 app engine a server based implementation designed to run on google app engine https cloud google com appengine or your server of choice this should be enough to get you started and on to building great things what you ll need we ll build our web app with node js and will rely on some libraries to make life easier the actions on google node js client library https developers google com actions tools nodejs client library the particle api js sdk https docs particle io reference javascript on the hardware side you will need a google home https madeby google com home or another device running google assistant a particle photon https docs particle io datasheets photon datasheet or a similar web connected microcontroller like the redbear duo https redbear cc product wifi ble redbear duo html it s handy to have a breadboard some hookup wire and a bright led and the examples will show those in action however the photon has an addressable led built in so you can use just the photon itself to test all the code presented here if you prefer you ll also need accounts with dialogflow https dialogflow com for understanding user voice queries google cloud https cloud google com for hosting the webhook webapp service particle cloud https build particle io build for deploying your photon code and communicating with the particle api if you re just starting out or if you re already comfortable with a microservices approach you can use the 1 firebase functions example it s easy to configure and requires no other infrastructure setup if you d prefer to run it on a full server environment or if you plan to build out a larger application from this use the 2 app engine example which can also run on any other server of your choosing if you ve got all those or similar services devices good to go then we re ready to start getting started assuming you have all the required devices and accounts as noted above the first thing you ll want to do is to set up apps on the corresponding services so you can get your devices talking to each other local setup first you ll need to clone this repo and cd into the newly created directory git clone git github com google voice iot maker demo git cd git github com google voice iot maker demo git you should see three directories alongside some additional files dialogflow agent the contents of the action to deploy on dialogflow dialogflow webhook a web application to parse the google actions dialogflow webhook with server based and cloud function options particle photon sample code to flash onto the particle photon once you ve taken a look we ll move on dialogflow using the dialogflow account referenced above you ll want to create a dialogflow agent https console dialogflow com api client we ll be setting up a webhook https dialogflow com docs fulfillment webhook example to handle our triggers and send web requests to the particle api 1 create a new agent or click here https console dialogflow com api client newagent to begin you can name it whatever you like 1 select create a new google project as well 1 in the settings section click on the gear icon next to your project name and go to export and import 1 select import from zip and upload the zip provided dialogflow agent voice iot maker demo zip you ve now imported the basic app shell take a look at the new ledcontrol intent viewable from the intents tab you can have a look there now if you re curious or continue on to fill out the app s details 1 head over to the integrations tab and click google assistant 1 scroll down to the bottom and click update draft 1 go back to the general tab in settings and scroll down to the google project details 1 click on the google cloud link and check out the project that s been created for you feel free to customize this however you like 1 click on the actions on google link and go to 2 app information 1 click add and fill in the details of your project there 1 add some sample invocations as well as a pronunciation of your assistant app s name 1 fill out the other required fields description picture contact email etc 1 scroll down to the bottom and click test draft you can now test out the conversational side of the app in one of two ways talk to the google actions simulator https developers google com actions tools simulator https developers google com actions tools simulator test queries by the gui or curl via the try it now interface in the dialogflow gui you can also try talking to your application on any assistant enabled device that you re signed into however if you re following along step by step it won t turn any lights on yet we still have to set up the web service and the photon app onward then google cloud depending on which hosting environment you want to use cd into either dialogflow webhook 1 firebase functions or dialogflow webhook 2 app engine and continue the setup instructions in that directory s readme md file important regardless of what hosting deployment method you choose make sure you return to the dialogflow panel and go into the fulfillment tab to update the url field also check that the domains field is set to enable webhook for all domains without doing these things dialogflow won t be able to talk to your new webhook particle make sure the photon is correctly set up and connected if it s not configured yet follow the steps in the particle docs https docs particle io guide getting started start photon you can upload your code to your photon via the particle web editor https build particle io build the particle desktop ide https www particle io products development tools particle desktop ide based on atom or the particle command line tools https docs particle io guide tools and features cli photon installing we ll be using the cli for this example which you can install thusly sudo npm i particle cli g to deploy via the command line first make sure you re logged in particle login you can find out the id of your device by running particle list then upload the code using that id particle flash your device id particle photon particle blink demo ino the photon should blink rapidly while the upload is in process and when it s done and calmly pulsing cyan you re ready to go note make sure you generate a particle access token https docs particle io reference api generate an access token and add that token along with your photon s device id to your config js file you can make sure it all works by running the following from your terminal curl https api particle io v1 devices your device id led d access token your access token d led on if everything is configured properly you should see something like the following json id your device id last app connected true return value 1 you should see the photon s light come on along with an led on the breadboard if you ve wired one up doing the same with led off will return a 0 instead of a 1 and will you guessed it turn the light off note if you ever see a return value 1 that s an error message something has gone wrong somewhere putting it all together once you ve uploaded all the code and each service is configured it s time to give it all a try you can confirm that everything went to plan by going to either your assistant enabled device or the google actions simulator https developers google com actions tools simulator asking to talk to your app talk to app name and typing turn the light on if all goes well your led should turn on further reading this application is just a taste of what s possible how far you take this framework is up to you here are a few resources to help you continue on your journey actions on google invocation and discovery https developers google com actions discovery actions on google dialogflow https developers google com actions dialogflow sample google actions https developers google com actions samples particle google cloud platform integration https docs particle io tutorials integrations google cloud platform acknowledgements this demo was created by the google proto studio team including mike dory https github com mikedory vikram tank amanda mccroskery this has been a collaboration with the google aiy projects https aiyprojects withgoogle com team extra high fives to billy rutledge james mclurkin jess holbrook jordan barber cindy teruya david stalnaker https github com davidstalnaker special thanks to sabah kosoy https github com sabahkosoy jeff nusz https github com customlogic michael cavalea https github com callmecavs and jeff gray https github com j3ffgray | photon google-cloud actions-on-google dialogflow aiyprojects particle particle-photon google-home | server |
iree | iree intermediate representation execution environment iree i ntermediate r epresentation e xecution e nvironment pronounced as eerie is an mlir https mlir llvm org based end to end compiler and runtime that lowers machine learning ml models to a unified ir that scales up to meet the needs of the datacenter and down to satisfy the constraints and special considerations of mobile and edge deployments see our website https openxla github io iree for project details user guides and instructions on building from source ci status https github com openxla iree actions workflows ci yml badge svg query branch 3amain event 3apush https github com openxla iree actions workflows ci yml query branch 3amain event 3apush project status iree is still in its early phase we have settled down on the overarching infrastructure and are actively improving various software components as well as project logistics it is still quite far from ready for everyday use and is made available without any support at the moment with that said we welcome any kind of feedback on any communication channels communication channels communication channels github issues https github com openxla iree issues feature requests bugs and other work tracking iree discord server https discord gg 26p4xw4 daily development discussions with the core team and collaborators iree discuss email list https groups google com forum forum iree discuss announcements general and low priority discussion related project channels mlir topic within llvm discourse https llvm discourse group c llvm project mlir 31 iree is enabled by and heavily relies on mlir https mlir llvm org iree sometimes is referred to in certain mlir discussions useful if you are also interested in mlir evolution architecture overview todo scotttodd switch to picture once better supported https github blog changelog 2022 05 19 specify theme context for images in markdown beta iree architecture docs website docs assets images iree architecture dark svg gh dark mode only iree architecture docs website docs assets images iree architecture svg gh light mode only see our website https openxla github io iree for more information presentations and talks community meeting recordings iree youtube channel https www youtube com iree4356 2021 06 09 iree runtime design tech talk recording https drive google com file d 1p0dcysaig8rc7erkyegutqkojgpfcu3s view and slides https drive google com file d 1ikgodzxnmz1exqwraiuty9exbe3ymwbb view usp sharing 2020 08 20 iree codegen mlir open design meeting presentation recording https drive google com file d 1325zkxnnixgw3cdwrdwj1 bp952wvc6w view usp sharing and slides https docs google com presentation d 1nethjkaoyg49kixy5telqfp6zr2v8 ujgzwz 3xvqc8 edit 2020 03 18 interactive hal ir walkthrough recording https drive google com file d 1 swdgapdfrgqzdxaapsa90ad1jvfhp f view usp sharing 2020 01 31 end to end mlir workflow in iree mlir open design meeting presentation recording https drive google com open id 1os9fapodpi59uj7jji3axntzkuttuvkr and slides https drive google com open id 1rcq4zpqfk9cvgu3ih1e5xbrbcqy7d cez578j84ovyi license iree is licensed under the terms of the apache 2 0 license with llvm exceptions see license license for more information | mlir vulkan tensorflow spirv cuda jax pytorch | ai |
Natural-Language-Processing-Nanodegree | udacity natural language processing nanodegree nlp hang degree png projects of natural language processing 1 part of speech tagger with hidden markove model https github com udacity hmm tagger 2 neural machine translation with recurrent neural networks https github com udacity aind2 nlp capstone 3 end to end automatic speech recognition with convolutional neural networks and recurrent neural networks https github com udacity aind vui capstone | udacity-nanodegree natural-language-processing hmm-viterbi-algorithm hmm-model machine-translation attention-mechanism speech-recognition speech-to-text encoder-decoder nlp recurrent-neural-networks scratch-implementation | ai |
Awesome-Computer-Vision | awesome computer vision awesome https awesome re badge svg https awesome re highlighted topics 1 conditional content generation awesome conditional content generation https github com haofanwang awesome conditional content generation hot topics 1 motion prediction awesome 3d human motion prediction https github com aras62 vision based prediction blob master papers motion papers md 2 3d human reconstruction awesome 3d human reconstruction https github com rlczddl awesome 3d human reconstruction 3 virtual try on a curated list of awesome virtual try on https github com minar09 awesome virtual try on 4 talking face awesome talking face generation https github com yunjinpark awesome talking face generation 5 sketch generation awesome sketch synthesis https github com markmohr awesome sketch synthesis 6 diffusion awesome diffusion models https github com heejkoo awesome diffusion models 7 nerf awesome nerf https github com yenchenlin awesome nerf 8 clip awesome clip https github com yzhuoning awesome clip awesome computer vision 1 graph neural network gnn https github com thunlp gnnpapers gnn https github com nnzhan awesome graph neural networks graph classification https github com benedekrozemberczki awesome graph classification adversarial gnn https github com safe graph graph adversarial learning literature deep gnn https github com mengliu1998 awesome deep gnn 2 video analysis action recognition https github com jinwchoi awesome action recognition temporal action detection https github com rheelt materials temporal action detection temporal action localization https github com alvin zeng awesome temporal action localization 3 adversarial attack adversarial attack https nicholas carlini com writing 2019 all adversarial example papers html adversarial learning https github com nebula beta awesome adversarial deep learning robust ml https github com p2333 papers of robust ml graph attack https github com chandlerbang awesome graph attack papers 4 3d vision point cloud https github com yochengliu awesome point cloud analysis 3d reconstruction https github com openmvg awesome 3dreconstruction list 5 automl automl https github com hibayesian awesome automl papers network pruning https github com he y awesome pruning network compression https github com sun254 awesome model compression and acceleration nas https github com d x y awesome nas 6 reinforcement learning rl https github com aikorea awesome rl rl https github com jgvictores awesome deep reinforcement learning multiagent rl https github com chuangyc awesome multiagent learning 7 transfer learning transfer learning https github com artix41 awesome transfer learning zero shot https github com chichilicious awesome zero shot learning meta learning https github com dragen1860 awesome meta learning 8 gan gan https github com nightrome really awesome gan gan applications https github com nashory gans awesome applications 9 object detection detection https github com hoya012 deep learning object detection detection https github com amusi awesome object detection 10 object tracking multiple object tracking https github com spyderxu multi object tracking paper listn tracking https github com foolwood benchmark results 11 pose estimation human pose estimation https github com wangzheallen awesome human pose estimation hand pose estimation https github com xinghaochen awesome hand pose estimation 12 segmentation semantic segmentation https github com mrgloom awesome semantic segmentation 13 classification image classification https github com weiaicunzai awesome image classification 14 vision language navigation vision language navigation https github com daqingliu awesome vln self supervised learning https github com jason718 awesome self supervised learning 15 super resolution super resolution https github com chaofwang awesome super resolution 16 denoising image denoising https github com wenbihan reproducible image denoising state of the art 17 anomaly detection anomaly detection https github com yzhao062 anomaly detection resources 18 interpretability interpretability https github com onetaken awesome deep learning interpretability 19 trajectory prediction trajectory prediction https github com xuehaouwa awesome trajectory prediction interaction aware trajectory prediction https github com jiachenli94 awesome interaction aware trajectory prediction 20 ocr image text localization recognition https github com whitelok image text localization recognition blob master readme zh cn md awesome ocr https github com chanchichoi awesome ocr 21 transformer is all you need awesome visual transformer https github com dk liang awesome visual transformer bert https github com tomohideshibata bert related papers transformer in vision https github com dirtyharrylyl transformer in vision 22 mlp is all you need awesome mlp papers https github com haofanwang awesome mlp papers 23 vision language pre training awesome pretrained chinese nlp models https github com lonepatient awesome pretrained chinese nlp models awesome vision language pretraining papers https github com yuewang cuhk awesome vision language pretraining papers awesome programming language pretraining papers https github com yuewang cuhk awesome programming language pretraining papers pycontrast https github com hobbitlong pycontrast contrastive learning codes https github com leerumor contrastive learning codes 24 prompt promptpapers https github com thunlp promptpapers 25 mim masked image modeling https github com ucasligang awesome mim 21 crowd counting awesome crowd counting https github com gjy3035 awesome crowd counting 22 video analysis temporal action localization https github com alvin zeng awesome temporal action localization mulitple object tracking https github com luanshiyinyang awesome multiple object tracking person reid https github com bismex awesome person re identification video person reid https github com asuradayuci awesome video person reid 23 visual reasoning visual reasoning https github com jokieleung awesome visual question answering 24 visual grounding visual grounding https github com theshadow29 awesome grounding 25 video inpainting awesome image inpainting https github com 1900zyh awesome image inpainting | computer-vision deep-learning awesome-list paper vision-project vision-and-language trajectory-prediction interpretability video-analysis pose-estimation graph-neural-network adversarial-attacks 3d-vision automl gan transfer-learning object-detection super-resolution denoising | ai |
web2web | web2web server less domain less websites updatable via torrents and bitcoin blockchain live demo https elendirx github io web2web why websites get seized by losing control over a webserver or a domain if we replace both the webserver and the domain with torrents https webtorrent io and blockchain https bitcoin org en then there s nothing left to seize how it works this repo contains two html files index html is responsible for loading the webpage from torrent webpage html is the actual webpage when you open index html in the browser live demo https elendirx github io web2web here s what happens 1 bitcoin address 1dhdyqb4xgdwjzzfbygeutqdqbhsf7tgt4 is searched for the latest outgoing transaction containing op return script https en bitcoin it wiki op return inside the script there is a torrent infohash of webpage html 2 webpage html is downloaded from torrent via webtorrent https webtorrent io and displayed how is it updated to perform serverless updates torrent of the updated webpage html is created and its infohash is inserted into new bitcoin transaction sent from 1dhdyqb4xgdwjzzfbygeutqdqbhsf7tgt4 address how is it domainless save the index html to your pc and open it from localhost it will still work and receive updates what next user accounts users will be able to sign up by sending small amount of bitcoin to the 1dhdyqb4xgdwjzzfbygeutqdqbhsf7tgt4 bitcoin address then they can update their content by inserting torrent infohashes into transactions sent from their addresses e commerce it will be possible to build complex serverless anonymous e commerce websites using bitcoin for payments project status proof of concept just for fun works in chrome firefox and opera to create your own distributed webpage take a look at web2web gateway https elendirx github io web2web gateway | blockchain |
|
pokemondatabase | this is a project designed to work as a search engine for pokemon the hope is that you will be able to search for pokemon based on stats types egg groups etc if we get super far into it we would also like to add a simple battle simulator and a gui | server |
|
svyu | svyu survey app built for android final project for mobile development course svyu icon two checkmarks forming the shape of letter s android client res svyu icon png demo composer click to play a video demonstration of composer android client res svyu composer play png https svyuwebsite blob core windows net res svyu composer mp4 survey click to play a video demonstration of survey android client res svyu survey play png https svyuwebsite blob core windows net res svyu survey mp4 collaborators for contributions before 3 25 2020 amber david martin ofrila walter | front_end |
|
aspnet-starter-kit | asp net core starter kit nbsp a href https github com kriasoft aspnet starter kit stargazers img src https img shields io github stars kriasoft aspnet starter kit svg style social label star maxage 3600 alt height 20 a a href https twitter com dotnetreact img src https img shields io twitter follow dotnetreact svg style social label follow maxage 3600 alt height 20 a a href https gitter im kriasoft aspnet starter kit img src https img shields io badge chat online green svg style social logo data 3aimage 2fpng 3bbase64 2civborw0kggoaaaansuheugaaaa4aaaaocaqaaac1qevaaaaabgdbtueaalgpc 2fxhbqaaacbjsfjnaab6jgaagiqaapoaaaca6aaadtaaaopgaaa6maaaf3ccule8aaaaamjlr0qa 2f4epzl8aaaajcehzcwaade4aaaxoax93jcmaaaahdelnrqfgcqegncolipklaaaa00leqvqy023qssuecrja8c 2bpy 2bjwkzt1r 2fepqnuukpdb33dfbjnyffyzzsv 2fgfwkysilxubew7hoopljrz 2fh7erwfp 2fxmzw9t5a1zkbzt8sjrr6grbvw4l3hgzpzbnpucqgr9s2zhpdhrq9ffevp6hkm 2fyik9qfgql5vc9oflbuctkyyzd 2bzwhals8nrh25p2jbqwlgbuerdm3snvm1pi04lgjltxh1rkkqvbvoy5nxouluh1xqtwtrqn6ri05plqu8bcioqpu25c2ak5f45afln7q 2bbul12fzqkdaaaacv0rvh0zgf0ztpjcmvhdguamjaxni0wos0wmvqwnjo1mjo0mi0wndowmockwtgaaaaldevydgrhdgu6bw9kawz5adiwmtytmdktmdfumdy6nti6nditmdq6mdcwv3meaaaagxrfwhrtb2z0d2fyzqb3d3cuaw5rc2nhcguub3jnm 2b48ggaaaabjru5erkjggg 3d 3d maxage 86400 alt height 20 a asp net core starter kit https github com kriasoft aspnet starter kit is a real world boilerplate and tooling for creating single page web applications https en wikipedia org wiki single page application spa oriented towards progressive enhancement https en wikipedia org wiki progressive enhancement design cross platform compatability and component based ui architecture it is built upon best of breed technologies including net core https dot net core kestrel https github com aspnet kestrelhttpserver ef core https ef readthedocs io en latest babel http babeljs io webpack https webpack github io react https facebook github io react redux http redux js org css modules https github com css modules css modules react hot loader http gaearon github io react hot loader and more this boilerplate comes in both c https github com kriasoft aspnet starter kit and f https github com kriasoft fsharp starter kit flavors see demo https aspnet core azurewebsites net docs docs nbsp nbsp follow us on gitter https gitter im kriasoft aspnet starter kit or twitter https twitter com dotnetreact nbsp nbsp learn react js es6 and asp net core learn reactjs es6 and aspnet core nbsp nbsp visit our sponsors p align center align top a href https rollbar com utm source reactstartkit github amp utm medium link amp utm campaign reactstartkit github img src https koistya github io files rollbar 362x72 png height 36 align top a a href https x team com hire react developers utm source reactstarterkit utm medium github link utm campaign reactstarterkit june img src https koistya github io files xteam 255x72 png height 36 align top a sup a href https x team com join utm source reactstarterkit utm medium github link utm campaign reactstarterkit june hiring a sup p features nbsp nbsp component based front end development via webpack https webpack github io css modules https github com css modules css modules and react https facebook github io react see webpack config js webpack config js br nbsp nbsp modern javascript syntax es2015 http babeljs io docs learn es2015 via babel http babeljs io modern css syntax css3 via postcss https github com postcss postcss br nbsp nbsp application state management via redux http redux js org see client store js client store js br nbsp nbsp universal cross stack routing and navigation via path to regexp https github com pillarjs path to regexp and history https github com reactjstraining history see client routes json client routes json br nbsp nbsp code splitting https github com webpack docs wiki code splitting and async chunk loading with webpack https webpack github io and es6 system import http www 2ality com 2014 09 es6 modules final html br nbsp nbsp hot module replacement hmr https webpack github io docs hot module replacement html w react hot loader http gaearon github io react hot loader br nbsp nbsp lightweight build automation with plain javascript see run js run js br nbsp nbsp cross device testing with browsersync https browsersync io br nbsp nbsp git based deployment to azure app service https azure microsoft com services app service see run js publish run js br nbsp nbsp 24 7 community support on gitter https gitter im kriasoft aspnet starter kit or stackoverflow http stackoverflow com questions tagged aspnet starter kit consulting and customization requests on codementor https www codementor io koistya br directory layout shell vscode visual studio code settings build the folder for compiled output client client side app frontend components common or shared ui components utils helper functions and utility classes views ui components for web pages screens history js html5 history api wrapper used for navigation main js entry point that bootstraps the app router js lightweight application router routes json the list of application routes store js application state manager redux client test unit and integration tests for the frontend app docs documentation to the project public static files such as favicon ico etc robots txt instructions for search engine crawlers etc server web server and data api backend controllers asp net web api and mvc controllers models entity framework models entities views server side rendered views appsettings json server side application settings startup cs server side application entry point web config web server settings for iis server test unit and integration tests for the backend app jsconfig json visual studio code settings for javascript package json the list of project dependencies and npm scripts run js build automation script similar to gulpfile js webpack config js bundling and optimization settings for webpack prerequisites os x windows or linux node js https nodejs org v6 or newer net core https www microsoft com net core and net core sdk https www microsoft com net core visual studio code https code visualstudio com with c extension https github com omnisharp omnisharp vscode or visual studio 2015 or newer getting started step 1 clone the latest version of asp net core starter kit on your local machine by running shell git clone o aspnet starter kit b master single branch https github com kriasoft aspnet starter kit git myapp cd myapp alternatively scaffold your project with yeoman http yeoman io shell npm install g yo npm install g generator aspnetcore yo aspnetcore step 2 install project dependencies listed in project json server project json and package json package json files shell npm install install both node js and net core dependencies step 3 finally launch your web app shell node run compile and lanch the app same as running npm start the app should become available at http localhost 5000 http localhost 5000 see run js run js for other available commands such as node run build node run publish etc you can also run your app in a release production mode by running node run release or without hot module replacement hmr by running node run no hmr how to deploy before you can deploy your app to azure app service https azure microsoft com services app service you need to open web app settings in azure portal https portal azure com go to deployment source select local git repository and hit ok then copy and paste git clone url of your web app into run js publish run js file finally whenever you need to compile your app into a distributable format and upload that to windows azure app service simply run shell node run publish same as running npm run publish how to update we work hard on keeping the project up to date and adding new features down the road after starting a new web application project based on this boilerplate you can always fetch and merge the latest changes from this upstream repo back into your project by running shell git checkout master git fetch aspnet starter kit git merge aspnet starter kit master alternatively pull the latest version of this repository into a separate folder and compare it with your project by using a diff tool such as beyond compare http www scootersoftware com how to contribute anyone and everyone is welcome to contribute contributing md to this project the best way to start is by checking our open issues https github com kriasoft aspnet starter kit issues submit a new issues https github com kriasoft aspnet starter kit issues new labels bug or feature request https github com kriasoft aspnet starter kit issues new labels enhancement participate in discussions upvote or downvote the issues you like or dislike send pull requests contributing md pull requests learn react js es6 and asp net core mortar board nbsp react js training program http www reactjsprogram com asdf 36750 q0pu0tfa by tyler mcginnis br mortar board nbsp react for beginners https reactforbeginners com friend konstantin and es6 training course https es6 io friend konstantin by wes bos br green book nbsp react up running building web applications http amzn to 2bbkzs1 by stoyan stefanov aug 2016 br green book nbsp getting started with react http amzn to 2bevri9 by doel sengupta and manu singhal apr 2016 br green book nbsp you don t know js es6 beyond http amzn to 2bfzlqe by kyle simpson dec 2015 br green book nbsp c 6 and net core 1 0 modern cross platform development http amzn to 2bev5us by mark j price mar 2016 br green book nbsp professional c 6 and net core 1 0 http amzn to 2bhilsn by christian nagel apr 2016 br related projects react app sdk https github com kriasoft react app build react applications with a single dev dependency and no build configuration react starter kit https github com kriasoft react starter kit isomorphic web app boilerplate node js express graphql react babel starter kit https github com kriasoft babel starter kit javascript library boilerplate es2015 babel rollup asp net core starter kit f https github com kriasoft fsharp starter kit web app boilerplate f net core kestrel graphql react universal router https github com kriasoft universal router isomorphic router for web and single page applications spa membership database https github com membership membership db sql database boilerplate for web app users roles and auth tokens get in touch aspnet starter kit https gitter im kriasoft aspnet starter kit on gitter koistya https twitter com koistya on codementor https www codementor io koistya or skype http hatscripts com addskype koistya license copyright 2014 present kriasoft https kriasoft com this source code is licensed under the mit license found in the license txt https github com kriasoft react starter kit blob master license txt file the documentation to the project is licensed under the cc by sa 4 0 http creativecommons org licenses by sa 4 0 license made with by konstantin tarkus koistya https twitter com koistya and contributors https github com kriasoft aspnet starter kit graphs contributors | front_end |
|
PythoTech | pythotech information technology company | server |
|
development-index-prediction | development index prediction a webservice built with python backend and machine learning connected to simple jquery frontend tools used sanic k nearest neighbor random forest classifier jquery chart js reason for this app is to see two different predictions of two different ml models in ui and to see how each line in database affects ml algorithms since every time a new prediction is saved both models are retrained to run the app open this folder with cmd command prompt in mac run py m venv env to create virtual environment navigate to cd env cd scripts activate to activatate activate bat file mac cd env cd bin activate navate 2 steps back cd cd run py m pip install sanic pandas sklearn run main py file open frontend with a url that runs in console both models predict different development index results of an area fictional or real every time all input fields are filled and predict button is hit the data is saved in to the database with index taken from predictions let us say that random forest has better accuracy score after it is retrained therefore the index predicted by random forest is saved in the database together with all the other inputs and vice versa | server |
|
Package-Transport-System | package transport system school project subject database software tools june 2021 description informations about the project are given in pdf file in serbian built with erwin data modeler ms sql java jdbc libraries libraries are provided in referenced libraries used libraries sab project 2021 public test jar mssql jdbc 8 2 2 jre11 jar author aleksandra bogicevic alebogi | sql mssql database database-modeling | server |
navstik-quadcopter | navstik quadcopter chibios rtos for quadcopter on navstik has dcm stabilisation algorithms as well as pcm stabilization functions | os |
|
selenium-cucumber-java | selenium cucumber java selenium cucumber automation testing using java selenium cucumber is a behavior driven development bdd approach to write automation test script to test web it enables you to write and execute automated acceptance unit tests it is cross platform open source and free automate your test cases with minimal coding more details http seleniumcucumber info documentation installation doc installation md predefined steps doc canned steps md download a framework maven https github com selenium cucumber selenium cucumber java maven example writing a test the cucumber features goes in the features library and should have the feature extension you can start out by looking at features my first feature you can extend this feature or make your own features using some of the predefined steps doc canned steps md that comes with selenium cucumber predefined steps by using predefined steps you can automate your test cases more quickly more efficiently and without much coding the predefined steps are located here doc canned steps md running test go to your project directory from terminal and hit following commands mvn test defualt will run on local firefox browser mvn test dbrowser chrome to use any other browser mvn test dcloud config saucelab windows chrome52 to run test on cloud test platforms using canned tests in your project in your testrunner class add a glue option package stepdefintions import org junit runner runwith import cucumber api cucumberoptions import cucumber api junit cucumber runwith cucumber class cucumberoptions plugin html target cucumberhtmlreport features classpath features glue info seleniumcucumber stepdefinitions public class runcuketest maven gradle dependency see https jitpack io selenium cucumber selenium cucumber java license the mit license permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software | front_end |
|
ud894-FEF-Server | back end server for udacity s front end frameworks course this server provides support for the front end frameworks coding labs it both serves the app files and provides a rest interface for reading and storing data running the server the binaries binaries directory contains builds for windows mac os x darwin and linux run the appropriate server program to start the server supplying the www flag to point to your front end code e g on mac os x use server darwin amd64 www fef udacimeals backbone use the log flag to see all of the incoming and outgoing traffic from the server server details the server serves web files and provides a rest api the www flag specifies the directory to be served as e g main www web files serves web files index html as index html the server implements a rest api at api items get api items no trailing slash returns an object with a json array of menu items put api items is disallowed you cannot put the whole array at once get api items id e g get api items strawberry shortcake gets the menu item with the specified id and returns it as json put api items id takes a menu item json format in the body and updates the existing item if the id exists or appends a new one if the id doesn t exist yet all of the data is stored in json format in the data directory menu json is the storage file the file at server assets menu json server assets menu json provides the starting values and is compiled into the server binary building and running if you want to modify the code or build for another platform 1 download https golang org dl and install the go programming language 2 download or clone this project 3 change into the server directory 3 compile the server go build 4 run the server program to start the server you can also use the compile sh compile sh script to cross compile for windows mac os x and linux if you want to change the initial data file install the rice tool from https github com geertjohan go rice https github com geertjohan go rice edit server assets menu json server assets menu json and follow the directions for using embed go before re building the binary | front_end |
|
Pulse-Boilerplate | pulse https raw githubusercontent com heartbeatua pulse boilerplate master tmp cover png pulse boilerplate we ve created this react based boilerplate during our research on the design system approach it consists of modern tools and basic atomic design http bradfrost com blog post atomic web design structure features up to date tools and practices for design system creation focused on atomic design http bradfrost com blog post atomic web design methodology and naming convention clear and understandable structure of folders documentation highly customizable themes pages templates easy to work with styles using styled system https styled system com getting started what s included the actual versions of webpack babel react react router hot reloading eslint airbnb config https github com airbnb javascript tree master packages eslint config airbnb code linter prettier https prettier io code formatter styled components https www styled components com css in js styled system https github com jxnblk styled system stylize your components at an advanced level setup install dependencies sh npm install run development server sh npm run dev project will be running at http localhost 3000 http localhost 3000 generate production build sh npm run build will create the dist folder style guide and documentation run a development server sh npm run guide style guide will run at http localhost 6060 http localhost 6060 eslint run and get code review you can pass a fix setting that will try to solve a problem automatically sh npm run eslint theming we use styled components theming https www styled components com docs advanced theming the styled system provides great theme based https github com styled system styled system blob master docs getting started md theming style props https github com styled system styled system blob master docs api md for building responsive https github com jxnblk styled system blob master docs responsive styles md design systems with react a few words about atomic design atomic design is a methodology composed of five distinct stages working together to create interface design systems in a more deliberate and hierarchical manner the five stages of atomic design are atoms molecules organisms templates pages to get more info about methodology check out the original article http atomicdesign bradfrost com chapter 2 todo x styled components x styled system tests jest got questions or suggestions simply reach through our website https heartbeat ua lets talk license mit | react atomic-design styled-components styleguide design-systems figma styled-system | os |
marvel-snap-bot | h1 align center br a href https www marvelsnap com img src https user images githubusercontent com 25803231 210183311 556aafb7 1690 4a17 a958 9c622d8f6f07 png alt markdownify width 200 a br marvel snap bot br h1 h4 align center a computer vision bot made with a href https opencv org target blank opencv a and a href https developer android com studio command line adb target blank adb a h4 p align center a href www python org img src https badgen net badge python 3 8 pink icon terminal alt python a a href https img shields io github repo size adriagual marvel snap bot img src https img shields io github repo size adriagual marvel snap bot a a href https pypi org img src https img shields io pypi v nine a a href https opensource org licenses mit img src https badgen net pypi license pip a p p align center a href key features key features a a href how to use how to use a a href improvements improvements a p animation https user images githubusercontent com 25803231 210187037 1d01c383 95ff 481a 99e3 566fe0e93715 gif capture https user images githubusercontent com 25803231 210183326 e543da5a ef14 44e5 8cd5 770eec453c02 png key features card detection detect player cards on live time automate the decision making mana consumption tracked fields with different priorities and constraints adb support bluestacks any version global utils to apply for other computer vision bots calculate the power of the cards farm missions farm season pass play kazoo decks with applied logic quick usage how to use to clone and run this application you ll need git https git scm com opencv https opencv org bluestacks https www bluestacks com es index html and python 3 https www python org from your command line bash clone this repository git clone https github com adriagual marvel snap bot go into the repository cd marvel snap bot install dependencies pip install r requirements txt edit the config py file adb path absolute path to where you can execute adb project path absolute path to the project images folder absolute path to the images folder fields folder absolute path to the fields images folder mana folder absolute path to the mana images folder turns folder absolute path to the turns folder data folder absolute path to the card images folder tmp path absolute path to a tmp folder make sure that bluestacks has the adb enabled settings advanced android debug bridge adb check if you have the correct resolution the script is made to work on 1600x900 resolution add the images to the each folder you can grab them from the emulator there is an option on the script to get the info in real time just change the false parameter from the functions get my hands cards and get fields image https user images githubusercontent com 25803231 211024589 c4545e4b 523c 40f0 b1c1 3302bf1ed83b png card images structure folder remember that this cards need its values entered manually on the cp list py file image https user images githubusercontent com 25803231 211023744 f587976f be89 41a8 a798 619c8209d29e png images structure folder image https user images githubusercontent com 25803231 211023945 26312bae 9ac6 437e 82f8 18c1d9140d94 png bash run the bot python start py note in order to see the bot start you must install the game marvel snap bot on bluestacks and start it improvements there are plenty of improvements still here s a bunch of fields should be detected automatically the points of each field is not tracked it can be done with pytesseract the cards on the player field don t show the power detect the enemy cards on the field keep track of the winrate of a session license mit | adb opencv android automation python3 | ai |
puer | this project is planning to archive server x puer s rebirth next generation dev server although puer is still very easy to use in a simple development there is no doubt that it is out of date after investigating the flaws of puer and its competing products like lite server browser sync our team redesigned a new project named svrx server x https github com svrxjs svrx you can read documentation here https docs svrx io en https docs svrx io zh the biggest feature is the decentralized plug in architecture but also very easy to use happy to use it puer more than a live reload server built for efficient frontend development gitter https badges gitter im joinchat svg https gitter im leeluolee puer utm source badge utm medium badge utm campaign pr badge utm content badge http leeluolee github io 2014 10 24 use puer helpus developer frontend features 1 create static server at target dir default in current dir 2 auto reload editing css will update styles only other files will reload the whole page 3 weinre integrated use i options 4 proxy server mode use it with an existing server 5 http request mock by a addon the addon is also live reloaded 6 connect middleware support install npm g install puer usage command line in most cases bash cd path to your static dir puer puer step 1 http leeluolee github io attach 2014 10 puer step 1 gif puer will launch the browser for you all pages will reload when you edit them full options to list all of puer s options use puer h bash ubuntu 21 19 puer h usage puer options options p port port server s listen port 8000 default f filetype typelist filetype to watch split with default js css html xhtml d dir dir your customer working dir default current dir i inspect start weinre server and debug all puer page x exclude exclude file under watching must be a regexp default a mock file your mock s path t target url remote proxy server no reload close auto reload feature not recommended no launch close the auto launch feature allow cors allow cross origin resource sharing h help help list mock request during development you may need to mock a request use a addon to help you mock a dynamic api shell puer a route js a sample route js looks like javascript use addon to mock http request module exports get get v1 posts id function req res next response json format res send title title changed content tow post hahahah put post delete is the same put v1 posts id function post v1 posts function delete v1 posts id function check the usage record http leeluolee github io attach 2014 10 puer step 2 gif it is just a config for routers you need export an object containing router config the keys join with method and path and the values represent the callback this function is based on express http expressjs com s router you can check its documentation for more help example from above is just equal code in express like javascript app get v1 posts id function req res next response json format res send title title changed content tow post hahahah app put v1 posts id function app post v1 posts function app delete v1 posts id function once route js is changed puer will refresh it there is no need to restart puer the route js style function just like showed before you can use the express s response and request method string if the passin is a string puer will find the file first if file is not exsit will directly response with the origin string javascript get v1 posts id hello html object array will respone a json format javascript get v1 posts id message some message get v1 posts message some message proxy support you can use t or target to use puer with an exsiting server for example say you already have a server running at port 8020 javascript puer t http localhost 8020 check the record for proxy mode http leeluolee github io attach 2014 10 puer step 3 gif you can use addon with target for more powerful usage puer t http localhost 8020 a route js check the usage record http leeluolee github io attach 2014 10 puer step 4 gif use the builtin debugger through weinre type i to bootstrap the weinre the client script is injected for you in every page through puer click the nav to weinre terminal button or find the weinre server directly at port 9001 shell puer i check the usage record http leeluolee github io attach 2014 10 puer step 5 gif use as connect express middleware javascript var connect require connect var path require path var http require http var puer require puer var app connect var server http createserver app var options dir path to watch folder ignored node modules ignored file app use puer connect app server options use as puer connect middleware you must use puer middleware before route and static midleware before any middle may return text html app use connect static dirname server listen 8001 function console log listen on 8001 port you must use puer middleware before route and static middleware before any middle may return text html other a name client event a client event puer will inject a namespace puer in global it is a emitter instance has on off and emit you can register update event to control the reload logic javascript puer on update function ev console log ev path the absolute path the file change console log ev css whether css file is change if ev path match js ev stop true if you set ev stop true the reload will be stoped example above means that if js file is changed reloading won t be actived license mit | front_end |
|
wave | wave prettier ignore start build status build badge build version version badge package all contributors all contributors badge contributors prettier ignore end documentation our documentation site lives at https wave free now com https wave free now com you ll be able to find detailed documentation on getting started all of the components our theme our principles and more installation sh npm install freenow wave or sh yarn add freenow wave contributing thanks for being willing to contribute please read the contributing md contributing md doc have doubts please don t hesitate to let know us what are your doubts what can be improved or what is difficult for you to grasp from the documentation open issues to start the conversation contributors thanks goes to these wonderful people emoji key https allcontributors org docs en emoji key all contributors list start do not remove or modify this section prettier ignore start markdownlint disable table tbody tr td align center valign top width 14 28 a href https bitbucket org lopinopulos img src https avatars githubusercontent com u 1469636 v 4 s 100 width 100px alt nikolai lopin br sub b nikolai lopin b sub a br a href https github com freenowtech wave commits author nlopin title code a a href https github com freenowtech wave commits author nlopin title documentation a a href https github com freenowtech wave issues q author 3anlopin title bug reports a a href https github com freenowtech wave pulls q is 3apr reviewed by 3anlopin title reviewed pull requests a td td align center valign top width 14 28 a href http jonah ml img src https avatars githubusercontent com u 8927747 v 4 s 100 width 100px alt jonah m ller br sub b jonah m ller b sub a br a href https github com freenowtech wave commits author snapsnapturtle title code a a href https github com freenowtech wave commits author snapsnapturtle title documentation a a href https github com freenowtech wave issues q author 3asnapsnapturtle title bug reports a a href https github com freenowtech wave pulls q is 3apr reviewed by 3asnapsnapturtle title reviewed pull requests a td td align center valign top width 14 28 a href https lukahartwig de img src https avatars githubusercontent com u 7414521 v 4 s 100 width 100px alt luka hartwig br sub b luka hartwig b sub a br a href https github com freenowtech wave commits author lukahartwig title code a a href https github com freenowtech wave commits author lukahartwig title documentation a a href https github com freenowtech wave issues q author 3alukahartwig title bug reports a a href https github com freenowtech wave pulls q is 3apr reviewed by 3alukahartwig title reviewed pull requests a td td align center valign top width 14 28 a href http alexisduran com img src https avatars githubusercontent com u 1425162 v 4 s 100 width 100px alt alexis duran br sub b alexis duran b sub a br a href https github com freenowtech wave commits author duranmla title code a a href https github com freenowtech wave commits author duranmla title documentation a a href https github com freenowtech wave issues q author 3aduranmla title bug reports a a href https github com freenowtech wave pulls q is 3apr reviewed by 3aduranmla title reviewed pull requests a td td align center valign top width 14 28 a href http www leonardodivittorio com img src https avatars githubusercontent com u 12762609 v 4 s 100 width 100px alt leonardo br sub b leonardo b sub a br a href https github com freenowtech wave commits author div leo title code a a href https github com freenowtech wave commits author div leo title documentation a a href https github com freenowtech wave issues q author 3adiv leo title bug reports a a href https github com freenowtech wave pulls q is 3apr reviewed by 3adiv leo title reviewed pull requests a td td align center valign top width 14 28 a href http arturmiglio com img src https avatars githubusercontent com u 539801 v 4 s 100 width 100px alt artur miglio br sub b artur miglio b sub a br a href https github com freenowtech wave commits author arturmiglio title documentation a a href https github com freenowtech wave commits author arturmiglio title code a a href https github com freenowtech wave issues q author 3aarturmiglio title bug reports a td td align center valign top width 14 28 a href https github com phllipo img src https avatars githubusercontent com u 9133431 v 4 s 100 width 100px alt phillip barkmann br sub b phillip barkmann b sub a br a href https github com freenowtech wave commits author phllipo title code a td tr tr td align center valign top width 14 28 a href https github com lloydaf img src https avatars githubusercontent com u 5729666 v 4 s 100 width 100px alt lloyd francis br sub b lloyd francis b sub a br a href https github com freenowtech wave commits author lloydaf title documentation a a href https github com freenowtech wave commits author lloydaf title code a td td align center valign top width 14 28 a href https github com janhamara img src https avatars githubusercontent com u 14894844 v 4 s 100 width 100px alt jan hamara br sub b jan hamara b sub a br a href https github com freenowtech wave commits author janhamara title code a a href https github com freenowtech wave commits author janhamara title documentation a a href https github com freenowtech wave pulls q is 3apr reviewed by 3ajanhamara title reviewed pull requests a td td align center valign top width 14 28 a href https github com rafael sepeda img src https avatars githubusercontent com u 13061805 v 4 s 100 width 100px alt rafael sepeda br sub b rafael sepeda b sub a br a href design rafael sepeda title design a a href blog rafael sepeda title blogposts a td td align center valign top width 14 28 a href https github com martimalek img src https avatars githubusercontent com u 46452321 v 4 s 100 width 100px alt martimalek br sub b martimalek b sub a br a href https github com freenowtech wave commits author martimalek title code a a href https github com freenowtech wave commits author martimalek title documentation a a href https github com freenowtech wave issues q author 3amartimalek title bug reports a a href https github com freenowtech wave pulls q is 3apr reviewed by 3amartimalek title reviewed pull requests a td td align center valign top width 14 28 a href https github com alatielle img src https avatars githubusercontent com u 1089670 v 4 s 100 width 100px alt elena rashkovan br sub b elena rashkovan b sub a br a href https github com freenowtech wave commits author alatielle title code a td tr tbody table markdownlint restore prettier ignore end all contributors list end this project follows the all contributors https github com all contributors all contributors specification contributions of any kind welcome prettier ignore start all contributors badge https img shields io github all contributors freenowtech wave main style flat square build badge https img shields io github workflow status freenowtech wave component 20library logo github style flat square build https github com freenowtech wave actions query workflow 3alibrary version badge https img shields io npm v freenow wave svg style flat square package https www npmjs com package freenow wave prettier ignore end | design-system reactjs component-library react-components hacktoberfest | os |
Team_5_Cloud | h1 align center hi we are team 5 h1 h3 align center we would like to gladly present to you our last project for this course this repository will be a showcase and demo of your aggregated knowledge throughout the cloud engineering program so please hold your applause until the end h3 | cloud |
|
detecto | detecto logo assets logo words svg documentation status https readthedocs org projects detecto badge version latest https detecto readthedocs io en latest badge latest downloads https pepy tech badge detecto https pepy tech project detecto detecto is a python package that allows you to build fully functioning computer vision and object detection models with just 5 lines of code inference on still images and videos transfer learning on custom datasets and serialization of models to files are just a few of detecto s features detecto is also built on top of pytorch allowing an easy transfer of models between the two libraries the table below shows a few examples of detecto s performance still image video img src assets apple orange png alt detecto still image width 500px video demo of detecto assets demo gif installation to install detecto using pip run the following command pip3 install detecto installing with pip should download all of detecto s dependencies automatically however if an issue arises you can manually download the dependencies listed in the requirements txt requirements txt file usage the power of detecto comes from its simplicity and ease of use creating and running a pre trained faster r cnn resnet 50 fpn https pytorch org docs stable torchvision models html object detection instance segmentation and person keypoint detection from pytorch s model zoo takes 4 lines of code python from detecto core import model from detecto visualize import detect video model model initialize a pre trained model detect video model input video mp4 output avi run inference on a video below are several more examples of things you can do with detecto transfer learning on custom datasets most of the times you want a computer vision model that can detect custom objects with detecto you can train a model on a custom dataset with 5 lines of code python from detecto core import model dataset dataset dataset custom dataset load images and label data from the custom dataset folder model model dog cat rabbit train to predict dogs cats and rabbits model fit dataset model predict start using your trained model inference and visualization when using a model for inference detecto returns predictions in an easy to use format and provides several visualization tools python from detecto core import model from detecto import utils visualize model model image utils read image image jpg helper function to read in images labels boxes scores model predict image get all predictions on an image predictions model predict top image same as above but returns only the top predictions print labels boxes scores print predictions visualize show labeled image image boxes labels plot predictions on a single image images visualize plot prediction grid model images plot predictions on a list of images visualize detect video model input video mp4 output avi run inference on a video visualize detect live model run inference on a live webcam advanced usage if you want more control over how you train your model detecto lets you do just that python from detecto import core utils from torchvision import transforms import matplotlib pyplot as plt convert xml files to csv format utils xml to csv training labels train labels csv utils xml to csv validation labels val labels csv define custom transforms to apply to your dataset custom transforms transforms compose transforms topilimage transforms resize 800 transforms colorjitter saturation 0 3 transforms totensor utils normalize transform pass in a csv file instead of xml files for faster dataset initialization speeds dataset core dataset train labels csv images transform custom transforms val dataset core dataset val labels csv val images validation dataset for training create your own dataloader with custom options loader core dataloader dataset batch size 2 shuffle true use mobilenet instead of the default resnet model core model car truck boat plane model name fasterrcnn mobilenet v3 large fpn losses model fit loader val dataset epochs 15 learning rate 0 001 verbose true plt plot losses visualize loss throughout training plt show model save model weights pth save model to a file directly access underlying torchvision model for even more control torch model model get internal model print type torch model for more examples visit the docs https detecto readthedocs io which includes a quickstart https detecto readthedocs io en latest usage quickstart html tutorial alternatively check out the demo on colab https colab research google com drive 1isatv5f 7b4i2qqtjta7todpq2k8qee0 api documentation the full api documentation can be found at detecto readthedocs io https detecto readthedocs io en latest api index html the docs are split into three sections each corresponding to one of detecto s modules core the detecto core https detecto readthedocs io en latest api core html module contains the central classes of the package dataset dataloader and model these are used to read in a labeled dataset and train a functioning object detection model utils the detecto utils https detecto readthedocs io en latest api utils html module contains a variety of useful helper functions with it you can read in images convert xml files into csv files apply standard transforms to images and more visualize the detecto visualize https detecto readthedocs io en latest api visualize html module is used to display labeled images plot predictions and run object detection on videos contributing all issues and pull requests are welcome to run the code locally first fork the repository and then run the following commands on your computer bash git clone https github com your username detecto git cd detecto recommended to create a virtual environment before the next step pip3 install r requirements txt when adding code be sure to write unit tests and docstrings where necessary tests are located in detecto tests and can be run using pytest python3 m pytest note that some tests may fail due to them requiring a pretrained model file this file can be downloaded here https www dropbox com s kjalrfs3la97du8 model pth dl 1 and should be placed at detecto tests static model pth to generate the documentation locally run the following commands bash cd docs make html the documentation can then be viewed at docs build html index html contact detecto was created by alan bi https www alanbi com feel free to reach out on twitter https twitter com alankbi or through email mailto alan bi326 gmail com | object-detection machine-learning computer-vision python pytorch faster-rcnn | ai |
Intelligent-Customer-Help-Desk-with-Smart-Document-Understanding | intelligent customer help desk with smart document understanding category artificial intelligence 1 project summary in this project we use the typical customer care chatbot experience but instead of relying on pre determined responses the dialog provides a hook that can call out to other ibm watson services for additional sources of information in this case its ecobee3 userguide owner manual that has been uploaded to watson discovery 2 project requirements i create a customer care dialog skill in watson assistant ii smart document understanding sdu to build an enhanced watson discovery collection iii create an ibm cloud functions web action that allows watson assistant to post queries to watson discovery iv build a web application with integration to all these services and deploy the same on ibm cloud platform 3 functional requirements step 1 the data source from external is annotated by using watson discovery smart document understanding step 2 the user interacts with the back end server through the application user interface the front end app user interface is a chatbot that engages the user in a conversaton step 3 dialog between the user and back end server is coordinated using a watson assistant dialog skill step 4 if the user asks a question that falls outside of the scope of the pre determined question set then a search query is issued to the watson discovery service through a watson assistant search skill 4 technical requirements i write the code in github account and clone the repository ii create watson services iii configure watson discovery iv configure watson assistant v add watson service credentials to environment file vi run the application 5 software requirements ibm cloud node red flow based development tool for visual programming developed by ibm ibm watson services watson assistance watson discovery ibm cloud functions customer help desk with smart document understanding sdu 6 project deliverables requirement specification document initial briefing report user interface backend development set up of test system user training session project report 7 project team individual project name sumedha rana the link to my node red https nodeamyred eu gb mybluemix net ui the link to my youtube vedio https youtu be bqxfwnzys7u | server |
|
EE319k-Final-Lab | ee319k final lab the final game design lab for embedded systems created a tilt based maze with progressively harder levels | os |
|
OSDI | operating system design and implementation notes img src chapter sources titleosdi png align right weight 300 height 400 notes since aug 27 2020 br still on progress br i don t have copyright br pdf book download https github com angold 4 osdi raw master operating 20systems 20design 20and 20implementation 2c 203rd 20edition pdf just for learning br contents 1 introduction br 1 introduction to operating systerm https github com angold 4 osdi blob master chapter chapter1 1os md br 2 the history of oprating systerm early https github com angold 4 osdi blob master chapter chapter1 2hse md br 3 minix https github com angold 4 osdi blob master chapter chapter1 3minix md br 4 the fourth generation 1980 present personal computers https github com angold 4 osdi blob master chapter chapter1 4mcos md br 5 system calls 1 https github com angold 4 osdi blob master chapter chapter1 5syscall 1 md br 6 system calls 2 https github com angold 4 osdi blob master chapter chapter1 6syscall 2 md br 7 system calls 3 https github com angold 4 osdi blob master chapter chapter1 7syscall 3 md br 8 operating system structure https github com angold 4 osdi blob master chapter chapter1 8ostruc md br 9 problems for chapter 1 https github com angold 4 osdi blob master chapter chapter1 9exercises md br topic sigreturn oriented programming attack https github com angold 4 osdi blob master chapter chapter1 srop sropattack md 2 processes br 1 introduction to processes https github com angold 4 osdi blob master chapter chapter2 1introprogress md br 2 interprocess communication https github com angold 4 osdi blob master chapter chapter2 2communication md br 3 semaphore and ipc problems https github com angold 4 osdi blob master chapter chapter2 3semaphore md br 4 inside a hole clock tick https github com angold 4 osdi blob master chapter chapter2 4clocktick md br 5 process scheduler https github com angold 4 osdi blob master chapter chapter2 5scheduler md br 6 interrupt https github com angold 4 osdi blob master chapter chapter2 6interrupt md br 7 system task https github com angold 4 osdi blob master chapter chapter2 7systask md br 8 problems for chapter 2 https github com angold 4 osdi blob master chapter chapter2 8problems md br topic interrupt implementation in minix3 https github com angold 4 osdi blob master chapter chapter2 interruptii md br | operating-system | os |
mobile | git flow this project uses git flow https github com nvie gitflow here are the installation instructions https github com nvie gitflow wiki installation here are usage instruction http jeffkreeftmeijer com 2010 why arent you using git flow here s an explanation of the branching model http nvie com posts a successful git branching model git flow uses also it s best to to use git pull rebase when pulling the develop branch so as to keep commit history clean style conventions all mobile android code should follow the official android style guide http source android com source code style html installation note that the ar activity will not run on the emulator only on a phone with a camera it has only been tested on the school s dell venue running android 2 2 since integrating the ar activity into this app some additional steps are required to get the app to build and run from eclipse 1 you will need to install the android ndk http developer android com tools sdk ndk index html it is required to build the application more on building in step 5 2 eclipse settings if you don t already have the project open in an eclipse work space open a new workspace and import the project as existing android code 3 you need to add a new class path variable to the java build path open the eclipse preferences pane go to java build path classpath variables create a new classpath variable named qcar sdk root and point its path at the root of the repository i e mobile the build folder must be in this directory 4 modify the project properties select java build path from the left menu select the libraries pane click add external jar find qcar jar in the repository at mobile build java qcar qcar jar and click open to add the library now click on the order and export pane and ensure that qcar jar is there and checked the order also seems to matter i m not sure of the details but if you put the qcar jar above the project src and gen folders in this list it seems to work 5 go to the command line and navigate to the project directory unless you renamed something it should be mobile vichar run ndk build note you need to have added the android ndk directory to you environment path variable you can also set up eclipse to run ndk build to do this open the project properties and select builders from the left hand menu click the new button to create a new builder in the next window select program and click ok name the builder ndk build click browse file system to select a location navigate to where you installed the android ndk and select the ndk build program ndk build cmd on windows on a windows machine eclipse will get angry if there is a space in this location string since mine was in program files x86 i used the dos short name see here http en wikipedia org wiki program files to figure out what it is for your configuration 32 or 64 bit click browse workspace or browse file system and select the root project folder vichar as the working directory in the build options set the builder to run during a clean this way you can run project clean everytime you pull in the refresh tab you might want to set the builder to automatically refresh all resources just be sure to save before you pull and or project clean click ok to finish note you only need to re run ndk build if you have made changes to the native resources in the jni folder otherwise i believe that you can modify java code and still run the program 6 back in eclipse clean the project at this point you should be able to run the application on a phone | front_end |
|
django-soft-ui-design | django soft design https appseed us product soft ui design django open source django app https appseed us apps django crafted on top of soft ui an open source bootstrap 5 design from creative tim the product is designed to deliver the best possible user experience with highly customizable feature rich pages django soft design https appseed us product soft ui design django product page django soft design https django soft ui free appseed srv1 com live demo django soft design pro https appseed us product soft ui design pro django premium version br features up to date dependencies theme django theme soft design https github com app generator django theme soft design designed by creative tim https www creative tim com product soft ui design system affiliate 128200 authentication django contrib auth registration deployment ci cd flow via render br soft ui design full stack starter generated by appseed https user images githubusercontent com 51070104 168812602 e35bad42 823f 4d3e 9d13 87a6c06c5a63 png br manual build download the code bash git clone https github com app generator django soft ui design git cd django soft ui design br install modules via venv bash virtualenv env source env bin activate pip install r requirements txt br set up database bash python manage py makemigrations python manage py migrate br create the superuser bash python manage py createsuperuser br start the app bash python manage py runserver at this point the app runs at http 127 0 0 1 8000 br codebase structure the project is coded using a simple and intuitive structure presented below bash project root core settings py project configuration urls py project routing home views py app views urls py app routing models py app models tests py tests templates theme customisation pages custom index html custom footer requirements txt project dependencies env sample env configuration default values manage py start the app django default start script br how to customize when a template file is loaded django scans all template directories starting from the ones defined by the user and returns the first match or an error in case the template is not found the theme used to style this starter provides the following files bash this exists in env lib theme soft design ui library root templates root templates folder accounts sign in html sign in page sign up html sign up page includes footer html footer component navigation html navigation bar scripts html scripts component layouts base html masterpage pages index html dashboard page author html profile page html all other pages when the project requires customization we need to copy the original file that needs an update from the virtual environment and place it in the template folder using the same path for instance if we want to customize the index html these are the steps step 1 create the templates directory inside the home app step 2 configure the project to use this new template directory core settings py templates section step 3 copy the index html from the original location inside your env and save it to the home templates dir source path your env lib theme soft design template pages index html destination path project root home templates pages index html to speed up all these steps the codebase is already configured steps 1 and 2 and a custom index can be found at this location home templates pages custom index html by default this file is unused because the theme expects index html without the custom prefix in order to use it simply rename it to index html like this the default version shipped in the library is ignored by django in a similar way all other files and components can be customized easily br deploy on render https render com create a blueprint instance go to https dashboard render com blueprints this link click new blueprint instance button connect your repo which you want to deploy fill the service group name and click on update existing resources button after that your deployment will start automatically at this point the product should be live br pro version https appseed us product soft ui design pro django material kit 2 is a premium design crafted by the creative tim agency on top of bootstrap 5 framework designed for those who like bold elements and beautiful websites material kit 2 is made of hundreds of elements designed blocks and fully coded pages built with an impressive level of quality django soft design pro https appseed us product soft ui design pro django product page enhanced ui more pages and components priority on support br soft ui design pro starter generated by appseed https user images githubusercontent com 51070104 168812715 52e036b7 582d 4851 9657 6b1f99727619 png br django soft design https appseed us product soft ui design django django starter provided by appseed https appseed us | soft-ui-desing-system django-application django-bootstrap4 python-app soft-design soft-design-system django-and-bootstrap django-bootstrap-design django-bs5-design django-soft-design soft-design-sample | os |
deeplearning4nlp-tutorial | deep learning for nlp tutorial hands on tutorial on deep learning with a special focus on natural language processing nlp this git repository accompanies the ukp lectures and seminars on deep learning for natural language processing in contrast to other tutorials this tutorial focuses on the usage of deep learning methods deep learning for nlp seminar july 2017 in july 2017 i updated the slides to the most recent python and keras version the slides as well as the source code is available in the folder 2017 07 seminar 2017 07 seminar the code in the folder can be run with either python 2 7 or python 3 6 with keras 2 0 5 and with theano 0 9 0 or tensorflow 1 2 1 as backend four different deep learning models for nlp are covered in the folder 1 feed forward architecture for sequence classification e g pos ner chunking 2 convolutional neural network for sentence text classification e g sentiment classification 3 convolutional neural network for relation extraction e g semantic relation extration 4 long short term memory lstm networks for sequence classificaiton deep learning for nlp seminar nov 2016 in november 2016 i gave a seminar at the university of duisburg essen the slides as well as the source code is available in the folder 2016 11 seminar 2016 11 seminar in the seminar i use python 2 7 theano 0 8 2 and keras 1 1 1 to model four different deep learning models for nlp 1 feed forward architecture for sequence classification e g pos ner chunking 2 convolutional neural network for sentence text classification e g sentiment classification 3 convolutional neural network for relation extraction e g semantic relation extration 4 long short term memory lstm networks for sequence classificaiton deep learning for nlp lecture oct 2015 in october 2015 i gave a lecture for the ukp department at the technical university of darmstadt the lecture is structured in six parts and covers the basics about deep learning in the lecture i use python 2 7 theano 0 6 0 and keras 0 3 0 to model different applications of deep learning for nlp the slides the source code and video recordings are available in the folder 2015 10 lecture 2015 10 lecture contact contact person nils reimers reimers ukp informatik tu darmstadt de http www ukp tu darmstadt de http www tu darmstadt de don t hesitate to send us an e mail or report an issue if something is broken and it shouldn t be or if you have further questions this repository contains experimental software and is published for the sole purpose of supporting the lectures | ai |
|
Online-Shop | online shop shopaholic is an online shopping application that is developed using dart and flutter on visual studio code it allows you to make an account scroll through tons of clothes in many categories to buy whatever you like by just clicking the add to cart button and even add clothes to the application to put them for sale the database for this application is made using firebase authentication for user s accounts and firebase cloud firestore for storing the products that are available for sale and their details further details can be found in the documentation pdf | dart firebase firebase-database flutter online-shop | server |
Smartbrain_app_backend | smartbrain app backend this is the backend development of smartbrain app use npm start to run the app at port 3001 if the frontend is running at 3000 | server |
|
Predicting-Visitor-Purchases-Using-ML | overview bigquery ml bigquery machine learning is a feature in bigquery where data analysts can create train evaluate and predict with machine learning models with minimal coding the google analytics sample ecommerce dataset that has millions of google analytics records for the google merchandise store loaded into bigquery in this lab i will use this data to run some typical queries that businesses would want to know about their customers purchasing habits objectives in this lab the following tasks should be performed loading data into bigquery from a public dataset querying and exploring the e commerce data set creating a training and evaluation data set to be used for batch prediction creating a classification and logistic regression model in bigquery ml evaluating the performance of created machine learning model and predicting and ranking the probability that a visitor will make a purchase results of the top 6 of first time visitors sorted in decreasing order of predicted probability more than 6 make a purchase in a later visit these users represent nearly 50 of all first time visitors who make a purchase in a later visit overall only 0 7 of first time visitors make a purchase in a later visit targeting the top 6 of first time increases marketing roi by 9x vs targeting them all | cloud |
|
TeamHealthcareSMU_GCP | teamhealthcaresmu gcp code repository for data engineering pipelines for breast cancer biopsy and genomic data for the purposes of michael deepti and lei s cloud computing project the html file for rfclassifier was downloaded directly from google cloud datalab | cloud |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.