names
stringlengths 1
98
| readmes
stringlengths 8
608k
| topics
stringlengths 0
442
| labels
stringclasses 6
values |
---|---|---|---|
rtos-mp3-project | rtos mp3 project real time embedded system co design | os |
|
restful-admin | restful admin introduction this is front end project for fastapi admin https github com long2ice fastapi admin and forked from rest admin https github com wxs77577 rest admin live demo check a live demo here https fastapi admin long2ice cn https fastapi admin long2ice cn username admin password 123456 data in database will restore every day screenshots login https github com long2ice restful admin raw master screenshots login png list https github com long2ice restful admin raw master screenshots list png view https github com long2ice restful admin raw master screenshots view png create https github com long2ice restful admin raw master screenshots create png run local 1 git clone https github com long2ice restful admin git 2 cp env example env and update 3 yarn yarn serve log done compiled successfully in 5051ms 5 05 48 pm app running at local http localhost 8080 network http 192 168 10 23 8080 note that the development build is not optimized to create a production build run yarn build rest api see fastapi admin https github com long2ice fastapi admin for reference deployment 1 yarn build 2 copy dist to your server and deployment by nginx thanksto rest admin https github com wxs77577 rest admin restful admin dashboard based on vue and boostrap 4 license this project is licensed under the mit https github com long2ice restful admin blob master license license | admin vue dashboard | front_end |
VisualMetaphors | i spy a metaphor large language models and diffusion models co create visual metaphors you can read more about our work in the paper available on arxiv https arxiv org pdf 2305 14724 pdf dataset our data for this project is available for download from the following link dataset on zenodo https zenodo org record 8011133 code to create the visual elaboration is in fewshotprompt py contact feel free to contact us via email tuhin chakrabarty tuhin chakr cs columbia edu arkadiy saakyan a saakyan cs columbia edu | ai |
|
ArxivDigest | p align center img src readme images banner png width 500 p arxiv digest and personalized recommendations using large language models this repo aims to provide a better daily digest for newly published arxiv papers based on your own research interests and natural language descriptions using relevancy ratings from gpt you can try it out on hugging face https huggingface co spaces autollm arxivdigest using your own openai api key you can also create a daily subscription pipeline to email you the results contents what this repo does what this repo does examples some examples usage usage running as a github action using sendgrid recommended running as a github action using sendgrid recommended running as a github action with smtp credentials running as a github action with smtp credentials running as a github action without emails running as a github action without emails running from the command line running from the command line running with a user interface running with a user interface roadmap roadmap extending and contributing extending and contributing what this repo does staying up to date on arxiv https arxiv org papers can take a considerable amount of time with on the order of hundreds of new papers each day to filter through there is an official daily digest service https info arxiv org help subscribe html however large categories like cs ai https arxiv org list cs ai recent still have 50 100 papers a day determining if these papers are relevant and important to you means reading through the title and abstract which is time consuming this repository offers a method to curate a daily digest sorted by relevance using large language models these models are conditioned based on your personal research interests which are described in natural language you modify the configuration file config yaml with an arxiv subject some set of categories and a natural language statement about the type of papers you are interested in the code pulls all the abstracts for papers in those categories and ranks how relevant they are to your interest on a scale of 1 10 using gpt 3 5 turbo 16k the code then emits an html digest listing all the relevant papers and optionally emails it to you using sendgrid https sendgrid com you will need to have a sendgrid account with an api key for this functionality to work testing it out with hugging face we provide a demo at https huggingface co spaces autollm arxivdigest https huggingface co spaces autollm arxivdigest simply enter your openai api key https platform openai com account api keys and then fill in the configuration on the right note that we do not store your key hfexample readme images hf example png you can also send yourself an email of the digest by creating a sendgrid account and api key https app sendgrid com settings api keys some examples of results digest configuration subject topic computer science categories artificial intelligence computation and language interest large language model pretraining and finetunings multimodal machine learning do not care about specific application for example information extraction summarization etc not interested in paper focus on specific languages e g arabic chinese etc result p align left img src readme images example 1 png width 580 p digest configuration subject topic quantitative finance interest making lots of money result p align left img src readme images example 2 png width 580 p usage running as a github action using sendgrid recommended the recommended way to get started using this repository is to 1 fork the repository 2 modify config yaml and merge the changes into your main branch 3 set the following secrets under settings secrets and variables repository secrets https docs github com en actions security guides encrypted secrets creating encrypted secrets for a repository see advanced usage advanced usage md create and fetch your api keys for more details on how to create and get openai and sendgrid api keys openai api key from openai https platform openai com account api keys sendgrid api key from sendgrid https app sendgrid com settings api keys from email this value must match the email you used to create the sendgrid api key to email 4 manually trigger the action or wait until the scheduled action takes place see advanced usage advanced usage md for more details including step by step images further customization and alternate usage running with a user interface to locally run the same ui as the huggign face space 1 install the requirements in src requirements txt as well as gradio 2 run python src app py and go to the local url from there you will be able to preview the papers from today as well as the generated digests 3 if you want to use a env file for your secrets you can copy env template to env and then set the environment variables in env note these file may be hidden by default in some operating systems due to the dot prefix the env file is one of the files in gitignore so git does not track it and it will not be uploaded to the repository do not edit the original env template with your keys or your email address since template env is tracked by git and editing it might cause you to commit your secrets warning do not edit and commit your env template with your personal keys or email address doing so may expose these to the world roadmap x support personalized paper recommendation using llm x send emails for daily digest implement a ranking factor to prioritize content from specific authors support open source models e g llama vicuna mpt etc fine tune an open source model to better support paper ranking and stay updated with the latest research concepts extending and contributing you may and are encourage to modify the code in this repository to suit your personal needs if you think your modifications would be in any way useful to others please submit a pull request these types of modifications include things like changes to the prompt different language models or additional ways for the digest is delivered to you | ai arxiv arxiv-papers chatgpt llm gpt gpt4 ml | ai |
RPIPicoFreeRTOSSMPExp | rpipicofreertossmpexp this is a freertos kernel smp https www freertos org symmetric multiprocessing introduction html example for the raspberry pi pico https www raspberrypi com products raspberry pi pico this example project is used on my video tutorial on youtube https youtu be nd8xewjn 2w dependencies freertos kernel https github com freertos freertos kernel version v202110 00 smp pico sdk schematic a schematic for the project and components can be found in the schematic folder 1x green led 1x blue led 4x red led 6x 75ohm resistors rpi pico breadboard micro usb cable cloning repository git clone recurse submodules https github com jondurrant rpipicofreertossmpexp build process cd rpipicofreertossmpexp mkdir build cd build cmake make copy binary to the pico further freertos kernel examples udemy course freertos on raspberry pi pico https www udemy com course freertos on rpi pico referralcode c5a9a19c93919a9da294 | os |
|
mtcnn_facenet_tensorRT | mtcnn facenet tensorrt on nvidia jetson nano using tensorrt recognize multiple faces this is the final project task of embedded system design of skku github address of reference source https github com davidsandberg facenet https github com nwesem mtcnn facenet cpp tensorrt project result image https user images githubusercontent com 53935607 228688502 71634813 e2ad 49bf 8a0c ff3e2bc9a559 png image https user images githubusercontent com 53935607 228688415 37ef66c0 fe60 46c9 acc1 42afb4369957 png face recognition for nvidia jetson nano using tensorrt face recognition with google facenet https arxiv org abs 1503 03832 architecture and retrained model by david sandberg github com davidsandberg facenet https github com davidsandberg facenet using tensorrt and opencv br this project is based on the implementation of l2norm helper functions which are needed in the output layer of the facenet model link to the repo github com r7vme tensorrt l2norm helper https github com r7vme tensorrt l2norm helper br moreover this project uses an adapted version of pkuzhou s implementation https github com pkuzhou mtcnn facedetection tensorrt of the mtcnn for face detection more info below hardware nvidia jetson nano raspberry pi v2 camera if you want to use a usb camera instead of raspi camera set the boolean iscsicam to false in main cpp src main cpp dependencies cuda 10 2 cudnn 8 0 br tensorrt 7 x br opencv 4 1 1 br tensorflow r1 14 for python to convert model from pb to uff update this master branch now uses jetpack 4 4 so dependencies have slightly changed and tensorflow is not preinstalled anymore so there is an extra step that takes a few minutes more than before br in case you would like to use older versions of jetpack there is a tag jp4 2 2 that can links to the older implementation installation 1 install cuda cudnn tensorrt and tensorflow for python you can check nvidia website https developer nvidia com for help installation procedures are very well documented br br if you are using nvidia jetson nano tx1 2 xavier with jetpack 4 4 most needed packages should be installed if the jetson was correctly flashed using sdk manager or the sd card image you will only need to install cmake openblas and tensorflow bash sudo apt install cmake libopenblas dev 2 install tensorflow the following shows the steps to install tensorflow for jetpack 4 4 this was copied from the official nvidia documentation https docs nvidia com deeplearning frameworks install tf jetson platform index html i m assuming you don t need to install it in a virtual environment if yes please refer to the documentation linked above if you are not installing this on a jetson please refer to the official tensorflow documentation bash install system packages required by tensorflow sudo apt update sudo apt install libhdf5 serial dev hdf5 tools libhdf5 dev zlib1g dev zip libjpeg8 dev liblapack dev libblas dev gfortran install and upgrade pip3 sudo apt install python3 pip sudo pip3 install u pip testresources setuptools install the python package dependencies sudo pip3 install u numpy 1 16 1 future 0 18 2 mock 3 0 5 h5py 2 10 0 keras preprocessing 1 1 1 keras applications 1 0 8 gast 0 2 2 futures protobuf pybind11 install tensorflow using the pip3 command this command will install the latest version of tensorflow compatible with jetpack 4 4 sudo pip3 install pre extra index url https developer download nvidia com compute redist jp v44 tensorflow 2 3 prune and freeze tensorflow model or get frozen model in the link the inputs to the original model are an input tensor consisting of a single or multiple faces and a phase train tensor telling all batch normalisation layers that model is not in train mode batch normalisation uses a switch layer to decide if the model is currently trained or just used for inference this switch layer cannot be processed in tensorrt which is why it needs to be removed apparently this can be done using freeze graph from tensorflow but here is a link to model where the phase train tensor has already been removed from the saved model github com apollo time facenet raw master model resnet facenet pb https github com apollo time facenet raw master model resnet facenet pb 4 convert frozen protobuf pb model to uff use the convert to uff tool which is installed with tensorflow installation to convert the pb model to uff the script will replace unsupported layers with custom layers implemented by github com r7vme tensorrt l2norm helper https github com r7vme tensorrt l2norm helper please check the file for the user defined values and update them if needed do not worry if there are a few warnings about the trt l2norm helper plugin bash cd path to project python3 step01 pb to uff py you should now have a facenet uff file in the facenetmodels folder facenetmodels which will be used as the input model to tensorrt br 4 get mtcnn models this repo uses an implementation by pkuzhou https github com pkuzhou mtcnn facedetection tensorrt of the multi task cascaded convolutional neural network mtcnn https arxiv org pdf 1604 02878 pdf for face detection the original implementation was adapted to return the bounding boxes such that it can be used as input to my facenet tensorrt implementation you will need all models from the repo in the mtcnnmodels mtcnnmodels folder so please do this to download them bash go to one above project cd path to project clone pkuzhous repo git clone https github com pkuzhou mtcnn facedetection tensorrt and move models into mtcnnmodels folder mv mtcnn facedetection tensorrt det path to project mtcnnmodels after doing so you should have the following files in your mtcnnmodels mtcnnmodels folder br det1 relu caffemodel det1 relu prototxt det2 relu caffemodel det2 relu prototxt det3 relu caffemodel det3 relu prototxt readme md done you are ready to build the project 5 build the project bash mkdir build cd build cmake dcmake build type release make j nproc if not run on jetson platform set the path to your cuda and tensorrt installation using dcuda toolkit rootdir path to cuda and dtensorrt root path to tensorrt note uff and engine files are gpu specific so if you use want to run this project on a different gpu or on another machine always start over at step 3 above usage put images of people in the imgs folder please only use images that contain one face br new feature you can now add faces while the algorithm is running when you see the opencv gui press n on your keyboard to add a new face the camera input will stop until you have opened your terminal and put in the name of the person you want to add bash mtcnn facenet cpp tensorrt press q to quit and to show the stats fps note this step might take a while when done the first time tensorrt now parses and serializes the model from uff to a runtime engine engine file performance performance on nvidia jetson nano 60ms 20ms for face detection using mtcnn 22ms 2ms per face for facenet inference total 15fps performance on nvidia jetson agx xavier 40ms 20ms for mtcnn 9ms 1ms per face for inference of facenet total 22fps license please respect all licenses of opencv and the data the machine learning models mtcnn and google facenet were trained on faq sometimes the camera driver doesn t close properly that means you will have to restart the nvargus daemon bash sudo systemctl restart nvargus daemon info niclas wesemann br niclaswesemann gmail com mailto niclas wesemann gmail com br | os |
|
Flatiron-OO-Ruby | flatiron oo ruby flatiron full stack web development curriculum amp labs object oriented ruby overview we ll introduce the concept of object oriented programming oop object oriented programming oop an object oriented approach to application development makes programs more intuitive to design faster to develop more amenable to modification and easier to understand object oriented programming with objective c apple inc it s natural to wonder how can a string of ones and zeroes be referred to as an object the use of the word object is an abstraction of thought an object in code has no more physical form than does a word in any human language sure words have physical representations speaking a word causes air to vibrate in a sound wave ink on a page can be shaped into symbols that represent the word a meaning can be pointed at or mimed out but none of these are the word itself human language is a system of abstraction it communicates the idea of a thing but not the thing itself as humans we re constantly faced with myriad facts and impressions that we must make sense of to do so we must abstract underlying structure away from surface details and discover the fundamental relations at work abstractions reveal causes and effects expose patterns and frameworks and separate what s important from what s not object orientation provides an abstraction of the data on which you operate moreover it provides a concrete grouping between the data and the operations you can perform with the data in effect giving the data behavior a code object representing a water pipe instead of a smoking pipe might contain values for length diameter material and manufacturer the bundling of these individual pieces of information together begins to form a larger whole object oriented programming however does more than just bundle up individual pieces of data that represent a thing it also bundles customized functions that can be performed on that data these are called methods behaviors that an object performs upon its internal data and even upon other code objects an object in code is a thing with all the data and all the logic required to complete a task objects are models and metaphors for the problems we solve in code object oriented programming was born from the trend of making digital lives reflect our real lives in the 1970 s adele goldberg and alan kay developed an object oriented language at xerox parc called smalltalk which was used in the first personal computer ruby comes with a few types of objects to get us started things like integer string array etc we call these base types of objects primitives but what if we wanted to create a new type in our programming universe a new kind of object for our code that s what the class keyword and object orientation allows us to do | object-oriented classes instance methods objects initialize modules inheritance gems scraping | front_end |
Deep_Learning_in_LangTech_course | deep learning in human language technology course materials for the university of turku course tko 8965 deep learning in human language technology previously named tko 2101 natural language processing utu moodle page https moodle utu fi course view php id 18315 feed forward nns and the bow model bag of words text classification with neural networks on the lectures we work our way through basic neural network models their training and application to classification bag of words text classification notebook bow classifier ipynb classifier word vector analysis notebook bow classifier features ipynb word embeddings notebook word embeddings ipynb bow classifier with pretrained word embeddings notebook bow classifier embeddings simpler ipynb convolutional neural networks convolutional neural networks and their use in natural language processing convolutional neural networks slides slides convolutional neural networks pptx sequence to label with cnns notebook seq2label conv ipynb cnn filter interpretation notebook cnn filters ipynb pytorch cnn model notebook cnn model pytorch ipynb recurrent neural networks introduction to recurrent neural networks and applications to various nlp tasks recurrent neural networks slides slides recurrent neural networks pdf long short term memory slides slides long short term memory pdf text classification with recurrent neural networks notebook rnn text classification ipynb text generation with recurrent neural networks notebook rnn text generation ipynb named entity recognition with recurrent neural networks notebook rnn named entity recognition ipynb sequence to sequence and attention encoder decorer and sequence to sequence architectures and introduction to neural attention sequence to sequence and neural attention slides slides sequence to sequence and attention pdf sequence to sequence date normalization notebook seq2seq dates ipynb sequence to sequence english to katakana translation notebook seq2seq katakana ipynb neural machine translation with attention tensorflow tutorial https www tensorflow org tutorials text nmt with attention transformer and transfer learning self attention transformer model and deep transfer learning transformer and transfer learning slides slides transformer and transfer learning pdf deep neural language models slides slides deep neural language models pdf text classification with bert notebook bert text classification ipynb sequence labeling with bert notebook bert sequence labeling ipynb applications and evaluation nlp applications of neural networks and evaluation of nn models sequence to sequence applications notebook seq2seq applications ipynb crosslingual sentence representations slides slides crosslingual sentence representations pdf laser and bert embeddings notebook laser ipynb inference as benchmark slides slides inference as benchmark pdf additional information on evaluation and paraphrase datasets paraphrase corpora pdf | ai |
|
modern-web-scala | modern web development with scala this is the companion code repository for the modern web development with scala https leanpub com modern web development with scala book published by leanpub staring point the application built using denisftw play scala web starter g8 https github com denisftw play scala web starter g8 giter8 template marks the initial state of a model application that is used throughout the book for demonstrating different aspects of web development in particular the book illustrates the following querying remote services with the wsclient using macwire https github com adamw macwire for injecting dependencies using scalikejdbc http scalikejdbc org for accessing postgresql integrating play with react https facebook github io react and webpack https webpack github io working with json and using play forms api using akka http akka io actors using http filters and action composition in play testing with scalatest http http www scalatest org and scalamock https scalamock org versions used the code uses play 2 8 and scala 2 13 along with sbt 1 4 | front_end |
|
Uber-Data-Engineering-Project-with-GCP-Modern-Tools | uber data analytics project this project aims to analyze the uber dataset using various data engineering and analytics techniques the project utilizes google cloud platform gcp services and modern tools to process transform and visualize the data project steps 1 create bucket create a storage bucket in gcp to store the project files 2 create instance set up a gcp compute instance with the following specifications cpu e2 standard 16 8 core cpu 64 gb ram 3 ssh connection establish an ssh connection to the gcp compute instance 4 run command execute the commands provided in the command txt file from the github repository to install python3 and the required libraries including pandas maze ai google cloud sdk 5 start maze ai project launch the maze ai project on port 6789 and access it through the external ip provided by the gcp instance 6 update firewall rule modify the firewall rule to allow access to the maze ai dashboard via the external ip and port 7 create data loader pipeline set up a data loading pipeline to import the uber dataset into the project 8 create data transformer pipeline develop a data transformation pipeline using a generic template handle any errors or kernel overloads that may occur during the transformation process 9 connect to dataexporter connect the data transformation pipeline to the dataexporter module to export the transformed data to google bigquery 10 configure io config yaml access gcp open api services and create a new service account download the service account key in json format and copy the json data into the io config yaml file this step ensures secure access to gcp services 11 complete data loader pipeline after completing the third pipeline navigate to bigquery and refresh the data to preview the connected data 12 data visualization utilize looker or data studio to create visualizations and analyze the uber dataset hashnode blog post check out my detailed blog post on this project on hashnode https sohampatra hashnode dev building an uber data engineering project with gcp and modern tools project dependencies python 3 pandas maze ai google cloud sdk google cloud bigquery license this project is licensed under the mit license license | cloud |
|
WebDev-Projects | webdev projects it is a repository for contribution of only web development projects for hacktoberfest | hacktoberfest hactoberfest2021 html css javascript webdevelopment hacktoberfest2022 | front_end |
vision | assignment 1 colony counting http cs colby edu courses f07 cs397 labs lab02 due 2 october 2007 overview for this assignment we will be designing an algorithm for counting colonies on an agar plate the colonies are of different sizes mostly round mostly separated on the plate and reasonably homogeneous in color setup download the data files from here note that some of the images are taken with the agar plate facing down some with it facing up you don t need to be able to process both pick an orientation and stick with it on the down facing plates the colonies will tend to look flatter than they do on the up facing places you will also want to download the files segment c provides a connected components algorithm vision h provides a few useful data types put the c file in your lib directory and add it to the list of files in the makefile there put the h file in your include directory you will also want to add an include statement to your programs to include vision h in them feel free to modify vision h to incorporate any work that you do procedure the colony pictures for this assignment were taken on a red background using a suitable thresholding method separate the colonies from the background thresholds on the green and blue channels will likely work well enough the output of this step should be a binary mask represent the mask as an array image of unsigned chars you can write out the mask to a file using the writepgm function if you use 0 for the background and 255 for the foreground then you will be able to visualize the mask easily write the following functions each should take as an argument an unsigned char array input image the number of rows the number of columns and an unsigned char array output image to hold the output grow executes either 4 or 8 connected growing your choice if you want you can add a parameter to indicate the number of grow operations to execute shrink executes either 4 or 8 connected shrinking should be the opposite of your choice for growing if you wish add a parameter to indicate the number of shrinking operations to execute median executes a median filter on the input image | ai |
|
Human-Computer-Interaction | human computer interaction human computer interaction project | os |
|
CapybaraServer | capybaraserver engineering rodents in the cloud | cloud |
|
manipalplacements | manipalplacements an open sourced campus placement catalogue web application to keep a track of all the companies visiting your campus needs to be set up and managed by an admin logo designs https user images githubusercontent com 27415791 155783685 a3bb2380 cce7 4690 aac5 063569398906 jpg tech stack and dependencies this application is made on the following stack nextjs with typescript ant design tailwind css scss figma for designs firebase for push notifications mongodb with mongoose localforage g scripts google forms for inputting data a href https ant design title ant design img src https github com get icon geticon raw master icons ant design svg alt ant design width 50px height 50px a a href https nextjs org title next js img src https github com get icon geticon raw master icons nextjs icon svg alt next js width 50px height 50px a a href https tailwindcss com title tailwind css img src https github com get icon geticon raw master icons tailwindcss icon svg alt tailwind css width 50px height 50px a a href https sass lang com title sass img src https github com get icon geticon raw master icons sass svg alt sass width 50px height 50px a a href https www firebase com title firebase img src https github com get icon geticon raw master icons firebase svg alt firebase width 50px height 50px a a href https www mongodb org title mongodb img src https github com get icon geticon raw master icons mongodb icon svg alt mongodb width 50px height 50px a a href https github com localforage localforage title localforage img src https avatars githubusercontent com u 18673496 s 200 v 4 alt localforage width 50px height 50px a a href https developers google com apps script title google app scripts img src https www gstatic com images branding product 2x hh apps script 512dp png alt google app scripts width 50px height 50px a installation refer to the example env local https github com canarygrapher manipalplacements blob main example env local file for setting up env local for your project firebase details are not required if you do not want to include push notification feature this is not required for local setup 1 clone the repo on your local system 2 install the dependencies using npm install or yarn install commands in the the root of your project make sure that you are not using yarn version 2 x x or berry because as of september 12 2021 typescript does not support plug n play 3 make sure that you have set up the environment variables properly in the env local file in the root directory 4 run the project using the npm run dev guide to contribution take a look at the issues https github com canarygrapher manipalplacements issues this guide will be updated soon i am currently focusing on finishing the designs first designing view the upcoming designs for the upcoming designs here mail the creator for getting edit access to these designs https www figma com file upok6515alcjwrguk7pt8c manipal placement node id 0 3a1 preview web capture 26 2 2022 1239 manipalplacements vercel app https user images githubusercontent com 27415791 155782348 09f70505 2731 4554 a8f2 3c290677154e jpeg | server |
|
MLOPs | mlops banner https github com bismuth consultancy bv mlops assets 115803478 abc396a5 be3c 4bee bdb9 a3721a829dcd houdini mlops 2 0 free and open source machine learning plugin for houdini developed by ambrosiussen holding and entagma https entagma com licensed and distributed by bismuth consultancy b v https www bismuthconsultancy com by downloading or using the plugin or any of its contents you are agreeing to the license found in this repository and terms of service of bismuth consultancy b v https www bismuthconsultancy com s en terms and conditions f5sk pdf paul ambrosiussen entagma discord https img shields io badge twitter 230077b5 svg style for the badge logo twitter https twitter com ambrosiussen p https img shields io badge twitter 230077b5 svg style for the badge logo twitter https twitter com entagma https img shields io badge discord 230077b5 svg style for the badge logo discord logocolor white https discord gg rkr5snzjtm https img shields io badge linkedin 230077b5 svg style for the badge logo linkedin https www linkedin com in paulambrosiussen promo video a href http www youtube com watch feature player embedded v izlicmtbyug target blank img src http img youtube com vi izlicmtbyug 0 jpg alt 2 0 release promo width 640 height 480 border 0 a installing for houdini to install the plugin for the first time follow these steps 1 clone this repository and make note of the directory you have cloned it to 2 copy the mlops json file found in the repository root and paste it in the houdini user pref dir packages folder 3 edit the mlops json file you just pasted and modify the mlops path found inside set the path to where you cloned the repository to in step one 4 install git follow the instructions for your relevant os here https github com git guides install git 5 launch houdini and open the mlops shelf click the install dependencies shelf button restart houdini once complete 6 after restarting houdini open the mlops shelf click the download model button optionally change the model name parameter to a custom model or just leave as is and hit download to work with the default stable diffusion model 7 in the mlops nodes use the dropdown on the type model parameters to select a downloaded model to use you can also provide a repo name from the huggingface library https huggingface co models pipeline tag text to image sort downloads and the nodes will download it for you for example runwayml stable diffusion v1 5 downloading models by default mlops sd model is the path to a single model used by all stable diffusion nodes by default you can set this to be your preferred default model by default the plugin will cache all downloaded models to the folder specified by mlops models notice the s at the end this will make them show up in the dropdowns for the model paths on the nodes both of the above varibles can be changed in the mlops json to suit your preference troubleshooting if you get an error saying torch not compiled with cuda enabled uninstall pytorch in your system python restart your pc and hit the install dependencies shelf button again metal users should set the compute device on mlops nodes to mps and set the following environment variable for it to work pytorch enable mps fallback 1 if you get an error with could not load library cudnn cnn infer64 8 dll and you have octane installed as a plugin for houdini try disabling it and restart houdini if you get an error similar to unexpected self size 1 must be divisible by 4 to view byte as float different element sizes but got 2683502 class runtimeerror try deleting the model cache in mlops models cache your model and try again this is likely caused by a corrupt cache other plugins we know cause issues installing mlops dependencies renderman octane disable these while installing mlops and its dependencies after installing mlops and its dependencies you can re enable them if you get strange python errors and you have tried several things already make sure you dont have a conflicting pythonpath environment variable set if that is the case remove it and restart houdini and the launcher if you use it notes we have provided a basic example file in this repo you can find it in the hip folder this plugin installs quite a few dependencies you can find them in requirements txt digital assets hdas are stored and distributed in the expanded format you can use hotl https www sidefx com docs houdini ref utils hotl html to collapse them if need be | ai |
|
Insilico-Drug-Discovery-using-Deep-Learning | insilico drug discovery using deep learning table explaining different models and the respective architectures involved updated on october 2020 notebook name architecture model specifics library de novo drug synthesis using recurrent neural networks ipynb lstms 3 layers keras qbmg ipynb grus 3 layers pytorch drugagentrl ipynb agent prior have same structrure clone of qbmg grus 3 layers pytorch smiles transformer ipynb transformer 4 layers of encoder decoder pytorch molecular finerprint mat ipynb molecule attention transformer transformer modified to take adjacency distance matrix pytorch main ipynb molecule attention transformer lm using grus molecular ecfp pytorch july 2020 a simple approach on generating drug sequences using rnns transformers in this project we use lstm s to generate drug sequences the model generates smiles seqeuences of drug sequqences like how a language model trained on shakespeare s poems generates shakespearean poems the preprocessing and training has been documented in the notebook the notebook will be uploaded on github hopefully the idea used in this project is highly infuenced by the paper https www ncbi nlm nih gov pmc articles pmc5836943 please go through the notebook de novo drug synthesis using rnns ipynb it has explaination and reason behind every code feel free to drop your opinions and changes you feel would improve its perfomance update august 2020 there have been considerable new changes made in the project these are as follows smiles generator constructed a new model using grus with an embedding layer to overcome shortcomings of lstm model used reinforcement techniques to generate novel drug sequences smiles format using molecular finerprints this approach uses two models grus embedding from point 1 they are called prior and agent prior and agent uses the weights we got after training the smiles generator point 1 and using experience relay we and augmented loss likelihood to finuetune the agent to generate sequnces read as sample with respect to a molecule read as client we perfom a scoring by comparing client and sample the way we do this is by using fingerprint generator 1024 dimension vector and then we calculate the tanimoto similarity jaccard similarity between the two and finally we get a score 0 1 which we incorporate into our loss function to finetune the agent finally we generate sequences read as candidates with a certain similarity a hyperparameter we can tweak w r t client molecule update september 2020 new architectures like transformers were explored to improve the model however owing to the limitations of compuataional power i could not continue my research in that direction to improve the perfomance i started working on the fingerprint generator graph neural networks were a better choice due to the homologous similairty between molecular structure and architecure structure many differnt models were trued and dmpnns https pubs acs org doi full 10 1021 acs jcim 9b00237 were experimented and the research is still ongoing working of gnns picture taken from dmpnn paper https pubs acs org na101 home literatum publisher achs journals content jcisd8 2019 jcisd8 2019 59 issue 8 acs jcim 9b00237 20190819 images medium ci9b00237 0001 gif style centerme working of gnns in dmpnns picture taken from dmpnn paper since i needed to extract features from the seqeunces and compare it with another molecule i tried applying autoencoders comparing two latent represenations would be similar to fingerprint similarity between two moeclues transformers were use for this purpose smiles transformers was used and its diiferent layers were merge to get a 1024 dimensional vector which would be used for comaprison the transfromer decoder would try to recreate the molecule from the latent vector output from encoder this approach perfromed similar to domain specific molecular finergerprint and there was no room for improvement until i thought of merging both the architectures graphs transformers to get best of both worlds with the lest trade off graph transformers were explored https user images githubusercontent com 47039231 95646485 85e40080 0ae6 11eb 88a1 1c162a96d079 png style centerme smiles transformer working transformers using graphs were also ideal candidate for the job molecule attention transformer mat https arxiv org abs 2002 08264 was experimented and the arcitecture was changed to get a 1024 dimension feature vector thankfully the authors have provided pretrained weights which were used to generate 1024 dimesional feature vector the research is still ongoing currently paused due to other commitments you cn read the project report for more details on the project architecture of mat https github com gmum mat raw master assets mat png style centerme architecture of mat once this architecture is incorporated into the model i am planning to construct a web api similar to http chemprop csail mit edu predict to genearete suitable drug candidates with respect a client molecule with a certain similairty hyperparameter this project is open to further research feel free to drop your opinions and changes you feel would improve its perfomance update the mat has been incorporated inside the main model currently the two stage model supports cosine similarity for scoring when you use mat for fingerprint generation if you wish to use domain specific rdkit morgan fingerprint tanimoto similairty is used different similairty measures are being experimented mat also will be changed in the future iterations of the project i have updated main ipynb notebook which has the code for complete two stage model scripting is underway after its done ipynb will be replaced with the final script cheers caveat i have noticed that mat is underperfroming owing to the lack of training on a relevant dataset corresponding to the client s molecule let s say the client is looking for cancer related drug sequences in order to give specific to cancer realted drugs mat needs to be trained on a certain downstream task like predicting anti cancer properites as a regression classfication problem later use trained transformer weights in the fingerprint generator to get a 1024 vector by making few changes of course mat initally was made for classification regression subproblems using it as a fingerprint generator by modifiyng few final layers is the magic here d | ai |
|
nuttx | apache nuttx introduction community getting help mailing lists issue tracker source code website source code environments installing cygwin ubuntu bash under windows 10 using macos installation download and unpack semi optional apps package installation directories with spaces in the path downloading from repositories related repositories notes about header files configuring nuttx instantiating canned configurations refreshing configurations nuttx configuration tool finding selections in the configuration menus reveal hidden configuration options make sure that you are on the right platform comparing two configurations making defconfig files incompatibilities with older configurations nuttx configuration tool under dos toolchains cross development toolchains nuttx buildroot toolchain shells building nuttx building re building build targets and options native windows build installing gnuwin32 cygwin build problems strange path problems window native toolchain issues documentation introduction apache nuttx is a real time operating system rtos with an emphasis on standards compliance and small footprint scalable from 8 bit to 64 bit microcontroller environments the primary governing standards in nuttx are posix and ansi standards additional standard apis from unix and other common rtoss such as vxworks are adopted for functionality not available under these standards or for functionality that is not appropriate for deeply embedded environments such as fork extensive documentation can be found on the project wiki https cwiki apache org nuttx nuttx for brevity many parts of the documentation will refer to apache nuttx as simply nuttx community every volunteer project obtains its strength from the people involved in it we invite you to participate as much or as little as you choose we encourage you to use our project and provide feedback provide us with use cases report bugs and submit patches contribute code or documentation getting help the best place to get help is the developer s mailing list please see the following section mailing lists get help using nuttx or contribute to the project on our mailing lists dev nuttx apache org is for people who want to contribute code to nuttx to subscribe send an email to dev subscribe nuttx apache org to unsubscribe send an email to dev unsubscribe nuttx apache org view the archives at https www mail archive com dev nuttx apache org commits nuttx apache org is a read only list that notifies subscribers about commit messages and patches to nuttx to subscribe send an email to commits subscribe nuttx apache org to unsubscribe send an email to commits unsubscribe nuttx apache org view the archives at https www mail archive com commits nuttx apache org reporting security issues found a vulnerability see our security policy here github security md issue tracker bug reports found bug send an email to the dev list dev nuttx apache org before submitting an issue please verify that the bug does in fact exist search the mailing list archives to verify there is no existing issue reporting the bug you ve found consider tracking down the bug yourself in the nuttx source code and submitting a patch along with your bug report this is a great time saver for the nuttx developers and helps ensure the bug will be fixed quickly feature requests enhancement requests for new features are also welcome the more concrete and rational the request is the greater the chance it will incorporated into future releases source code the project sources are in two git repositories the core os is in nuttx and the apps repository is in nuttx apps these are housed in gitbox on asf servers and also mirrored at github these are kept in sync so you can use whichever option you prefer nuttx core os repository primary https gitbox apache org repos asf p nuttx git github mirror https github com apache nuttx apps repository primary https gitbox apache org repos asf p nuttx apps git github mirror https github com apache nuttx apps website source code the project website sources are accessible via the website source code repository which is also mirrored in github primary https gitbox apache org repos asf p nuttx website git github mirror https github com apache nuttx website environments nuttx requires a posix development environment such as you would find under linux or macos nuttx may also be installed and built on windows system if you also provide such a posix development environment options for a posix development environment under windows include an installation of linux on a virtual machine vm in windows i have not been happy using a vm myself i have had stability problems with open source vms and commercial vms cost more than i want to spend sharing files with linux running in a vm is awkward sharing devices connected to the windows box with linux in a vm is at the very least confusing using windows tools such as segger j link with files built under the linux vm is not a possibility the cygwin environment instructions for installation of cygwin on a windows system are provided in the following paragraph installing cygwin cygwin is a mature well tested and very convenient environment it is especially convenient if you need to integrate with windows tools and files downsides are that the installation time is very long and the compile times are slow ubuntu bash shell under windows 10 this is a new option under windows 10 see the section ubuntu bash under windows 10 below this is an improvement over cygwin if your concern is compile time its build performance is comparable to native linux certainly better than the cygwin build time it also installs in a tiny fraction of the time as cygwin perhaps 20 minutes for the basic ubuntu install vs more than a day for the complete cygwin install there have been even more recent ports of linux environment to windows i need to update this section to include some mention of these alternatives the msys environment msys derives from an older version of cygwin simplified and adapted to work more naturally in the windows environment see http www mingw org wiki msys if you are interested in using msys the advantages of the msys environment is that it is better integrted with the native windows environment and lighter weight it uses only a minimal number of add on posix land tools the download link in that wiki takes you to the sourceforge download site the sourceforge msys project has been stagnant for some time the msys project has more recently moved to http odsn net projects sfnet mingwbundle downloads of current zip files are available there but no instructions for the installation msys2 appears to be a re write of msys based on a newer version of cygwin is it available at https www msys2 org a windows installer is available at that site along with very good installation instructions the download is relatively quick at least compared to cygwin and the pacman package management tool supports supports simple system updates for example pacman s git will install the git command line utilities other posix environments check out unxutils https sourceforge net projects unxutils https en wikipedia org wiki unxutils mobaxterm https mobaxterm mobatek net gow https github com bmatzelle gow wiki disclaimer in principle these should work however i have never used any of these environments and cannot guarantee that there is not some less than obvious issues nuttx can also be installed and built on a native windows system but with some potential tool related issues see the discussion native windows build under building nuttx below gnuwin32 is used to provide compatible native windows tools installing cygwin installing cygwin on your windows pc is simple but time consuming see http www cygwin com for installation instructions basically you just need to download a tiny setup exe program and it does the real network installation for you some cygwin installation tips 1 install at c cygwin 2 install everything only the minimal base packages from the cygwin distribution are installed by default clicking on categories and packages in the setup exe package installation screen will provide you with the ability to control what is installed or updated clicking on the default field next to the all category will provide you with the opportunity to install every cygwin package be advised that this will download and install hundreds of megabytes to your computer if you use the default installation you will be missing many of the cygwin utilities that you will need to build nuttx the build will fail in numerous places because of missing packages note the last time i installed everything the download was about 5gib the server i selected was also very slow so it took over a day to do the whole install note you don t really have to install everything but i cannot answer the question then what should i install i don t know the answer to that and so will continue to recommend installing everything you should certainly be able to omit science math and publishing you can try omitting kde gnome gtk and other graphics packages if you don t plan to use them perhaps a minimum set would be those packages listed below for the ubuntu bash under windows 10 installation update sergey frolov had success with the following minimal cygwin configuration 1 after starting the cygwin installer keep the recommended packages that are pre selected in the default configuration 2 using the installation tools add the following packages make gnu make bison libgmp3 dev gcc core byacc libmpfr dev gcc g gperf libmpc dev flex gdb automake 1 15 libncurses dev libgmp dev curl after installing cygwin you will get lots of links for installed tools and shells i use the rxvt native shell it is fast and reliable and does not require you to run the cygwin x server which is neither fast nor reliable unless otherwise noted the rest of these instructions assume that you are at a bash command line prompt in either linux or in cygwin shell using msys msys is an environment the derives from cygwin thus most things said about cygwin apply equally to msys this section will then focus on the differences when using msys specifically msys2 here is it assumed that you have already downloaded and installed msys2 from https www msys2 org using the windows installer available at that location it is also assumed that you have brought in the necessary tools using the pacman package management tool tools needed including pacman s git pacman s make pacman s gcc pacman s gdb and possibly others depending upon your usage then you will need to build and install kconfig frontends per the instructions of the top level readme txt file in the tools repository this requires the following additional tools pacman s bison pacman s curl pacman s gperf pacman s ncurses devel pacman s automake wrapper pacman s autoconf pacman s pkg config because of some versioning issues i had to run aclocal prior to running the kconfig frontends configure script see configuring nuttx below for further information unlike cygwin msys does not support symbolic links the ln s command will in fact copy a directory this means that you make defs file will have to include definitions like ifeq config windows msys y dirlink topdir tools copydir sh dirunlink topdir tools unlink sh endif this will force the directory copies to work in a way that can be handled by the nuttx build system note the default link sh script has been updated so that is should now be msys2 compatible the above is preferred but no longer necessary in the make defs file to build the simulator under msys you also need pacman s zlib devel it appears that you cannot use directory names with spaces in them like c program files 86 in the msys path variable i worked around this by create windows junctions like this 1 open the a windows command terminal 2 cd to c msys64 then 3 mklink j programfiles c program files and 4 mklink j programfiles86 c program files x86 they then show up as programfiles and programfiles86 with the msys2 sandbox those paths can then be used with the path variable i had to do something similar for the path to the gnu tools arm embedded toolchain which also has spaces in the path name ubuntu bash under windows 10 a better version of a command line only ubuntu under windows 10 beta has recently been made available from microsoft installation installation instructions abound on the internet complete with screen shots i will attempt to duplicate those instructions in full here here are the simplified installation steps open settings click on update security click on for developers under use developer features select the developer mode option to setup the environment to install bash a message box should pop up click yes to turn on developer mode after the necessary components install you ll need to restart your computer once your computer reboots open control panel click on programs click on turn windows features on or off a list of features will pop up check the windows subsystem for linux beta option click ok once the components installed on your computer click the restart now button to complete the task after your computer restarts you will notice that bash will not appear in the recently added list of apps this is because bash isn t actually installed yet now that you have setup the necessary components use the following steps to complete the installation of bash open start do a search for bash exe and press enter on the command prompt type y and press enter to download and install bash from the windows store this will take awhile then you ll need to create a default unix user account this account doesn t have to be the same as your windows account enter the username in the required field and press enter you can t use the username admin close the bash exe command prompt now that you completed the installation and setup you can open the bash tool from the start menu like you would with any other app accessing windows files from ubuntu file systems will be mounted under mnt so for example c program files appears at mnt c program files this is as opposed to cygwin where the same directory would appear at cygdrive c program files with these differences perhaps a few other windows quirks the ubuntu install works just like ubuntu running natively on your pc a good tip for file sharing is to use symbolic links within your ubuntu home directory for example suppose you have your projects directory at c documents projects then you can set up a link to the projects directory in your ubuntu directory like ln s mnt c documents projects projects accessing ubuntu files from windows in ubuntu userspace for windows the ubuntu file system root directory is at localappdata lxss rootfs or c users username appdata local lxss rootfs however i am unable to see my files under the rootfs home directory after some looking around i find the home directory localappdata lxss home with that trick access to the home directory you should actually be able to use windows tools outside of the ubuntu sandbox with versions of nuttx built within the sandbox using that path executing windows tools from ubuntu you can also execute windows tools from within the ubuntu sandbox mnt c program files x86 microchip xc32 v1 43 bin xc32 gcc exe version unable to translate current working directory using c windows system32 xc32 gcc exe microchip technology 4 8 3 mplab xc32 compiler v1 43 build date mar 1 2017 the error message indicates that there are more issues you cannot mix windows tools that use windows style paths in an environment that uses posix paths i think you would have to use linux tools only from within the ubuntu sandbox install ubuntu software use sudo apt get install package name as examples this is how you would get git sudo apt get install git this will get you a compiler for your host pc sudo apt get install gcc this will get you an arm compiler for your target sudo apt get install gcc arm none eabi note that is just an example i am not sure if apt get will give you a current or usable compiler you should carefully select your toolchain for the needs of your project you will also need to get the kconfig frontends configuration as described below under nuttx configuration tool in order to build the kconfig frontends configuration tool you will also need make gperf flex bison and libncurses dev that is enough to do a basic nuttx build integrating with windows tools if you want to integrate with windows native tools then you would need deal with the same kind of craziness as with integrating cygwin with native toolchains see the section cygwin build problems below however there is currently no build support for using windows native tools with ubuntu under windows this tool combination is made to work with cygwin through the use of the cygpath w tool that converts paths from say cydrive c program files to c program files there is however no corresponding tool to convert mnt c program files in the ubuntu environment graphics support the ubuntu version support by microsoft is a command line only version there is no support for linux graphics utilities this limitation is not a limitation of ubuntu however only in what microsoft is willing to support if you install a x server then you can also use basic graphics utilities see for example http www howtogeek com 261575 how to run graphical linux desktop applications from windows 10s bash shell many linux graphics programs would however also require a graphics framework like gtk or qt so this might be a trip down the rabbit hole using macos you need to install at least the following tools specific to macos flock used by appdir build logic a macos port is available at https github com discoteq flock brew tap discoteq discoteq brew install flock if you want to build the sim xcode the native compiler and the rest of the toolchain elf toolchain if you want to build modules for config libc modlib brew install x86 64 elf gcc installation there are two ways to get nuttx you may download released stable tarballs from either the project website or you may get nuttx by cloning the git repositories let s consider the released tarballs first download and unpack download and unpack the nuttx tarball if you are reading this then you have probably already done that after unpacking you will end up with a directory called nuttx version where version is the nuttx version number you might want to rename that directory nuttx to match the various instructions in the documentation and some scripts in the source tree download location https nuttx apache org download legacy download locations https bitbucket org nuttx nuttx downloads https sourceforge net projects nuttx files nuttx semi optional apps package all nuttx libraries and example code used to be in included within the nuttx source tree as of nuttx 6 0 this application code was moved into a separate tarball the apps tarball if you are just beginning with nuttx then you will want to download the versioned apps tarball along with the nuttx tarball if you already have your own product application directory then you may not need the apps tarball it is called semi optional because if you don t have some apps directory nuttx will fail to build you do not necessarily need to use the nuttx apps tarball but may instead provide your own custom application directory such a custom directory would need to include a valid makefile to support the build and a valid kconfig file to support the configuration more about these files later download then unpack the apps tarball in the same directory where you unpacked the nuttx tarball after you unpack the apps tarball you will have a new directory called apps version where the version should exactly match the version of the nuttx tarball again you might want to rename the directory to simply apps to match what you read in the documentation after unpacking and renaming the apps tarball you will have two directories side by side like this nuttx apps this is important because the nuttx build will expect to find the apps directory in that default location that default location can be changed by modifying your nuttx configuration file but that is another story installation directories with spaces in the path the nuttx build directory should reside in a path that contains no spaces in any higher level directory name for example under cygwin your home directory might be formed from your first and last names like home first last that will cause strange errors when the make system tries to build actually that problem is probably not too difficult to fix some makefiles probably just need some paths within double quotes i work around spaces in the home directory name by creating a new directory that does not contain any spaces such as home nuttx then i install nuttx in home nuttx and always build from home nuttx nuttx code downloading from repositories cloning the repository before cloning repositories on any windows platform do the following git command git config global core autocrlf false that will avoid conversions of linefeeds newlines n to carriage return plus linefeed sequences r n the current nuttx du jour is available in from a git repository here are instructions for cloning the core nuttx rtos corresponding to the nuttx tarball discussed above git clone https gitbox apache org repos asf nuttx git nuttx or git clone https github com apache nuttx git nuttx and the semi optional apps application directory and be cloned like git clone https gitbox apache org repos asf nuttx apps git apps or git clone https github com apache nuttx apps git apps that will give you the same directory structure like this nuttx apps configuring the clones the following steps need to be performed for each of the repositories after changing to the clone directory set your identity git config global user name my name git config global user email my name example com colorized diffs are much easier to read git config global color branch auto git config global color diff auto git config global color interactive auto git config global color status auto checkout other settings git config list cloning nuttx inside cygwin if you are cloning the nuttx repository it is recommended to avoid automatic end of lines conversions by git these conversions may break some scripts like configure sh before cloning do the following git config global core autocrlf false related repositories these are standalone repositories https gitbox apache org repos asf nuttx apps or https github com apache nuttx apps git this directory holds an optional package of applications and libraries can be used with the nuttx rtos there is a readme txt file there that will provide more information about that package https bitbucket org nuttx nxwidgets this is the nuttx c graphics support this includes nxwm the tiny nuttx window manager https bitbucket org nuttx uclibc this repository contains a version of the uclibc c library this code originates from http cxx uclibc org and has been adapted for nuttx by the rgmp team http rgmp sourceforge net wiki index php main page https bitbucket org nuttx buildroot a environment that you can to use to build a custom nuttx gnu toolchain https bitbucket org nuttx tools there are snapshots of some tools here that you will need to work with nuttx kconfig frontends genromfs and others notes about header files other c library header files when a gcc toolchain is built it must be built against a c library the compiler together with the contents of the c library completes the c language definition and provides the complete c development environment nuttx provides its own built in c library so the complete consistent c language definition for use with nuttx comes from the combination of the compiler and the header files provided by the nuttx c library when a gcc toolchain is built it incorporates the c library header files into the compiler internal directories and in this way the c library really becomes a part of the toolchain if you use the nuttx buildroot toolchain as described below under nuttx buildroot toolchain your gcc toolchain will build against the nuttx c library and will incorporate the nuttx c library header files as part of the toolchain if you use some other third party tool chain this will not be the case however those toolchains were probably built against some other incompatible c library distribution such as newlib those tools will have incorporated the incompatible c library header files as part of the toolchain these incompatible header files must not be used with nuttx because they will conflict with definitions in the nuttx built in c library for such toolchains that include header files from a foreign c library nuttx must be compiled without using the standard header files that are distributed with your toolchain this prevents including conflicting incompatible header files such as stdio h the math h and stdarg h are probably the two most trouble some header files to deal with these troublesome header files are discussed in more detail below header files provided by your toolchain certain header files such as setjmp h stdarg h and math h may still be needed from your toolchain and your compiler may not however be able to find these if you compile nuttx without using standard header files i e with nostdinc if that is the case one solution is to copy those header file from your toolchain into the nuttx include directory duplicated header files there are also a few header files that can be found in the nuttx include directory which are duplicated by the header files from your toolchain stdint h and stdbool h are examples if you prefer to use the stdint h and stdbool h header files from your toolchain those could be copied into the nuttx include directory using most other header files from your toolchain would probably cause errors math h even though you should not use a foreign c library you may still need to use other external libraries with nuttx in particular you may need to use the math library libm a nuttx supports a generic built in math library that can be enabled using config libm y however you may still want to use a higher performance external math library that has been tuned for your cpu sometimes such tuned math libraries are bundled with your toolchain the math library header file math h is a then special case if you do nothing the standard math h header file that is provided with your toolchain will be used if you have a custom architecture specific math h header file then that header file should be placed at arch cpu include math h there is a stub math h header file located at include nuttx lib math h this stub header file can be used to redirect the inclusion to an architecture specific math h header file if you add an architecture specific math h header file then you should also define config arch math h y in your nuttx configuration file if config arch math h is selected then the top level makefile will copy the stub math h header file from include nuttx lib math h to include math h where it will become the system math h header file the stub math h header file does nothing other than to include that architecture specific math h header file as the system math h header file float h if you enable the generic built in math library then that math library will expect your toolchain to provide the standard float h header file the float h header file defines the properties of your floating point implementation it would always be best to use your toolchain s float h header file but if none is available a default float h header file will be provided if this option is selected however there is no assurance that the settings in this float h are actually correct for your platform stdarg h in most cases the correct version of stdarg h is the version provided with your toolchain however sometimes there are issues with using your toolchains stdarg h for example it may attempt to draw in header files that do not exist in nuttx or perhaps the header files that it uses are not compatible with the nuttx header files in those cases you can use an architecture specific stdarg h header file by defining config arch stdarg h y see the discussion above for the math h header this setting works exactly the same for the stdarg h header file configuring nuttx instantiating canned configurations configure sh and configure bat canned nuttx configuration files are retained in boards arch name chip name board name configs config dir where board name is the name of your development board and config dir is the name of the sub directory containing a specific configuration for that board arch name and chip name refer to characteristics of the mcu used on the board arch name is the cpu architecture implemented by the mcu chip name identifies the mcu chip family only a few steps are required to instantiate a nuttx configuration but to make the configuration even easier there are scripts available in the tools sub directory combines those simple steps into one command there is one tool for use with any bash like shell that does configuration steps it is used as follows tools configure sh board name config dir there is an alternative windows batch file that can be used in the windows native environment like tools configure bat board name config dir and to make sure that other platforms are supported there is also a c program at tools configure c that can be compiled to establish the board configuration see tools readme txt for more information about these scripts general information about configuring nuttx can be found in topdir boards readme txt topdir boards arch name chip name board name readme txt the hidden configuration scripts as mentioned above there are only a few simple steps to instantiating a nuttx configuration those steps are hidden by the configuration scripts but are summarized below 1 copy files configuring nuttx requires only copying two files from the config dir to the directory where you installed nuttx topdir copy boards arch name chip name board name configs config dir make def to topdir make defs or copy boards arch name chip name board name scripts make def to topdir make defs make defs describes the rules needed by your tool chain to compile and link code you may need to modify this file to match the specific needs of your toolchain note that a configuration may have its own unique make defs file in its configuration directory or it may use a common make defs file for the board in the scripts directory the first takes precedence copy boards arch name chip name board name configs config dir defconfig to topdir config the defconfig file holds the actual build configuration this file is included by all other make files to determine what is included in the build and what is not this file is also used to generate a c configuration header at include nuttx config h copy other environment specific files to topdir this might include files like gdbinit or ide configuration files like project or cproject 2 refresh the configuration new configuration setting may be added or removed existing settings may also change there values or options this must be handled by refreshing the configuration as described below note nuttx uses only compressed defconfig files for the nuttx defconfig files this refreshing step is not optional it is also necessary to uncompress and regenerate the full making file this is discussed further below refreshing configurations configurations can get out of date as new configuration settings are added or removed or as dependencies between configuration settings change the contents of a default configuration can become out of synch with the build systems hence it is a good practice to refresh each configuration after configuring and before making to refresh the configuration use the nuttx configuration tool like this make oldconfig after you have instantiated the nuttx configuration as described above the configuration step copied the config file into place in the top level nuttx directory make oldconfig step will then operate on that config file to bring it up to date if your configuration is out of date you will be prompted by make oldconfig to resolve the issues detected by the configuration tool that is to provide values for the new configuration options in the build system doing this can save you a lot of problems down the road due to obsolete settings in the default board configuration file the nuttx configuration tool is discussed in more detail in the following paragraph confused about what the correct value for a new configuration item should be enter in response to the make oldconfig prompt and it will show you the help text that goes with the option if you don t want to make any decisions are willing to just accept the recommended default value for each new configuration item an even easier way is make olddefconfig the olddefconfig target will simply bring your configuration up to date with the current kconfig files setting any new options to the default value no questions asked nuttx configuration tool an automated tool has been incorporated to support re configuration of nuttx this tool is based on the kconfig frontends application available at https bitbucket org nuttx tools src master kconfig frontends this is a snapshot of the old http ymorin is a geek org projects kconfig frontends which is no longer available this application provides a tool called kconfig mconf that is used by the nuttx top level makefile the following make target is provided make menuconfig this make target will bring up nuttx configuration menus warning never do make menuconfig on a configuration that has not been converted to use the kconfig frontends tools this will damage your configuration see https cwiki apache org confluence display nuttx converting legacy configurations to use kconfig mconf nuttx also supports kconfiglib https github com ulfalizer kconfiglib by default which is a kconfig tool implemented in python 2 3 compared with kconfig frontends kconfiglib provides nuttx with the possibility of multi platform support configure nuttx in winodws native visual studio and also kconfiglib has a stronger kconfig syntax check this will help developers to avoid some kconfig syntax errors install kconfiglib via following command pip install kconfiglib if you are a working on windows which also need the support of windows curses pip install windows curses note it should be noted that kconfiglib does not support modules attributes https github com ulfalizer kconfiglib blob master kconfiglib py l3239 l3254 the community seems to have stopped updating if the features depends on config build loadable kconfiglib may not be a good choice how do we tell a new configuration from an old one see incompatibilities with older configurations below the menuconfig make target depends on two things 1 the kconfig configuration data files that appear in almost all nuttx directories these data files are the part that is still under development patches are welcome the kconfig files contain configuration information for the configuration settings relevant to the directory in which the kconfig file resides note for a description of the syntax of this configuration file see kconfig language txt in the tools repository at https bitbucket org nuttx tools 2 the kconfig mconf tool kconfig mconf is part of the kconfig frontends package you can download that package from the snapshot in the tools repository at https bitbucket org nuttx tools building kconfig frontends under linux may be as simple as configure make make install but there may be some build complexities especially if you are building under cygwin see the more detailed build instructions in the top level readme txt file of the tools repository at https bitbucket org nuttx tools the make install step will by default install the kconfig mconf tool at usr local bin mconf where ever you choose to install kconfig mconf make certain that your path variable includes a path to that installation directory the kconfig frontends tools will not build in a native windows environment directly out of the box for the windows native case you can use the modified version of kconfig frontends that can be found at http uvc de posts linux kernel configuration tool kconfig under windows html or a more recent port that can be found at http reclonelabs com more kconfig awesomeness for windows the basic configuration order is bottom up select the build environment select the processor select the board select the supported peripherals configure the device drivers configure the application options on top of this this is pretty straight forward for creating new configurations but may be less intuitive for modifying existing configurations another ncurses based tool that is an option to kconfig mconf is kconfig nconf the differences are primary in in the aesthetics of the ui if you have kconfig nconf built then you can invoke that front end with make nconfig if you have an environment that supports the qt or gtk graphical systems probably kde or gnome respectively or cygwin under windows with qt or gtk installed then you can also build the graphical kconfig frontends kconfig qconf and kconfig gconf in these case you can start the graphical configurator with either make qconfig or make gconfig some keyboard shortcuts supported by kconfig mconf the tool that runs when you do make menuconfig will bring up the mconfig help display can be used find configuration selections z can be used to reveal hidden configuration options these last two shortcuts are described further in the following paragraphs finding selections in the configuration menus the nuttx configuration options have gotten complex and it can be very difficult to find options in the menu trees if you are not sure where to look the basic configuration order describe above can help to narrow things down but if you know exactly what configuration setting you want to select say config xyz but not where to find it then the make menuconfig version of the tool offers some help by pressing the key the tool will bring up a menu that will allow you to search for a configuration item just enter the string config xyz and press enter it will show you not only where to find the configuration item but also all of the dependencies related to the configuration item reveal hidden configuration options if you type z then kconfig mconf will change what is displayed normally only enabled features that have all of their dependencies met are displayed that is of course not very useful if you would like to discover new options or if you are looking for an option and do not realize that the dependencies have not yet been selected and hence it is not displayed but if you enter z then every option will be shown whether or not its dependencies have been met you can then see everything that could be selected with the right dependency selections these additional options will be shown the for the selection and for the value since it cannot be selected and has no value about all you do is to select the help option to see what the dependencies are make sure that you are on the right platform saved configurations may run on linux cygwin 32 or 64 bit or other platforms the platform characteristics can be changed use make menuconfig sometimes this can be confusing due to the differences between the platforms enter sethost sh sethost sh is a simple script that changes a configuration to your host platform this can greatly simplify life if you use many different configurations for example if you are running on linux and you configure like this tools configure sh board configuration the you can use the following command to both 1 make sure that the configuration is up to date and 2 the configuration is set up correctly for linux tools sethost sh l or if you are on a windows cygwin 64 bit platform tools sethost sh c or for msys msys2 tools sethost sh g other options are available from the help option built into the script you can see all options with tools sethost sh h recently the options to the configure sh and configure bat scripts have been extended so that you both setup the configuration select for the host platform that you use and uncompress and refresh the defconfig file all in one command like tools configure sh l board configuration for a linux host or for a windows cygwin host tools configure sh c board configuration other options are available from the help option built into the script you can see all options with tools configure sh h comparing two configurations if you try to compare two configurations using diff you will probably not be happy with the result there are superfluous things added to the configuration files that make comparisons with the human eye difficult there is a tool at nuttx tools cmpconfig c that can be built to simplify these comparisons the output from this difference tool will show only the meaningful differences between two configuration files this tool is built as follows cd nuttx tools make f makefile host this will create a program called cmpconfig or comconfig exe on windows why would you want to compare two configuration files here are a few of the reasons why i do this 1 when i create a new configuration i usually base it on an older configuration and i want to know what are the options that i need to change to add the new feature to the older configurations for example suppose that i have a boarda nsh configuration and i want to create a boarda nxwm configuration suppose i already have boardb nsh and boardb nxwm configurations then by comparing the boardb nsh with the boardb nxwm i can see the modifications that i would need to make to my boarda nsh to create a new boarda nxwm 2 but the most common reason that i use the cmpconfig program is to check the results of refreshing a configuration with make oldconfig see the paragraph refreshing configurations above the make oldconfig command will make changes to my configuration and using cmpconfig i can see precisely what those changes were and if any should be of concern to me 3 the cmpconfig tool can also be useful when converting older legacy manual configurations to the current configurations based on the kconfig frontends tools see the following paragraph making defconfig files config files as defconfig files the minimum defconfig file is simply the generated config file with config apps dir setting removed or commented out that setting provides the name and location of the apps directory relative to the nuttx build directory the default is apps however the apps directory may be any other location and may have a different name for example the name of versioned nuttx releases are always in the form apps xx yy where xx yy is the version number finding the apps directory path when the default configuration is installed using one of the scripts or programs in the nuttx tools directory there will be an option to provide the path to the apps directory if not provided then the configure tool will look around and try to make a reasonable decision about where the apps directory is located compressed defconfig files the makefile also supports an option to generate very small defconfig files the config files are quite large and complex but most of the settings in the config file simply have the default settings from the kconfig files these config files can be converted into small defconfig file make savedefconfig that make target will generate a defconfig file in the top level directory the size reduction is really quite remarkable wc l config defconfig 1085 config 82 defconfig 1167 total in order to be usable the config file installed from the compressed defconfig file must be reconstituted using make olddefconfig note 1 only compressed defconfig files are retained in the nuttx repository all patches and prs that attempt to add or modify a defconfig file must use the compressed defconfig format as created by make savdefconfig note 2 when make savedefconfig runs it will try several things some of which are expected to fail in these cases you will see an error message from make followed by ignored you should also ignore these messages caution this size reduction was accomplished by removing all setting from the config file that were at the default value make olddefconfig can regenerate the original config file by simply restoring those default settings the underlying assumption here is of course that the default settings do not change if the default settings change and they often do then the original config may not be reproducible so if your project requires 100 reproducibility over a long period of time you make want to save the complete config files vs the standard compressed defconfig file configuring with compressed defconfig files as described above defconfig all nuttx defconfig files are compressed using make savedeconfig these compressed defconfig files are generally not fully usable as they are and may not build the target binaries that you want because the compression process removed all of the default settings from the defconfig file to restore the default settings you should run the following after configuring make olddefconfig that will restore the the missing defaulted values using this command after configuring is generally a good practice anyway even if the defconfig files are not compressed in this fashion the defconfig file may be old and the only way to assure that the installed config is is up to date is via make oldconfig or make olddefconfig see the paragraph above entitled refreshing configurations for additional information incompatibilities with older configurations warning the current nuttx build system supports only the new compressed defconfig configuration files generated using the kconfig frontends tools as described in the preceding section support for the older legacy manual configurations was eliminated in nuttx 7 0 support for uncompressed config files as defconfig files was eliminated after nuttx 7 21 all configurations must now be done using the kconfig frontends tool the older manual configurations and the new kconfig frontends configurations are not compatible old legacy configurations can not be used with the kconfig frontends tool and hence cannot be used with releases of nuttx 7 0 and beyond if you run make menuconfig with a legacy configuration the resulting configuration will probably not be functional q how can i tell if a configuration is a new kconfig frontends configuration or an older manual configuration a only old manual configurations will have an appconfig file q how can i convert a older manual configuration into a new kconfig frontends toolchain a refer to https cwiki apache org confluence display nuttx converting legacy configurations to use kconfig mconf warning as described above whenever you use a configuration you really should always refresh the configuration with the following command before you make nuttx make oldconfig or make olddefconfig this will make sure that the configuration is up to date in the event that it has lapsed behind the current nuttx development see the paragraph refreshing configurations above but this only works with new configuration files created with the kconfig frontends tools further this step is not optional with the new compressed defconfig files it is a necessary step that will also uncompress the defconfig file regenerating the config and making it usable for nuttx builds never do make oldconfig or make menuconfig on a configuration that has not been converted to use the kconfig frontends tools this will damage your configuration see https cwiki apache org confluence display nuttx converting legacy configurations to use kconfig mconf nuttx configuration tool under dos recent versions of nuttx support building nuttx from a native windows console window see native windows build below but kconfig frontends is a linux tool at one time this was a problem for windows users but now there are two specially modified versions of the kconfig frontends tools that can be used one can be found here http uvc de posts linux kernel configuration tool kconfig under windows html the configuration steps of the most recent versions of nuttx require the kconfig tweak tool that is not not available in the the above however there has been an update to this kconfig windows tools that does include kconfig tweak http reclonelabs com more kconfig awesomeness for windows source code is available here https github com reclone kconfig frontends win32 and https github com reclone kconfig frontends win32 releases it is also possible to use the version of kconfig frontends built under cygwin outside of the cygwin sandbox in a native windows environment 1 you can run the configuration tool using cygwin however the cygwin win mk will complain so to do this will you have to manually edit the config file a delete the line config windows native y b change the apps directory path config apps dir to use unix style delimiters for example change apps to apps and of course after you use the configuration tool you need to restore config windows native y and the correct config apps dir 2 you can with some effort run the cygwin kconfig mconf tool directly in the windows console window in this case you do not have to modify the config file but there are other complexities a you need to temporarily set the cygwin directories in the path variable then run kconfig mconf manually like kconfig mconf kconfig there is a windows batch file at tools kconfig bat that automates these steps tools kconfig menuconfig b there is an issue with accessing dos environment variables from the cygwin kconfig mconf running in the windows console the following change to the top level kconfig file seems to work around these problems config appsdir string option env appsdir default apps toolchains cross development toolchains in order to build nuttx for your board you will have to obtain a cross compiler to generate code for your target cpu for each board configuration there is a readme txt file at boards arch name chip name board name readme txt that readme file contains suggestions and information about appropriate tools and development environments for use with your board in any case the path environment variable will need to be updated to include the location where the build can find the toolchain binaries nuttx buildroot toolchain for many configurations a diy set of tools is available for nuttx these tools can be downloaded from the nuttx bitbucket org file repository after unpacking the buildroot tarball you can find instructions for building the tools in the buildroot boards readme txt file check the readme txt file in the configuration directory for your board to see if you can use the buildroot toolchain with your board this readme txt file is located in boards arch name chip name board name readme txt this toolchain is available for both the linux and cygwin development environments advantages 1 nuttx header files are built into the tool chain and 2 related support tools like nxflat tools the romfs genromfs tools and the kconfig frontends tools can be built into your toolchain disadvantages this tool chain is not was well supported as some other toolchains gnu tools are not my priority and so the buildroot tools often get behind for example until recently there was no eabi support in the nuttx buildroot toolchain for arm note for cortex m3 4 there are oabi and eabi versions of the buildroot toolchains if you are using the older oabi toolchain the prefix for the tools will be arm nuttx elf for the eabi toolchain the prefix will be arm nuttx eabi if you are using the older oabi toolchain with an arm cortex m3 4 you will need to set config arm toolchain buildroot oabi in the config file in order to pick the right tool prefix if the make system ever picks the wrong prefix for your toolchain you can always specify the prefix on the command to override the default like make crossdev arm nuttx elf shells the nuttx build relies on some shell scripts some are inline in the makefiles and many are executable scripts in the tools directory the scripts were all developed using bash and many contain bash shell dependencies most of the scripts begin with bin bash to specifically select the bash shell some still have bin sh but i haven t heard any complaints so these must not have bash dependencies there are two shell issues that i have heard of 1 linux where bin sh refers to an incompatible shell like ksh or csh in this case bash is probably available and the bin bash at the beginning of the file should do the job if any scripts with bin sh fail try changing that to bin bash and let me know about the change 2 freebsd with the bourne shell and no bash shell the other reverse case has also been reported on freebsd setups that have the bourne shell but not bash in this base bin bash fails but bin sh works okay my recommendation in this case is to create a symbolic link at bin bash that refers to the bourne shell there may still be issues however with certain the bash centric scripts that will require modifications building nuttx building nuttx builds in place in the source tree you do not need to create any special build directories assuming that your make defs is setup properly for your tool chain and that path environment variable contains the path to where your cross development tools are installed the following steps are all that are required to build nuttx cd topdir make at least one configuration eagle100 requires additional command line arguments on the make command read topdir boards arch name chip name board name readme txt to see if that applies to your target re building re building is normally simple just type make again but there are some things that can get you when you use the cygwin development environment with windows native tools the native windows tools do not understand cygwin s symbolic links so the nuttx make system does something weird it copies the configuration directories instead of linking to them it could perhaps use the ntfs mklink command but it doesn t a consequence of this is that you can easily get confused when you edit a file in one of the linked i e copied directories re build nuttx and then not see your changes when you run the program that is because build is still using the version of the file in the copied directory not your modified file older versions of nuttx did not support dependencies in this configuration so a simple work around this annoying behavior in this case was the following when you re build make clean context all this make command will remove of the copied directories re copy them then make nuttx however more recent versions of nuttx do support dependencies for the cygwin build as a result the above command will cause everything to be rebuilt because it removes and will cause recreating the include nuttx config h header file a much less gracefully but still effective command in this case is the following for the arm configuration rm rf arch arm src chip arch arm src board this kludge simple removes the copied directories these directories will be re created when you do a normal make and your edits will then be effective build targets and options build targets below is a summary of the build targets available in the top level nuttx makefile all the default target builds the nuttx executable in the selected output formats clean removes derived object files archives executables and temporary files but retains the configuration and context files and directories distclean does clean then also removes all configuration and context files this essentially restores the directory structure to its original unconfigured stated application housekeeping targets the appdir variable refers to the user application directory a sample apps directory is included with nuttx however this is not treated as part of nuttx and may be replaced with a different application directory for the most part the application directory is treated like any other build directory in the makefile script however as a convenience the following targets are included to support housekeeping functions in the user application directory from the nuttx build directory apps clean perform the clean operation only in the user application directory apps distclean perform the distclean operation only in the user application directory the apps config file is preserved so that this is not a full distclean but more of a configuration reset for the application directory export the export target will package the nuttx libraries and header files into an exportable package caveats 1 these needs some extension for the kernel build 2 the logic in tools mkexport sh only supports gcc and for example explicitly assumes that the archiver is ar flash or download deprecated this is a helper target that will rebuild nuttx and flash it to the target system in one step the operation of this target depends completely upon implementation of the flash command in the user make defs file it will generate an error if the flash command is not defined the following targets are used internally by the make logic but can be invoked from the command under certain conditions if necessary depend create build dependencies note there is currently no support for build dependencies under cygwin using windows native toolchains context the context target is invoked on each target build to assure that nuttx is properly configured the basic configuration steps include creation of the the config h and version h header files in the include nuttx directory and the establishment of symbolic links to configured directories clean context this is part of the distclean target it removes all of the header files and symbolic links created by the context target build options of course the value any make variable an be overridden from the make command line however there is one particular variable assignment option that may be useful to you v 1 this is the build verbosity flag if you specify v 1 on the make command line you will see the exact commands used in the build this can be very useful when adding new boards or tracking down compile time errors and warnings contributed by richard cochran native windows build the beginnings of a windows native build are in place but still not often used as of this writing the build was functional but because of lack of use may find some issues to be resolved with this build configuration the windows native build logic initiated if config windows native y is defined in the nuttx configuration file this build uses all windows style paths uses primarily windows batch commands from cmd exe with a few extensions from gnuwin32 in this build you cannot use a cygwin or msys shell rather the build must be performed in a windows console window here is a better terminal than the standard issue cmd exe terminal conemu which can be downloaded from https sourceforge net projects conemu or https conemu github io build tools the build still relies on some unix like commands i use the gnuwin32 tools that can be downloaded from http gnuwin32 sourceforge net using the download all selection individual packages can be download instead if you know what you are doing and want a faster download no i can t tell you which packages you should or should not download note it should be possible to use cygwin or msys2 in place of the gnuwin32 tools there are however complexities in doing that because those tools depend on the shell environment and use dlls that are not found at least not without the correct setup host compiler i use the minggw gcc compiler which can be downloaded from http www mingw org if you are using gnuwin32 then it is recommended the you not install the optional msys components as there may be conflicts kconfig frontends see the section entitled nuttx configuration tool under dos for information about installing the kconfig frontend tools to run natively under windows this capability should still be considered a work in progress because 1 it has not been verified on all targets and tools and 2 it still lacks some of the creature comforts of the more mature environments installing gnuwin32 the windows native build will depend upon a few unix like tools that can be provided either by msys or gnuwin32 the gnuwin32 are available from http gnuwin32 sourceforge net gnuwin32 provides ports of tools with a gpl or similar open source license to modern ms windows microsoft windows 2000 xp 2003 vista 2008 7 see http gnuwin32 sourceforge net packages html for a list of all of the tools available in the gnuwin32 package the sourceforge project is located here http sourceforge net projects gnuwin32 the project is still being actively supported although some of the windows ports have gotten very old some commercial toolchains include a subset of the gnuwin32 tools in the installation my recommendation is that you download the gnuwin32 tools directly from the sourceforge net website so that you will know what you are using and can reproduce your build environment gnuwin32 installation steps the following steps will download and execute the gnuwin32 installer 1 download getgnuwin32 x x x exe from http sourceforge net projects getgnuwin32 files this is the installer the current version as of this writing is 0 6 3 2 run the installer 3 accept the license 4 select the installation directory my recommendation is the directory that contains this readme file this directory 5 after running getgnuwin32 0 x x exe you will have a new directory this directory getgnuwin32 note that the gnuwin32 installer didn t install gnuwin32 instead it installed another smarter downloader that downloader is the gnuwin32 package management tool developed by the open ssl project the following steps probably should be performed from inside a dos shell 6 change to the directory created by getgnuwin32 x x x exe cd getgnuwin32 7 execute the download bat script the download bat script will download about 446 packages enough to have a very complete linux like environment under the dos shell this will take awhile this step only downloads the packages and the next step will install the packages download 8 this step will install the downloaded packages the argument of the install bat script is the installation location c gnuwin32 is the standard install location install c gnuwin32 note this installation step will install all gnuwin32 packages far more than you will ever need if disc space is a problem for you you might need to perform a manual installation of the individual zip files that you will find in the this directory getgnuwin32 packages directory 9 make sure that you add the gnuwin32 tools to your path variable set path c gnuwin32 bin path warning make sure you have c mingw bin in your path before any other directory that contains libiconv 2 dll apparently the as exe in some mingw distributions are dependent on that dll and having an old version of it in the path somewhere for example gnuwin32 tools will cause as exe to pick up the older version that doesn t have the entry point it s looking for cygwin build problems performance build performance under cygwin is really not so bad certainly not as good as a linux build however often you will find that the performance is not just bad but terrible if you are seeing awful performance like two or three compilations per second the culprit is usually your windows anti virus protection interfering with the build tool program execution i use cygwin quite often and i use windows defender in order to get good build performance i routinely keep the windows defender virus threat protections settings screen up i disable real time protection just before entering make then turn real time protection back on when the build completes with this additional nuisance step i find that build performance under cygwin is completely acceptable strange path problems if you see strange behavior when building under cygwin then you may have a problem with your path variable for example if you see failures to locate files that are clearly present that may mean that you are using the wrong version of a tool for example you may not be using cygwin s make program at usr bin make try which make usr bin make when you install some toolchains such as yargarto or codesourcery tools they may modify your path variable to include a path to their binaries at that location they may have gnuwin32 versions of the tools so you might actually be using a version of make that does not understand cygwin paths the solution is either 1 edit your path to remove the path to the gnuwin32 tools or 2 put usr local bin usr bin and bin at the front of your path export path usr local bin usr bin bin path window native toolchain issues there are many popular windows native toolchains that may be used with nuttx examples include codesourcery for windows devkitarm and several vendor provided toolchains there are several limitations with using a and windows based toolchain in a cygwin environment the three biggest are 1 the windows toolchain cannot follow cygwin paths path conversions are performed automatically in the cygwin makefiles using the cygpath utility but you might easily find some new path problems if so check out cygpath w 2 windows toolchains cannot follow cygwin symbolic links many symbolic links are used in nuttx e g include arch the make system works around these problems for the windows tools by copying directories instead of linking them but this can also cause some confusion for you for example you may edit a file in a linked directory and find that your changes had no effect that is because you are building the copy of the file in the fake symbolic directory if you use a windows toolchain you should get in the habit of making like this make clean context all an alias in your bashrc file might make that less painful the rebuild is not a long as you might think because there is no dependency checking if you are using a native windows toolchain that bring us to 3 general pre built toolchain issues to continue with the list of window native toolchain issues we can add the following these however are really just issues that you will have if you use any pre built toolchain vs building the nuttx toolchain from the nuttx buildroot package there may be incompatibilities with header files libraries and compiler built in functions detailed below for the most part these issues are handled in the existing make logic but if you are breaking new ground then you may encounter these 1 header files most pre built toolchains will build with a foreign c library usually newlib but maybe uclibc or glibc if you are using a linux toolchain this means that the header files from the foreign c library will be built into the toolchain so if you include stdio h you will get the stdio h from the incompatible foreign c library and not the nuttx stdio h at nuttx include stdio h that you wanted this can cause confusion in the builds and you must always be sure the nostdinc is included in the cflags that will assure that you take the include files only from 2 libraries what was said above header files applies to libraries you do not want to include code from the libraries of any foreign c libraries built into your toolchain if this happens you will get perplexing errors about undefined symbols to avoid these errors you will need to add nostdlib to your cflags flags to assure that you only take code from the nuttx libraries this however may causes other issues for libraries in the toolchain that you do want like libgcc a or libm a these are special cased in most makefiles but you could still run into issues of missing libraries 3 built ins some compilers target a particular operating system many people would for example like to use the same toolchain to develop linux and nuttx software compilers built for other operating systems may generate incompatible built in logic and for this reason fno builtin should also be included in your c flags and finally you may not be able to use nxflat 4 nxflat if you use a pre built toolchain you will lose all support for nxflat nxflat is a binary format described in documentation nuttxnxflat html it may be possible to build standalone versions of the nxflat tools there are a few examples of this in the buildroot repository at https bitbucket org nuttx buildroot however it is possible that there could be interoperability issues with your toolchain since they will be using different versions of binutils and possibly different abis building original linux boards in cygwin some default board configurations are set to build under linux and others to build under windows with cygwin various default toolchains may also be used in each configuration it is possible to change the default setup here for example is what you must do in order to compile a default linux configuration in the cygwin environment using the codesourcery for windows toolchain after instantiating a canned nuttx configuration run the target menuconfig and set the following items build setup build host platform windows build setup windows build environment cygwin system type toolchain selection codesourcery gnu toolchain under windows in windows 7 it may be required to open the cygwin shell as administrator run as option right button you find errors like permission denied recovering from bad configurations many people make the mistake of configuring nuttx with the canned configuration and then just typing make with disastrous consequences the build may fail with mysterious uninterpretable and irrecoverable build errors if for example you do this with an unmodified linux configuration in a windows cgwin environment you will corrupt the build environment the environment will be corrupted because of posix vs windows path issues and with issues related to symbolic links if you make the mistake of doing this the easiest way to recover is to just start over do make distclean to remove every trace of the corrupted configuration reconfigure from scratch and make certain that the set the configuration correctly for your platform before attempting to make again just fixing the configuration file after you have instantiated the bad configuration with make is not enough documentation additional information can be found in the documentation directory and also in readme files that are scattered throughout the source tree the documentation is in html and can be access by loading the following file into your web browser documentation index html nuttx documentation is also available online at https nuttx apache org below is a guide to the available readme files in the nuttx source tree nuttx arch arm src common readme lwl console txt lpc214x readme txt stm32l4 readme txt renesas include readme txt src readme txt x86 include readme txt src readme txt z80 src z80 readme txt z180 readme txt z180 mmu txt readme txt audio readme txt boards arm a1x pcduino a10 readme txt am335x beaglebone black readme txt c5471 c5471evm readme txt cxd56xx spresense readme txt dm320 ntosd dm320 doc readme txt readme txt efm32 efm32 g8xx stk readme txt efm32gg stk3700 readme txt olimex efm32g880f128 stk readme txt imx6 sabre 6quad readme txt imxrt imxrt1050 evk readme txt imxrt1060 evk readme txt teensy 4 x readme txt kinetis freedom k28f readme txt freedom k64f readme txt freedom k66f readme txt kwikstik k40 readme txt teensy 3 x readme txt twr k60n512 readme txt twr k64f120m readme txt kl freedom kl25z readme txt freedom kl26z readme txt teensy lc readme txt lc823450 lc823450 xgevk readme txt lpc17xx 40xx lincoln60 readme txt lpc4088 devkit readme txt lpc4088 quickstart readme txt lpcxpresso lpc1768 readme txt lx cpu readme txt mbed readme txt mcb1700 readme txt olimex lpc1766stk readme txt open1788 readme txt pnev5180b readme txt u blox c027 readme txt zkit arm 1769 readme txt lpc214x mcu123 lpc214x readme txt zp214xpa readme txt lpc2378 olimex lpc2378 readme txt lpc31xx ea3131 readme txt ea3152 readme txt olimex lpc h3131 readme txt lpc43xx bambino 200e readme txt lpc4330 xplorer readme txt lpc4337 ws readme txt lpc4357 evb readme txt lpc4370 link2 readme txt lpc54xx lpcxpresso lpc54628 readme txt max326xx max32660 evsys readme txt moxart moxa nrf52 nrf52 generic readme txt nuc1xx nutiny nuc120 readme txt s32k1xx s32k118evb readme txt s32k146evb readme txt s32k148evb readme txt sam34 arduino due readme txt flipnclick sam3x readme txt sam3u ek readme txt sam4cmp db readme txt sam4e ek readme txt sam4l xplained readme txt sam4s xplained readme txt sam4s xplained pro readme txt sama5 sama5d2 xult readme txt giant board readme md sama5d3x ek readme txt sama5d3 xplained readme txt sama5d4 ek readme txt samd2l2 arduino m0 readme txt samd20 xplained readme txt samd21 xplained readme txt saml21 xplained readme txt samd5e5 metro m4 readme txt samv7 same70 qmtech readme txt same70 xplained readme txt samv71 xult readme txt stm32 axoloti readme txt b g474e dpow1 readme txt clicker2 stm32 readme txt cloudctrl readme txt emw3162 readme txt fire stm32v2 readme txt hymini stm32v readme txt maple readme txt mikroe stm32f4 readme txt nucleo f103rb readme txt nucleo f207zg readme txt nucleo f302r8 readme txt nucleo f303re readme txt nucleo f303ze readme txt nucleo f334r8 readme txt nucleo f410rb readme txt nucleo f446re readme txt nucleo f4x1re readme txt nucleo l152re readme txt olimexino stm32 olimex stm32 e407 readme txt olimex stm32 h405 readme txt olimex stm32 h407 readme txt olimex stm32 p107 olimex stm32 p207 readme txt olimex stm32 p407 readme txt omnibusf4 readme txt photon readme txt shenzhou readme txt stm32 tiny readme txt stm3210e eval readme txt stm3220g eval readme txt stm3240g eval readme txt stm32butterfly2 stm32f103 minimum readme txt stm32f334 disco readme txt stm32f3discovery readme txt stm32f411e disco readme txt stm32f429i disco readme txt stm32f4discovery readme txt stm32ldiscovery readme txt stm32vldiscovery readme txt viewtool stm32f107 readme txt stm32f0l0g0 b l072z lrwan1 readme txt nucleo f072rb readme txt nucleo f091rc readme txt nucleo g070rb readme txt nucleo g071rb readme txt nucleo l073rz readme txt stm32f051 discovery readme txt stm32f072 discovery readme txt stm32f7 nucleo 144 readme txt stm32f746g disco configs fb readme txt configs nxdemo readme txt configs nxterm readme txt readme txt stm32f746 ws stm32f769i disco readme txt stm32h7 nucleo h743zi readme txt stm32l4 b l475e iot01a readme txt nucleo l432kc readme txt nucleo l452re readme txt nucleo l476rg readme txt nucleo l496zg readme txt stm32l476 mdk readme txt stm32l476vg disco readme txt stm32l4r9ai disco readme txt str71x olimex strp711 readme txt tiva dk tm4c129x readme txt eagle100 readme txt ekk lm3s9b96 readme txt launchxl cc1310 readme txt launchxl cc1312r1 readme txt lm3s6432 s2e readme txt lm3s6965 ek readme txt lm3s8962 ek readme txt lm4f120 launchpad readme txt tm4c123g launchpad readme txt tm4c1294 launchpad readme txt tms570 launchxl tms57004 readme txt tms570ls31x usb kit readme txt xmc4 xmc4500 relax readme txt avr at32uc3 avr32dev1 readme txt at90usb micropendous3 readme txt teensy 2 0 readme txt atmega amber readme txt arduino mega2560 readme txt moteino mega readme txt hc m9s12 demo9s12ne64 readme txt ne64badge readme txt mips pic32mx mirtoo readme txt pic32mx7mmb readme txt pic32mx starterkit readme txt sure pic32mx readme txt ubw32 readme txt pic32mz chipkit wifire readme txt flipnclick pic32mz readme txt pic32mz starterkit readme txt misoc lm32 misoc readme txt or1k mor1kx or1k readme txt renesas m16c skp16c26 readme txt sh1 us7032evb1 readme txt risc v sim sim sim include readme txt readme txt x86 qemu qemu i486 readme txt xtensa esp32 esp32 core readme txt z16 z16f z16f2800100zcog configs nsh readme txt configs ostest readme txt configs pashello readme txt readme txt z80 ez80 ez80f910200kitg configs ostest readme txt readme txt ez80f910200zco configs dhcpd readme txt configs httpd readme txt configs nettest readme txt configs nsh readme txt configs poll readme txt readme txt makerlisp configs nsh flash readme txt configs nsh ram readme txt configs sdboot readme txt readme txt z80x configs nsh flash readme txt configs nsh ram readme txt configs sdboot readme txt readme txt z180 p112 readme txt z8 z8encore000zco configs ostest readme txt readme txt z8f64200100kit configs ostest readme txt readme txt z80 z80sim readme txt readme txt drivers eeprom readme txt lcd readme txt pcf8574 lcd backpack readme txt mtd readme txt sensors readme txt syslog readme txt readme txt fs binfs readme txt cromfs readme txt mmap readme txt nxffs readme txt smartfs readme txt procfs readme txt spiffs readme md unionfs readme txt graphics readme txt libs readme txt libc zoneinfo readme txt readme txt libdsp readme txt libnx nxfongs readme txt readme txt libxx readme txt mm shm readme txt readme txt net sixlowpan readme txt readme txt pass1 readme txt syscall readme txt tools readme txt below is a guide to the available readme files in the semi optional apps source tree apps examples bastest readme txt json readme txt pashello readme txt readme txt gpsutils minmea readme txt graphics tiff readme txt traveler tools tcledit readme txt interpreters bas readme txt ficl readme txt readme txt modbus readme txt netutils discover readme txt ftpc readme txt json readme txt telnetd readme txt readme txt nshlib readme txt nxwidgets readme txt system cdcacm readme txt i2c readme txt inifile readme txt install readme txt nsh readme txt nxplayer readme txt psmq readme txt symtab readme txt termcurses readme txt usbmsc readme txt zmodem readme txt wireless bluetooth btsak readme txt ieee802154 i8sak readme txt additional readme txt files in the other related repositories nxwidgets doxygen readme txt tools readme txt unittests readme txt readme txt buildroot readme txt tools readme txt uclibc readme txt | nuttx rtos embedded real-time mcu microcontroller | os |
papers | about this repository contains short summaries of some machine learning papers added 2018 10 01 unsupervised learning eccv 2018 deep clustering for unsupervised learning of visual features neural nets deep clustering for unsupervised learning of visual features md object detection point cloud self driving cars eccv 2018 deep continuous fusion for multi sensor 3d object detection neural nets deep continuous fusion for multi sensor 3d object detection md audio sound source localization action recognition sound source separation self supervised eccv 2018 audio visual scene analysis with self supervised multisensory features neural nets audio visual scene analysis with self supervised multisensory features md uncertainty eccv 2018 towards realistic predictors neural nets towards realistic predictors md object detection eccv 2018 acquisition of localization confidence for accurate object detection neural nets acquisition of localization confidence for accurate od md object detection eccv 2018 cornernet detecting objects as paired keypoints neural nets cornernet md normalization eccv 2018 group normalization neural nets group normalization md architectures attention eccv 2018 convolutional networks with adaptive inference graphs neural nets convolutional networks with adaptive inference graphs md added 2018 03 08 architectures attention spatial transformer networks neural nets stn md thanks alexobednikov https github com alexobednikov added 2018 03 06 loss functions recognition working hard to know your neighbor s margins local descriptor learning loss neural nets hardnet md thanks alexobednikov https github com alexobednikov added 2017 12 13 face recognition faces neural aggregation network for video face recognition neural nets nan for video face recognition md thanks alexobednikov https github com alexobednikov added 2017 12 03 critical learning periods in deep neural networks neural nets critical learning periods in deep neural networks md gan self driving cars high resolution image synthesis and semantic manipulation with conditional gans neural nets high resolution image synthesis with conditional gans md self driving cars computer vision for autonomous vehicles problems datasets and state of the art mixed computer vision for autonomous vehicles overview md added 2017 10 28 gan progressive growing of gans for improved quality stability and variation neural nets progressive growing of gans md added 2017 10 24 self driving cars systematic testing of convolutional neural networks for autonomous driving neural nets systematic testing of cnns for autonomous driving md self driving cars segmentation fast scene understanding for autonomous driving neural nets fast scene understanding for autonomous driving md self driving cars arguing machines perception control system redundancy and edge case discovery in real world autonomous driving neural nets arguing machines md self driving cars gan reinforcement virtual to real reinforcement learning for autonomous driving neural nets virtual to real rl for ad md self driving cars end to end learning for self driving cars neural nets end to end learning for self driving cars md added 2017 10 21 snapshot ensembles train 1 get m for free neural nets snapshot ensembles md image crowd counting using convolutional neural network and markov random field neural nets image crowd counting using cnn and mrf md reinforcement rainbow combining improvements in deep reinforcement learning neural nets rainbow md reinforcement learning to navigate in complex environments neural nets learning to navigate in complex environments md gan unsupervised image to image translation networks neural nets unsupervised image to image translation networks md rnn dilated recurrent neural networks neural nets dilated recurrent neural networks md object detection tracking detect to track and track to detect neural nets detect to track and track to detect md architectures dilated residual networks neural nets dilated residual networks md added 2017 09 24 object detection feature pyramid networks for object detection neural nets feature pyramid networks for object detection md object detection ssd single shot multibox detector neural nets ssd md object detection efficient networks mobilenets efficient convolutional neural networks for mobile vision applications neural nets mobilenets md object detection mask r cnn neural nets mask r cnn md added 2017 08 08 faces multi view face detection using deep convolutional neural networks neural nets multi view face detection using deep convolutional neural networks md aka ddfd thanks arnaldog12 https github com arnaldog12 added 2017 06 11 gan on the effects of batch and weight normalization in generative adversarial networks neural nets on the effects of bn and wn in gans md gan began neural nets began md gan stackgan text to photo realistic image synthesis with stacked generative adversarial networks neural nets stackgan md activation functions self normalizing neural networks neural nets self normalizing neural networks md gan wasserstein gan neural nets wgan md aka wgan added 2017 03 15 object detection yolo9000 better faster stronger neural nets yolo9000 md aka yolov2 object detection you only look once unified real time object detection neural nets yolo md aka yolo object detection pvanet deep but lightweight neural networks for real time object detection neural nets pvanet md added 2017 03 14 object detection r fcn object detection via region based fully convolutional networks neural nets r fcn md object detection faster r cnn neural nets faster r cnn md object detection fast r cnn neural nets fast r cnn md object detection rich feature hierarchies for accurate object detection and semantic segmentation neural nets rich feature hierarchies for accurate object detection and semantic segmentation md aka r cnn pedestrians ten years of pedestrian detection what have we learned mixed ten years of pedestrian detection what have we learned md neural style instance normalization the missing ingredient for fast stylization neural nets instance normalization the missing ingredient for fast stylization md added 2016 07 29 human pose estimation stacked hourglass networks for human pose estimation neural nets stacked hourglass networks for human pose estimation md faces deepface closing the gap to human level performance in face verification neural nets deepface md translation character based neural machine translation neural nets character based neural machine translation md added 2016 07 01 human pose estimation convolutional pose machines neural nets convolutional pose machines md faces hyperface a deep multi task learning framework for face detection landmark localization pose estimation and gender recognition neural nets hyperface md faces face attribute prediction using off the shelf cnn features neural nets face attribute prediction using off the shelf cnn features md faces cms rcnn contextual multi scale region based cnn for unconstrained face detection neural nets cms rcnn md conditional image generation with pixelcnn decoders neural nets conditional image generation with pixelcnn decoders md gan infogan interpretable representation learning by information maximizing generative adversarial nets neural nets infogan md gan improved techniques for training gans neural nets improved techniques for training gans md synthesizing the preferred inputs for neurons in neural networks via deep generator networks neural nets synthesizing the preferred inputs for neurons in neural networks via deep generator networks md added 2016 06 06 architectures fractalnet ultra deep neural networks without residuals neural nets fractalnet ultra deep networks without residuals md planet photo geolocation with convolutional neural networks neural nets planet md optimizers adam a method for stochastic optimization neural nets adam md gan rnn generating images with recurrent adversarial networks neural nets generating images with recurrent adversarial networks md gan adversarially learned inference neural nets adversarially learned inference md added 2016 06 02 architectures resnet in resnet generalizing residual architectures neural nets resnet in resnet md autoencoders rank ordered autoencoders neural nets rank ordered autoencoders md architectures wide residual networks neural nets wide residual networks md architectures identity mappings in deep residual networks neural nets identity mappings in deep residual networks md regularization swapout learning an ensemble of deep architectures neural nets swapout md multi scale context aggregation by dilated convolutions neural nets multi scale context aggregation by dilated convolutions md texture synthesis through convolutional neural networks and spectrum constraints neural nets texture synthesis through cnns and spectrum constraints md precomputed real time texture synthesis with markovian generative adversarial networks neural nets markovian gans md added 2016 05 15 neural style semantic style transfer and turning two bit doodles into fine artwork neural nets neural doodle md combining markov random fields and convolutional neural networks for image synthesis neural nets combining mrfs and cnns for image synthesis md superresolution accurate image super resolution using very deep convolutional networks neural nets accurate image super resolution md human pose estimation joint training of a convolutional network and a graphical model for human pose estimation neural nets joint training of a convnet and a pgm for hpe md reinforcement hierarchical deep reinforcement learning integrating temporal abstraction and intrinsic motivation neural nets hierarchical deep reinforcement learning md colorization let there be color neural nets let there be color md added 2016 05 08 neural style artistic style transfer for videos neural nets artistic style transfer for videos md added 2016 05 03 reinforcement playing atari with deep reinforcement learning neural nets playing atari with deep reinforcement learning md generative attend infer repeat fast scene understanding with generative models neural nets attend infer repeat md architectures efficient networks squeezenet alexnet level accuracy with 50x fewer parameters and 0 5mb model size neural nets squeezenet md activation functions noisy activation functions neural nets noisy activation functions md object detection image to text densecap fully convolutional localization networks for dense captioning neural nets densecap md added 2016 04 01 regularization deep networks with stochastic depth neural nets deep networks with stochastic depth md added 2016 03 31 gan deep generative image models using a laplacian pyramid of adversarial networks neural nets deep generative image models using a laplacian pyramid of adversarial networks md generative rnn attention draw a recurrent neural network for image generation neural nets draw a recurrent neural network for image generation md generating images with perceptual similarity metrics based on deep networks neural nets generating images with perceptual similarity metrics based on deep networks md generative generative moment matching networks neural nets generative moment matching networks md generative rnn pixel recurrent neural networks neural nets pixel recurrent neural networks md gan unsupervised representation learning with deep convolutional generative adversarial networks neural nets unsupervised representation learning with deep convolutional generative adversarial networks md added 2016 03 neural style a neural algorithm for artistic style neural nets a neural algorithm for artistic style md normalization regularization batch normalization accelerating deep network training by reducing internal covariate shift neural nets batch normalization md architectures deep residual learning for image recognition neural nets deep residual learning for image recognition md activation functions fast and accurate deep networks learning by exponential linear units elus neural nets elus md fractional max pooling neural nets fractional max pooling md gan generative adversarial networks neural nets generative adversarial networks md architectures inception v4 inception resnet and the impact of residual connections on learning neural nets inception v4 md normalization weight normalization a simple reparameterization to accelerate training of deep neural networks neural nets weight normalization md | machine-learning deep-learning paper summary computer-vision nlp gan deep-reinforcement-learning | ai |
GreenTunnel | green tunnel p align center img src assets logo png alt green tunnel logo width 200 p p align center img src https img shields io github license sadeghhayeri greentunnel svg color green style for the badge img src https img shields io github repo size sadeghhayeri greentunnel svg color green style for the badge img src https img shields io discord 707464295021019197 color green style for the badge p greentunnel bypasses dpi deep packet inspection systems found in many isps internet service providers which block access to certain websites p align center img src assets demo gif alt green tunnel demo style margin top 20px p how to use graphical user interface gui you can simply choose the suitable installation for your os in the releases https github com sadeghhayeri greentunnel releases releases section command line interface cli you can install greentunnel using npm https www npmjs org npm npm i g green tunnel or using snap https snapcraft io edge version sudo snap install edge green tunnel devmode after installation you can run it using gt or green tunnel commands gt help usage green tunnel options usage gt options options help h show help boolean version v show version number boolean ip ip address to bind proxy server string default 127 0 0 1 https only block insecure http requests boolean default false port port address to bind proxy server number default 8000 dns type string choices https tls default https dns server string default https cloudflare dns com dns query dns ip ip address for unencrypted dns string default 127 0 0 1 dns port port for unencrypted dns number default 53 silent s run in silent mode boolean default false verbose v debug mode string default system proxy automatic set system proxy boolean default true examples gt gt ip 127 0 0 1 port 8000 https only gt dns server https doh securedns eu dns query gt verbose green tunnel proxy issues https github com sadeghhayeri greentunnel issues for debug use verbose option green tunnel verbose green tunnel docker docker run p 8000 8000 sadeghhayeri green tunnel envs port https only verbose silent dns type dns server usage docker run e port 1000 p 8000 1000 sadeghhayeri green tunnel on raspberry pi docker run p 8000 8000 sadeghhayeri green tunnel arm if you want to make container keep running when reboot docker run d restart unless stopped p 8000 8000 sadeghhayeri green tunnel arm please make sure port 8000 is not blocked on raspberry pi firewall sudo ufw allow 8000 comment green tunnel to use it on your other device set http proxy to raspberry pi ip address port port 8000 tested on macos catalina with node 12 ubuntu 18 04 with node 8 windows 10 with node 8 faq how does it work http there are gaps in providers in dpi they happen from what the dpi rules write for ordinary user programs omitting all possible cases that are permissible by standards this is done for simplicity and speed some dpis cannot recognize the http request if it is divided into tcp segments for example a request of the form get http 1 0 host www youtube com we send it in 2 parts first comes get http 1 0 n host www you and second sends as tube com n in this example isp cannot find blocked word youtube in packets and you can bypass it https server name indication sni is an extension to tls transport layer security that indicates the actual destination hostname a client is attempting to access over https for this web filter feature sni hostname information is used for blocking access to specific sites over https for example if the administrator chooses to block the hostname youtube using this feature all website access attempts over https that contain youtube like www youtube com in the sni would be blocked however access to the same hostname over http would not be blocked by this feature greentunnel tries to split first client hello packet into small chunks and isps can t parse packet and found sni field so bypass traffic dns when you enter a url in a web browser the first thing the web browser does is to ask a dns domain name system server at a known numeric address to look up the domain name referenced in the url and supply the corresponding ip address if the dns server is configured to block access it consults a blacklist of banned domain names when a browser requests the ip address for one of these domain names the dns server gives a wrong answer or no answer at all greentunnel use dns over https https en wikipedia org wiki dns over https doh dns over https and dns over tls https en wikipedia org wiki dns over tls dns over tls to get real ip address and bypass dns spoofing development notes greentunnel is an open source app and i really appreciate other developers adding new features and or helping fix bugs if you want to contribute to greentunnel you can fork this repository make the changes and create a pull request however please make sure you follow a few rules listed below to ensure that your changes get merged into the main repo the rules listed below are enforced to make sure the changes made are well documented and can be easily kept track of pull requests and stars are always welcome for bugs and feature requests please create an issue make sure your pull request has an informative title you should use prefixes like add fix etc at the start of the title which describes the changes followed by a one line description of the changes example add added a new feature to greentunnel commits in your fork should be informative as well make sure you don t combine too many changes into a single commit todo list x enable disable proxy on windows httphandler x add cli arguments x catch all exceptions add preferences menu fix close button donation love greentunnel please consider donating to sustain our activities dogecoin dtgjx8kkdcuksebtvhgqx1gyennavvuxla br bitcoin bc1qknjsmsa98lljwxjwl4pmjh48s8su8r8ajkqd8w br ethereum 0x018fbf3fac7165b2c85f856cc90e2d9410415150 br litecoin ltc1q5tfprazpkzjvzf5shgprkpkhnnku3p72feutxt br ripple xrp rt6ztkkdbvyzbee9cpqsdtsewntbaov13 br https img shields io badge buy 20me 20a 20coffee irr 20 20payping red svg style for the badge logo ko fi https payping ir d txts br https img shields io badge buy 20me 20a 20coffee usd 20 20paypal red svg style for the badge logo ko fi https www paypal com cgi bin webscr cmd donations business hj5tbxvyths7n currency code usd source url br donate with bitcoin https en cryptobadges io badge big 3c5sj5bj3n5gyjr27uxowdsggcq2vjdhn5 showbalance true https en cryptobadges io donate bc1qknjsmsa98lljwxjwl4pmjh48s8su8r8ajkqd8w donate with ethereum https en cryptobadges io badge big 0x018fbf3fac7165b2c85f856cc90e2d9410415150 showbalance true https en cryptobadges io donate 0x018fbf3fac7165b2c85f856cc90e2d9410415150 donate with ripple https en cryptobadges io badge big rt6ztkkdbvyzbee9cpqsdtsewntbaov13 showbalance true https en cryptobadges io donate rt6ztkkdbvyzbee9cpqsdtsewntbaov13 license licensed under the mit license see license https github com sadeghhayeri greentunnel blob master license license | dpi filtering isp sni deep-packet-inspection proxy vpn socks firewall-bypass | os |
peluqueria_canina | peluqueria canina proyecto final de programaci n entorno servidor de 2do a o basado en aplicaci n de registros para peluquer a canina status https img shields io badge status running green svg colorb 00c106 readme https img shields io badge readme ok green svg colorb 00c106 database https img shields io badge database ok green svg colorb 00c106 commits https img shields io badge commits 26 blue svg tag https img shields io badge tag v0 3 orange svg template https img shields io badge template twig yellow svg techs https img shields io badge techs javascript php css bootstrap yellow svg estructura del proyecto la estructura del proyecto est basada en mvc modelo vista controlador y en api restfull crea su propia api a json contenido y caracter sticas registro de usuarios login y logout a adir registros borrarlos y editarlos instalaci n debes renombrar el archivo env example a env con los datos correspondientes a la base de datos cargar la base de datos para construir la base de datos utiliza el script createdb sql https github com adrydev92 peluqueria canina blob master createdb sql instalaci n de dependencias desde la terminal usa el siguiente comando composer update ste recibe las dependencias desde el composer json configuraci n de ruta de inicio mamp preferences web server document root clic izquierdo ruta de tu proyecto carpeta public tecnolog as usadas la aplicaci n est estructurada utilizando php javascript css bootstrap y twig | php css bootstrap html5 javascript twig | front_end |
module-ballerinax-ai.agent | ballerina ai agent library build https github com ballerina platform module ballerinax ai agent workflows ci badge svg https github com ballerina platform module ballerinax ai agent actions query workflow 3aci github last commit https img shields io github last commit ballerina platform module ballerinax ai agent svg https github com ballerina platform module ballerinax ai agent commits master graalvm check https github com ballerina platform module ballerinax ai agent actions workflows build with bal test native yml badge svg https github com ballerina platform module ballerinax ai agent actions workflows build with bal test native yml license https img shields io badge license apache 202 0 blue svg https opensource org licenses apache 2 0 this library provides functionality required to build react agent using large language models llms for more information go to the module s agent ballerina module md building from the source setting up the prerequisites 1 download and install java se development kit jdk version 11 you can install either openjdk https adoptopenjdk net or oracle jdk https www oracle com java technologies javase jdk11 downloads html note set the java home environment variable to the path name of the directory into which you installed jdk 2 download and install ballerina swan lake https ballerina io building the source execute the commands below to build from the source after installing ballerina swan lake to build the package bal build github to run the tests bal test github contributing to ballerina as an open source project ballerina welcomes contributions from the community for more information go to the contribution guidelines https github com ballerina platform ballerina lang blob master contributing md code of conduct all contributors are encouraged to read the ballerina code of conduct https ballerina io code of conduct useful links discuss code changes of the ballerina project in ballerina dev googlegroups com mailto ballerina dev googlegroups com chat live with us via our discord server https discord gg ballerinalang post all technical questions on stack overflow with the ballerina https stackoverflow com questions tagged ballerina tag | ai |
|
Question-Generation | question generation this project was originally intended for an ai course at sofia university during it s execution i was constraint on time and couldn t implement all the ideas i had but i plan to continue working on it and i did pick up the topic for my master s thesis using t5 transformers to generate question answer pairs along with distractors check it out in the question generation transformers https github com kristiyanvachev question generation transformers repository the approach for identifyng keywords used as target answers has been accepted in the ranlp2021 conference generating answer candidates for quizzes and answer aware question generators https arxiv org abs 2108 12898v1 general idea the idea is to generate multiple choice answers from text by splitting this complex problem to simpler steps identify keywords from the text and use them as answers to the questions replace the answer from the sentence with blank space and use it as the base for the question transform the sentence with a blank space for answer to a more question like sentence generate distractors words that are similar to the answer as incorrect answers question generation step by step gif https media giphy com media 1n4jpyditd3mgvtzbz giphy gif installation creating a virtual environment optional to avoid any conflicts with python packages from other projects it is a good practice to create a virtual environment https docs python org 3 library venv html in which the packages will be installed if you do not want to this you can skip the next commands and directly install the the requirements txt file create a virtual environment python m venv venv enter the virtual environment windows venv scripts activate linux or macos source venv scripts activate install ipython inside the venv ipython kernel install user name venv install jupyter lab inside the venv pip install jupyterlab installing packages pip install r requirements txt run jupyter jupyter lab execution data exploration before i could to anything i wanted to understand more about how questions are made and what kind of words are it s answers i used the squad 1 0 https rajpurkar github io squad explorer dataset which has about 100 000 questions generated from wikipedia articles you can read about the insights i ve found in the data exploration jupyter notebook identifying answers my assumption was that words from the text would be great answers for questions all i needed to do was to decide which words or short phrases are good enough to become answers i decided to do a binary classification on each word from the text spacy https spacy io really helped me with the word tagging feature engineering i pretty much needed to create the entire dataset for the binary classification i extracted each non stop word from the paragraphs of each question in the squad dataset and added some features on it like part of speech is it a named entity are only alpha characters used shape whether it s only alpha characters digits has punctuation xxxx dddd xxx x xxxx word count and the label isanswer whether the word extracted from the paragraph is the same and in the same place as the answer of the squad question some other features like tf idf score and cosine similarity to the title would be the great but i didn t have the time to add them other than those it s up to our imagination to create new features maybe whether it s in the start middle or end of a sentence information about the words surrounding it and more though before adding more feature it would be nice to have a metric to assess whether the feature is going to be useful or not model training i found the problem similar to spam filtering where a common approach is to tag each word of an email as coming from a spam or not a spam email i used scikit learn s gaussian naive bayes algorithm to classify each word whether it s an answer the results were surprisingly good at a quick glance the algorithm classified most of the words as answers the ones it didn t were in fact unfit the cool thing about naive bayes is that you get the probability for each word in the demo i ve used that to order the words from the most likely answer to the least likely creating questions another assumption i had was that the sentence of an answer could easily be turned to a question just by placing a blank space in the position of the answer in the text i get a cloze question sentence with a blank space for the missing word answer oxygen question is a chemical element with symbol o and atomic number 8 i decided it wasn t worth it to transform the cloze question to a more question looking sentence but i imagine it could be done with a seq2seq neural network similarly to the way text is translated from one language to another generating incorrect answers the part turned out really well for each answer i generate it s most similar words using word embeddings and cosine similarity most similar words to oxygen https i gyazo com 175b9f86b3defc0798800cb06169cc3f png most of the words are just fine and could easily be mistaken for the correct answer but there are some which are obviously not appropriate since i didn t have a dataset with incorrect answers i fell back on a more classical approach i removed the words that weren t the same part of speech or the same named entity as the answer and added some more context from the question i would like to find a dataset with multiple choice answers and see if i can create a ml model for generating better incorrect answers results after adding a demo project the generated questions aren t really fit to go into a classroom instantly but they are t bad either the cool thing is the simplicity and modularity of the approach where you could find where it s doing bad say it s classifying verbs and plug a fix into it having a complex neural network like all the papers on the topics do will probably do better especially in the age we re living but the great thing i found out about this approach is that it s like a gateway for a software engineer with his software engineering mindset to get into the field of ai and see meaningful results future work updated i find this topic quite interesting and with a lot of potential i would probably continue working in this field i even enrolled in a masters of data mining and will probably do some similar projects i will link anything useful here i ve already put some more time on finishing the project but i would like to transform it more to a tutorial about getting into the field of ai while having the ability to easily extend it with new custom features updates update 29 12 19 the repository has become pretty popular so i added a new notebook demo ipynb that combines all the modules and generates questions for any text i reordered the other notebooks and documented the code a bit better update 09 03 21 added a requirements txt file with instructions to run a virtual environment and fixed the bug a with valueerror operands could not be broadcast together with shapes 230 121 83 i have also started working on my master s thesis with a similar topic of question generation update 27 10 21 i have uploaded the code for my master s thesis in the question generation transformers https github com kristiyanvachev question generation transformers repository i highly encourage you to check it out additionally the approach using a classfier to pick the answer candidates has been accepted as a students paper in the ranlp2021 conference paper here https arxiv org abs 2108 12898v1 | question-generation question-generator questions-and-answers quiz machine-learning spacy spacy-nlp nlp ai word-embeddings cosine-similarity naive-bayes | ai |
Air-Canvas-project | air canvas project computer vision project implemented with opencv ever wanted to draw your imagination by just waiving your finger in air in this post we will learn to build an air canvas which can draw anything on it by just capturing the motion of a coloured marker with camera here a coloured object at tip of finger is used as the marker we will be using the computer vision techniques of opencv to build this project the preffered language is python due to its exhaustive libraries and easy to use syntax but understanding the basics it can be implemented in any opencv supported language here colour detection and tracking is used in order to achieve the objective the colour marker in detected and a mask is produced it includes the further steps of morphological operations on the mask produced which are erosion and dilation erosion reduces the impurities present in the mask and dilation further restores the eroded main mask algorithm 1 start reading the frames and convert the captured frames to hsv colour space easy for colour detection 2 prepare the canvas frame and put the respective ink buttons on it 3 adjust the trackbar values for finding the mask of coloured marker 4 preprocess the mask with morphological operations erotion and dilation 5 detect the contours find the center coordinates of largest contour and keep storing them in the array for successive frames arrays for drawing points on canvas 6 finally draw the points stored in array on the frames and canvas requirements python3 numpy opencv installed on your system img src https raw githubusercontent com infoaryan air canvas project master screenshots sample project img1 png width 800 height 400 | ai |
|
ml_from_scratch | ml from scratch repository of latest machine learning algorithms in the fields of large language models and diffusion models all implemented from scratch in pytorch hence it serves as an excellent reference for students ml engineers and ml practitioners alike this repository is inspired by the mingpt series https github com karpathy mingpt and adds on top by following industry best practices like clean and well documented code modular and well tested components following standard pytorch apis like datasets and dataloaders the repository also aims to expand beyond pretraining and include other algorithms too like lora finetuning distillation and rlhf alignment algorithms implemented in this repository large language models 1 pretraining of gpt models https arxiv org abs 2005 14165 2 finetuning of gpt models using lora https arxiv org abs 2106 09685 diffusion models 1 latent diffusion models https arxiv org abs 2112 10752 disclaimer the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software the views and opinions of authors expressed herein do not necessarily state or reflect those of their employers or any agency thereof | ai |
|
nlp | nlp circleci https circleci com gh chriscasola nlp svg style svg https circleci com gh chriscasola nlp go report card https goreportcard com badge github com chriscasola nlp https goreportcard com report github com chriscasola nlp godoc https godoc org github com chriscasola nlp status svg https godoc org github com chriscasola nlp nlp is a go package meant to contain implementations of common natural language processing algorithms so far there is a naive implementation of conditional random fields crf the crf implementation draws from the following articles introduction to conditional random fields http blog echen me 2012 01 03 introduction to conditional random fields an introduction to conditional random fields http homepages inf ed ac uk csutton publications crftutv2 pdf an introduction to conditional random fields for relational learning https people cs umass edu mccallum papers crf tutorial pdf | crf nlp natural-language-processing conditional-random-fields golang go | ai |
sefirot | sefirot github actions https github com globalbrain sefirot workflows test badge svg https github com globalbrain sefirot actions codecov https codecov io gh globalbrain sefirot branch main graph badge svg https codecov io gh globalbrain sefirot license https img shields io npm l globalbrain sefirot svg https github com globalbrain sefirot blob main license md sefirot is a collection of vue components for global brain design system components are meant to be clean sophisticated and scalable sefirot is focused on being used within global brain s ecosystem hence the design ui ux of components is relatively fixed and customization capability is limited in exchange for customizability we can create components that are more robust dynamic and clean feel free to leverage any component within this project you may customize components how you see fit and perhaps some features may be valuable to you any suggestions requests or questions are welcome documentation you can check out the documentation for sefirot at https sefirot globalbrains com contribution we re really excited that you are interested in contributing to sefirot before submitting your contribution though please make sure to take a moment and read through the following guidelines code style guide sefirot follows the official vue style guide https v3 vuejs org style guide but always remember to follow the golden rule hellip every line of code should appear to be written by a single person no matter the number of contributors mdash cite mdo cite development bash pnpm run serve serve documentation website at http localhost 3000 bash pnpm run lint lint files using a rule of standard js bash pnpm test run the tests bash pnpm run coverage output test coverage in coverage directory license sefirot is open sourced software licensed under the mit license license md | vue | os |
Lab-Project-FreeRTOS-FAT | freertos fat dos compatible embedded fat file system freertos fat is an open source thread aware and scalable fat12 fat16 fat32 dos windows compatible embedded fat file system which was recently acquired by real time engineers ltd for use with and without the rtos freertos fat is already used in commercial products and is the file system used in the ftp https www freertos org freertos plus freertos plus tcp ftp server html and http https www freertos org freertos plus freertos plus tcp http web server html server examples that are documented on the freertos tcp https www freertos org freertos plus freertos plus tcp index html pages the standard c library style api https www freertos org freertos plus freertos plus fat standard file system api html includes a thread local errno value and the lower level native api provides a rich set of detailed error codes for more details please visit freertos fat https www freertos org freertos plus freertos plus fat index html page to consume freertos fat consume with cmake if using cmake it is recommended to use this repository using fetchcontent add the following into your project s main or a subdirectory s cmakelists txt cmake include fetchcontent fetchcontent declare freertos plus fat git repository https github com freertos lab project freertos fat git git tag master note best practice to use specific git hash or tagged version git submodules don t grab any submodules since not latest set freertos plus fat dev support off cache bool force select the native compile port set freertos plus fat port posix cache string force select the cross compile port if cmake crosscompiling eg zynq 2019 3 version of port set freertos plus fat port zynq 2019 3 cache string force endif fetchcontent makeavailable freertos plus fat if you already have freertos in your project you may skip the fetch content by setting freertos plus fat fetch freertos to off consuming stand alone it is recommended to use this repository as a submodule please refer to git tools submodules https git scm com book en v2 git tools submodules notes this project is undergoing optimizations or refactorization to improve memory usage modularity documentation demo usability or test coverage | os |
|
py-blockchain-cli | pyblockchaincli build status https travis ci org simpleapples py blockchain cli svg branch master https travis ci org simpleapples py blockchain cli readme zh md a simple blockchain command line interface without using any 3rd party library screenshot http ww1 sinaimg cn large 6ae0adaely1fpb0l9rznog20fo0d8gqg gif features mining block with data distributed peer to peer network proof of work system blockchain validation installation only support python 3 6 run bash clone this repository git clone git github com simpleapples py blockchain cli git go into the project folder cd py blockchain cli run main py python3 main py contributing please submit a pull request to contribute license this project is licensed under the mit license | blockchain python3 cryptocurrency command-line-interface | blockchain |
konker-platform | konker platform join the chat at https gitter im konker platform lobby https badges gitter im konker platform lobby svg https gitter im konker platform lobby utm source badge utm medium badge utm campaign pr badge utm content badge license img license build https github com konkerlabs konker platform actions workflows ci yaml badge svg https github com konkerlabs konker platform actions workflows ci yaml konker platform is an open source platform for the internet of things iot it is developed by konker and the community the platform allows easy connection and management of devices using http or mqtt protocols pre requisites konker platform runs and compiles on java 8 it has a compile time dependency on lombok see below and r untime dependencies on eclipse jetty mongodb redis and mosquitto minimal hardware requisites to run with maven quadcore cpu 8 gb of memory to run as container with kubernetes 10 vcore 8gb of memory dependencies lombok intellij just install the lombok plugin eclipse 1 java jar maven repository org projectlombok lombok version lombok version jar 2 click on specify location 3 select the eclipse executable 4 click on install update 5 restart the eclipse building konker platform is built by using apache maven maven package running run standalone container if you want to run the konker open platform on your own desktop we offer a docker image with allin resources to help you please visit https hub docker com r konkerlabs konker platform if you need some help please contact us on support konkerlabs com hosted cloud environment konker provides a hosted konker platform please contact us at http www konkerlabs com deploying if you built your package with maven the konker platform can be deployed as a web application on your favorite servlet container we use jetty you will need to customize the application conf file to your needs see application conf example on how to do that ci all branchs and pull requests are submited to github action before merge ci need to give you a gree light for more info please se https github com konkerlabs konker platform actions license and copyright copyright 2017 konker labs licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license license license license img https img shields io badge license apache 202 blue svg | hacktoberfest hacktoberfest2021 | server |
feeds | feeds the rss feeds i subscribe to mainly about ios maths and software engineering in general this list is generated by freshrss https freshrss org an open source and self hostable aggregator i use to keep up with all of this content while controling my data getting started 1 choose you rss client or service i recommend to use freshrss 2 get the opml file and import it 3 read | os |
|
AttrScore | attrscore code datasets models for the paper automatic evaluation of attribution by large language models https arxiv org pdf 2305 06311 pdf img png attrscore png what s new june 26 2023 1 evaluation results of more models including gpt 4 2 thorough re examination of the attreval gensearch dataset and correcting some annotation issues updated dataset released 3 training and evaluation code as well as model checkpoints released dataset we release our dataset including training and two evaluation sets attreval simulation and attreval gensearch at huggingface datasets https huggingface co datasets osunlp attrscore more details can be found on the dataset page python loading dataset from datasets import load dataset training attr train load dataset osunlp attrscore combined train test attr eval simulation load dataset osunlp attrscore attreval simulation attr eval gensearch load dataset osunlp attrscore attreval gensearch evaluation we show our results for both prompting llms and fine tuning llms on repurposed data from related tasks simulation gensearch setting model size attr contra extra overall attr contra extra overall zero shot alpaca 7b 50 0 4 0 1 4 33 6 50 7 8 6 3 6 34 3 alpaca 13b 48 3 5 6 2 2 33 5 50 6 6 1 19 3 34 7 vicuna 13b 46 3 8 3 21 6 34 6 54 4 13 3 26 1 41 4 chatgpt 45 7 17 9 52 7 43 2 61 2 20 6 53 3 55 0 gpt 4 58 7 23 2 61 5 55 6 87 3 45 0 89 6 85 1 few shot alpaca 7b 45 4 8 2 9 6 31 9 49 6 5 2 13 5 37 2 alpaca 13b 38 9 20 1 2 2 33 1 50 5 10 3 5 6 34 8 vicuna 13b 35 4 37 2 0 3 32 6 50 6 9 1 8 4 34 1 chatgpt 46 6 27 6 35 8 39 2 62 6 26 8 49 5 53 3 gpt 4 61 1 31 3 68 8 60 0 85 2 53 3 88 9 84 3 fine tuned roberta 330m 62 5 54 6 74 7 65 0 47 2 25 2 62 3 49 8 gpt2 1 5b 63 6 54 6 71 9 63 5 51 1 18 6 60 7 47 4 t5 770m 45 9 57 1 71 6 59 1 58 5 24 3 72 5 61 6 flan t5 770m https huggingface co osunlp attrscore flan t5 large 57 3 50 1 70 5 59 3 64 3 27 6 72 9 64 5 flan t5 3b https huggingface co osunlp attrscore flan t5 xl 48 1 48 7 67 1 55 7 77 7 44 4 80 0 75 2 flan t5 11b https huggingface co osunlp attrscore flan t5 xxl 48 4 49 9 66 5 55 4 81 6 38 9 76 9 72 7 llama 7b https huggingface co osunlp attrscore llama 7b 62 2 50 7 74 6 62 8 77 9 41 1 78 3 72 5 alpaca 7b https huggingface co osunlp attrscore alpaca 7b 66 8 41 1 76 8 64 5 73 0 30 2 80 0 72 5 alpaca 13b https huggingface co osunlp attrscore alpaca 13b 63 6 48 9 75 8 63 6 77 5 34 5 79 4 73 3 vicuna 13b https huggingface co osunlp attrscore vicuna 13b 66 2 49 1 78 6 66 0 69 4 37 7 79 9 72 1 prompt llms zero few shot we can prompt llms such as chatgpt and gpt 4 to evaluate the attribution the input is the evaluation task prompt claim a concatenation of query answer and a reference for example verify whether a given reference can support the claim options attributable extrapolatory or contradictory attributable means the reference fully supports the claim extrapolatory means the reference lacks sufficient information to validate the claim and contradictory means the claim contradicts the information presented in the reference claim who is the current ceo of twitter the current ceo of twitter is elon musk reference elon musk is the ceo of twitter musk took over as ceo in october 2022 following a back and forth affair in which the billionaire proposed to purchase the social media company for 44 billion tried to back out and then ultimately went through with the acquisition after becoming ceo former ceo parag agrawal cfo ned segal and legal affairs and policy chief vijaya gadde were all dismissed from the company to replicate the number in the table for chatgpt gpt4 please copy your openai api key in api key txt and then run the notebook example prompt chatgpt gpt4 ipynb to prompt llama alpaca vicuna please see below for how to run inference on these models fine tune lms you can fine tune any lms on our repurposed datasets to evaluate the attribution here we give an example for fine tuning llama alpaca vicuna you can use the model name or path with any llama family models we do full fine tuning of llama alpaca vicuna 7b 13b models with 4 a100 80gb gpus bash torchrun nproc per node 4 train alpaca py model name or path chavinlo alpaca 13b data path osunlp attrscore train subset combined train input has query true num train samples 1 bf16 true output dir tmp alpaca 13b combined train evaluation strategy steps eval steps 500 num train epochs 1 model max length 512 per device train batch size 2 per device eval batch size 2 gradient accumulation steps 8 save strategy steps save steps 5000 save total limit 1 learning rate 2e 5 weight decay 0 warmup ratio 0 03 lr scheduler type cosine logging steps 1 fsdp full shard auto wrap fsdp transformer layer cls to wrap llamadecoderlayer tf32 true you could also load our fine tuned models to evaluate we provide the following checkpoints we trained on the combined train dataset in huggingface models flan t5 large 770m osunlp attrscore flan t5 large https huggingface co osunlp attrscore flan t5 large flan t5 xl 3b osunlp attrscore flan t5 xl https huggingface co osunlp attrscore flan t5 xl flan t5 xxl 11b osunlp attrscore flan t5 xxl https huggingface co osunlp attrscore flan t5 xxl llama 7b osunlp attrscore llama 7b https huggingface co osunlp attrscore llama 7b alpaca 7b osunlp attrscore alpaca 7b https huggingface co osunlp attrscore alpaca 7b alpaca 13b osunlp attrscore alpaca 13b https huggingface co osunlp attrscore alpaca 13b vicuna 13b osunlp attrscore vicuna 13b https huggingface co osunlp attrscore vicuna 13b for example python from transformers import autotokenizer automodelforseq2seqlm tokenizer autotokenizer from pretrained osunlp attrscore flan t5 xl model automodelforseq2seqlm from pretrained osunlp attrscore flan t5 xl input as an attribution validator your task is to verify whether a given reference can support the given claim a claim can be either a plain sentence or a question followed by its answer specifically your response should clearly indicate the relationship attributable contradictory or extrapolatory a contradictory error occurs when you can infer that the answer contradicts the fact presented in the context while an extrapolatory error means that you cannot infer the correctness of the answer based on the information provided in the context n nclaim who is the current ceo of twitter the current ceo of twitter is elon musk n reference elon musk is the ceo of twitter musk took over as ceo in october 2022 following a back and forth affair in which the billionaire proposed to purchase the social media company for 44 billion tried to back out and then ultimately went through with the acquisition after becoming ceo former ceo parag agrawal cfo ned segal and legal affairs and policy chief vijaya gadde were all dismissed from the company input ids tokenizer encode input return tensors pt outputs model generate input ids output tokenizer decode outputs 0 skip special tokens true print output attributable or simply using the pipeline python from transformers import pipeline model pipeline text2text generation osunlp attrscore flan t5 xl input as an attribution validator your task is to verify whether a given reference can support the given claim a claim can be either a plain sentence or a question followed by its answer specifically your response should clearly indicate the relationship attributable contradictory or extrapolatory a contradictory error occurs when you can infer that the answer contradicts the fact presented in the context while an extrapolatory error means that you cannot infer the correctness of the answer based on the information provided in the context n nclaim who is the current ceo of twitter the current ceo of twitter is elon musk n reference elon musk is the ceo of twitter musk took over as ceo in october 2022 following a back and forth affair in which the billionaire proposed to purchase the social media company for 44 billion tried to back out and then ultimately went through with the acquisition after becoming ceo former ceo parag agrawal cfo ned segal and legal affairs and policy chief vijaya gadde were all dismissed from the company output model input 0 generated text print output attributable we show an inference and evaluation script for llama based models bash python inference alpaca py model name or path osunlp attrscore vicuna 13b test data path osunlp attrscore subset name attreval simulation model max length 512 acknowledgement limitations all the datasets in this project are intended for research purpose use only we collect and annotate data for evaluation using publicly available information on the web with the assistance of a generative search engine new bing we acknowledge that llms have the potential to reproduce and amplify harmful information present in the data we made an effort to mitigate this risk by carefully selecting our evaluation data and by conducting analyses to identify and mitigate potential risks in the process our annotated evaluation set attreval gensearch is derived from new bing which uses gpt 4 as its backbone it is crucial to note that we also use gpt 4 for evaluating attribution on attreval gensearch which achieves the best performance with around 85 overall accuracy some bias might come from gpt 4 both generating the test examples and evaluating the attribution which could potentially skew our understanding of the model s true performance we therefore caution against over optimism we also acknowledge that the size of attreval gensearch is moderate which may not fully represent the real use setting of attributed llms besides the attreval simulation dataset still has gaps from the real scenario the error patterns in this simulated dataset might be overly simplistic and lack diversity which can limit the models ability to effectively handle more complex and varied real world errors it is also worth noting that this simulated dataset may contain noise and erroneous labels which could further impede the models learning and subsequent performance how to obtain higher quality training data for attribution evaluation at scale can be a major focus for future development citation information if you find this code or dataset useful please consider citing our paper bib article yue2023automatic title automatic evaluation of attribution by large language models author yue xiang and wang boshi and zhang kai and chen ziru and su yu and sun huan journal arxiv preprint arxiv 2305 06311 year 2023 contact feel free to reach out if you have any questions xiang yue mailto yue 149 osu edu yu su mailto su 809 osu edu huan sun mailto sun 397 osu edu | attribution chatgpt gpt-4 large-language-model large-language-models llms natural-language-processing evaluation-llms | ai |
ChessVisionBot | chessvisionbot the chessvisionbot uses computer vision to detect a 2d chessboard on the screen it detects played moves by the opponent and calculates the best response given a certain time to calculate the chessvisionbot is then able to move the mouse and execute the calculated move thanks to the stockfish engine https github com official stockfish stockfish it is able to play at a very high rating with quick movetimes do not use this chessbot to play against a real human playing chess chessvisionbot vs victoria on chess24 br 3 minute time control 1700 rating chessvisionbot vs chess com computer level 10 br unlimited time 2600 rating computer vision based chess bot wins against victoria on chess24 https github com kochsebastian onlinechessbot blob master images preview2 png https www youtube com watch v z9lsjm55tng computer vision based chess bot wins against victoria on chess24 computer vision based chess bot wins against chess com computer level 10 2600 rating https github com kochsebastian onlinechessbot blob master images preview1 png https www youtube com watch v 1ateesxdodc computer vision based chess bot wins against chess com computer level 10 2600 rating https www youtube com watch v z9lsjm55tng https www youtube com watch v 1ateesxdodc solving tactics chessvisionbot solves unknown tactics on chess com computer vision based chess bot solves 25 random chess tactics in a row https github com kochsebastian onlinechessbot blob master images preview3 png https youtu be l6hsxr5kzo4 computer vision based chess bot solves 25 random chess tactics in a row https youtu be l6hsxr5kzo4 how to set up and use i won t publish a how to install instructions because i want to prevent non programmers to just use this project to cheat on chess websites my intend with that project is combine computer vision and machine learning with chess and to learn something about every of these aspects for a programmer who has the same intentions installation and setting up the environment isn t that hard even though i provide no instructions the project is only tested on a mac environment so there may be some compablility problems but i guess nothing impossible to fix feel free to edit the code as you like if you want to contribute your changes i happily await your pull request notice tensorflow has moved on from tensorflow 1 0 and has exciting new functionalities with tensorflow 2 0 unfortunately the current tensorflow code is not compatible with tensorflow 2 0 and unfortunately i lack the time to convert the code to the current tensorflow syntax i m looking forward to a pull request from the community to update the old tensorflow code maybe even migrate the code to pytorch | python opencv tensorflow machine-learning computer-vision chess chess-ai keras neural-networks | ai |
FLM | fine tuning large language models under progress key features 1 full use of computational resources gpu utilization bf16 model parallelism most optimal gpu utilization support parameter efficient fine tuning peft https github com huggingface peft with underlying plm loaded in lower bits int8 2 dynamically construct any kind of training evaluation dataset combination designate any of instances for each training validation dataset allows instruction tuning dynamic validation during training beats t0 by 00 support all pre training objective fine tuning objective e g mlm ssm etc 3 evaluate llms even 175b lms on any kind of evaluation datasets mmlu bigbench etc with flexgen https github com fminference flexgen support support inference with both decoder only lms encoder decoder lms supports multiple types of verbalizer techniques calibrazation etc supports all kinds of generative metrics rouge bleu mauve etc 4 log the training run via wandb 0 install dependencies install torch with the correct cuda version check nvcc version pip install torch extra index url https download pytorch org whl cu116 upgrade install hugging face libraries pip install transformers 4 26 0 datasets 2 9 0 accelerate 0 16 0 evaluate 0 4 0 upgrade install deepspeed and ninja for jit compilations of kernels pip install deepspeed 0 8 0 ninja upgrade install additional dependencies needed for training pip install rouge score nltk py7zr tensorboard scikit learn pip install sentencepiece pip install wandb also get promptsource currently getting the version that supports xp3 cd third party git clone b tr13 https github com muennighoff promptsource git cd promptsource pip install e 1 dataset preparation run the following code for preparing pretraining data python make dataset pt py config dataset configs pretrain ko json run the following code for preparing fine tuning evaluation data python make dataset ft py config dataset configs finetune basic json 2 train or evaluate any lms in huggingface before training or evaluating we will tokenize the train eval dataset before the actaul training after filling out the configurations in run configs directory run the following code for preprocessing the train eval data python preprocess dataset py config run configs train basic json finally run the run py file happy training evaluating localhost num1 num2 to designate the cuda visible devices num1 num2 code will use all gpus as default deepspeed include localhost 0 run py config run configs train basic json | ai |
|
crowdfunding_etl | crowdfunding etl a sample etl project to demonstrate mastery of a common data engineering pipeline from raw data files delivered by a client to a functioning postgresql database in this case the data has been delivered by a burgeoning crowdfunding platform with no backend architecture in place procedure this etl project is divided into the following subsections create the category and subcategory dataframes create the campaign dataframe create the contacts dataframe create the crowdfunding database create the category and subcategory dataframes 1 extract and transform the crowdfunding xlsx excel data to create a category dataframe that has the following columns a category id column that has entries going sequentially from cat1 to cat n where n is the number of unique categories a category column that contains only the category titles the following image shows the category dataframe category dataframe images category dataframe png 2 export the category dataframe as category csv and save it to your github repository 3 extract and transform the crowdfunding xlsx excel data to create a subcategory dataframe that has the following columns a subcategory id column that has entries going sequentially from subcat1 to subcatn where n is the number of unique subcategories a subcategory column that contains only the subcategory titles the following image shows this subcategory dataframe subcategory dataframe images subcategory dataframe png 4 export the subcategory dataframe as subcategory csv create the campaign dataframe extract and transform the crowdfunding xlsx excel data to create a campaign dataframe has the following columns the cf id column the contact id column the company name column the blurb column renamed to description the goal column converted to the float data type the pledged column converted to the float data type the backers count column the country column the currency column the launched at column renamed to launch date and with the utc times converted to the datetime format the deadline column renamed to end date and with the utc times converted to the datetime format the category id column with unique identification numbers matching those in the category id column of the category dataframe the subcategory id column with the unique identification numbers matching those in the subcategory id column of the subcategory dataframe the following image shows the campaign dataframe campaign dataframe images campaign dataframe png 2 export the campaign dataframe as campaign csv create the contacts dataframe 1 the following are two methods for extracting and transforming the data from the contacts xlsx excel data method 1 use python dictionary methods method 2 use regular expressions method 1 import the contacts xlsx file into a dataframe iterate through the dataframe converting each row to a dictionary iterate through each dictionary doing the following extract the dictionary values from the keys by using a python list comprehension add the values for each row to a new list create a new dataframe that contains the extracted data split each name column value into a first and last name and place each in a new column clean and export the dataframe as contacts csv and save it to your github repository method 2 import the contacts xlsx file into a dataframe extract the contact id name and email columns by using regular expressions create a new dataframe with the extracted data convert the contact id column to the integer type split each name column value into a first and a last name and place each in a new column clean and then export the dataframe as contacts csv and save it to your github repository the following is an image of the final dataframe final contact dataframe images contact dataframe final png create the crowdfunding database 1 inspect the four csv files and then sketch an erd of the tables crowdfunding db erd postgresql files crowdfunding db erd png 2 use the information from the erd to create a table schema for each csv file a pdf containing the table schema is included in the file named crowdfunding db schema pdf 3 save the database schema as a postgres file named crowdfunding db schema sql 4 create a new postgres database named crowdfunding db the create script is included in a file named create crowdfunding db sql 5 using the database schema create the tables in the correct order to handle the foreign keys 6 verify the table creation by running a select statement for each table verification queries are included in the file named full table verification queries sql 7 import each csv file into its corresponding sql table 8 verify that each table has the correct data by running a select statement for each verification queries are included in the file named full table verification queries sql | server |
|
CHP08-Unity-step-by-step- | chp08 unity step by step unity in embedded system design and robotics a step by step guide | os |
|
gulp-boilerplate | gulp boilerplate build status https travis ci com lucaswinkler gulp boilerplate svg token 6xptyyj9yjazumpzepqi branch master https travis ci com lucaswinkler gulp boilerplate a simple boilerplate for front end web development which uses gulp https gulpjs com v4 this is my first time trying gulp so it won t be perfect i just wanted to create a basic template to work off of with a file structure i liked features live reloading cache busting scss converted to css auto prefixed and minified with sourcemaps javascript concatenated into a single file minified with sourcemaps and supports es6 image minifying getting started follow these steps in order to get the website up and running locally on your machine installation npm install to install any dependencies npm start or gulp watch to start a live reload session building npm run build or gulp to build the application extras gulp or gulp build to build the application gulp watch to enable live reload gulp clean to delete the build folder gulp styles to run the style tasks gulp scripts to run the script tasks gulp images to run the image tasks gulp favicon to run the favicon tasks file structure bash app images js main js vendors scss abstracts base pages main scss build images js app min js vendors min js css styles min css tips you can use any file structure for your javascript and scss files the one provided is an example | gulp html5 css3 sass scss browsersync livereload boilerplate | front_end |
Qwen-VL | p align left a href readme cn md a nbsp nbspenglish nbsp nbsp nbsp a href readme ja md a nbsp p br br p align center img src assets logo jpg width 400 p br p align center qwen vl a href https modelscope cn models qwen qwen vl summary a a href https huggingface co qwen qwen vl a nbsp qwen vl chat a href https modelscope cn models qwen qwen vl chat summary a a href https huggingface co qwen qwen vl chat a nbsp qwen vl chat int4 a href https huggingface co qwen qwen vl chat int4 a br a href https qianwen res oss cn beijing aliyuncs com qwen wechat group png wechat a nbsp nbsp nbsp nbsp a href https discord gg z3gaxxz9ce discord a nbsp nbsp nbsp nbsp a href https modelscope cn studios qwen qwen vl chat demo summary demo a nbsp nbsp a href https arxiv org abs 2308 12966 paper a nbsp nbsp nbsp nbsp a href https github com camenduru qwen vl chat colab colab a nbsp nbsp nbsp a href tutorial md tutorial a p br br qwen vl qwen large vision language model is the multimodal version of the large model series qwen abbr tongyi qianwen proposed by alibaba cloud qwen vl accepts image text and bounding box as inputs outputs text and bounding box the features of qwen vl include strong performance it significantly surpasses existing open sourced large vision language models lvlm under similar model scale on multiple english evaluation benchmarks including zero shot captioning vqa docvqa and grounding multi lingual lvlm supporting text recognition qwen vl naturally supports english chinese and multi lingual conversation and it promotes end to end recognition of chinese and english bi lingual text in images multi image interleaved conversations this feature allows for the input and comparison of multiple images as well as the ability to specify questions related to the images and engage in multi image storytelling first generalist model supporting grounding in chinese detecting bounding boxes through open domain language expression in both chinese and english fine grained recognition and understanding compared to the 224 224 resolution currently used by other open sourced lvlm the 448 448 resolution promotes fine grained text recognition document qa and bounding box annotation br p align center img src assets demo vl gif width 400 p br we release two models of the qwen vl series qwen vl the pre trained lvlm model uses qwen 7b as the initialization of the llm and openclip vit bigg https github com mlfoundations open clip as the initialization of the visual encoder and connects them with a randomly initialized cross attention layer qwen vl chat a multimodal llm based ai assistant which is trained with alignment techniques qwen vl chat supports more flexible interaction such as multiple image inputs multi round question answering and creative capabilities br news and updates 2023 9 25 we update qwen vl chat with more robust chinese instruction following ability improved understanding of web pages and table images and better dialogue performance touchstone cn 401 2 481 7 en 645 2 711 6 2023 9 12 we now support finetuning on the qwen vl models including full parameter finetuning lora and q lora 2023 9 8 thanks to camenduru https github com camenduru for contributing the wonderful colab https github com camenduru qwen vl chat colab everyone can use it as a local or online qwen vl chat int4 demo tutorial on one 12g gpu 2023 9 5 qwen vl chat achieves sotas on mme benchmark https github com bradyfu awesome multimodal large language models tree evaluation a comprehensive evaluation benchmark for multimodal large language models it measures both perception and cognition abilities on a total of 14 subtasks 2023 9 4 qwen vl series achieve sotas on seed bench https huggingface co spaces ailab cvc seed bench leaderboard a multimodal benchmark of 19k multiple choice questions with accurate human annotations for evaluating multimodal llms including both image and video understanding 2023 9 1 we release the touchstone https github com ofa sys touchstone evaluation which is a comprehensive assessment of multimodal language models encompassing not only basic recognition and comprehension but also extending to literary creation by using strong llms as judges and converting multimodal information into text 2023 8 31 we release the int4 quantized model for qwen vl chat qwen vl chat int4 which requires low memory costs but achieves improved inference speed besides there is no significant performance degradation on the benchmark evaluation 2023 8 22 we release both qwen vl and qwen vl chat on modelscope and hugging face we also provide a paper https arxiv org abs 2308 12966 for more details about the model including training details and model performance br evaluation we evaluated the model s abilities from three perspectives 1 standard benchmarks we evaluate the model s basic task capabilities on four major categories of multimodal tasks zero shot captioning evaluate model s zero shot image captioning ability on unseen datasets general vqa evaluate the general question answering ability of pictures such as the judgment color number category etc text based vqa evaluate the model s ability to recognize text in pictures such as document qa chart qa etc referring expression comprehension evaluate the ability to localize a target object in an image described by a referring expression 2 touchstone to evaluate the overall text image dialogue capability and alignment level with humans we have constructed a benchmark called touchstone https github com ofa sys touchstone which is based on scoring with gpt4 to evaluate the lvlm model the touchstone benchmark covers a total of 300 images 800 questions and 27 categories such as attribute based q a celebrity recognition writing poetry summarizing multiple images product comparison math problem solving etc in order to break the current limitation of gpt4 in terms of direct image input touchstone provides fine grained image annotations by human labeling these detailed annotations along with the questions and the model s output are then presented to gpt4 for scoring the benchmark includes both english and chinese versions 3 other multimodal benchmarks we also evaluated our model s capabilities in other multimodal benchmarks mme benchmark https github com bradyfu awesome multimodal large language models tree evaluation a comprehensive evaluation benchmark for multimodal large language models qwen vl chat achieves sotas on both perception and cognition tracks seed bench https huggingface co spaces ailab cvc seed bench leaderboard a multimodal benchmark of 19k multiple choice questions with accurate human annotations for evaluating multimodal llms qwen series achieves sotas on this benchmark the results of the evaluation are as follows qwen vl outperforms current sota generalist models on multiple vl tasks and has a more comprehensive coverage in terms of capability range p align center img src assets radar png width 600 p zero shot captioning general vqa table thead tr th rowspan 2 model type th th rowspan 2 model th th colspan 2 zero shot captioning th th colspan 5 general vqa th tr tr th nocaps th th flickr30k th th vqav2 sup dev sup th th ok vqa th th gqa th th sciqa img br 0 shot th th vizwiz br 0 shot th tr thead tbody align center tr td rowspan 10 generalist br models td td flamingo 9b td td td td 61 5 td td 51 8 td td 44 7 td td td td td td 28 8 td tr tr td flamingo 80b td td td td 67 2 td td 56 3 td td 50 6 td td td td td td 31 6 td tr tr td unified io xl td td 100 0 td td td td 77 9 td td 54 0 td td td td td td td tr tr td kosmos 1 td td td td 67 1 td td 51 0 td td td td td td td td 29 2 td tr tr td kosmos 2 td td td td 80 5 td td 51 1 td td td td td td td td td tr tr td blip 2 vicuna 13b td td 103 9 td td 71 6 td td 65 0 td td 45 9 td td 32 3 td td 61 0 td td 19 6 td tr tr td instructblip vicuna 13b td td strong 121 9 strong td td 82 8 td td td td td td 49 5 td td 63 1 td td 33 4 td tr tr td shikra vicuna 13b td td td td 73 9 td td 77 36 td td 47 16 td td td td td td td tr tr td strong qwen vl qwen 7b strong td td 121 4 td td b 85 8 b td td b 78 8 b td td b 58 6 b td td b 59 3 b td td 67 1 td td 35 2 td tr tr td qwen vl 4 shot td td td td td td td td 63 6 td td td td td td 39 1 td tr tr td qwen vl chat td td 120 2 td td 81 0 td td 78 2 td td 56 6 td td 57 5 td td b 68 2 b td td b 38 9 b td tr tr td qwen vl chat 4 shot td td td td td td td td 60 6 td td td td td td 44 45 td tr tr td previous sota br per task fine tuning td td td td 127 0 br pali 17b td td 84 5 br instructblip br flant5 xl td td 86 1 br pali x br 55b td td 66 1 br pali x br 55b td td 72 1 br cfr td td 92 53 br llava br gpt 4 td td 70 9 br pali x br 55b td tr tbody table for zero shot image captioning qwen vl achieves the sota on flickr30k and competitive results on nocaps with instructblip for general vqa qwen vl achieves the sota under the same generalist lvlm scale settings text oriented vqa focused on text understanding capabilities in images table thead tr th model type th th model th th textvqa th th docvqa th th chartqa th th ai2d th th ocr vqa th tr thead tbody align center tr td rowspan 5 generalist models td td blip 2 vicuna 13b td td 42 4 td td td td td td td td td tr tr td instructblip vicuna 13b td td 50 7 td td td td td td td td td tr tr td mplug docowl llama 7b td td 52 6 td td 62 2 td td 57 4 td td td td td tr tr td pix2struct large 1 3b td td td td b 76 6 b td td 58 6 td td 42 1 td td 71 3 td tr tr td qwen vl qwen 7b td td b 63 8 b td td 65 1 td td b 65 7 b td td b 62 3 b td td b 75 7 b td tr tr td specialist sotas br specialist finetuned td td pali x 55b single task ft br without ocr pipeline td td 71 44 td td 80 0 td td 70 0 td td 81 2 td td 75 0 td tr tbody table in text related recognition qa evaluation qwen vl achieves the sota under the generalist lvlm scale settings resolution is important for several above evaluations while most open sourced lvlm models with 224 resolution are incapable of these evaluations or can only solve these by cutting images qwen vl scales the resolution to 448 so that it can be evaluated end to end qwen vl even outperforms pix2struct large models of 1024 resolution on some tasks referring expression comprehension table thead tr th rowspan 2 model type th th rowspan 2 model th th colspan 3 refcoco th th colspan 3 refcoco th th colspan 2 refcocog th th grit th tr tr th val th th test a th th test b th th val th th test a th th test b th th val u th th test u th th refexp th tr thead tbody align center tr td rowspan 8 generalist models td td gpv 2 td td td td td td td td td td td td td td td td td td 51 50 td tr tr td ofa l td td 79 96 td td 83 67 td td 76 39 td td 68 29 td td 76 00 td td 61 75 td td 67 57 td td 67 58 td td 61 70 td tr tr td unified io td td td td td td td td td td td td td td td td td td b 78 61 b td tr tr td visionllm h td td td td 86 70 td td td td td td td td td td td td td td td tr tr td shikra 7b td td 87 01 td td 90 61 td td 80 24 td td 81 60 td td 87 36 td td 72 12 td td 82 27 td td 82 19 td td 69 34 td tr tr td shikra 13b td td 87 83 td td 91 11 td td 81 81 td td 82 89 td td 87 79 td td 74 41 td td 82 64 td td 83 16 td td 69 03 td tr tr td qwen vl 7b td td b 89 36 b td td 92 26 td td b 85 34 b td td b 83 12 b td td 88 25 td td b 77 21 b td td 85 58 td td 85 48 td td 78 22 td tr tr td qwen vl 7b chat td td 88 55 td td b 92 27 b td td 84 51 td td 82 82 td td b 88 59 b td td 76 79 td td b 85 96 b td td b 86 32 b td td td tr td rowspan 3 specialist sotas br specialist finetuned td td g dino l td td 90 56 td td 93 19 td td 88 24 td td 82 75 td td 88 95 td td 75 92 td td 86 13 td td 87 02 td td td tr tr td uninext h td td 92 64 td td 94 33 td td 91 46 td td 85 24 td td 89 63 td td 79 79 td td 88 73 td td 89 37 td td td tr tr td one peace td td 92 58 td td 94 18 td td 89 26 td td 88 77 td td 92 21 td td 83 23 td td 89 22 td td 89 27 td td td tr tbody table qwen vl achieves the sota in all above referring expression comprehension benchmarks qwen vl has not been trained on any chinese grounding data but it can still generalize to the chinese grounding tasks in a zero shot way by training chinese caption data and english grounding data we provide all of the above evaluation scripts for reproducing our experimental results please read eval mm evaluation md eval mm evaluation md for more information chat evaluation touchstone is a benchmark based on scoring with gpt4 to evaluate the abilities of the lvlm model on text image dialogue and alignment levels with humans it covers a total of 300 images 800 questions and 27 categories such as attribute based q a celebrity recognition writing poetry summarizing multiple images product comparison math problem solving etc please read touchstone readme md touchstone readme md for more information english evaluation model score pandagpt 488 5 minigpt4 531 7 instructblip 552 4 llama adapterv2 590 1 llava 602 7 mplug owl 605 4 qwen vl chat 645 2 qwen vl chat 1 1 711 6 chinese evaluation model score visualglm 247 1 qwen vl chat 401 2 qwen vl chat 1 1 481 7 qwen vl chat has achieved the best results in both chinese and english alignment evaluation other benchmarks mme benchmark mme https github com bradyfu awesome multimodal large language models tree evaluation is a comprehensive evaluation benchmark for multimodal large language models it measures both perception and cognition abilities on a total of 14 subtasks including existence count position color poster celebrity scene landmark artwork ocr commonsense reasoning numerical calculation text translation and code reasoning qwen vl chat achieves sotas on both perception and cognition evaluation see more details on here eval mm mme eval mme md p align center img src eval mm mme perception jpg width 600 p p align center img src eval mm mme cognition jpg width 600 p seed bench seed bench https huggingface co spaces ailab cvc seed bench leaderboard is a multimodal benchmark of 19k multiple choice questions with accurate human annotations for evaluating multimodal llms covering 12 evaluation dimensions including both image and video understanding see more details on here eval mm seed bench eval seed md qwen vl and qwen vl chat achieve sotas on this benchmark p align center img src eval mm seed bench leaderboard jpg p requirements python 3 8 and above pytorch 1 12 and above 2 0 and above are recommended cuda 11 4 and above are recommended this is for gpu users br quickstart below we provide simple examples to show how to use qwen vl and qwen vl chat with modelscope and transformers before running the code make sure you have setup the environment and installed the required packages make sure you meet the above requirements and then install the dependent libraries bash pip install r requirements txt now you can start with modelscope or transformers more usage aboue vision encoder please refer to the tutorial tutorial md transformers to use qwen vl chat for the inference all you need to do is to input a few lines of codes as demonstrated below however please make sure that you are using the latest code python from transformers import automodelforcausallm autotokenizer from transformers generation import generationconfig import torch torch manual seed 1234 note the default behavior now has injection attack prevention off tokenizer autotokenizer from pretrained qwen qwen vl chat trust remote code true use bf16 model automodelforcausallm from pretrained qwen qwen vl chat device map auto trust remote code true bf16 true eval use fp16 model automodelforcausallm from pretrained qwen qwen vl chat device map auto trust remote code true fp16 true eval use cpu only model automodelforcausallm from pretrained qwen qwen vl chat device map cpu trust remote code true eval use cuda device model automodelforcausallm from pretrained qwen qwen vl chat device map cuda trust remote code true eval specify hyperparameters for generation model generation config generationconfig from pretrained qwen qwen vl chat trust remote code true 1st dialogue turn query tokenizer from list format image https qianwen res oss cn beijing aliyuncs com qwen vl assets demo jpeg either a local path or an url text response history model chat tokenizer query query history none print response 2nd dialogue turn response history model chat tokenizer history history print response ref ref box 536 509 588 602 box image tokenizer draw bbox on latest picture response history if image image save 1 jpg else print no box p align center img src assets demo highfive jpg width 500 p details summary running qwen vl summary running qwen vl pretrained base model is also simple python from transformers import automodelforcausallm autotokenizer from transformers generation import generationconfig import torch torch manual seed 1234 tokenizer autotokenizer from pretrained qwen qwen vl trust remote code true use bf16 model automodelforcausallm from pretrained qwen qwen vl device map auto trust remote code true bf16 true eval use fp16 model automodelforcausallm from pretrained qwen qwen vl device map auto trust remote code true fp16 true eval use cpu only model automodelforcausallm from pretrained qwen qwen vl device map cpu trust remote code true eval use cuda device model automodelforcausallm from pretrained qwen qwen vl device map cuda trust remote code true eval specify hyperparameters for generation no need to do this if you are using transformers 4 32 0 model generation config generationconfig from pretrained qwen qwen vl trust remote code true query tokenizer from list format image https qianwen res oss cn beijing aliyuncs com qwen vl assets demo jpeg either a local path or an url text generate the caption in english with grounding inputs tokenizer query return tensors pt inputs inputs to model device pred model generate inputs response tokenizer decode pred cpu 0 skip special tokens false print response img https qianwen res oss cn beijing aliyuncs com qwen vl assets demo jpeg img generate the caption in english with grounding ref woman ref box 451 379 731 806 box and ref her dog ref box 219 424 576 896 box playing on the beach endoftext image tokenizer draw bbox on latest picture response if image image save 2 jpg else print no box p align center img src assets demo spotting caption jpg width 500 p details in the event of a network issue while attempting to download model checkpoints and codes from huggingface an alternative approach is to initially fetch the checkpoint from modelscope and then load it from the local directory as outlined below python from modelscope import snapshot download from transformers import automodelforcausallm autotokenizer downloading model checkpoint to a local dir model dir model dir snapshot download qwen qwen vl model dir snapshot download qwen qwen vl chat loading local checkpoints trust remote code is still set as true since we still load codes from local dir instead of transformers tokenizer autotokenizer from pretrained model dir trust remote code true model automodelforcausallm from pretrained model dir device map cuda trust remote code true eval modelscope modelscope is an opensource platform for model as a service maas which provides flexible and cost effective model service to ai developers similarly you can run the models with modelscope as shown below python from modelscope import snapshot download automodelforcausallm autotokenizer generationconfig import torch model id qwen qwen vl chat revision v1 0 0 model dir snapshot download model id revision revision torch manual seed 1234 tokenizer autotokenizer from pretrained model dir trust remote code true if not hasattr tokenizer model dir tokenizer model dir model dir use bf16 model automodelforcausallm from pretrained model dir device map auto trust remote code true bf16 true eval use fp16 model automodelforcausallm from pretrained model dir device map auto trust remote code true fp16 true eval use cpu model automodelforcausallm from pretrained model dir device map cpu trust remote code true eval use auto model automodelforcausallm from pretrained model dir device map auto trust remote code true eval specify hyperparameters for generation no need to do this if you are using transformers 4 32 0 model generation config generationconfig from pretrained model dir trust remote code true 1st dialogue turn either a local path or an url between img img tags image path https qianwen res oss cn beijing aliyuncs com qwen vl assets demo jpeg response history model chat tokenizer query f img image path img history none print response 2nd dialogue turn response history model chat tokenizer history history print response ref ref box 211 412 577 891 box image tokenizer draw bbox on latest picture response history if image image save output chat jpg else print no box p align center img src assets demo highfive jpg width 500 p br quantization usage we provide a new solution based on autogptq https github com panqiwei autogptq and release an int4 quantized model for qwen vl chat qwen vl chat int4 click here https huggingface co qwen qwen vl chat int4 which achieves nearly lossless model effects but improved performance on both memory costs and inference speed here we demonstrate how to use our provided quantized models for inference before you start make sure you meet the requirements e g torch 2 0 and above transformers 4 32 0 and above etc and install the required packages bash pip install optimum git clone https github com justinlin610 autogptq git cd autogptq pip install v if you meet problems installing auto gptq we advise you to check out the official repo https github com panqiwei autogptq to find a wheel then you can load the quantized model easily and run inference as same as usual python model automodelforcausallm from pretrained qwen qwen vl chat int4 device map auto trust remote code true eval either a local path or an url between img img tags image path https qianwen res oss cn beijing aliyuncs com qwen vl assets demo jpeg response history model chat tokenizer query f img image path img history none print response performance we illustrate the model performance of both bf16 and int4 models on the benchmark touchstone https github com ofa sys touchstone and we find that the quantized model does not suffer from significant performance degradation results are shown below quantization zh en bf16 401 2 645 2 int4 386 6 651 4 inference speed we measured the average inference speed tokens s of generating 1792 2048 258 and 7934 8192 258 tokens with the context of an image which takes 258 tokens under bf16 precision and int4 quantization respectively quantization speed 2048 tokens speed 8192 tokens bf16 28 87 24 32 int4 37 79 34 34 the profiling runs on a single a100 sxm4 80g gpu with pytorch 2 0 1 and cuda 11 4 gpu memory usage we also profile the peak gpu memory usage for encoding 1792 2048 258 tokens including an image as context and generating single token and generating 7934 8192 258 tokens with an image as context under bf16 or int4 quantization level respectively the results are shown below quantization peak usage for encoding 2048 tokens peak usage for generating 8192 tokens bf16 22 60gb 28 01gb int4 11 82gb 17 23gb the above speed and memory profiling are conducted using this script https qianwen res oss cn beijing aliyuncs com profile mm py br finetuning now we provide the official training script finetune py for users to finetune the pretrained model for downstream applications in a simple fashion additionally we provide shell scripts to launch finetuning with no worries this script supports the training with deepspeed and fsdp the shell scripts that we provide use deepspeed and thus we advise you to install deepspeed before you start bash pip install deepspeed data preparation to prepare your training data you need to put all the samples into a list and save it to a json file each sample is a dictionary consisting of an id and a list for conversation below is a simple example list with 1 sample json id identity 0 conversations from user value from assistant value qwen vl id identity 1 conversations from user value picture 1 img https qianwen res oss cn beijing aliyuncs com qwen vl assets demo jpeg img n from assistant value from user value from assistant value ref ref box 588 499 725 789 box id identity 2 conversations from user value picture 1 img assets mm tutorial chongqing jpeg img npicture 2 img assets mm tutorial beijing jpeg img n from assistant value for the vl tasks there are special tokens that are used including img img ref ref box box the picture is represented as picture id img img path img n your prompt where id indicates the position of the image in the conversation starting from 1 the img path can be a local file path or a web link the coordinate box is expressed as box x1 y1 x2 y2 box where x1 y1 and x2 y2 are normalized values in the range 0 1000 its corresponding text description can be identified by ref text caption ref after data preparation you can use the provided shell scripts to run finetuning remember to specify the path to the data file data the finetuning scripts allow you to perform full parameter finetuning lora q lora full parameter finetuning full parameter parameter finetuning requires updating all parameters of llm in the whole training process in our experiments frozening the parameters of vit during the fine tuning phase achieves better performance to launch your training run the following script bash sh finetune finetune ds sh remember to specify the correct model name or path the data path as well as the output directory in the shell scripts if you want to make changes just remove the argument deepspeed or make changes in the deepspeed configuration json file based on your requirements additionally this script supports mixed precision training and thus you can use bf16 true or fp16 true empirically we advise you to use bf16 to make your training consistent with our pretraining and alignment if your machine supports bf16 and thus we use it by default lora similarly to run lora use another script to run as shown below before you start make sure that you have installed peft also you need to specify your paths to your model data and output we advise you to use absolute path for your pretrained model this is because lora only saves the adapter and the absolute path in the adapter configuration json file is used for finding out the pretrained model to load bash single gpu training sh finetune finetune lora single gpu sh distributed training sh finetune finetune lora ds sh in comparison with full parameter finetuning lora paper https arxiv org abs 2106 09685 only updates the parameters of adapter layers but keeps the original large language model layers frozen this allows much fewer memory costs and thus fewer computation costs note that if you use lora to finetune the base language model e g qwen vl instead of chat models e g qwen vl chat the script automatically switches the embedding and output layer as trainable parameters this is because the base language model has no knowledge of special tokens brought by chatml format thus these layers should be updated for the model to understand and predict the tokens or in another word if your training brings in special tokens in lora you should set the layers to trainable parameters by setting modules to save inside the code additionally we find that there is a significant gap between the memory footprint of lora with and without these trainable parameters therefore if you have trouble with memory we advise you to lora finetune the chat models check the profile below for more information q lora however if you still suffer from insufficient memory you can consider q lora paper https arxiv org abs 2305 14314 which uses the quantized large language model and other techniques such as paged attention to allow even fewer memory costs to run q lora directly run the following script bash single gpu training sh finetune finetune qlora single gpu sh distributed training sh finetune finetune qlora ds sh for q lora we advise you to load our provided quantized model e g qwen vl chat int4 you should not use the bf16 models different from full parameter finetuning and lora only fp16 is supported for q lora besides for q lora the troubles with the special tokens in lora still exist however as we only provide the int4 models for chat models which means the language model has learned the special tokens of chatml format you have no worry about the layers note that the layers of the int4 model should not be trainable and thus if you introduce special tokens in your training q lora might not work different from full parameter finetuning the training of both lora and q lora only saves the adapter parameters you can load the finetuned model for inference as shown below python from peft import autopeftmodelforcausallm model autopeftmodelforcausallm from pretrained path to adapter path to the output directory device map auto trust remote code true eval if you want to merge the adapters and save the finetuned model as a standalone model you can only do this with lora and you cannot merge the parameters from q lora you can run the following codes python from peft import autopeftmodelforcausallm model autopeftmodelforcausallm from pretrained path to adapter path to the output directory device map auto trust remote code true eval merged model model merge and unload max shard size and safe serialization are not necessary they respectively work for sharding checkpoint and save the model to safetensors merged model save pretrained new model directory max shard size 2048mb safe serialization true note for multi gpu training you need to specify the proper hyperparameters for distributed training based on your machine besides we advise you to specify your maximum sequence length with the argument model max length based on your consideration of data memory footprint and training speed profiling of memory and speed we profile the gpu memory and training speed of both lora base refers to training the embedding and output layer while lora chat has no trainable embedding and output layer and q lora in the setup of single gpu training in this test we experiment on a single a100 sxm4 80g gpu and we use cuda 11 8 and pytorch 2 0 we uniformly use a batch size of 1 and gradient accumulation of 8 each sample contains an image we profile the memory gb and speed s iter of inputs of different lengths namely 384 512 1024 and 2048 the statistics are listed below table tr th rowspan 2 method th th colspan 4 align center sequence length th tr tr th align center 384 th th align center 512 th th align center 1024 th th align center 2048 th tr tr td lora base td td align center 37 1g 2 3s it td td align center 37 3g 2 4s it td td align center 38 7g 3 6s it td td align center 38 7g 6 1s it td tr tr td lora chat td td align center 23 3g 2 2s it td td align center 23 6g 2 3s it td td align center 25 1g 3 5s it td td align center 27 3g 5 9s it td tr tr td q lora td td align center 17 0g 4 2s it td td align center 17 2g 4 5s it td td align center 18 2g 5 5s it td td align center 19 3g 7 9s it td tr table br demo web ui we provide code for users to build a web ui demo before you start make sure you install the following packages pip install r requirements web demo txt then run the command below and click on the generated link python web demo mm py br faq if you meet problems please refer to faq faq md and the issues first to search a solution before you launch a new issue br license agreement researchers and developers are free to use the codes and model weights of both qwen vl and qwen vl chat we also allow their commercial use check our license at license license for more details br citation if you find our paper and code useful in your research please consider giving a star star and citation pencil bibtex article qwen vl title qwen vl a versatile vision language model for understanding localization text reading and beyond author bai jinze and bai shuai and yang shusheng and wang shijie and tan sinan and wang peng and lin junyang and zhou chang and zhou jingren journal arxiv preprint arxiv 2308 12966 year 2023 br contact us if you are interested to leave a message to either our research team or product team feel free to send an email to qianwen opensource alibabacloud com | large-language-models vision-language-model | ai |
Augmentor | augmentorlogo https github com mdbloice augmentorfiles blob master misc augmentorlogo png augmentor is an image augmentation library in python for machine learning it aims to be a standalone library that is platform and framework independent which is more convenient allows for finer grained control over augmentation and implements the most real world relevant augmentation techniques it employs a stochastic approach using building blocks that allow for operations to be pieced together in a pipeline pypi https img shields io badge augmentor v0 2 10 blue svg maxage 2592000 https pypi python org pypi augmentor supported python versions https img shields io badge python 2 7 20 7c 203 5 20 7c 203 6 20 7c 203 7 20 7c 203 8 20 7c 203 9 blue svg https pypi python org pypi augmentor pypi install https github com mdbloice augmentor actions workflows pypi yml badge svg https github com mdbloice augmentor actions workflows pypi yml pytest https github com mdbloice augmentor actions workflows package tests yml badge svg https github com mdbloice augmentor actions workflows package tests yml documentation status https readthedocs org projects augmentor badge version master https augmentor readthedocs io en master badge master license http img shields io badge license mit brightgreen svg style flat license md project status active the project has reached a stable usable state and is being actively developed http www repostatus org badges latest active svg http www repostatus org active binder https mybinder org badge svg https mybinder org v2 gh 4quantoss augmentor master installation augmentor is written in python a julia version of the package is also being developed as a sister project and is available here https github com evizero augmentor jl install using pip from the command line python pip install augmentor see the documentation for building from source to upgrade from a previous version use pip install augmentor upgrade documentation complete documentation can be found on read the docs https augmentor readthedocs io https augmentor readthedocs io en stable quick start guide and usage the purpose of augmentor is to automate image augmentation artificial data generation in order to expand datasets as input for machine learning algorithms especially neural networks and deep learning the package works by building an augmentation pipeline where you define a series of operations to perform on a set of images operations such as rotations or transforms are added one by one to create an augmentation pipeline when complete the pipeline can be executed and an augmented dataset is created to begin instantiate a pipeline object that points to a directory on your file system python import augmentor p augmentor pipeline path to images you can then add operations to the pipeline object p as follows python p rotate probability 0 7 max left rotation 10 max right rotation 10 p zoom probability 0 5 min factor 1 1 max factor 1 5 every function requires you to specify a probability which is used to decide if an operation is applied to an image as it is passed through the augmentation pipeline once you have created a pipeline you can sample from it like so python p sample 10000 which will generate 10 000 augmented images based on your specifications by default these will be written to the disk in a directory named output relative to the path specified when initialising the p pipeline object above if you wish to process each image in the pipeline exactly once use process python p process this function might be useful for resizing a dataset for example it would make sense to create a pipeline where all of its operations have their probability set to 1 when using the process method multi threading augmentor version 0 2 1 now uses multi threading to increase the speed of generating images this may slow down some pipelines if the original images are very small set multi threaded to false if slowdown is experienced python p sample 100 multi threaded false however by default the sample function uses multi threading this is currently only implemented when saving to disk generators will use multi threading in the next version update ground truth data images can be passed through the pipeline in groups of two or more so that ground truth data can be identically augmented original image and mask sup 3 sup augmented original and mask images originalmask https raw githubusercontent com mdbloice augmentorfiles master usageguide original with mask png augmentedmask https raw githubusercontent com mdbloice augmentorfiles master usageguide ground truth gif to augment ground truth data in parallel to any original data add a ground truth directory to a pipeline using the ground truth https augmentor readthedocs io en master code html augmentor pipeline pipeline ground truth function python p augmentor pipeline path to images point to a directory containing ground truth data images with the same file names will be added as ground truth data and augmented in parallel to the original data p ground truth path to ground truth images add operations to the pipeline as normal p rotate probability 1 max left rotation 5 max right rotation 5 p flip left right probability 0 5 p zoom random probability 0 5 percentage area 0 8 p flip top bottom probability 0 5 p sample 50 multiple mask image augmentation using the datapipeline class augmentor version 0 2 3 images that have multiple associated masks can be augmented multiple mask augmentation multiplemask https github com mdbloice augmentorfiles blob master usageguide merged multi mask gif arbitrarily long lists of images can be passed through the pipeline in groups and augmented identically using the datapipeline class this is useful for ground truth images that have several masks for example in the example below the images and their masks are contained in the images data structure as lists of lists while their labels are contained in y python p augmentor datapipeline images y p rotate 1 max left rotation 5 max right rotation 5 p flip top bottom 0 5 p zoom random 1 percentage area 0 5 augmented images labels p sample 100 the datapipeline returns images directly augmented images above and does not save them to disk nor does it read data from the disk images are passed directly to datapipeline during initialisation for details of the images data structure and how to create it see the multiple mask augmentation ipynb https github com mdbloice augmentor blob master notebooks multiple mask augmentation ipynb jupyter notebook generators for keras and pytorch if you do not wish to save to disk you can use a generator in this case with keras python g p keras generator batch size 128 images labels next g which returns a batch of images of size 128 and their corresponding labels generators return data indefinitely and can be used to train neural networks with augmented data on the fly alternatively you can integrate it with pytorch python import torchvision transforms torchvision transforms compose p torch transform torchvision transforms totensor main features elastic distortions using elastic distortions one image can be used to generate many images that are real world feasible and label preserving input image augmented images eight hand drawn border https cloud githubusercontent com assets 16042756 23697279 79850d52 03e7 11e7 9445 475316b702a3 png eights border https cloud githubusercontent com assets 16042756 23697283 802698a6 03e7 11e7 94b7 f0b61977ef33 gif the input image has a 1 pixel black border to emphasise that you are getting distortions without changing the size or aspect ratio of the original image and without any black transparent padding around the newly generated images the functionality can be more clearly seen here original image sup 1 sup random distortions applied original https raw githubusercontent com mdbloice augmentorfiles master usageguide orig png distorted https raw githubusercontent com mdbloice augmentorfiles master usageguide distort gif perspective transforms there are a total of 12 different types of perspective transform available four of the most common are shown below tilt left tilt right tilt forward tilt backward tiltleft https raw githubusercontent com mdbloice augmentorfiles master usageguide tiltleft s png original https raw githubusercontent com mdbloice augmentorfiles master usageguide tiltright s png original https raw githubusercontent com mdbloice augmentorfiles master usageguide tiltforward s png original https raw githubusercontent com mdbloice augmentorfiles master usageguide tiltbackward s png the remaining eight types of transform are as follows skew type 0 skew type 1 skew type 2 skew type 3 skew0 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner0 s png skew1 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner1 s png skew2 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner2 s png skew3 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner3 s png skew type 4 skew type 5 skew type 6 skew type 7 skew4 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner4 s png skew5 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner5 s png skew6 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner6 s png skew7 https raw githubusercontent com mdbloice augmentorfiles master usageguide corner7 s png size preserving rotations rotations by default preserve the file size of the original images original image rotated 10 degrees automatically cropped original https raw githubusercontent com mdbloice augmentorfiles master usageguide orig png rotate https raw githubusercontent com mdbloice augmentorfiles master usageguide rotate aug b png compared to rotations by other software original image rotated 10 degrees original https raw githubusercontent com mdbloice augmentorfiles master usageguide orig png rotate https raw githubusercontent com mdbloice augmentorfiles master usageguide rotate png size preserving shearing shearing will also automatically crop the correct area from the sheared image so that you have an image with no black space or padding original image shear x axis 20 degrees shear y axis 20 degrees original https raw githubusercontent com mdbloice augmentorfiles master usageguide orig png shearx https raw githubusercontent com mdbloice augmentorfiles master usageguide shear x aug png sheary https raw githubusercontent com mdbloice augmentorfiles master usageguide shear y aug png compare this to how this is normally done original image shear x axis 20 degrees shear y axis 20 degrees original https raw githubusercontent com mdbloice augmentorfiles master usageguide orig png shearx https raw githubusercontent com mdbloice augmentorfiles master usageguide shear x png sheary https raw githubusercontent com mdbloice augmentorfiles master usageguide shear y png cropping cropping can also be handled in a manner more suitable for machine learning image augmentation original image random crops resize operation original https raw githubusercontent com mdbloice augmentorfiles master usageguide orig png original https raw githubusercontent com mdbloice augmentorfiles master usageguide crop resize gif random erasing random erasing is a technique used to make models robust to occlusion this may be useful for training neural networks used in object detection in navigation scenarios for example original image sup 2 sup random erasing original https raw githubusercontent com mdbloice augmentorfiles master usageguide city road street italy scaled jpg original https raw githubusercontent com mdbloice augmentorfiles master usageguide city road street italy animation gif see the pipeline random erasing https augmentor readthedocs io en stable code html augmentor pipeline pipeline random erasing documentation for usage chaining operations in a pipeline with only a few operations a single image can be augmented to produce large numbers of new label preserving samples original image distortions mirroring original https raw githubusercontent com mdbloice augmentorfiles master usageguide eight 200px png distortflipflop https raw githubusercontent com mdbloice augmentorfiles master usageguide flip distort gif in the example above we have applied three operations first we randomly distort the image then we flip it horizontally with a probability of 0 5 and then vertically with a probability of 0 5 we then sample from this pipeline 100 times to create 100 new data python p random distortion probability 1 grid width 4 grid height 4 magnitude 8 p flip left right probability 0 5 p flip top bottom probability 0 5 p sample 100 tutorial notebooks integration with keras using generators augmentor can be used as a replacement for keras augmentation functionality augmentor can create a generator which produces augmented data indefinitely according to the pipeline you have defined see the following notebooks for details reading images from a local directory augmenting them at run time and using a generator to pass the augmented stream of images to a keras convolutional neural network see augmentor keras ipynb https github com mdbloice augmentor blob master notebooks augmentor keras ipynb augmenting data in memory in array format and using a generator to pass these new images to the keras neural network see augmentor keras array data ipynb https github com mdbloice augmentor blob master notebooks augmentor keras array data ipynb per class augmentation strategies augmentor allows for pipelines to be defined per class that is you can define different augmentation strategies on a class by class basis for a given classification problem see an example of this in the following jupyter notebook per class augmentation strategy ipynb https github com mdbloice augmentor blob master notebooks per class augmentation strategy ipynb complete example let s perform an augmentation task on a single image demonstrating the pipeline and several features of augmentor first import the package and initialise a pipeline object by pointing it to a directory containing your images python import augmentor p augmentor pipeline home user augmentor data tests now you can begin adding operations to the pipeline object python p rotate90 probability 0 5 p rotate270 probability 0 5 p flip left right probability 0 8 p flip top bottom probability 0 3 p crop random probability 1 percentage area 0 5 p resize probability 1 0 width 120 height 120 once you have added the operations you require you can sample images from this pipeline python p sample 100 some sample output input image sup 3 sup augmented images original https cloud githubusercontent com assets 16042756 23019262 b696e3a6 f441 11e6 958d 17f18f2cd35e jpg augmented https cloud githubusercontent com assets 16042756 23018832 cda6967e f43f 11e6 9082 765c291f1fd6 gif the augmented images may be useful for a boundary detection task for example licence and acknowledgements augmentor is made available under the terms of the mit licence see licence md https github com mdbloice augmentor blob master license md 1 checkerboard image obtained from wikimedia commons and is in the public domain https commons wikimedia org wiki file checkerboard pattern svg 2 street view image is in the public domain http stokpic com project italian city street with shoppers 3 skin lesion image obtained from the isic archive image id 5436e3abbae478396759f0cf download https isic archive com 443 api v1 image 5436e3abbae478396759f0cf download you can use urllib to obtain the skin lesion image in order to reproduce the augmented images above python from urllib import urlretrieve im url https isic archive com 443 api v1 image 5436e3abbae478396759f0cf download urlretrieve im url isic 0000000 jpg isic 0000000 jpg httplib httpmessage instance at 0x7f7bd949a950 note for python 3 use from urllib request import urlretrieve logo created at logomakr com https logomakr com tests to run the automated tests clone the repository and run bash py test v from the command line to view the ci tests that are run after each commit see https travis ci org mdbloice augmentor asciicast click the preview below to view a video demonstration of augmentor in use asciicast https asciinema org a 105368 png https asciinema org a 105368 autoplay 1 speed 3 | augmentation machine-learning deep-learning neural-networks | ai |
BlockchainDevReport | blockchain development report source code and full methodology for outlier ventures blockchain development reports latest one can be found here https outlierventures io research blockchain developer trends 2021 setup install requires python sh pip3 install r requirements txt add github pats for all large data pulling operations from github github personal access tokens pat https help github com en github authenticating to github creating a personal access token for the command line are required as user to github server requests are rate limited at 5000 requests per hour per authenticated user no scope access is required for the tokens ps if you have private repos be sure to use a token that only has the public repo scope create a env refer to env sample to store all the github pats in a single space seperated list these pats will be used in round robin to access the various github organisations and repositories update config optional in the config ini file there are three categories of protocols projects namely blockchain defi nft metaverse each category contains the protocols projects analysed for the blockchain development trends 2021 report https outlierventures io research blockchain developer trends 2021 to run for a particular category uncomment the corresponding section and run script s for blockchian defi nft protocols projects you can also add protocols projects you want the scripts to analyse update protocols optional the analysis is based on core repositories for each protocol with the electric capital s crowdsourced crypto ecosystems https github com electric capital crypto ecosystems index being used as the base where we have manually curated relevant organisations per ecosystem based on thorough research therefore we would advise against updating protocol toml as it would overwrite the manual curation of organisations all of the ecosystems are specified in toml configuration files to update toml files of the protocols projects added for comparision by you to the config you can follow either of the two steps automated comment all categories of protocols projects in the config ini create a new variable called chains in the config ini containing their names in a single space seperated list ensure that their names are the same as toml file names of the corresponding electric capital crypto ecosytems then run the following command sh python3 updateprotocols py manual create a file in the protocols sub folder with the same name as that of the toml files corresponding to the protocols projects in the electric capital crypto ecosytems and copy and paste the contents in it usage protocol core development sh python3 dev py protocol name this analyses historical commits code changes and statistics for the each of the github organisations belonging to the protocol summed across repositories for the default branch main master results are written to 2 files protocol name stats json latest stats such as star count and code churn in the last month protocol name history json historical commits and code churn additions and deletions on a week by week basis protocol core contributing developers sh python3 contr py protcocols protocol name toml the total number active in the past year is printed and the usernames written to protocol name contributors json it saves all the seen repositories in the protocol name repos seen txt if an error occurs rerunning this script will start analysing from the point where it crashed ignoring all seen repos visualizing results once you have run both of the above run for all the protocols projects you can visualize results using the following command sh python3 vis py results are written to files commits csv commits png commits change png churn csv churn png churn change png devs csv devs png and devs change png note that churn refers to the number of code changes one stop shell script methodology we have based our analsysis on core repositories for each protocol using electric capital s crowdsourced crypto ecosystems https github com electric capital crypto ecosystems index as the base with manual curation of relevant organisations per ecosystem all the core repositories of each of the github organizations of a protocol were taken and the forked repositories when marked as such on github were ignored forking repositories is very common practice and leads to the development activity of one ecosystem being included in another including all forks in the analysis adds a lot more noise than signal for similar reasons only activity for the default branch main or master of each repository was included in these unforked repositories all commits to the default branch were indexed and analyzed we attribute the development activity for each organization on github to a single protocol and don t include individual repositories outside of those organizations to most accurately show development activity to the core development of protocols for the blockchain development trends 2021 report github data was pulled for the duration of 27 january 31 december 2020 the toml configuration files used for organisations and repositories analysed for the core development and developer count are in the protocols folder core protocol development historical commits and code changes commits and code changes are pulled directly from the github api these are pulled per repository and then summed for all repositories in a given organisation for their default branch main master the data points used are the total number of commits and total number of code changes additions deletions each week across all branches in the visualisation a 4 week moving average is taken to smooth the data the data collection is in dev py and the visualization is in vis py core developer contributing to a protocol all commits are pulled from each repo and the date as well as the author github username returned any commits with a date from more than one year in the past are filtered out the process is repeated for all repos in the toml file with the resulting list of contributors combined and de duplicated the data collection is in contr py and the visualization is in vis py github statistics a measurement of the sum total of stars forks and releases of each of the core repositories of the protocols github organization indicating in some way its popularity and activity visualization including growth calculation commit and churn charts are visualised using a 4 week moving average to smooth data therefore the curve lags by approximately 2 weeks developer activity charts display the raw data growth charts percentage change take an average of the last 8 weeks of the year and compare this figure to the first 8 weeks of the year rounding to the nearest whole percentage point this applies to all growth charts commit churn and developer activity | blockchain |
|
tinyhttp | div align center img src https raw githubusercontent com deno libs tinyhttp master logo svg h1 align center tinyhttp h1 nest badge nest badge https nest land package tinyhttp github workflow status gh actions img github actions codecov codecov badge codecov docs badge docs code quality img code quality dependency count deps div this is a deno https deno land port of tinyhttp https github com talentlessguy tinyhttp 0 legacy tiny amp fast web framework as a replacement of express example ts import app from https deno land x tinyhttp app ts const app new app app get name async req res await res send hello on req url from deno v deno version deno and tinyhttp app listen 3000 console log started on 3000 docs badge https img shields io github v release deno libs tinyhttp label docs logo deno style for the badge color b06892 docs https doc deno land https deno land x tinyhttp mod ts gh actions img https img shields io github actions workflow status deno libs tinyhttp main yml branch master style for the badge logo github label color b06892 codecov https coveralls io github deno libs tinyhttp github actions https github com deno libs tinyhttp actions codecov badge https img shields io coveralls github deno libs tinyhttp style for the badge color b06892 nest badge https img shields io badge publushed 20on nest land b06892 style for the badge code quality img https img shields io codefactor grade github deno libs tinyhttp style for the badge color b06892 code quality https www codefactor io repository github deno libs tinyhttp deps https img shields io endpoint url https 3a 2f 2fdeno visualizer danopia net 2fshields 2fdep count 2fhttps 2fx nest land 2ftinyhttp 400 1 23 2fmod ts style for the badge color b06892 | deno tinyhttp web-framework backend http http-server | front_end |
algorithms-and-data-structure | div img src https github com roscibely algorithms and data structure blob main root ufersa png width 70 height 100 div algorithms and data structure professora rosana rego https github com roscibely pex1241 algoritmos e estrutura de dados i bacharelado interdisciplinar em tecnologia da informa o ufersa git e controle de vers o curso gr tis https www dataquest io course git and vcs git cheat sheet https github com roscibely algorithms and data structure blob develop root github git cheat sheet pdf octocat parte i 1 ponteiros https github com roscibely algorithms and data structure tree main root pointers 2 aloca o din mica https github com roscibely algorithms and data structure blob main root pointers alocdinamic md 2 1 vetores https github com roscibely algorithms and data structure tree main root vectors 2 2 matrizes https github com roscibely algorithms and data structure tree develop root matrices 3 tipos estruturados https github com roscibely algorithms and data structure tree main root estruturas 3 1 vetores de estruturas https github com roscibely algorithms and data structure tree main root estruturas vetores estruturados 3 2 vetores de ponteiros para estruturas https github com roscibely algorithms and data structure tree develop root estruturas vetores de ponteiros de struct 3 3 uni o https github com roscibely algorithms and data structure tree main root estruturas union 3 4 enumera es https github com roscibely algorithms and data structure tree main root estruturas enum exerc cios https github com roscibely algorithms and data structure blob develop root questoes revisao md parte ii 1 arquivos https github com roscibely algorithms and data structure tree develop root arquivos 2 tipos abstrato de dados https github com roscibely algorithms and data structure tree main root tad 3 complexidade de algoritmos https github com roscibely algorithms and data structure blob develop root algoritmos de busca time md 3 1 calculando o tempo de execu o https github com roscibely algorithms and data structure tree develop root execution time 4 algoritmos de busca https github com roscibely algorithms and data structure tree main root algoritmos de busca 5 algoritmos de ordena o https github com roscibely algorithms and data structure tree main root sort algorithms parte iii 1 estrutura de dados lista https github com roscibely algorithms and data structure tree main root listas 2 estrutura de dados pilha https github com roscibely algorithms and data structure tree main root pilha 3 estrutura de dados fila https github com roscibely algorithms and data structure tree main root filas extra 1 intere o por menu https github com roscibely algorithms and data structure tree develop root interecao 20por 20menus quer aprender python comece clicando aqui https github com roscibely data structure with python especial i compilando e executando programas em c no terminal do linux com o terminal aberto digite o comando abaixo para compilar o programa bash gcc nome do programa c o nome do programa para executar o programa digite o comando abaixo bash nome do programa para compilar e executar o programa em um nico comando digite o comando abaixo bash gcc nome do programa c o nome do programa nome do programa ii compilando e executando programas em c no terminal do windows com o terminal aberto digite o comando abaixo para compilar o programa bash gcc nome do programa c o nome do programa para executar o programa digite o comando abaixo bash nome do programa para compilar e executar o programa em um nico comando digite o comando abaixo bash gcc nome do programa c o nome do programa nome do programa gcc o compilador do c ele compila o c digo fonte e gera um arquivo execut vel o comando acima compila o arquivo nome do programa c e gera um arquivo execut vel nome do programa o par metro o nome do programa indica o nome do arquivo execut vel gerado o par metro indica que o pr ximo comando s ser executado se o comando anterior for executado com sucesso o comando nome do programa executa o arquivo execut vel nome do programa debugger para compilar o programa com o debugger digite o comando abaixo bash gcc g nome do programa c o nome do programa para executar o programa com o debugger digite o comando abaixo bash gdb nome do programa para compilar e executar o programa com o debugger em um nico comando digite o comando abaixo bash gcc g nome do programa c o nome do programa gdb nome do programa para executar o programa com o debugger digite o comando abaixo bash gdb nome do programa div img src https github com roscibely algorithms and data structure blob develop root ufersa jpg width 700 height 250 div i ufersa campus pau dos ferros i | algorithm algorithms c c-language c-programming-language data-structures collaborate educative github gitlens | server |
embedde-systems | embedde systems embedded systems are computing systems designed to perform specific tasks within a larger system | os |
|
fullstack-docker-template | development environment setup 1 download and install docker desktop 2m https www docker com products docker desktop tip you can speed up docker desktop by configuring it to use more cpu and memory 2 get the env file from ops team 5m ask the ops team for the env local file for setting up all environment variables 3 run the makefile 10m the build is fully dockerized make docker local initial setup 4 visit the app visit http localhost 80 tooling setup linting and formatting we use flake8 for linting and black https github com psf black for formatting installing black curl https bootstrap pypa io get pip py o get pip py python3 get pip pip install black common development commands linting lint the front end make lint js lint the backend make lint python other view logs from backend docker compose logs f backend compile requirements txt pip compile requirements in output file requirements txt check types mypy config file mypy ini | server |
|
w5500-lwip-freertos | w5500 lwip binding w5500 is used in mac raw mode hardware address filtering is switched off this important if you are intended to use it as one of bridge interfaces interrupts are used only for recv signal uses freertos tested with stm32 how to use 1 patch your lwip tree with files from external libs lwip port freertos folder 2 implement your low level init spi gpio nvic clocks etc and low level io functions in device wizchip wizchip port c 3 check interrupt handler example in app interrupts c change in accordance with your platform and gpio setup i e external interrupt number and so on 4 call lwipinit in your main function | w5500 wizchip lwip freertos tcpip c embedded | os |
Travel-Buddy | travel buddy mobile application development module app project members sahan chandrabahu it19072784 udesh wijesekara pasan wadhasinghe it19026244 | front_end |
|
esm.sh | esm sh server embed assets og image svg esm sh a fast smart global content delivery network cdn for modern es2015 web development release https img shields io github v release esm dev esm sh label display name tag sort semver style flat colora 232323 colorb 232323 logo hackthebox logocolor eeeeee https github com esm dev esm sh releases discord https img shields io discord 1097820016893763684 style flat colora 232323 colorb 232323 label logo discord logocolor eeeeee https discord gg xdbjmeb7pb twitter https img shields io twitter follow jexia label 40jexia style flat colora 232323 colorb 232323 logo twitter logocolor eeeeee https twitter com jexia github sponsors https img shields io github sponsors ije label style flat colora 232323 colorb 232323 logo githubsponsors logocolor eeeeee https github com sponsors ije open collective https img shields io opencollective all esm label style flat colora 232323 colorb 232323 logo opencollective logocolor eeeeee https opencollective com esm how to use esm sh is a modern cdn that allows you to import es6 modules https developer mozilla org en us docs web javascript guide modules from a url js import module from https esm sh pkg semver path or build a module with custom input code js import esm from https esm sh build const sayhi await esm import chalk from chalk export const sayhi chalk blue hi console log sayhi prints hi message in blue color more usage check out here building module with custom inputcode you may want to use bare specifier instead of url with import maps https github com wicg import maps html script type importmap imports react https esm sh react 18 2 0 script script type module import react from react alias to https esm sh react 18 2 0 script importing from npm js import react from https esm sh react 18 2 0 you may also use a semver https docs npmjs com cli v6 using npm semver or a dist tag https docs npmjs com cli v8 commands npm dist tag instead of a fixed version number or omit the version tag entirely to use the latest tag js import react from https esm sh react 18 2 0 latest import react from https esm sh react 17 17 0 2 import react from https esm sh react canary 18 3 0 canary e1ad4aa36 20230601 you can import submodules of a package js import rendertostring from https esm sh react dom 18 2 0 server or import fetch non module js as following js import https esm sh react 18 2 0 package json assert type json importing from github esm sh supports to import modules assets from a github repo gh owner repo tag path for example js import tslib from https esm sh gh microsoft tslib 2 6 0 or load a svg image from a github repo https esm sh gh microsoft fluentui emoji assets party 20popper color party popper color svg specifying dependencies by default esm sh rewrites import specifiers based on the package dependencies to specify the version of these dependencies you can add the deps package version query to specify multiple dependencies separate them with a comma like this deps react 17 0 2 react dom 17 0 2 js import react from https esm sh react 17 0 2 import useswr from https esm sh swr deps react 17 0 2 aliasing dependencies js import useswr from https esm sh swr alias react preact compat in combination with deps js import useswr from https esm sh swr alias react preact compat deps preact 10 5 14 the original idea came from lucacasonato https github com lucacasonato tree shaking by default esm sh exports a module with all its exported members however if you want to import only a specific set of members you can specify them by adding a exports foo bar query to the import statement js import await rest from https esm sh tslib 7 3kb import await rest from https esm sh tslib exports await rest 489b by using this feature you can take advantage of tree shaking with esbuild and achieve a smaller bundle size note that this feature is only supported for esm modules and not cjs modules bundle mode js import button from https esm sh antd bundle in bundle mode all dependencies are bundled into a single js file except the peer dependencies development mode js import react from https esm sh react dev with the dev option esm sh builds a module with process env node env set to development or based on the condition development in the exports field of package json this is useful for libraries that have different behavior in development and production for example react will use a different warning message in development mode esbuild options by default esm sh checks the user agent header to determine the build target you can also specify the target by adding target available targets are es2015 es2022 esnext deno denonext node and bun js import react from https esm sh react target es2020 other supported options of esbuild conditions https esbuild github io api conditions js import foo from https esm sh foo conditions custom1 custom2 keep names https esbuild github io api keep names js import foo from https esm sh foo keep names ignore annotations https esbuild github io api ignore annotations js import foo from https esm sh foo ignore annotations web worker esm sh supports worker query to load the module as a web worker js import workerfactory from https esm sh monaco editor esm vs editor editor worker worker const worker workerfactory you can pass some custom code snippet to the worker when calling the factory function js const workeraddon self onmessage function e console log e data const worker workerfactory workeraddon package css html link rel stylesheet href https esm sh monaco editor css this only works when the package imports css files in js directly importing wasm modules esm sh supports importing wasm modules in js directly to do that you need to add module query to the import url js import wasm from https esm sh dqbd tiktoken 1 0 3 tiktoken bg wasm module const exports new webassembly instance wasm imports fixing named exports if you get an error like not provide an export named that means esm sh can t resolve named exports of the module correctly you can add exports foo bar query to specify the named exports js import render from https esm sh react dom 18 2 0 exports render using import maps import maps https github com wicg import maps has been supported by most modern browsers and deno natively this allows bare import specifiers such as import react from react to work esm sh supports external foo bar query to specify external dependencies with this query esm sh will not rewrite the import specifiers of the specified dependencies for example json imports preact https esm sh preact 10 7 2 preact render to string https esm sh preact render to string 5 2 0 external preact alternatively you can mark all dependencies as external by adding a prefix before the package name json imports preact https esm sh preact 10 7 2 preact render to string https esm sh preact render to string 5 2 0 swr https esm sh swr 1 3 0 react https esm sh preact 10 7 2 compat import maps supports trailing slash https github com wicg import maps packages via trailing slashes that can not work with url search params friendly to fix this issue esm sh provides a special format for import url that allows you to use query params with trailing slash change the query prefix to and put it after the package version json imports react dom https esm sh react dom 18 2 0 pin v133 dev react dom https esm sh react dom 18 2 0 pin v133 dev esm sh also provides a cli script using cli script in deno to generate and update the import maps that resolves dependencies automatically escape hatch raw source files in rare cases you may want to request js source files from packages as is without transformation into es modules to do so you need to add a raw query to the request url for example you might need to register a package s source script as a service worker in a browser that does not yet support https caniuse com mdn api serviceworker ecmascript modules the type module option js await navigator serviceworker register new url https esm sh playground elements 0 18 1 playground service worker js raw import meta url href scope you may alternatively specify an raw extra query after the package version html playground project sandbox base url https esm sh playground elements 0 18 1 raw playground project so that transitive references in the raw assets will also be raw requests deno compatibility esm sh is a deno friendly cdn that resolves node s built in modules such as fs os net etc making it compatible with deno js import express from https esm sh express const app express app get req res res send hello world app listen 3000 for users using deno 1 33 2 esm sh uses deno land std 0 177 1 node https deno land std 0 177 1 node as the node compatibility layer you can specify a different version by adding the deno std ver query js import postcss from https esm sh express deno std 0 128 0 deno supports type definitions for modules with a types field in their package json file through the x typescript types header this makes it possible to have type checking and auto completion when using those modules in deno link https deno land manual typescript types using x typescript types header figure 1 server embed assets sceenshot deno types png in case the type definitions provided by the x typescript types header are incorrect you can disable it by adding the no dts query to the module import url js import unescape from https esm sh lodash unescape no dts this will prevent the x typescript types header from being included in the network request and you can manually specify the types for the imported module supporting nodejs bun nodejs 18 supports http importing under the experimental network imports flag bun doesn t support http modules yet we highly recommend reejs https ree js org as the runtime with esm sh that works both in nodejs and bun using cli script esm sh provides a cli script for managing imports with import maps in deno https deno land and node bun via reejs https ree js org this cli script automatically resolves dependencies and uses a pinned build version for stability to use the esm sh cli script you first need to run the init command in your project s root directory bash deno run a r https esm sh init once you ve initialized the script you can use the following commands to manage your imports bash adding packages deno task esm add react react dom add multiple packages deno task esm add react 17 0 2 specify version deno task esm add react preact compat using alias updating packages deno task esm update react react dom update specific packages deno task esm update update all packages removing packages deno task esm remove react react dom the cli script works with node bun via reejs https ree js org bash initializing reejs x https esm sh init using reejs tasks like deno tasks above reejs task esm add react reejs task esm update react reejs task esm remove react building module with custom input code this is an experimental api that allows you to build a module with custom input code imports npm gh packages supports ts jsx syntaxes bundle mulitple modules into a single js file js import build from https esm sh build const ret await build dependencies preact 10 13 2 preact render to string 6 0 2 code jsx h import h from preact import rendertostring from preact render to string export function render string return rendertostring h1 hello world h1 for types checking and lsp completion types export function render string import module const render await import ret url import bundled module const render await import ret bundleurl render h1 hello world h1 or use the esm tag function to build and import js ts snippet quickly in browser with npm packages js import esm from https esm sh build const mod await esm jsx h import h from preact 10 13 2 import rendertostring from preact render to string 6 0 2 export const html rendertostring h1 hello world h1 console log mod html h1 hello world h1 pinning build version to ensure stable and consistent behavior you may want to pin the build version of a module you re using from esm sh this helps you avoid potential breaking changes in the module caused by updates to the esm sh server the pin query allows you to specify a specific build version of a module which is an immutable cached version stored on the esm sh cdn js import react from https esm sh react dom pin v133 or use version prefix import react from https esm sh v133 react dom by using the pin query in the import statement you can rest assured that the version of the module you re using will not change even if updates are pushed to the esm sh server this helps ensure the stability and reliability of your application for ui libraries like react and vue esm sh uses a special build version stable to ensure single version of the library is used in the whole application global cdn img width 150 align right src server embed assets cf svg the global cdn of esm sh is provided by cloudflare https cloudflare com one of the world s largest and fastest cloud network platforms self hosting to host esm sh by yourself check the hosting hosting md documentation | esm cdn npm deno js esbuild infrastructure es2015 es6 | front_end |
gst-torch | simbotic torch real time computer vision pipeline for gstreamer a pytorch powered gstreamer plugin developed with rust includes custom 3d engine accelerated ml models accelerated capture and transform pipelines libtorch plugin for gstreamer in rust assets monodepth semseg fusion png assets teaser 02 gif assets motion transfer gif assets rozgo 3ddfa jpg assets rozgo depth png assets rozgo pose jpg assets 3ddfa png assets salient png not only are the networks cuda enabled but the pipeline has also been accelerated with cuda tensors source for monocular depth src monodepth rs semantic segmentation src semseg rs motion transfer src motiontransfer rs dependencies simbotictorch has been tested on ubuntu 18 20 rust works with latest stable rust https rustup rs cuda 11 4 cudnn 8 2 2 make sure you have cuda 11 4 installed in your system with cudnn 8 2 2 download cudnn v8 2 2 july 6th 2021 for cuda 11 4 cudnn runtime library for ubuntu20 04 deb cudnn developer library for ubuntu20 04 deb note on ubuntu 20 there might be an issue with missing libnvrtc builtins so 11 1 a symlink solves it vertex vx pc usr local cuda targets x86 64 linux lib ll libnvrtc builtins lrwxrwxrwx 1 root root 25 jul 15 12 10 libnvrtc builtins so libnvrtc builtins so 11 4 lrwxrwxrwx 1 root root 25 aug 12 00 01 libnvrtc builtins so 11 1 libnvrtc builtins so 11 4 lrwxrwxrwx 1 root root 29 jul 15 12 10 libnvrtc builtins so 11 4 libnvrtc builtins so 11 4 100 rw r r 1 root root 6883208 jul 15 12 10 libnvrtc builtins so 11 4 100 sudo ln s usr local cuda targets x86 64 linux lib libnvrtc builtins so 11 4 usr local cuda targets x86 64 linux lib libnvrtc builtins so 11 1 libtorch 1 11 0 depends on cuda enabled works with cuda 11 4 libtorch get libtorch from the pytorch website download section https pytorch org get started locally specifically cxxx11 abi https download pytorch org libtorch cu113 libtorch cxx11 abi shared with deps 1 11 0 2bcu113 zip set env libtorch gstreamer depends on gstreamer development libraries apt install libgstreamer1 0 dev libgstreamer plugins base1 0 dev gstreamer1 0 plugins base gstreamer1 0 plugins good gstreamer1 0 plugins bad gstreamer1 0 plugins ugly gstreamer1 0 libav libgstrtspserver 1 0 dev misc dependencies simbotictorch now includes a 3d rendering engine and has the following dependencies apt install glslang tools others apt install libssl dev apt install libx11 dev apt install gnome video effects frei0r build git lfs this repo uses git lfs for models and assets make sure git lfs command is properly installed https git lfs github com environment variable an environment variable needs to be set for all scripts and tools to be able to find this plugin export simbotic torch full path to this repo building to build the rust gst plugin just type build sh test monodepth and segmentation with any of the following test dashboard preview sh test dashboard webcam sh test dashboard file sh test monodepth preview sh test monodepth webcam sh test semseg preview sh test semseg webcam sh test motion transfer test motiontransfer preview sh test motiontransfer webcam sh test motiontransfer file sh citations article monodepth2 title digging into self supervised monocular depth prediction author cl e ment godard and oisin mac aodha and michael firman and gabriel j brostow booktitle the international conference on computer vision iccv month october year 2019 inproceedings semantic cvpr19 author yi zhu karan sapra fitsum a reda kevin j shih shawn newsam andrew tao bryan catanzaro title improving semantic segmentation via video propagation and label relaxation booktitle ieee conference on computer vision and pattern recognition cvpr month june year 2019 url https nv adlr github io publication 2018 segmentation indicates equal contribution inproceedings reda2018sdc title sdc net video prediction using spatially displaced convolution author reda fitsum a and liu guilin and shih kevin j and kirby robert and barker jon and tarjan david and tao andrew and catanzaro bryan booktitle proceedings of the european conference on computer vision eccv pages 718 733 year 2018 inproceedings siarohin 2019 neurips author siarohin aliaksandr and lathuili re st phane and tulyakov sergey and ricci elisa and sebe nicu title first order motion model for image animation booktitle conference on neural information processing systems neurips month december year 2019 misc 3ddfa cleardusk author guo jianzhu and zhu xiangyu and lei zhen title 3ddfa howpublished url https github com cleardusk 3ddfa year 2018 article zhu2017face title face alignment in full pose range a 3d total solution author zhu xiangyu and liu xiaoming and lei zhen and li stan z journal ieee transactions on pattern analysis and machine intelligence year 2017 publisher ieee inproceedings qin 2020 pr title u2 net going deeper with nested u structure for salient object detection author qin xuebin and zhang zichen and huang chenyang and dehghan masood and zaiane osmar and jagersand martin journal pattern recognition volume 106 pages 107404 year 2020 tch rs https github com laurentmazare tch rs rust bindings for pytorch monodepth2 https github com nianticlabs monodepth2 monocular depth estimation from a single image semantic segmentation https github com nvidia semantic segmentation improving semantic segmentation via video propagation and label relaxation first order model for image animation https github com aliaksandrsiarohin first order model first order motion model for image animation 3ddfa https github com cleardusk 3ddfa face alignment in full pose range a 3d total solution u 2 net https github com nathanua u 2 net object detection based on a visual attention | ai |
|
CS112_Team08-Design-and-Analysis-of-Algorithm | cs112 l23 khcl design and analysis of algorithm lecturer msc nguyen thanh son msc huynh thi thanh thuong group id n008 team member no full name student id team role github 1 dzung bui tri mailto 19521386 gm uit edu vn 19521386 https img shields io badge leader blue btrdung https github com btrdung 2 tan ngoc pham mailto 19520925 gm uit edu vn 19520925 https img shields io badge member blue ngctnnnn https github com ngctnnnn 3 an khanh vo mailto 19520007 gm uit edu vn 19520007 https img shields io badge member blue vokhanhan25 https github com vokhanhan25 process x week 1 x 1 1 introduction x week 2 x 2 1 computational thinking x 2 2 abstraction taobien x week 3 x 3 1 complexity algorithm https github com btrdung cs112 team08 tree main assignments week03 x 3 2 assignments x 3 2 1 vt50 seaweed https khmt uit edu vn wecode cs112 2021 assignment 2 3 x 3 2 2 vr06 bot https khmt uit edu vn wecode cs112 2021 assignment 2 1 x 3 3 analysis complexity of heap sort https github com btrdung cs112 team08 tree main assignments week04 x week 4 x 4 1 complexity recursive algorithm https github com btrdung cs112 team08 tree main assignments week04 x 4 2 analysis complexity of merge sort https github com btrdung cs112 team08 tree main assignments week04 x week 5 x 5 1 program correctness https github com btrdung cs112 team08 tree main assignments week05 x 5 2 assignments x 5 2 1 vv27 s index https khmt uit edu vn wecode cs112 2021 assignment 3 4 x 5 2 2 vu33 maxstr https khmt uit edu vn wecode cs112 2021 assignment 3 9 x 5 2 3 vs05 ceramic https khmt uit edu vn wecode cs112 2021 assignment 3 17 x week 6 x 6 1 brute force https github com btrdung cs112 team08 tree main assignments week06 x week 7 x 7 1 back tracking https github com btrdung cs112 team08 tree main assignments week07 x 7 2 assignments x 7 2 1 vq30 renewed https khmt uit edu vn wecode cs112 2021 assignment 5 8 x 7 2 2 vs02 newnum https khmt uit edu vn wecode cs112 2021 assignment 5 15 x 7 3 special assignments x 7 3 1 peg soliatire https github com btrdung peg solitaire x week 8 x 8 1 divide and conquer https github com btrdung cs112 team08 tree main assignments week08 x 8 2 assignments x week 9 x 9 1 branch and bound https github com btrdung cs112 team08 tree main assignments week09 x 9 2 assignments x 9 2 1 vq21 wall https khmt uit edu vn wecode cs112 2021 assignment 6 21 x 9 2 2 vu20 fraction https khmt uit edu vn wecode cs112 2021 assignment 6 13 x week 10 x 9 1 geometry https github com btrdung cs112 team08 tree main assignments week10 x 9 2 presentation https github com btrdung cs112 team08 tree main presentation x self practice x atcoder https github com btrdung cs112 team08 tree main self practice atcoder x beginner 205 https atcoder jp contests abc205 x beginner 206 https atcoder jp contests abc206 x beginner 207 https atcoder jp contests abc207 x grand 057 https atcoder jp contests agc057 x codeforces https github com btrdung cs112 team08 tree main self practice codeforces x round 725 div 3 https codeforces com contest 1538 x e olymp https github com btrdung cs112 team08 tree main self practice e olymp x stl intro https github com btrdung cs112 team08 design and analysis of algorithm tree main self practice e olymp stl 20intro x sort search https github com btrdung cs112 team08 design and analysis of algorithm tree main self practice e olymp sort 20 26 20search | algorithm complexity-algorithm analysis-complexity | server |
webpack-starter-basic | img alt webpack starter basic loo src https github com lifenautjoe webpack starter basic blob master src assets logo on dark bg png raw true width 250 webpack starter basic forthebadge http forthebadge com images badges fo real svg http forthebadge com forthebadge http forthebadge com images badges built with love svg http forthebadge com dependencies https david dm org lifenautjoe webpack starter basic svg https david dm org lifenautjoe webpack starter basic a simple webpack 4 starter project for your basic web development needs read more on the demo website https lifenautjoe github io webpack starter basic or continue reading below table of contents motivation motivation features features requirements requirements usage usage faq faq when should i use this starter when should i use this starter where s the common webpack config wheres the common webpack config how to load fonts how to load fonts how to load images how to load images in javascript in javascript in index html in indexhtml how to install bootstrap 4 how to install bootstrap 4 websites using this starter kit on the wild websites using this starter kit on the wild motivation i needed to make a plain ol drop your mail to stay updated of ongoing developments page i did not need anything fancy no frontend framework no unit testing simply a starter project that would let me use sass es6 load assets add vendor prefixes start a dev server generate sourcemaps and optimize everything for production i looked around and all i found were heavily specialized and complicated webpack starter projects webpack angular starter webpack react starter etc that are so intertwined with plugins that stripping undesired functionality is almost impossible so i did this features separated development and production webpack settings you can understand sass es6 asset loading css vendor prefixing development server sourcemaps favicons generation production optimizations mobile browser header color requirements node https nodejs org 7 6 usage substitute project name for your project name clone the repository sh git clone https github com lifenautjoe webpack starter basic project name cd project name install npm dependencies sh npm install run the kickstart command sh npm run kickstart after the project has been kickstarted to start the development server sh npm start to build for production sh npm run build to preview the production build sh npm run preview faq when should i use this starter you should use this starter if any of the following are true you want to make a static page e g splash screen onboarding screen phaser game threejs visualization countdown you found no good starter kit for whatever you want to do and need a solid place to start from please note if you are going to use a frontend framework like angular or react you can of course add the required plugins and configuration but it s normally complicated and quirky enough that it s highly recommended to use one of the existing starter projects such as react webpack babel https github com alicoding react webpack babel or for angular projects the angular cli https github com angular angular cli where s the common webpack config there is none and that is good thing the pattern creates unnecessary confusion over the setup at the end the config will always be different across environments people just put booleans everywhere on the common config to switch between these differing configuration options which is just awful to see and confusing for someone who s just starting on webpack the only truly shared config between these files are the entry js point and the main html template how to load fonts if you don t support opera mini browsers support the woff format its newer version woff2 is widely supported by modern browsers and can be a good alternative if you decide to use only this format you can load the fonts in a similar manner to images in your webpack dev js and webpack prod js add the following js module exports module rules test woff loader url loader options limit at 50k above that it emits separate files limit 50000 url loader sets mimetype if it s passed without this it derives it from the file extension mimetype application font woff output below fonts directory name fonts name ext and let s say your font is in the folder assets with the name pixel woff you can add it and use it in index scss as scss font face font family pixel src url assets pixel woff format woff body font family pixel sans serif if you would like to support all kinds of font types remove the woff rule we previously added to webpack dev js and webpack prod js and add the following js module exports module rules test ttf eot woff woff2 loader file loader options name fonts name ext and assuming you have your fonts in the directory assets with names pixel woff pixel ttf pixel eot etc you can add it and use it in index scss as scss font face font family pixel src url assets pixel woff2 format woff2 url assets pixel woff format woff url assets pixel eot format embedded opentype url assets pixel ttf format truetype add other formats as you see fit how to load images in javascript you can require an image from javascript like js const myimage require assets icon png if the image size in bytes is smaller than 8192 you myimage will be a string with the encoded image path such as data image svg xml base64 bw9kdwxllmv4cg9ydhmgpsbfx3dlynbhy2tfchvibgljx3bhdghfxyaricjhc3nldhmvaw1hz2vzl3rpy2stq3lydkhsdi5zdmciow if the image size is larger than 8192 it will be a string with the url to the image such as src assets icon png hash 5b1f36bc41ab31f5b801 this limit is set so images like icons are not loaded through a request but you can force the loader to give you image urls always by doing the following but should not be necessary the limit works 90 of the time js const myimage require url assets icon png in index html if you would like to include an image on your index html file place the path of the image in a webpack require statement require imagepath html img class splash title img src require src assets logo on dark bg png alt webpack logo a how to install bootstrap 4 after the project has been kickstarted install bootstrap sh npm install bootstrap 4 save install bootstrap dependencies sh npm install popper js save npm install jquery save replace the project index scss with scss import bootstrap scss bootstrap and replace the project index js with js require styles index scss import popperjs from popper js import jquery from jquery jquery console log hello jquery bootstrap 4 to see it all come together replace the index html body tag with html body nav class navbar navbar expand md navbar dark bg dark fixed top a class navbar brand href navbar a button class navbar toggler type button data toggle collapse data target navbarsexampledefault aria controls navbarsexampledefault aria expanded false aria label toggle navigation span class navbar toggler icon span button div class collapse navbar collapse id navbarsexampledefault ul class navbar nav mr auto li class nav item active a class nav link href home span class sr only current span a li li class nav item a class nav link href link a li li class nav item a class nav link disabled href disabled a li li class nav item dropdown a class nav link dropdown toggle href https example com id dropdown01 data toggle dropdown aria haspopup true aria expanded false dropdown a div class dropdown menu aria labelledby dropdown01 a class dropdown item href action a a class dropdown item href another action a a class dropdown item href something else here a div li ul form class form inline my 2 my lg 0 input class form control mr sm 2 type text placeholder search aria label search button class btn btn outline success my 2 my sm 0 type submit search button form div nav main role main class container div class starter template h1 bootstrap starter template h1 p class lead use this document as a way to quickly start any new project br all you get is this text and a mostly barebones html document p div main container body start the development server and voil sh npm start to build for production sh npm run build to preview the production build sh npm run preview please remember to remove the google analytics tag in the index html file as soon as you make the template yours html global site tag gtag js google analytics script async src https www googletagmanager com gtag js id ua 101423651 2 script script window datalayer window datalayer function gtag datalayer push arguments gtag js new date gtag config ua 101423651 2 script websites using this starter kit on the wild droppable library https github com lifenautjoe droppable noel event emitter https github com lifenautjoe noel chooseit wishbot http voeux2018 choosit com webpack starter basic https lifenautjoe github io webpack starter basic okuna https www okuna io have a website online built with this starter kit and would like to add it to the list open an issue author joel hernandez www lifenautjoe com | webpack seed starter webpack-starter webpack-seed starter-kit es6-starter es6 sass seed-project ecmascript6 simple webpack-simple basic webpack4 phaser phaserjs spa | front_end |
graph_database_epilepsy | graph database epilepsy prerequisites libraries to install all the necessary libraries listed in please execute the command pip install r requirements txt database ensure that an instance of the neo4j graph database is running and available on the localhost server check the config ini file and update it with the correct bolt port and address for the database as well as the credentials default credentials provided as example input files ensure that the appropriate input files are placed in the input folder the patient csv file contains fields pertaining to the patient the protocols csv file contains information about the studies each patient has consented to the events includes information about the procedures performed on a patient and the summary csv file consists of the diagnosis established during the surgical conference initializing database to initialize the database with data simply execute the main function of the main py file using the command python main py | server |
|
AI_Composer | overview a project that trains a lstm recurrent neural network over a dataset of midi files more information can be found on the writeup about this project http yoavz com music rnn this the code for build an ai composer on youtube https youtu be s f2qv2 u00 dependencies numpy http www numpy org tensorflow https github com tensorflow tensorflow python midi https github com vishnubob python midi git mingus https github com bspaans python mingus use pip https pypi python org pypi pip to install any missing dependencies installation tested on ubuntu 16 04 step 1 tensorflow version 0 8 0 must be used on tensorflow s download page here https www tensorflow org versions r0 10 get started os setup html scroll down to pip installation follow the first step normally you will see export tf binary url followed by a url modify the part of the url that has tensorflow 0 10 0 so that it will download version 0 8 0 not version 0 10 0 tensorflow 0 8 0 example of the modified url for the python 2 7 cpu version of tensorflow export tf binary url https storage googleapis com tensorflow linux cpu tensorflow 0 8 0 cp27 none linux x86 64 whl sudo pip install upgrade tf binary url follow the third step normally to install tensorflow step 2 after installing tensorflow you will have to install the missing dependencies pip install matplotlib sudo apt get install python tk pip install numpy step 3 cd git clone https github com vishnubob python midi cd python midi python setup py install cd git clone https github com bspaans python mingus cd python mingus python setup py install basic usage 1 mkdir data mkdir models 2 run python main py this will collect the data create the chord mapping file in data nottingham pickle and train the model 3 run python rnn sample py config file new config file config to generate a new midi song give it 1 2 hours to train on your local machine then generate the new song you don t have to wait for it to finish just wait until you see the saving model message in terminal in a future video i ll talk about how to easily setup cloud gpu training likely using www fomoro com credits credit for the vast majority of code here goes to yoav zimmerman https github com yoavz i ve merely created a wrapper around all of the important functions to get people started | ai |
|
vip-guide | vip guide the veteran focused integration process vip is a lean agile framework services the interest of veterans through the efficient streamlining of information technology it activities that occur within the department of veterans affairs it enterprise any va it project effort that touches the va network must utilize vip regardless of whether it spends government funding from va s congressional it appropriation or any other appropriation the current version of the vip policy in use at the department is version 3 1 you can find a pdf copy of it here docs vip3 1 pdf you can access a markdown version of the document within the guide directory a full table of contents is available here guide table of contents md or you can navigate using the below links introduction guide introduction md table of contents guide table of contents md 1 overview guide overview md 2 lifecycle guide lifecycle md 3 reporting and tooling guide reporting and tools md 4 managing a project guide managing a project md appendix a terminology guide appendixa terminology md appendix b applicability guide appendixb applicability md appendix c contacts guide appendixc contacts md generating a markdown copy of the full guide you can generate a single file markdown document using the following script included in this project bin make vip full guide sh developing on the site locally this site uses jekyll http jekyllrb com sass http sass lang com bourbon http bourbon io neat http neat bourbon io and requires ruby 2 x install dependencies with bundler http bundler io shell bundle install and run the site with jekyll shell bundle exec jekyll serve watch if all goes well visit the site at http localhost 4000 vip is maintained by the vip business office vbo under the enterprise program management office epmo | server |
|
ece_ntua_db_2018 | ece ntua db 2018 car rental service using php mysql and javascript for databases course of electrical and computer engineering school of ntua athens this application supports basic crud create read update delete operations for employees vehicles customers as well as some complex mysql queries for statistics 1 vehicles per company branch 2 driver employees working on different branches 3 overall vehicle number recorded at our database 4 overall number of customers per city 5 vehicles with ascending km order 6 bookings per month 7 months with more than 15 of overall yearly bookings 8 vehicles booked more than n times 9 active bookings 10 pending vehicle services ordered by days till service left demo you can view a demo hosted at 000webhost here https ecedb2 000webhostapp com final databases screenshot https github com arvchristos ece ntua db 2018 raw master docs databases screenshot png built with php mysql javascript jquery bootstrap 3 angularjs developers arvanitis christos arvchristos https github com arvchristos manos bagakis manosbagakis https github com manosbagakis | server |
|
mobidgroup | mobidgroup mobile development | front_end |
|
chaincrafter | chaincrafter seamless integration and composability for large language model apps features composable prompts and chains use multiple models to run one chain and then use that as input for a different chain and model customizable prompt and response formatting add modifiers to prompts to change the style length and format of the response extract data from the response to use in the next prompt add custom functions to process the response add custom functions to process the input variables integration with openai api llama cpp in progress async calls to models load prompts and chains from yaml using catalogs makes it easier to share prompts and chains between projects build up a prompts library docs index start python installation bash pip install chaincrafter usage 1 define your prompts and the variables that they expect the input variables can be of any type and can be processed by a function the prompt message is treated as an f string 2 define your chain of prompts the chain is a list of tuples where each tuple contains a prompt and the output key to store the response in the output key is used to access the response in the next prompt 3 set up the models that you want to use 4 run the chain using the models python from chaincrafter import chain prompt from chaincrafter models import openaichat chat model openaichat temperature 0 65 model name gpt 3 5 turbo system prompt prompt you are a helpful assistant who responds to questions about the world hello prompt prompt hello what is the capital of france answer only with the city name followup prompt prompt city sounds like a nice place to visit what is the population of city chain chain system prompt hello prompt city followup prompt followup response messages chain run chat model for message in messages print f message role message content running the examples bash source venv bin activate export openai api key python m examples interesting facts python m examples interesting facts catalog javascript typescript work in progress installation bash npm install chaincrafter usage docs index end | ai chatgpt gpt-4 llm ml openai-api | ai |
learn-RTOS | learn rtos this tutorials are my learnings on how rtos for embedded devices are crafted the main target is arm cm4f architecture the tools required to run any of them are 1 cubemx ide and 2 arm gcc toolchain https developer arm com tools and software open source software developer tools gnu toolchain to clone the tutorials git clone recurse submodules j2 url | freertos arm gcc-arm-toolchain makefile c | os |
mdepx | introduction mdepx also known as mdx or machdep x is an operating system for embedded applications key features real time priority based time sliced round robin scheduling fully preemptible cooperative scheduling optional tickless operation static dynamic memory allocation timed mutexes semaphores symmetric multiprocessing smp bsd libc included flat address space isa supported arm cortex m family armv7 m armv8 m cheri128 hybrid and pure abi capability system models mips32 mips64 risc v machine or supervisor modes rv32 rv64 smp platforms supported raspberry pi pico nordic semiconductor nrf5 nrf9 stmicroelectronics stm32 microchip pic32 sifive some others emulators support see emul qemu riscv64c128 cheri pure capability mode qemu riscv64 smp qemu riscv32 smp qemu mips64c128 cheri hybrid and pure capability modes qemu mips64 featured applications external raspberry pi pico https github com machdep raspberrypi pico lte 4g link with nrf9160 https github com machdep nrf9160 ctsrd cheri device model https github com ctsrd cheri device model and device model riscv https github com ctsrd cheri device model riscv other example apps https github com machdep getting started see documentation https machdep uk contributing please submit pull requests on github or send patches to br machdep com note that mdepx uses freebsd style 9 https www freebsd org cgi man cgi query style sektion 9 guide license this project is licensed under two clause bsd license https en wikipedia org wiki bsd licenses 2 clause license 22simplified bsd license 22 or 22freebsd license 22 | rtos risc-v arm riscv smp qemu cheri embedded microcontroller bare-metal cortex-m raspberry-pi | os |
Embedded-Spatial-Measurement-System | embedded spatial measurement system designed and built an embedded spatial measurement system using a time of flight sensor to acquire information of the surrounding environment img src final3dmodel jpg alt the final 3d model after collecting 10 planes of measurements the final 3d model after collecting 10 planes of measurements https drive google com file d 1zcbt4e3ytiusqzmn9cd7omf5mzziiz5s view usp sharing https drive google com file d 1oyejrt4ilwtp zjen6tvv6ooqkcs79 8 view usp sharing | os |
|
ml-datasets | ml datasets curated list of machine learning datasets from nepalese researchers audio devanagiri numbers spoken audio https drive google com drive folders 15g57qa1tqa4ix6 mic6v1wieouqp0xal nepali asr training data set http www openslr org 54 nepali asr training data set containing 157k utterances nepali text to speech dataset 1 https github com meamit nepali text to speech tree master speechdb dataset 2 https github com anuragregmi speak nepali tree master sounds dataset 3 https github com hcoebct069 nepali asr tree master recordings devanagiri characters speech https github com tsumansapkota devanagari characters speech disaster earthquake building damage levels https www drivendata org competitions 57 nepal earthquake page 136 finance nepal rastra bank forex rate api https www nrb org np exportforexjson php yy 2019 mm 08 dd 01 yy1 2019 mm1 08 dd1 02 nepali stock market dataset 2012 2020 https www kaggle com sagyamthapa nepali stock market form 2012 to 2020 till march 2019 01 01 csv kaggle nepal stock exchange data till 2019 https www kaggle com qramkrishna nepal stock exchange data geography metadata from open street maps https github com sharad461 nepal openstreetmap extract nepal travel distance between cities km https data world hdx d1d0c217 8c6b 4747 ab1e 1069e2ff3e6b pokhara weather data from 2009 to 2023 https www kaggle com datasets gauravneupane pokhara weather data from 2009 to 2023 health health diseases in nepali https github com sanjaalcorps nepalidataclassifiers blob master healthclassifiers txt real time sensor data air pollution epa air pollution data https github com hbvj99 epaairpollution nepal government air pollution data https github com hbvj99 npgovairpollution dristhi air pollution data https github com hbvj99 dristhiairpollution river level data http www hydrology gov np daily vegetable fruit price information http kalimatimarket gov np daily price information location of mahanagar yatayat in realtime https github com theonlynischal track mahanagar yatayat tribhuwan international airport realtime flight arrival list http tiairport com np flight details realtime flight departure list http tiairport com np flight details 2 image corn leaf infection dataset https www kaggle com qramkrishna corn leaf infection dataset voting ballot paper dataset https github com rajshreeee image classification for voting system using cnn nepalese currency nepali currency notes https drive google com file d 1pdf0hx6pvgx4djtchl4eeddct4wlfngw view usp sharing cash dataset https drive google com drive folders 1gxitxrk13ehkmemebpi8mrsfsr4lur55 images of 10 50 100 rupee notes https github com mmanishh nrscurrencyrecognizer tree master data train faces of famous people from nepal https www thefamouspeople com nepal php dhcd dataset https github com prasanna1991 dhcd dataset a dataset of devnagari nepali handwritten characters license plate recognition lpr dataset https github com prasanna1991 lpr nepali motorbike license plate dataset nepali characters dataset https github com inspiringlab ncd nepali fonts ocr dataset https github com basantachaulagain nepscan tree master resources nepali handwritten digits https github com kcnishan nepali handwritten digits recognition tree master dataset nepali potraits https www kaggle com sumansid nepali portraits dataset vehicles dataset https github com sdevkota007 vehicles nepal dataset 4800 images of two wheeler and four wheeler vehicles from nepal text 16nepalinews corpus https github com sndsabin nepali news classifier 14 364 nepali language news documents a large scale nepali text corpus https ieee dataport org open access large scale nepali text corpus 65k nepali sentences https github com sanjaalcorps nepalidatasets blob master raw sentences np 65k csv 350k nepali sentences https github com team naya nlp doko 39k nepali wikipedia articles https www kaggle com disisbig nepali wikipedia articles nepal brihat sabdakosh json https github com bikashpadhikari nepali brihat sabdakosh json a structured json dump of all 122 000 words of the nepali brihat sabdakosh 1000 sport news https github com aryal007 nepali text generation blob master data sports news nepali 1000 txt nepali translation parallel corpus https drive google com file d 1uthfjkjfvdgtu263dnbz wpnlqoarz 0 view nepali english machine translation corpus https github com facebookresearch flores nepali abstractive summarization corpus 286k article title pairs from news https drive google com file d 1l56k0zonmk6xpelkaxpm45wcmt 9ps3x view nepal earthquake tweets https crisisnlp qcri org lrec2016 content 2015 nepal eq html nepali chat corpus https github com itsmeashutosh43 create a open source nepali chat corpus nagarik news corpus https github com ashmitbhattarai nepali language modeling using lstm tree master nepali corpus nagarik setopati news corpus https github com ashmitbhattarai nepali language modeling using lstm tree master nepali corpus setopati nepali news in english corpus https github com sharad461 english corpus nepal nepali news dataset https github com kamalacharya2044 nepalinewsdataset laxmi prasad devkota poems https github com devkotasawal1 poem generator blob master lspd txt collection of poems of laxmi prasad devkota and contains 119161 characters nepali names https github com datafiction oya nepali nlp blob master data names nepali txt dummy nepali people information https github com bibhuticoder dummydata blob master data csv nepali news classification dataset https drive google com drive folders 1vm0uj3ffwp 3gusan3fzsov4q7ryujig nepali ngram https github com virtualanup nepalingram nepali stopwords https github com sanjaalcorps nepalistopwords blob master nepalistopwords txt nepali wikipedia articles dataset https drive google com open id 1yh8blj5bydbvzaoqemrpltedzjiioayn nepali word list https github com tesseract ocr langdata blob master nep nep wordlist nepali transliteration https github com achilleskarki nepalilipi nepali textbooks https ecommons cornell edu handle 1813 24179 collection of school textbooks from nepal assembled by professor of anthropology kathryn march over the last 30 years nepali textbooks from grade 1 to 12 http lib moecdc gov np catalog opac css index php lvl cmspage pageid 6 id rubrique 105 nepali word2vec https github com rabindralamsal word2vec embeddings for nepali language nepali spelling correction dataset https github com tnagorra nspell tree master data nepali contemporary dictionary http ltk org np nepalisabdakos dict np dictionary db sql gz 80 00 000 nepali wordlist https github com prabinzz nepali wordlist english to nepali dictionary https github com nirooj56 nepdict blob master database data csv nepali movies on imdb https github com nish1001 nepalimdb blob master data nepali movies json sentiwordnet https github com wannamit nep sentiword py misspelling correction dictionary https github com sarojdhakal bhasha nepali lemmatizer https github com dpakpdl nepalilemmatizer cc100 http data statmt org cc 100 ne txt xz lince https ritual uh edu lince nepali english code switching dataset wordlists in selected languages of nepal https github com lexibank halenepal languages resources for nepal https language resources nepal github io nepali national corpus https www sketchengine eu nepali national corpus | dataset machine-learning-datasets nepalese-researchers | ai |
plunker | plunker gitter https badges gitter im join 20chat svg https gitter im filearts plunker utm source badge utm medium badge utm campaign pr badge utm content badge the next generation of lightweight collaborative online editing warning this repository does not contain the code for what you see running on http plnkr co the current code for plunker is in the repositories listed below originally plunker was coded in a single repository with different sub servers existing in the servers path the entire application was run on a single server however with increasing popularity reality decided to come hang out and make everyone s lives difficult the solution was simple since the components of plunker were designed as sub servers it should be easy to split them out and run them separately however having different logical entities with different functions in the same repository doesn t make sense i decided to create separate repositories for each of the plunker servers that are currently deployed on nodejitsu they are as follows plunker component repositories plunker api github com filearts plunker api the server that connects to a mongodb database and serves requests over a restful api plunker www github com filearts plunker www the server that is responsible for hosting and running the front end that users see and touch everyday plunker run plugin github com ggoodman plunker run plugin the server that allows for previewing of plunks and temporary previews and also does the dynamic transpilation plunker collab github com filearts plunker collab the server that serves the code necessary for collaborative coding as well as doing the actual operational transformation over a browserchannel connection plunker embed github com filearts plunker embed the server that hosts the embedded views of plunks plunker config files each server once cloned locally requires one or two config json files to run servers that use the environment specific config files config development json and config production json plunker api plunker www plunker run plunker collab only plunker embed uses a single config json file sample configuration file not all fields are required by each server but if all are present no harm should come to any small animals javascript host hostname com url www http hostname com collab http collab hostname com api http api hostname com embed http embed hostname com run http run hostname com carbonadsh oops this is pretty specific to my current deploy carbonadsv oops this is pretty specific to my current deploy port 8080 oauth github id series of random chars secret longer series of random chars everything below this point is out of date or incorrect and there be dragons usage git clone git github com filearts plunker git git submodule update init npm install node server js editor api post edit you can send a post request to edit to bootstrap the editor with the basic structure of a plunk the json format for this is described below javascript description description of plunk tags array of tags files filename index html content html script src script js script html filename script js content alert hello world license copyright filearts https github com filearts | front_end |
|
bestlearn | bestlearn information technology | server |
|
Blockchain-Course-Patterns | solidity patterns part of the course on ethereum blockchain decentralized apps development design http acloudfan com learn blockchain http www bcmentors com solidity is changing at a very rapid pace you may see some warnings raj | blockchain |
|
myersBriggsNLPAnalysis | myers briggs personality type natural language processing project introduction my name is nathan fritter and i am obsessed with data i have a b s in applied statistics from ucsb and was an active member of the data science club ucsb this is one of various projects i have completed to vairous degrees feel free to take a look at any and all other projects and make suggestions the project this project is based on the kaggle competition here https www kaggle com datasnaek mbti type tweets from various accounts on twitter all with an accompanying personality type were gathered together each account has a personality type along with the last 50 tweets for each user at first my goal was to tokenize the data turn into individual words or parts remove all the noise stop words hyperlinks hashtags etc and use the tokenized inputs try and predict the personality type of the user however i realized that i was leaving alot to be desired with respect to exploratory analysis instead of simply just getting word and type frequency i realized i could be looking for so much more hashtag mention retweet url frequency n gram occurences etc thus i have revamped the project to include alot more on the exploratory analysis side i also have learned alot about data cleanup manipulation storing data in different ways and more that i hope becomes clear in my code i utilized three different machine learning models known for success in natural language processing nlp multinomial naive bayes linear support vector machine neural network the project is split up into different sections tht you may look at individually data extraction cleanup exploratory analysis naive bayes linear support vector machine neural network currently the only version i have this for is python versions in r and pyspark are currently in the works note data extraction cleanup py and helper functions py are used to abstract functions for the other scripts for readability i e using a function called tokenize data from helper functions py to tokenize data creating different versions of data for the other scripts to import etc since the other scripts use these scripts heavily if you would like to make changes to this project give them a look what is the myers briggs personality type the myers briggs personality type is based on psychological theory about how people perceive their world and make accompanying judgements about these perceptions it is a simplistic view that is prone to overgeneralizing but one that can have surprising effectiveness at predicting people s behaviors in general when taking into account the limitations of the theory there are four categories extroverted introverted e i rather than the mainstream view that this distinction means talkative versus antisocial this difference stems from where one gets their energy extroverts gain energy from being around other people talking conversing being noticed etc they can be alone but will get tired without contact introverts gain energy from being alone and clearing their thoughts being alone and allowed to let their thoughts flow is very energizing coming from personal experience opposite to extroverts introverts have the capability to socialize quite effectively but after a while even five minutes can do wonders intuitive sensory n s here the differences lie in how the individual perceives their world the two domains here are either through the five senses immediate environment or within their mind intuitives n are better at perceiving the world through their mind and imagining abstract possibilities in the world sensories s are better at perceiving the world through the five senses thinking feeling t f this domain deals with how the individual judges the information they have perceived either the individual makes judgments in a thinking t way or a feeling f way thinkers make conclusions about their world through logic and reasoning feelers make conclusions about their world through emotion and gut feeling judgmental perceiving j p lastly and a little more complicated this domain basically states whether the perceiving trait or the judging trait is the dominant trait of the individual judgers j have their judging trait be their dominant overall trait perceivers p have their perceiving trait be their dominant overall trait here http www myersbriggs org my mbti personality type mbti basics is more in depth detail on the theory kaggle site has info as well reproducing results on your local machine 1 clone the repo onto your machine instructions here https help github com articles cloning a repository if you are not familiar 2 optional download the package virtualenv i prefer using virtual environments to run code since there will be no package conflicts with anything in requirements txt here we will use pip3 install virtualenv python3 compatible feel free to use another package manager such as brew 3 run the following commands in your terminal compatible with linux and mac i creating a virtual environment called your virtual env virtualenv your virtual env replace your virtual env with what you d like to name your environment ii turn on the virtual environment and place you inside source your virtual env bin activate iii install necessary packages from requirements txt pip3 install r requirements txt iv create necessary files for exploratory analysis and model building python3 scripts data extraction cleanup py this will create the following files mbti tokenized csv tokenized data with stopwords mbti cleaned csv tokenized data without stopwords important to make sure these scripts run properly run the code from the main directory after cloning i e do not change directories before running scripts i have added code that grabs the current working directory and makes sure it ends with myersbriggsnlpanalysis if it does not the scripts will not run example for running scripts python3 scripts any script py 4 run any of the other scripts and watch the magic general analysis findings exploratory analysis here i will delve into the various different insights i have mined through this data word frequencies top 25 no change after stop words are removed img src https raw githubusercontent com njfritter myersbriggsnlpanalysis master images wordfrequencylabeled png style width 150px wordcloud based on frequencies above this can also be found in the images folder img src https raw githubusercontent com njfritter myersbriggsnlpanalysis master images wordcloud png style width 150px label personality type frequencies img src https raw githubusercontent com njfritter myersbriggsnlpanalysis master images typefrequencylabeled png style width 150px the variance in the frequency of personality types may be an issue down the line infp infj intp and intj show up the most and disproportionally so because of this there will likely be something called class imbalance this is where some classes are represented much more highly than others also complexity does not always make models better the fact that there are sixteen different classes would impact any model s performance as a next step i will alter the types to look at specific type combination differences which may include e vs i n vs s t vs f j vs p nt vs nf vs sf vs st nj vs np vs sj vs sp whichever method i choose should reduce error and increase accuracy due to increased simplicity of the model however any simplification will lead to information lost so there is a trade off model metrics using the original four letter types 16 classes here are the model results model accuracy test error rate cross validation score hyperparameter optimization optimized accuracy multinomial naive bayes 0 2169 0 7831 accuracy 0 21 0 00 vect ngram range 1 1 tfidf use idf false clf alpha 1 0e 10 clf fit prior false 0 3210 linear support vector machine 0 6615 0 3385 accuracy 0 67 0 03 clf alpha 0 001 clf eta0 0 25 clf l1 ratio 0 clf learning rate optimal clf penalty l2 tfidf use idf true vect ngram range 1 1 0 6716 multi layer perceptron 0 5777 0 3423 accuracy 0 66 0 02 blank blank the accuracy and test error rate are based on one train test split with model fitting and default parameters simplest method the optimized accuracy is the accuracy of the model chosen with the best parameters after optimization as well as the cross validation results as we can see the accuracy of these methods will be fairly limited due to the large number of classes and the shortcomings of using tweets as data where slang hyperlinks and more contribute to making the data noisy personality type prediction results here are the success rates of each model predicting each personality type this is used tuned model results personality type multinomial naive bayes linear support vector machine multi layer perceptron enfj 0 000000 0 129032 0 370968 enfp 0 051887 0 561321 0 584906 entj 0 000000 0 342466 0 561644 entp 0 027273 0 613636 0 586364 esfj 0 000000 0 100000 0 100000 esfp 0 000000 0 000000 0 000000 estj 0 000000 0 000000 0 000000 estp 0 000000 0 222222 0 296296 infj 0 475789 0 751579 0 698947 infp 0 676898 0 886914 0 754443 intj 0 230088 0 657817 0 678466 intp 0 396896 0 815965 0 778271 isfj 0 000000 0 415385 0 538462 isfp 0 000000 0 225806 0 494624 istj 0 000000 0 272727 0 480519 istp 0 000000 0 583333 0 703704 we can deduce a couple of things here naive bayes is not a good choice for this data as it consistently has trouble predicting the underrepresented classes the model only was able to give six different classes in its predictions all of the inxx types plus enfp and entp these six types happened to be the top 6 types in frequency this is a case where the model s simplicity ends up hurting it it is highly reliant on seeing many examples of a class to predict it accurately no matter what model the four inxx personality types seem to be predicted pretty well within that the infp type is predicted very well this is consistent with the fact that those four have the most tweets as well as infp having the most overall the enxx types have the second most and they have reasonable performance including actual predictions from naive bayes next are isxx types then esxx types did the worst overall also having the lowest amount of tweets in fact there seems to be a noticable relationship between the number of tweets data points per type and the model s predictive success with them there are some exceptions entj isfj and istp but for the most part this is to be expected the more data you have of a certain class the better a model will be able to correctly predict it on new data contributing if you would like to contribute create a new branch to work on instructions here https github com kunena kunena forum wiki create a new branch with git and manage branches by default the branch you will receive after cloning is the master branch this is where the main changes are deployed so it should not contain any of your code unless the changes have been agreed upon by all parties which is the purpose of creating the new branch this way you must submit a request to push changes to the main branch submit changes as a pull request with any relevant information commented reaching out in some way would also be great to help speed up the process if you do contribute i will add your name as a contributor i am always iterating on this project and any feedback would be greatly welcomed sources cited classification with neural nets using mlp classifier http sdsawtelle github io blog output week4 andrew ng machine learning with python html machine learning models with scikit learn http scikit learn org stable modules classes html module sklearn base mining twitter data with marco bonzanini https marcobonzanini com 2015 03 09 mining twitter data with python part 2 6 easy steps to learn naive bayes https www analyticsvidhya com blog 2017 09 naive bayes explained a practical explanation of a naive bayes classifier https monkeylearn com blog practical explanation naive bayes classifier | natural-language-processing myers-briggs-personality tweets kaggle python3 jupyter-notebook machine-learning | ai |
polkadot | dear contributors and users we would like to inform you that we have recently made significant changes to our repository structure in order to streamline our development process and foster better contributions we have merged three separate repositories cumulus substrate and polkadot into a single new repository the polkadot sdk https github com paritytech polkadot sdk go ahead and make sure to support us by giving a star to the new repo by consolidating our codebase we aim to enhance collaboration and provide a more efficient platform for future development if you currently have an open pull request in any of the merged repositories we kindly request that you resubmit your pr in the new repository this will ensure that your contributions are considered within the updated context and enable us to review and merge them more effectively we appreciate your understanding and ongoing support throughout this transition should you have any questions or require further assistance please don t hesitate to reach out to us https forum polkadot network t psa parity is currently working on merging the polkadot stack repositories into one single repository 2883 best regards parity technologies | parity polkadot blockchain rust client node | blockchain |
sql-challenge | employee database data modeling data engineering and data analysis using postgresql design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data data modeling inspect the csvs and sketch out an erd of the tables data engineering use the information you have to create a table schema for each of the six csv files specify data types primary keys foreign keys and other constraints import each csv file into the corresponding sql table data analysis 1 list the following details of each employee employee number last name first name gender and salary 2 list employees who were hired in 1986 3 list the manager of each department with the following information department number department name the manager s employee number last name first name and start and end employment dates 4 list the department of each employee with the following information employee number last name first name and department name 5 list all employees whose first name is hercules and last names begin with b 6 list all employees in the sales department including their employee number last name first name and department name 7 list all employees in the sales and development departments including their employee number last name first name and department name 8 in descending order list the frequency count of employee last names i e how many employees share each last name | postgresql postgresql-database sql data-modeling data-engineering data-analysis data-science big-data | server |
energy-temperature | energy temperature data engineering project using google cloud begin tf docs requirements name version a name requirement terraform a terraform requirement terraform 0 13 0 a name requirement google a google requirement google 4 35 providers name version a name provider random a random provider random 3 5 1 modules name source version a name module bigquery a bigquery module bigquery bigquery n a a name module buckets a buckets module buckets buckets n a resources name type random id bucket prefix https registry terraform io providers hashicorp random latest docs resources id resource random integer random https registry terraform io providers hashicorp random latest docs resources integer resource inputs name description type default required a name input gcp project a gcp project input gcp project project to use for this config string cr lab aruzza 2706230354 no a name input gcp region a gcp region input gcp region region to use for gcp provider string us central1 no outputs name description a name output data flow bucket a data flow bucket output data flow bucket n a a name output dataset name a dataset name output dataset name n a a name output default label a default label output default label using the output from the local variable to call the label on all modules and resources a name output raw data bucket a raw data bucket output raw data bucket bucket names end tf docs | cloud |
|
Recommendation-systems | recommendation systems recommendation systems this is a workshop on using machine learning and deep learning techniques to build recommendation systesm theory ml amp dl formulation prediction vs ranking similiarity biased vs unbiased paradigms content based collaborative filtering knowledge based hybrid and ensembles data tabular images text sequences models deep matrix factorisation auto encoders wide amp deep rank learning sequence modelling methods explicit vs implicit feedback user item matrix embeddings convolution recurrent domain signals location time context social process setup encode amp embed design train amp select serve amp scale measure test amp improve tools python data stack numpy pandas scikit learn keras spacy implicit lightfm python libraries deep recommender libraries 1 tensorrec built on tensorflow 2 spotlight built on pytorch 3 tfranking built on tensorflow learning to rank matrix factorisation based libraries 1 implicit implicit matrix factorisation 2 qmf implicit matrix factorisation 3 lightfm for hybrid recommedations 4 surprise scikit learn type api for traditional alogrithms similarity search libraries 1 annoy approximate nearest neighbour 2 nmslib knn methods 3 faiss similarity search and clustering algorithms amp approaches collaborative filtering for implicit feedback datasets bayesian personalised ranking for implicit data logistic matrix factorisation neural network matrix factorisation neural collaborative filtering variational autoencoders for collaborative filtering evaluations evaluating recommendation systems about me piyush pathak portfolio https anirudhrapathak3 wixsite com piyush github https github com piyushpathak03 blog https medium com piyushpathak03 follw me linkedin badge https img shields io badge piyushpathak blue style flat square logo linkedin logocolor white link https www linkedin com in piyushpathak03 https www linkedin com in piyushpathak03 p align right img height 100 src https media giphy com media l3urdstnijbny7rwlb giphy gif p | os |
|
cis411_lab3_uiux | cis 411 lab 3 user experience user interface this is the user experience ux user interface ui lab for cis 411 systems analysis and design for messiah university http messiah edu the purpose of this lab is to engage students to consider how to assess ux ui considerations from their own perspective how to solicit similar input from others and apply findings into their own product designs doing the lab 1 pre requisites 1 github account 2 git is installed on your development machine 3 text editor or other integrated development environment ide for modifying code 2 lab description systems products and solutions are more than a series of feature functions and met requirements eventually all computer systems produce a human interaction or a personal artifact thus growing our appreciation for user experience and the user interface even if we re not personally artistically gifted is foundational to our proficiency as a technology professionals this lab has 2 parts 1 evaluate two online job search sites use steve krug s principles to assess two websites apps 2 conduct competitive usability test assess how well a product competitor performs in a useablity test and determine the implications for your product design detailed instructions are here lab instructions md and you are expected to compile your findings into a labreport following this template labreports lab template md 3 submissions you are expected to create a lab report as a markdown file under the labreports directory using the lab github handle md naming convention in your forked repository after you have reviewed your work then you should submit a pull request to this repository with your lab report and any accompanying images files e g required diagrams add the pull request url into the courseroom lms canvas for grading resources lab specific help since this may be your first github experience or writing requirements we have some additional resources for you detailed instructions lab instructions md lab template labreports lab template md krug s 2006 don t make me think a common sense approach to web usability 2nd edition retrived from https www amazon com dont make me think usability dp 0321344758 https www amazon com dont make me think usability dp 0321344758 understanding markdown syntax markdown guide https www markdownguide org github flavored markdown https github github com gfm table to markdown tool https tabletomarkdown com convert spreadsheet to markdown license this content is provided under the mit license license credits special thanks to joel worrall aka tangollama https github com tangollama for co developing this course and this lab | os |
|
GPT-gouv | avis finetuning open source llms with data of french public service project organization license makefile makefile with commands like make data or make train readme md the top level readme for developers using this project data external data from third party sources interim intermediate data that has been transformed processed the final canonical data sets for modeling raw the original immutable data dump docs a default sphinx project see sphinx doc org for details models trained and serialized models model predictions or model summaries notebooks jupyter notebooks naming convention is a number for ordering the creator s initials and a short delimited description e g 1 0 jqp initial data exploration references data dictionaries manuals and all other explanatory materials reports generated analysis as html pdf latex etc figures generated graphics and figures to be used in reporting requirements txt the requirements file for reproducing the analysis environment e g generated with pip freeze requirements txt setup py makes project pip installable pip install e so src can be imported src source code for use in this project init py makes src a python module data scripts to download or generate data make dataset py features scripts to turn raw data into features for modeling build features py models scripts to train models and then use trained models to make predictions predict model py train model py visualization scripts to create exploratory and results oriented visualizations visualize py tox ini tox file with settings for running tox see tox readthedocs io p small project based on the a target blank href https drivendata github io cookiecutter data science cookiecutter data science project template a cookiecutterdatascience small p | ai |
|
GPT-RAG | components 1 data ingestion https github com azure gpt rag ingestion 2 orchestrator https github com azure gpt rag orchestrator 3 app front end https github com azure gpt rag frontend video gpt rag prompt engineering finetuning train spanish alt text https img youtube com vi icsf4yiriea 0 jpg https www youtube com watch v icsf4yiriea what is a rag pattern img src media rag2 png alt retrieval augmented generation rag pattern width 1024 reference implementation of the retrieval augmented generation rag pattern why to start with rag pattern img src media rag1 png alt why rag width 1024 gpt rag architecture overview img src architecture diagram architecture diagram gpt rag png alt architecture overview width 1024 connectivity components azure virtual network vnet to secure data flow isolated internal inbound outbound connections azure front door lb l7 web application firewall waf to secure internet facing components bastion rdp ssh over tls secure remote desktop access solution for vms in the virtual network jumpbox a secure jump host to access vms in private subnets ai workloads azure open ai a managed ai service for running advanced language models like gpt 4 private dns zones for name resolution within the virtual network and between vnets cosmos db a globally distributed multi model database service to support ai applications web applications in azure web app azure ai services for building intelligent applications high availability disaster recovery ready solution audit logs monitoring observability app insight continuous operational improvement architecture deep dive img src media rag3 png alt architecture deep dive width 1024 1 data ingestion https github com azure gpt rag ingestion optimizes data preparation for azure openai 2 orchestrator https github com azure gpt rag orchestrator the system s dynamic backbone ensuring scalability and a consistent user experience 3 app front end https github com azure gpt rag frontend built with azure app services and the backend for front end pattern offers a smooth and scalable user interface prerequisites azure developer cli https aka ms azure dev install how to deploy gpt rag to deploy this solution you just need to execute the next steps 1 provision required azure services you can do it by clicking on the following button deploy to azure https aka ms deploytoazurebutton https portal azure com create microsoft template uri https 3a 2f 2fraw githubusercontent com 2fazure 2fgpt rag 2fmain 2finfra 2fmain json or by using azure developer cli azd https aka ms azure dev install executing the following lines in terminal azd auth login azd init t azure gpt rag azd up important when selecting the target location check here https learn microsoft com en us azure cognitive services openai concepts models the regions that currently support the azure openai models you want to use 2 ingestion component use data ingestion https github com azure gpt rag ingestion repo template to create your data ingestion git repo and execute the steps in its deploy section 3 orchestrator component use orchestrator https github com azure gpt rag orchestrator repo template to create your orchestrator git repo and execute the steps in its deploy section 4 front end component use app front end https github com azure gpt rag frontend repo template to create your own frontend git repo and execute the steps in its deploy section main components 1 data ingestion https github com azure gpt rag ingestion 2 orchestrator https github com azure gpt rag orchestrator 3 app front end https github com azure gpt rag frontend built with azure app services and the backend for front end pattern offers a smooth and scalable user interface project addons pricing estimation pricing model https github com azure gpt rag wiki gpt e2 80 90rag e2 80 90 pricing model governance governance model https share mindmanager com publish 9ogrdwqzmazzb6ilgurohv4lj1lrikjowc0w u2u technical references get started with the cloud adoption framework https learn microsoft com en us azure cloud adoption framework get started index what is an azure landing zone https learn microsoft com en us azure cloud adoption framework ready landing zone index azure openai service https learn microsoft com azure cognitive services openai overview azure cognitive search https learn microsoft com azure search search what is azure search retrieval augmented generation rag paper https arxiv org abs 2005 11401 retrieval augmented generation rag in azure cognitive search https learn microsoft com en us azure search retrieval augmented generation overview build and maintain your company copilot with azure ml and gpt 4 https www youtube com watch si b2tjsq4z4r7rksew v 2meevuwayxs revolutionize your enterprise data with chatgpt next gen apps w azure openai and cognitive search https aka ms entgptsearchblog introducing azure openai service on your data in public preview https techcommunity microsoft com t5 ai cognitive services blog introducing azure openai service on your data in public preview ba p 3847000 grounding llms https techcommunity microsoft com t5 fasttrack for azure grounding llms ba p 3843857 text what 20is 20grounding 3f relevance 20of 20the 20generated 20output check your facts and try again improving large language models with external knowledge and automated feedback https www microsoft com en us research group deep learning group articles check your facts and try again improving large language models with external knowledge and automated feedback microsoft guidance validation and robustness of responses https lnkd in ggesqmsv rag vs finetuning https towardsdatascience com rag vs finetuning which is the best tool to boost your llm application 94654b1eaba7 contributing this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla opensource microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g status check comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments trademarks this project may contain trademarks or logos for projects products or services authorized use of microsoft trademarks or logos is subject to and must follow microsoft s trademark brand guidelines https www microsoft com en us legal intellectualproperty trademarks usage general use of microsoft trademarks or logos in modified versions of this project must not cause confusion or imply microsoft sponsorship any use of third party trademarks or logos are subject to those third party s policies | azure gpt-3 gpt-4 openai | ai |
DIY-RTOS-Learning-tinyOS- | diy rtos learning tinyos learn to develop a tiny rtos step by step it s the my exercise for the online course by lecturer lisutong more information about the course http 01ketang cc | os |
|
techmoon | techmoon information technology related article | server |
|
blockchain-in-c | noincoin block chain in c major a rough cryptocurrency implementation written in c uses only c std library and nanomsg for communication coin design coin png nodes can mine noins notecoins which can be used to buy musical notes on the blockchain each block contains eight notes one octave and together they comprise an everlasting piece of music anyone can contribute to client program can transfer noins to other accounts or post the muscial notes i wrote this just to learn how cryptocurencies work it lacks the full robustness to support an absolutely massive scale potential improvements dynamic difficulty of proof generation sequencer that requests blocks as needed | blockchain-technology music | blockchain |
ipython-notebook-nltk | ipython notebook nltk an introduction to nltk with python run the notebook and see preview in github render notebook https github com luchux ipython notebook nltk blob master nlp 20 20melbdjango ipynb run it locally with the env and libs needed clone the repo create a virtual environment and install the requeriments you can clean them a bit if not using some of the libs run ipython notebook in the root directory and enjoy the notebook feel free to change test play with nltk datasets in this notebook sources many of the exercies and content is stuff i learned and used from http nltk org book natural language processing with python http shop oreilly com product 9780596516499 do mit license copyright c 2009 luciano m guasco luchux permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software | natural-language-processing nltk-library ipython-notebook python nltk | ai |
GrimoireJS | grimoire js github logo png slack status https grimoire slackin herokuapp com badge svg https grimoire slackin herokuapp com circle ci https circleci com gh grimoiregl grimoirejs svg style svg https circleci com gh grimoiregl grimoirejs license https img shields io badge license mit blue svg https github com jthreejs jthree blob develop license dependency status https david dm org grimoiregl grimoirejs svg https david dm org grimoiregl grimoirejs devdependency status https david dm org grimoiregl grimoirejs dev status svg https david dm org grimoiregl grimoirejs info devdependencies greenkeeper badge https badges greenkeeper io grimoiregl grimoirejs svg https greenkeeper io overview grimoire js provide a bridge between web engineers and cg engineers there were big gap between the development flows used for each web engineers have typically used event driven javascript programs for daily works and they mutate dom apis to make web pages dynamic however cg engineers has typically used loop based programs for daily works these are mostly build with a programming language with strong type like c or c and recently cg engineers more like to use strongly structured engines like unity this is why these 2 engineers have so much different flow for workings this is why it is hard to learn cg stuff by web engineers and cg engineers are also hard to make suitable apis for web engineers working with grimoire js is a javascript typescript framework to solve this problem with strong architecture features you can see several feature of grimoire js providing in this section we strongly recommend to see our top page http grimoire gl to learn these features most of written things are same as this readme md but our samples on the top pages are working html like markup we provides a syntax like xml to compose webgl canvas this is kind of html for web engineers you can create 360 degree image viewer on browser only by writing the code below see official page to see working example xml goml scene camera camera mesh geometry sphere cull front texture 360 jpg mesh components rotate speed 0 1 mesh components mesh scene goml dom operation api web engineers typically write javascript to mutate dom structures or attributes all that kinds things are same at grimoire web engineers can use query based operation api to changing attributes modifying structures of dom or registering event handlers these are codes to co work webgl canvas and web uis that made with ordinal web development way you can see working example on our official top page xml goml scene camera camera mesh texture logo png geometry cube mesh components rotate speed 0 0 1 0 mesh components mesh scene goml js gr function var mesh gr simple canvas mesh simple red on click function mesh setattribute color red simple blue on click function mesh setattribute color blue mesh on mouseenter function mesh setattribute scale 2 0 simple bigger addclass bold label simple smaller removeclass bold label mesh on mouseleave function mesh setattribute scale 1 0 simple smaller addclass bold label simple bigger removeclass bold label simple and powerful architecture typescript ready if you really want to make webgl stuff on your page it is hard to make only by web engineers if that contents requires highly customized representation in this situation web engineers and cg engineers need to co work cg engineers can write a component and these are reusable and these are able to be written by typescript safe and effective environment for development this is a sample to make objects waving movement you can see full comprehensive this sample at our top page ts import component from grimoirejs ref node component import isceneupdateargument from grimoirejs fundamental ref scenerenderer isceneupdateargument import transformcomponent from grimoirejs fundamental ref components transformcomponent import vector3 from grimoirejs math ref vector3 import gr from grimoirejs class wave extends component public static attributes amp default 1 0 converter number speed default 1 0 converter number public amp number public speed number private transform transformcomponent public mount void this transform this node getcomponent transformcomponent this bindattributes bind component attributes to fields public update t isceneupdateargument void this transfrom position new vector3 this transform position x math sin this speed t timer timeinsecound this amp this transform position z gr registercomponent wave wave download please see official site and download page https grimoire gl guide 1 essentials 01 installation html useful links official site http grimoire gl api reference see here https api grimoire gl core this document is automatically generated document stamp make sure the api reference is only containing core stuff mutating goml stuff operating attributes methods being available on component instance and so on if you want to see webgl related feature of api you should see renderer plugin page https api grimoire gl grimoirejs fundamental license mit license see the license file for more detail | webgl canvas grimoire | front_end |
nlp-tasks | natural language processing tasks and examples with the advancement of a i technology in recent years natural language processing technology has been able to solve so many problems while working as an nlp engineer i encountered various tasks and i thought it would be nice to gather and organize the natural language processing tasks i have dealt with in one place borrowing kyubyong s project https github com kyubyong nlp tasks format i organized natural language processing tasks with references and example code automated essay scoring wiki automated essay scoring https en wikipedia org wiki automated essay scoring data the hewlett foundation automated essay scoring https www kaggle com c asap aes data model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 model electra https arxiv org abs 2003 10555 off the shelf pororo s aes https kakaobrain github io pororo text cls aes html automatic speech recognition wiki speech recognition https en wikipedia org wiki speech recognition data librispeech https www openslr org 12 data aishell 1 https arxiv org abs 1709 05522 data ksponspeech https aihub or kr aidata 105 model deep speech2 https arxiv org abs 1512 02595 model listen attend and spell https arxiv org abs 1508 01211 model wav2vec 2 0 https arxiv org abs 2006 11477 off the shelf pororo s asr https kakaobrain github io pororo miscs asr html code example with ksponspeech https github com sooftware nlp tasks tree main automatic speech recognition dialogue generation wiki dialogue system https en wikipedia org wiki dialogue system data persona chat https github com facebookresearch parlai tree main projects personachat data korean sns corpus https aihub or kr aidata 30718 model dialogue gpt https arxiv org abs 1911 00536 code example with korean sns corpus https github com sooftware nlp tasks tree main dialogue generation dialogue retrieval wiki dialogue system https en wikipedia org wiki dialogue system data persona chat https github com facebookresearch parlai tree main projects personachat data ubuntu dialogue corpus https arxiv org abs 1506 08909 data korean sns corpus https aihub or kr aidata 30718 model poly encoder https arxiv org abs 1905 01969 code example with ubuntu dialogue corpus https github com sooftware nlp tasks tree main dialogue retrieval fill in the blank wiki cloze test https en wikipedia org wiki cloze test info masked language modeling with bert https towardsdatascience com masked language modelling with bert 7d49793e5d2c model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 off the shelf pororo s fill in the blank https kakaobrain github io pororo tagging fill html code example with wikicorpus https github com sooftware nlp tasks tree main fill in the blank grammatical error correction wiki autocorrection https en wikipedia org wiki autocorrection data nus non commercial research trial corpus license https www comp nus edu sg nlp conll14st nucle license pdf data cornell movie dialogs corpus https www cs cornell edu cristian cornell movie dialogs corpus html off the shelf pororo s gec https kakaobrain github io pororo seq2seq gec html grapheme to phoneme wiki grapheme https en wikipedia org wiki grapheme wiki phoneme https en wikipedia org wiki phoneme representative data multilingual pronunciation data https drive google com drive folders 0b7r gatfzj2awkpswhpxuklwumm off the shelf model pororo s g2p https kakaobrain github io pororo seq2seq g2p html knowledge powered conversation paper wizard of wikipedia knowledge powered conversational agents https arxiv org abs 1811 01241 data wizard of wikipedia https huggingface co datasets maximedb wow code example with wizard of wikipedia https github com sooftware nlp tasks tree main knowledge powered conversation language modeling wiki language model https en wikipedia org wiki language model info a beginner s guide to language models https towardsdatascience com the beginners guide to language models aa47165b57f9 model gpt3 https arxiv org abs 2005 14165 model gpt2 https d4mucfpksywv cloudfront net better language models language models are unsupervised multitask learners pdf model ken lm https github com kpu kenlm model rnn lm https www fit vutbr cz research groups speech publi 2010 mikolov interspeech2010 is100722 pdf code example with openwebtext https github com sooftware nlp tasks tree main language modeling machine reading comprehension wiki reading comprehension https en wikipedia org wiki reading comprehension info machine reading comprehension with bert https mccormickml com 2020 03 10 question answering with a fine tuned bert data squad https rajpurkar github io squad explorer data korquad https korquad github io korquad 201 0 model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 model electra https arxiv org abs 2003 10555 off the shelf pororo s mrc https kakaobrain github io pororo tagging mrc html code example with squad korquad https github com sooftware nlp tasks tree main machine reading comprehension machine translation wiki translation https en wikipedia org wiki translation data wmt 2014 english to french https www statmt org wmt14 translation task html data korean english translation corpus https aihub or kr aidata 87 model transformer https arxiv org abs 1706 03762 off the shelf pororo s translation https kakaobrain github io pororo seq2seq mt html code example with korean english translation corpus https github com sooftware nlp tasks tree main machine tranlsation math word problem solving paper with code math word problem solving https paperswithcode com task math word problem solving data deepmind mathmatics dataset https github com deepmind mathematics dataset data kmwp korean math word problems https github com tunib ai kmwp code example with kmwp https github com sooftware nlp tasks tree main math word problem natural language inference wiki textual entailment https en wikipedia org wiki textual entailment data glue mnli https arxiv org abs 1804 07461 data kornli https github com kakaobrain kornludatasets model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 model electra https arxiv org abs 2003 10555 off the shelf pororo s nli https kakaobrain github io pororo text cls nli html code example with glue mnli https github com sooftware nlp tasks tree main natural language inference named entity recognition wiki named entity recognition https en wikipedia org wiki named entity recognition data conll 2002 ner corpus https github com teropa nlp tree master resources corpora conll2002 data conll 2003 ner corpus https github com synalp ner tree master corpus conll 2003 data naver ner https github com naver nlp challenge model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 model electra https arxiv org abs 2003 10555 off the shelf pororo s ner https kakaobrain github io pororo tagging ner html code example with naver ner https github com sooftware nlp tasks tree main named entity recognition paraphrase generation wiki paraphrase https en wikipedia org wiki paraphrase off the shelf pororo s paraphrase generation https kakaobrain github io pororo seq2seq para gen html phoneme to grapheme off the shelf pororo s p2g https kakaobrain github io pororo seq2seq p2g html sentiment analysis wiki sentiment analysis https en wikipedia org wiki sentiment analysis data glue sst https arxiv org abs 1804 07461 data nsmc https github com e9t nsmc model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 model electra https arxiv org abs 2003 10555 off the shelf pororo s sentiment analysis https kakaobrain github io pororo text cls sentiment html code example with nsmc https github com sooftware nlp tasks tree main sentiment classification semantic textual similarity wiki semantic similarity https en wikipedia org wiki semantic similarity data glue sts https arxiv org abs 1804 07461 data korsts https github com kakaobrain kornludatasets model bert https arxiv org pdf 1810 04805 pdf model roberta https arxiv org abs 1907 11692 model electra https arxiv org abs 2003 10555 off the shelf pororo s sts https kakaobrain github io pororo text cls sts html code example with squad https github com sooftware nlp tasks tree main semantic textual similarity speech synthesis wiki speech synthesis https en wikipedia org wiki speech synthesis data lj speech https keithito com lj speech dataset data css10 https github com kyubyong css10 data kss https www kaggle com bryanpark korean single speaker speech dataset model tacotron2 https arxiv org abs 1712 05884 model fastspeech2 https arxiv org abs 2006 04558 model wavenet https deepmind com blog article wavenet generative model raw audio model hifi gan https arxiv org abs 2010 05646 off the shelf pororo s tts https kakaobrain github io pororo miscs tts html code example with lj speech https github com nvidia tacotron2 code example with kss https github com sooftware takotron2 summarization wiki automatic summarization https en wikipedia org wiki automatic summarization data xsum https arxiv org abs 1808 08745 data korean summarization corpus https aihub or kr aidata 8054 model bart https arxiv org abs 1910 13461 off the shelf pororo s summarization https kakaobrain github io pororo seq2seq summary html code example with xsum https github com sooftware nlp tasks tree main summarization author soohwan kim https github com sooftware sooftware contacts sh951011 gmail com | ai |
|
ESP-RTOS-MQTT | mqtt library for esp8266 for espressif rtos sdk v1 4 0 based on eclipse paho https eclipse org paho clients c embedded this project aims to provide an independent mqtt library which can be seamlessly added to any existing esp8266 project it is inspired from esp mqtt https github com tuanpmt esp mqtt by tuanpmt esp rtos paho https github com baoshi esp rtos paho by baoshi despite being excellent projects it requires some effort to include mqtt in your own project from these prerequisites 1 xtensa lx106 architecture toolchain to build the library if you are using espressif lubuntu vm https espressif com en support explore get started esp8266 getting started guide this is already installed and added to your path if you wish to set up your own use esp open sdk https github com pfalcon esp open sdk 2 esp8266 rtos sdk v1 4 0 github https github com espressif esp8266 rtos sdk build 1 make sure xtensa toolchain is added to path 2 export sdk path with path to espressif sdk installation sh export sdk path path to espressif rtos sdk dir 3 build the library and install it in the sdk sh make make install usage 1 in your project makefile add lmqtt to linkflags eagle app v6 linkflags eagle app v6 l sdk path lib lmqtt lcirom lcrypto and include headers towards the end of makefile includes includes i pdir include includes i sdk path include mqtt sinclude sdk path makefile todos ssl support esp open rtos supprt add examples mit license | os |
|
Email-Phishing-Attempts-Detection-using-NLP | email phishing attempts detection using nlp email phishing attempts detection from the text of email bodies using natural language processing and machine learning | ai |
|
resources | go data resources please see the following sub sections for specific types of go based data science resources community resources rarr community events conferences blogs etc tooling resources rarr tooling packages libraries tools etc | ai |
|
school-management-backend | school management backend backend system overview backend system overview docs images slide1 jpg rest api api docs docs api yaml backend default port 4000 base address http host name 4000 route categories login login related routes class class management related routes homework homework related routes df definition related routes m master routes rel relationship related routes login routes login routes docs images tsd 20presentation page 2 jpg admin routes login routes docs images tsd 20presentation page 3 jpg parent routes login routes docs images tsd 20presentation page 4 jpg teacher routes login routes docs images tsd 20presentation page 5 20 1 jpg class routes login routes docs images tsd 20presentation page 6 jpg homework routes login routes docs images tsd 20presentation page 7 jpg all other routes login routes docs images tsd 20presentation page 8 jpg the following is the list of routes following the above architecture similar routes docs images listofsimilarroutes png chat module architecture login routes docs images tsd 20presentation page 9 20 1 jpg project report introduction project background the education sector is a fast growing industry in sri lanka due to the rapid development of the private school system there is huge competition among schools and parents department of census statistics 2019 alt text docs images image1 png image tooltip schools are expected to provide many sporting activities and subjects to their students due to the higher workload schools and teachers forget to deliver the required information within a limited period there is a direct impact on children s education due to this delay most of the children like to play or do anything other than homework if parents or teachers didn t give any homework many children would not do any work related to education alt text docs images image2 png image tooltip less work will lead them to lower performances if parents identify the reasons they can find solutions parent teacher meetings home work past papers and term tests can be used to identify and correct the weakness of students fast and efficient communication and resource sharing between teachers and parents can be used to identify and correct the weaknesses of the children school attendance is also very important to generate higher performances attendanceworks org 2018 the team was named as hexatech as a motivation factor all team members are possed with keen to learn and overcome challenges members of the team are table tr td ol li thissa gunarathne li ol td td rowspan 6 as td td p style text align right project manager p td tr tr td ol li amila dissanayaka li ol td td p style text align right server administrator p td tr tr td ol li preshan visad silva li ol td td p style text align right business analyst p td tr tr td ol li niruhan viswarupan li ol td td p style text align right system developer p td tr tr td ol li mohan prakashini li ol td td p style text align right qa executive p td tr tr td ol li lankitha gallage li ol td td p style text align right system developer p td tr table aims and objectives alt text docs images image3 png image tooltip as a team we expect to reduce the communication gap between teachers and parents shilpa ys am is a progressive web application developed using mern stack which is to create a user friendly personalized environment for schools to maintain their communication between teachers parents and students alt text docs images image4 png image tooltip we aim to develop the efficiency of teachers and school management for better results development objectives can be categorized into five areas app messaging table tr td rowspan 5 strong em message to em strong td td ol li em all the members em li ol td tr tr td ol li em an entire grade em li ol td tr tr td ol li em an entire class em li ol td tr tr td ol li em a particular parent em li ol td tr tr td ol li em a teacher from a parent em li ol td tr table notifications table tr td rowspan 5 strong em send notifications to em strong td td ol li em student unauthorized absence em li ol td tr tr td ol li em fees reminders em li ol td tr tr td ol li em parent teacher meetings em li ol td tr tr td ol li em homework em li ol td tr tr td ol li em annual student profile update by parents em li ol td tr table time tables table tr td rowspan 3 strong em record em strong td td ol li em class time table em li ol td tr tr td ol li em exam time table em li ol td tr tr td ol li em extra activities em li ol td tr table downloads table tr td rowspan 3 strong em links to download em strong td td ol li em exam report card em li ol td tr tr td ol li em exam past papers em li ol td tr tr td ol li em academic reports em li ol td tr table file update table tr td rowspan 5 strong em master transaction em strong td td ol li em student profile em li ol td tr tr td ol li em teachers profile em li ol td tr tr td ol li em school calendar em li ol td tr tr td ol li em contact information em li ol td tr tr td ol li em attendance em li ol td tr table project planning table tr td img src docs images image5 png width alt alt text title image tooltip td td em here shows the gantt chart used in the life cycle of software development of the project em ul li there are five milestones to check the progress with the time using milestones project deviation can be identified ul li parallel activities are visible in the same colours then identification of such tasks is easier than other tasks li ul li ul td tr table ask allocation task allocation is done according to the selected role of the members there are three members allocated for programming and development related tasks other three members are assigned for quality assurance business analysis and project management task allocation sheet is an effective way to identify members assigned for each task table tr td colspan 4 strong em project name em strong p strong em shilpa parents teacher communication application em strong td td colspan 3 strong em project start date em strong p strong em 15 01 2020 em strong td tr tr td em brainstorming em td td 15 12 2019 td td high td td 15 12 2019 td td done td td every team member td td discussion about the project topics td tr tr td em requirement gathering em td td 30 1 2020 td td normal td td 28 01 2020 td td done td td preshan td td completed successfully td tr tr td em business feasibility analysis em td td 4 2 2020 td td normal td td 3 2 2020 td td done td td preshan amila thissa td td completed successfully td tr tr td em technical feasibility analysis em td td 10 2 2020 td td normal td td 10 2 2020 td td done td td lankitha niruhan amila td td completed successfully td tr tr td em software design phase 1 em td td 15 02 2020 td td high td td 15 02 2020 td td done td td lankitha preshan amila td td completed successfully td tr tr td em software design phase 2 em td td 18 02 2020 td td high td td 18 02 2020 td td done td td preshan niruhan lankitha td td completed successfully td tr tr td em user interface design em td td 14 02 2020 td td normal td td 27 03 2020 td td pending td td lankitha td td pending due to the workload of the programmer td tr tr td em quality assurance of the design em td td 20 02 2020 td td normal td td 27 03 2020 td td pending td td prakashini td td partially completed td tr tr td em back end programming em td td 8 3 2020 td td high td td 10 3 2020 td td done td td niruhan td td most of the development process done td tr tr td em front end programming em td td 15 03 2020 td td high td td 27 03 2020 td td pending td td lankitha td td pending due to the workload of the programmer td tr tr td em integration of the modules em td td 17 03 2020 td td high td td 27 03 2020 td td pending td td lankitha amila niruhan td td pending due to defendant work td tr tr td em testing em td td 18 03 2020 td td normal td td 27 03 2020 td td pending td td prakashini preshan thissa td td pending due to defendant work td tr tr td em quality assurance of the software em td td 20 03 2020 td td normal td td 27 03 2020 td td pending td td prakashini td td pending due to defendant work td tr tr td em finalize the development process em td td 20 03 2020 td td normal td td 27 03 2020 td td pending td td lankitha niruhan preshan td td pending due to defendant work td tr table risk management risk management is an important part of the successful completion of the project software risk management requires a greater depth of knowledge than other risk management processes identified risks table tr td strong em id em strong td td strong em identified risk em strong td td strong em risk management method em strong td tr tr td strong em r sub 1 sub em strong td td em not being able to complete the project on time em td td assign half of the team members for development related tasks frequently ask programmers to present their work td tr tr td strong em r sub 2 sub em strong td td em less user satisfaction em td td instructions were given to the developers and business analysis to tally the gathered requirements and development output td tr tr td strong em r sub 3 sub em strong td td em cost overrun em td td encourage members to develop most of the modules within the team td tr tr td strong em r sub 4 sub em strong td td em complete failure of the project em td td due to coronavirus infections there is a possibility that the project fails advice has given to the members to maintain their health td tr tr td strong em r sub 5 sub em strong td td em software quality problems em td td rigorous quality assurance testing and validation td tr tr td strong em r sub 6 sub em strong td td em knowledge gaps of the members em td td self learning the identified gaps online and knowledge sharing among group members td tr tr td strong em r sub 7 sub em strong td td em less commitment of the team members em td td always motivate team member to finish the work as soon as possible td tr table system analysis system analysis is used to identify requirements and convert them into a software reality first required information should be collected by analysing gathered requirements the framework of the software can be designed fact gathering techniques we used the following techniques to gather the required information 1 interviews top down approach used frequently 2 observation school work process observed 3 document analysis analysis student reports school letters and documents published on the notice board analysis of gathered facts according to the gathered facts most of the schools expect unique applications rather than a common application they need to overcome the weakness of the whatsapp app their budget is also limited schools expect to reduce the workload of the teachers up to some extent they mainly expect attendance absence tracking homework distribution attendance to extracurricular activities payment reminders and individual directed communication using the mobile application better security of the application and over the phone production support also among the expectations of the school management software requirement specification introduction the purpose of this document is to improve communication systems in school document convention database distributed database and the entity relationships feasibility study economic feasibility 1 shilpa reduced a lot of manual work in school 2 using shilpa app school s work efficiency will be improved 3 student s sensitive information eg medical updates can be updated safely and no unauthorized access therefore shilps app is economically feasible technical feasibility 1 mongodb is used as a database which is easy to install and setup and easy to scale 2 the app is developed under the reactjs environment where it provides real and significant mobile applications and it supports both android and ios mobile devices it will save developers time because developers do not need to develop two applications for android and ios therefore shilpa app can be developed in a reasonable time operational feasibility the shilpa app develops to enhance communication and relationship between schools and families since the app can run on both android and ios mobile devices anyone can log in to the app at any time anywhere school computer laptops or tablets can be used for backend purposes since the server is hosted on the cloud no need to keep the server administrator to look after the server and take backups therefore shilpa app is operationally easy to handle intended audience suggestions the system is restricted within the school premises with admin students parents and teachers this was implemented under the requirements gathered from table tr td strong em interview with em strong td td strong em position em strong td td strong em school em strong td tr tr td rev fr indunil sampath td td rector primary principle td td st sebastian s college mortuwa td tr tr td mrs aruni fernando td td class teacher td td st sebastian s college mortuwa td tr tr td mrs shalika perera td td parent td td st sebastian s college mortuwa td tr tr td mas nimira cooray td td student td td st sebastian s college mortuwa td tr tr td mrs sharmila fernando td td sectional head td td holy family convent bambalapitiya td tr tr td mrs rozan td td english teacher td td holy family convent bambalapitiya td tr table project scope 1 access to students parents teachers and admins to use the new application 2 direct messaging between teachers and parents 3 storing student details in the database 4 attendance tracking 5 fee status tracking and reminders 6 parent teacher meeting scheduling 7 homework distribution out of scope 1 online fee payment through gateway 2 timetable for subjects extracurricular activities 3 advanced level stream recommendation 4 school transport gps tracking 5 school calendar view other equivalent systems we refer to a few other systems to identify the process we aim to identify the process to build our system with the higher compatible ability to the market here are a few systems with their functions parentsquare alt text docs images image6 png image tooltip features table tr td ol li sending individual messages li ol td tr tr td ol li sending group messages li ol td tr tr td ol li newsletters li ol td tr tr td ol li sending emergency alerts notification li ol td tr tr td ol li students attendance management li ol td tr tr td ol li school directory li ol td tr tr td ol li payments update li ol td tr tr td ol li school calendar li ol td tr table 10 tass web alt text docs images image7 png image tooltip features table tr td ol li student profile attendance assignment li ol td tr tr td ol li update medical information of the student li ol td tr tr td ol li behavioural management li ol td tr tr td ol li the internet payment gateway for pay fees li ol td tr tr td ol li test time tables li ol td tr tr td ol li test results li ol td tr tr td ol li assessment results li ol td tr tr td ol li view teachers comments li ol td tr table 2 product description 14 product perspective there are three major users of the system table tr td strong em student em strong p basic student information p class details p attendance p marks p extracurricular activities td td strong em parent em strong p basic parent information p occupation details td td strong em teacher em strong p basic teacher information p education qualifications p teaching subjects p extracurricular td tr table alt text docs images image8 png image tooltip apart from the users mentioned above there are administrative users who have all the rights to maintain the system 15 user class and characteristics alt text docs images image9 png image tooltip 11 functions features are shown below functions 1 hostel administration future development 2 learning administration 3 education dashboard 4 class marks and registration update 5 disciplinary management etc future development features 1 centralized app for schools 2 maintain system users 3 sending messages popup notifications to parents 4 finalizing the exam marks average ranks 5 informing the parent s meetings calendar reminders 6 downloads alt text docs images image10 png image tooltip 16 tools environment table tr td img src docs images image11 png width alt alt text title image tooltip td td img src docs images image12 png width alt alt text title image tooltip td td img src docs images image13 jpg width alt alt text title image tooltip td td img src docs images image14 jpg width alt alt text title image tooltip td tr tr td img src docs images image15 png width alt alt text title image tooltip td td img src docs images image16 png width alt alt text title image tooltip td td img src docs images image17 png width alt alt text title image tooltip td td td tr table the operating environment for this school application is used are 1 hosting server centos 7 8 2 firewall iptables 3 frontend platform reactjs 4 backend platform nodejs express 5 database mongodb 6 push notifications firebase 17 database distribution drawing https docs google com drawings d 12345 export png 18 client server system 1 distributed with client server side 2 all data resides on the server side 3 the application executed on the client side 19 hardware the system can be used with the following hardware 1 tablet 2 desktop 3 mobile phone 20 software quality attributes 1 availability including all detailed information competition venue time transport facility 2 correctness passing an accurate information 3 maintainability admin will manage the database and the app 4 usability satisfy the school team and parents 3 system design 21 3 tier architecture three tier architecture is used to compose the layers of logical computing as per to our application we used as a specific type of server side and client side this supported many benefits for the development environment and modularizing the user s interface data storage of a school as well it was very useful to update a specific new feature to an application independently of other parts with flexible improvement in the overall development process this is used because it can help the application to redevelop and modernize without affecting other functional business data access it contains a presentation layer and data tier and well supportive architecture to speed up the development of an application and to check it s performances and to modularize it was used for end application from the database and selected according to school needs specific parts of the application without affecting other features these were developed as layers it showed in below image alt text docs images image18 png image tooltip 22 agile methodology this methodology is used to develop school application and it is used to determine the enhancement needed on the development methods and terms with five those are as follows requirement analysis the stage where the needed requirements are gathered in some schools and describe the problems and motivation for building the school application planning targeting the end users and deciding the time duration and frequent versions hence feedback is received regularly debugging the errors and fixing the problems on time designing an approach using xp and coding style with simple designs to log and view by delivering the domain knowledge to users tasks are allocated according to the respective developers and keeping the task in the form of a board of to do and in progress testing mapping for the process and testing is created in every iteration and it has been started and an artefact is constructed after the iteration levels the whole system is tested and results demonstrated immediately xp uses tdd techniques to ensure all implemented features are tested release in this phase we review the results then assess the current performance of the application identifying whether the system is completed without any errors and failures checked whether it is supported to the end users when release is planned for retirement maintenance once the feedback is received from schools at each stage the results are discussed with the team and the features are further improved and the factors are decided according to their perspective this model is used to get high quality applications and it is a more adaptable alternative with a unique modification due to the above features and advantages we decided to use an agile methodology to develop our application 4 system development 23 development of the prototype two different prototypes were made the first was made for the interim presentation based on the feedback from the lecturer we decided to develop a new prototype from scratch in the prototype both the frontend and backend was developed within the same project and 3 tire architecture was not fully used the frontend was developed using reactjs and the backend using node js express the frontend was basic and not responsive it behaved more like a static web page rather than a responsive react application based on suggested data model changes and feature changes we moved onto the next version which is described in the next section 24 used technologies algorithm as discussed earlier a 3 tire architecture was applied in development model and controller resided in the backend and the view was made in the front end the backend was built in node js with express server socket io was used for developing messaging systems the model was designed as mongoose schemas which can be stored in mongodb the backend and frontend communicate using rest api with http requests the frontend was developed in reactjs and use axios to connect with the backend the ui was developed with material ui bootstrap alt text docs images image19 png image tooltip once a user is logged in a key is sent to the frontend this will be used for all the transactions and along with the username password the system is sending the service key using for the push notifications to the server to map the users for the notifications this is used with the firebase for the notifications alt text docs images image20 png image tooltip sample of a sent token is shown below alt text docs images image21 png image tooltip users of the system are derived as administrators teachers parents students each of them has different credentials of the system administrators have all the authentication alt text docs images image22 png image tooltip administrators can maintain the core details of the organization the option for the privileges is limited alt text docs images image23 png image tooltip once a notification needs to be sent to a specific group an administrator or an authorized teacher and push the notifications to the firebase server and the users will get them instantly a sample notification is shown below alt text docs images image24 png image tooltip system users can also use the internal messaging service to send and receive messages when a user logged in and chat using socket io a port is reserving for s particular message thread system users can be assigned to theses threads by the administrator alt text docs images image25 png image tooltip github was used for version controlling and collaborating among team members 5 system qa testing report testing is combined as manual testing and automated testing for the front end according to our school application i used manual testing for the front end by creating the test cases in excel sheets which allows us to cover all functional nonfunctional scenarios in our system back end was tested using mongodb node js and postman while testing the back end the first time it was successful and again when testing other parts overall back end testing got stuck i restarted and tried to get results but errors continuously occurred therefore that unable to give the results because of time constraints for front end presentation i used the excel sheet to update the process of application whether the application passes or fails with actual and expected results test cases are created along with the development as per our team development we followed the agile methodology in which it can repeat the process in the development stage and from time to time new features are implemented to test the function i used some process procedures which the specific function got pass without any failure while testing front end errors and bugs are identified and reported to our developer then all bugs errors are corrected by developers once bugs are fixed again execute the failing test case to verify whether the specific function is passed alt text docs images image26 png image tooltip qa testing methods process procedures as a qa the testing more importantly concerned to give a good quality of product to the specific school therefore that to cover all functional requirements without any failure in the application i followed some testing process and procedures those are as below 1 qa testing methods analyzed alt text docs images image27 png image tooltip 2 test design manual testing with excel test cases 3 testing methodology agile testing strategy the below shows the procedures and process how application tested as per to agile testing strategy alt text docs images image28 png image tooltip 4 retest error test strategy tracking the error in ui if errors are tracked and identified in application reporting to developer to fix the bugs and give back to retest the specific functions to check get pass alt text docs images image29 png image tooltip | school-management backend-server node-js | server |
HIL-MT | human in the loop machine translation with large language model mt summit 2023 paper https files sciconf cn upload file 20230827 20230827195133 32318 pdf 1 overview p align center img src image hil png p in this study we propose a human in the loop pipeline that guides llms to produce customized outputs with revision instructions the pipeline initiates by prompting the llm to produce a draft translation followed by the utilization of automatic retrieval or human feedback as supervision signals to enhance the llm s translation through in context learning the humanmachine interactions generated in this pipeline are also stored in an external database to expand the in context retrieval database enabling us to leverage human supervision in an offline setting we evaluate the proposed pipeline using the gpt 3 5 turbo api on five domain specific benchmarks for german english translation the results demonstrate the effectiveness of the pipeline in tailoring in domain translations and improving translation performance compared to direct translation instructions this work was featured in mt summit 2023 2 feedback collection initial translation results are obtained via translation base py get ter based generated feedback via sacrebleu patch sacrebleu py sacrebleu ref m ter ter trace file op json hypo get in context demonstrations by running retrieval py data store path test set path 3 in context refinement translation pipeline stage1 run translation hil py stage2 run compare hil py all results about the experiment are stored in data citation bibtex inproceedings yang etal 2023 hilmt title human in the loop machine translation with large language model author yang xinyi and zhan runzhe and wong derek f and wu junchao and chao lidia s booktitle proceedings of machine translation summit xix vol 2 users track month sep year 2023 address macau sar china publisher machine translation summit url https files sciconf cn upload file 20230827 20230827195133 32318 pdf pages 88 98 | ai |
|
front-end-handbook | available now front end developer handbook 2019 https frontendmasters com books front end handbook 2019 front end developer handbook 2016 written by cody lindley http codylindley com sponsored by frontend masters https frontendmasters com this is a guide that anyone could use to learn about the practice of front end development it broadly outlines and discusses the practice of front end engineering how to learn it and what tools are used when practicing it in 2016 it is specifically written with the intention of being a professional resource for potential and currently practicing front end developers to equip themselves with learning materials and development tools secondarily it can be used by managers ctos instructors and head hunters to gain insights into the practice of front end development the content of the handbook favors web technologies html css dom and javascript and those solutions that are directly built on top of these open technologies the materials referenced and discussed in the book are either best in class or the current offering to a problem the book should not be considered a comprehensive outline of all resources available to a front end developer the value of the book is tied up in a terse focused and timely curation of just enough categorical information so as not to overwhelm anyone on any one particular subject matter the intention is to release an update to the content yearly the handbook is divided into three parts part i the front end practice part one broadly describes the practice of front end engineering part ii learning front end development part two identifies self directed and direct resources for learning to become a front end developer part iii front end development tools part three briefly explains and identifies tools of the trade read online at frontendhandbook com http www frontendhandbook com download a pdf epub or mobi file from https www gitbook com book frontendmasters front end handbook details https www gitbook com book frontendmasters front end handbook details a rel license href http creativecommons org licenses by nc nd 3 0 img alt creative commons license style border width 0 src https i creativecommons org l by nc nd 3 0 88x31 png a br this work is licensed under a a rel license href http creativecommons org licenses by nc nd 3 0 creative commons attribution noncommercial noderivs 3 0 unported license a | front_end |
|
H2O | h2o heavy hitter oracle for efficient generative inference of large language models license mit https img shields io badge license mit green svg https opensource org licenses mit code for the paper h2o heavy hitter oracle for efficient generative inference of large language models zhenyu zhang ying sheng tianyi zhou tianlong chen lianmin zheng ruisi cai zhao song yuandong tian christopher r clark barrett zhangyang wang beidi chen overview large language models llms despite their recent impressive accomplishments are notably cost prohibitive to deploy particularly for applications involving long content generation such as dialogue systems and story writing often a large amount of transient state information referred to as the kv cache is stored in gpu memory in addition to model parameters scaling linearly with the sequence length and batch size in this paper we introduce a novel approach for implementing the kv cache which significantly reduces its memory footprint our approach is based on the noteworthy observation that a small portion of tokens contributes most of the value when computing attention scores we call these tokens heavy hitters h2 through a comprehensive investigation we find that i the emergence of h2 is natural and strongly correlates with the frequent co occurrence of tokens in the text and ii removing them results in significant performance degradation based on these insights we propose heavy hitter oracle h2o a kv cache eviction policy that dynamically retains a balance of recent and h2 tokens we formulate the kv cache eviction as a dynamic submodular problem and prove under mild assumptions a theoretical guarantee for our novel eviction algorithm which could help guide future work we validate the accuracy of our algorithm with opt llama and gpt neox across a wide range of tasks our implementation of h2o with 20 heavy hitters improves the throughput over three leading inference systems deepspeed zero inference hugging face accelerate and flexgen by up to 29 29 and 3 on opt 6 7b and opt 30b with the same batch size h2o can reduce the latency by up to 1 9 img src figs h2o jpg align center width 100 hight 100 content we provide two code to implement heavy hitter oracle for efficient generative inference of large language models h2o flexgen h2o flexgen readme md achieving higher throughput for llm generation the code is based on flexgen https github com fminference flexgen h2o hf h2o hf testing the performance on different benchmarks the code is based on hugging face https github com huggingface transformers and we plan to work on the real throughput improvement based on hugging face framework | gpt-3 heavy-hitters high-throughput kv-cache large-language-models sparsity | ai |
Origin | origin project a solution for product counterfeiting using blockchain technology to store the product s informations producer origin etc in a manner that forbids any falsification this project was the winner of the leapfrog hackathon algiers 2018 this is the android part of the origin project here s the devpost link https devpost com software originproject ethereum the ethereum part is https github com faouziamrouche originproject ethereum | android ethereum blockchain qrcode-scanner | server |
CenoOS-IOT | p align center a href http www cenocloud com img width 200 src https raw githubusercontent com cenoos cenoos iot master docs docs img logo png a p h1 align center cenoos iot h1 div align center cenoos is a real time operating system for iot devices from cenocloud open issues https img shields io github issues badges shields svg style flat square https github com cenoos cenoos iot issues closed issues https img shields io github issues closed badges shields svg style flat square https github com cenoos cenoos iot issues discord https img shields io discord 534285557157855232 svg style flat square https discordapp com channels 534285557157855232 534285557157855234 div docuemnt architecture overview p align center img width 600 src https raw githubusercontent com cenoos ceno rtos master docs assets arch2 png p prepare conf makefile conf makefile arch arm32 board ek tm4c123gxl mcu tm4c123gh6pm link file link ld ocd cfg file ek tm4c123gxl cfg board openocd makefile openocd script dir usr local share openocd scripts board makefile base dir users neroyang project ceno rtos openocd bash cd user make openocd connect bash make all bash make flash have a good trial community discord twitch license cenoos is released under the apache 2 0 license let s fund issues in this repository https issuehunt io static embed issuehunt button v1 svg https issuehunt io repos 157975898 cenoos https raw githubusercontent com cenoos ceno rtos master docs assets twitter header photo 1 png | rtos kernel operate-system embedded-systems iot-device openocd | os |
svrx | p align center img width 320 src https svrx io assets images banner png p p align center a href https www npmjs com package svrx img src https img shields io npm v svrx svrx style flat square alt svrx a a href https nodejs org en img src https img shields io node v svrx svrx style flat square alt node a a href https travis ci org svrxjs svrx rel nofollow img src https img shields io travis svrxjs svrx master style flat square logo travis alt build status a a href https codecov io gh svrxjs svrx img src https img shields io codecov c gh svrxjs svrx style flat square logo codecov alt codecov a a href https david dm org svrxjs svrx path packages 2fsvrx img src https img shields io david svrxjs svrx path packages 2fsvrx style flat square alt dependencies a a href https david dm org svrxjs svrx path packages 2fsvrx type dev img src https img shields io david dev svrxjs svrx path packages 2fsvrx style flat square alt devdependencies a a href https gitter im svrxjs svrx utm source badge utm medium badge utm campaign pr badge utm content badge img src https badges gitter im svrxjs svrx svg alt gitter a p english readme zh cn md a pluggable frontend server it just works server x svrx is a platform built for efficient front end development motivation as a front end developer to meet different kind of development requirements usually we will have one or more set of fixed development environment in which may include a local dev server and many other debug tools it s difficult to maintain a development environment you need to install and configure every tool separately besides you may also need to enable or disable a tool when switching among projects to solve the problem we plan to integrate all the development services and tools into a pluggable platform and name it server x svrx with server x you can freely pick and combine any services plugins you want like static serve proxy remote debugging and etc without concerning about plugin installation now server x makes it possible for us to easily customize the development environment for each project and instead of downloading many other packages all you need to do is just install server x features serve a static site or spa in current directory easy to proxy everything auto refresh the page on sources change inline reload on stylesheets change powerful plugins use without installation routing with hot reload never restart your server toolkit for quick custom plugin development https svrx io assets images demo gif here s an example showing how to start a devserver with server x only with a simple command bash svrx p qrcode after code change just save the files to make sure livereload works and here s also a tiny plugin named qrcode to display a qrcode of this page remember you don t need to install any plugins just declare it quick start install bash npm install g svrx cli usage before we start you need to cd into the root of your project first let s say you ve already got an index html in your project bash cd your project ls index html and without any other config just run svrx command to start the dev server bash svrx then visit http localhost 8000 to see the content of index html https svrx io assets demo png command line options you can pass options to change the default behavior through command line bash svrx port 3000 https no livereload check out the full option reference doc here https docs svrx io en guide option html svrxrc js and also you can write down all your options by creating a file named svrxrc js or svrx config js in the root path of your project javascript svrxrc js module exports port 3000 https true livereload false and then run svrx command svrx will read your options from the config file automatically feature plugins again you don t need to install any plugins just use it server x will handle everything such as install update for you you can use plugins through command line options eg bash svrx plugin markdown p qrcode p is alias of plugin svrx markdown qrcode set a pluginname to true to start a plugin quickly and also you can enable and config a plugin through plugins in svrxrc js file eg javascript svrxrc js module exports plugins markdown name qrcode options ui false see all plugins https svrx io plugin query svrx plugin write your own plugin if unluckily you didn t find a proper plugin you need you can try write one with our plugin dev tool https github com svrxjs svrx create plugin as a pure plugin platform server x encapsulates a lot of basic logic for you which makes it rather easy to write a new plugin by the way in general you can easily write a plugin with code less than 50 lines just like most of our published plugins so what can we do through the plugins we can inject code script styles into the front end page eg vconsole plugin https github com svrxjs svrx plugin vconsole qrcode plugin https github com svrxjs svrx plugin qrcode intercept the backend requests edit and proxy those data eg mock js plugin https github com svrxjs svrx plugin mock json server plugin https github com svrxjs svrx plugin json server anyway server x provides a powerful ability to inject both frontend and backend logic all you need to do is use it to create your own magic plugins you can read more about plugin development here https docs svrx io en plugin contribution html feature routing you can try the following commands to start server x routing quickly bash touch route js create empty routing file svrx route route js in your route js get blog to json title svrx then open blog you ll see the json output title svrx features of routing support hot reloading check it out by editing your route js now easy writing clear reading support expanding https docs svrx io en guide route html plugin through plugin besides return of json you can also get handle to handle ctx ctx body handle get html to send html haha html get rewrite path to rewrite query path get redirect path to redirect localhost 9002 proxy path get api to proxy http mock server com to learn more about the grammar and usage of routing click here https docs svrx io en guide route html documentation you can read more detail about the usage api reference blogs here https docs svrx io en support feel free to raise an issue https github com svrxjs svrx issues new choose contributing please see the contributing guidelines https docs svrx io en contribution html | svrx | front_end |
cloud-platform-iot-starterkit | https img shields io badge status not 20currently 20maintained red svg longcache true style flat important notice this public repository is read only and no longer maintained for sap iot samples refer to the sap iot samples https github com sap samples sap iot samples repository on github com sap samples starter kit for the sap cloud platform internet of things the documentation and source code aim to provide building blocks and complete end to end examples for how to use the sap cloud platform internet of things supported iot service versions sap cloud platform internet of things for the neo environment as covered in the iot starter kit since may 2015 neo sap cloud platform internet of things for the cloud foundry environment initial coverage in the iot starter kit since june 2017 cf the programs included in this starter kit reference third party open source or other free download components the third party licensors of these components may provide additional license rights terms and conditions and or require certain notices the exact dependencies for each program can be established according to the rules of its programming language and the build system used pom files for java maven import statements for python etc how to obtain support this project is provided as is there is no guarantee that raised issues will be answered or addressed in future releases license copyright c 2015 sap se or an sap affiliate company all rights reserved this file is licensed under the sap sample code license agreement except as noted otherwise in the license license file | sample sample-code sap-cloud-platform iot scp-iot | server |
NL2Code.github.io | large language models meet nl2code a survey https arxiv org pdf 2212 09420 pdf we build a real time updated websit for nl2code powered by llms click here to visit the website https nl2code github io we hope that our work can facilitate the research work of researchers in llms for nl2code furthermore we also hope everyone can contribute to the website by adding up to date nl2code papers citation inproceedings zan2023large title large language models meet nl2code a survey author daoguang zan and bei chen and fengji zhang and dianjie lu and bingchao wu and bei guan and yongji wang and jian guang lou booktitle proceedings of the 61th annual meeting of the association for computational linguistics year 2023 | ai |
|
autoEdit_2 | autoedit 2 autoedit 2 is a fast text based video editing desktop app for mac linux and windows built with node and electron and backbone front end for making video production faster easier and more accessible ready to use release see releases section https github com opennewslabs autoedit 2 releases to download latest packaged version ready to use and view user manual https pietropassarelli gitbooks io autoedit2 user manual content for overview of the app all you need to get started is to decide what speech to text service you want to use and get some credentials to get going check out the user manual for more details https pietropassarelli gitbooks io autoedit2 user manual content setup stt apis html setup bash git clone git github com opennewslabs autoedit 2 git cd autoedit 2 bash npm install usage development npm start compiles the js client side files with browserify and starts electron note you d also need to get speech to text keys or have the gentle open source app running locally check out the user manual for more details https pietropassarelli gitbooks io autoedit2 user manual content setup stt apis html unless you are choosing pocketsphinx as speech to text option background autoedit is a text based video editing software that creates a digital paper editing workflow for more background see this write up on source introducing autoedit https source opennews org articles video editing made better introducing autoedit as well as for more in depth on the underlying workflow see this gitbook how to tell compelling stories out of video interviews https pietropassarelli gitbooks io how to tell compelling stories out of video inter content and especially this section focused on digital paper editing https pietropassarelli gitbooks io how to tell compelling stories out of video inter content digital paper editing autoedit 2 digital paper editing software html overview diagram https github com opennewslabs autoedit 2 raw master assets autoedit overview diagram 1 0 7 png system architecture high level overview of system architecture see architecture overview https autoedit gitbook io documentation overview architecture in developer s documentation https autoedit gitbook io documentation development env how to run the development environment coding style convention ref optional eg which linter to use linting github pre push hook optional node 8 or greater npm 5 or greater speech to text credentials https autoedit gitbook io user manual setup stt apis linting use eslintrc json eslintrc json in your code editor of choice to keep style consistency build how to run build see documentation section for build deployment in os x linux windows https autoedit gitbook io documentation overview deploymentbuild travis ci deployment see documentation section for travis ci continuous build https autoedit gitbook io documentation overview deploymentbuild travis ci continuous build deployment as adobe panel see documentation section for more info https autoedit gitbook io documentation adobe panel autoedit adobe cep panel dev setup tests how to carry out tests at the moment some unit tests are in the spec folder using jasmine and you can run them using npm run test however they still need to be fixed up a bit and perhaps move to jest pr and help on improving test coverage welcome build project page demo see documentation section for building project page https autoedit gitbook io documentation project page build project page building demo front end https autoedit gitbook io documentation project page build update demo front end page open source this is an open source project in it s current version it was originally created as part of a open news knight mozilla fellowship https opennews org what fellowships by pietro passarelli http pietropassarelli com with the vox media product team http product voxmedia com you can contribute https github com opennewslabs autoedit 2 and or a href mailto site email subject autoedit 202 20question propose ideas a you have for this project this tool is under development and you may find some bugs in that case we will appreciate if you can fill an issue https github com opennewslabs autoedit 2 issues or a href mailto pietro autoedit io subject hello target top get in touch a if you are curious about whatever happened to autoedit 1 check this out http pietropassarelli com autoedit html contributing feel free to get in touch with any questions via email pietro autoedit io or twitter pietropassarell https twitter com pietropassarell github issues https help github com articles about issues to suggest ideas and report bugs welcome and or fork the project and send me a pull request https help github com articles about pull requests sign up to the mailing list http eepurl com cmzwsx follow on twitter http twitter com autoedit2 and or facebook https www facebook com autoedit io to keep up to date with the latest releases check out the issues section https github com opennewslabs autoedit 2 issues and waffle io dashboard https waffle io opennewslabs autoedit 2 stories in ready https badge waffle io opennewslabs autoedit 2 png label ready title ready https waffle io opennewslabs autoedit 2 contributors list of contributors that have helped shaped autoedit by contributing and or advising on this or previous versions in no particular order andrea baldo https twitter com and baldo dan zajdband https twitter com impronunciable rosario rascuna https twitter com sarhus daniele bottillo https twitter com dbottillo sanette tanaka https twitter com ssktanaka ryan mark https twitter com ryanmark katie o dowd bernhard fasenfest https github com bfasenfest pietro passarelli http github com pietrop active contributors pietro passarelli http github com pietrop support the project sign up to the mailing list http eepurl com cmzwsx follow on twitter http twitter com autoedit2 and or facebook https www facebook com autoedit io to keep up to date with the latest releases say hi at a href mailto pietro autoedit io subject hello target top pietro autoedit io a always curious to hear what autoedit is helping you with autoedit io http www autoedit io it s free and open source free as in free speech as well as in free beer help support the autoedit project to keep it that way https donorbox org c9762eef 0e08 468e 90cb 2d00643697f8 recurring true support will go towards fixing bugs adding features provide support for users etc | video-editing dmg edl watson speech-to-text stt gentle gentle-stt osx electron backbone ibm-watson ibm-watson-speech mac speechmatics video-sequences autoedit transcription desktop backbonejs | front_end |
CHP02-Unity-step-by-step | chp02 unity step by step unity in embedded system design and robotics a step by step guide | os |
|
locomotive-boilerplate | p align center a href https github com locomotivemtl locomotive boilerplate img src https user images githubusercontent com 4596862 54868065 c2aea200 4d5e 11e9 9ce3 e0013c15f48c png height 140 a p h1 align center locomotive boilerplate h1 p align center front end boilerplate for projects by a href https locomotive ca locomotive a p features uses a custom task runner docs development md for handling assets uses browsersync for fast development and testing in browsers uses sass for a feature rich superset of css uses esbuild for extremely fast processing of js es modules uses svg mixer for processing svg files and generating spritesheets uses itcss for a sane and scalable css architecture uses locomotive scroll for smooth scrolling with parallax effects uses a custom grid system docs grid md for layout creation learn more about languages and technologies docs technologies md getting started make sure you have the following installed node at least 17 9 the latest lts is recommended npm at least 8 0 the latest lts is recommended you can use nvm to install and use different versions of node via the command line sh clone the repository git clone https github com locomotivemtl locomotive boilerplate git my new project enter the newly cloned directory cd my new project then replace the original remote repository with your project s repository then update the following files to suit your project readme md readme md the file you are currently reading package json package json package name locomotivemtl boilerplate package title locomotive boilerplate package lock json package lock json package name locomotivemtl boilerplate loconfig json loconfig json browsersync proxy url locomotive boilerplate test remove paths url to use browsersync s built in server which uses paths dest view path views boilerplate template environment js assets scripts utils environment js application name boilerplate site webmanifest www site webmanifest manifest name locomotive boilerplate manifest short name boilerplate html files page title locomotive boilerplate installation sh switch to recommended node version from nvmrc nvm use install dependencies from package json npm install development sh start development server watch for changes and compile assets npm start compile and minify assets npm run build learn more about development and building docs development md documentation development and building docs development md languages and technologies docs technologies md grid system docs grid md browsersync https npmjs com package browser sync esbuild https npmjs com package esbuild itcss https itcss io locomotive scroll https npmjs com package locomotive scroll modularjs https npmjs com package modujs modularload https npmjs com package modularload sass https sass lang com svg mixer https npmjs com package svg mixer node https nodejs org npm https npmjs com nvm https github com nvm sh nvm | front_end |
|
material-properties-interchange | material properties interchange build status https travis ci org mvernacc material properties interchange svg branch master https travis ci org mvernacc material properties interchange codecov https codecov io gh mvernacc material properties interchange branch master graph badge svg https codecov io gh mvernacc material properties interchange do for material properties what step files do for 3d geometry i ve only made a rough demo so far if you would find it useful please leave a comment or message me current status check out tutorials xplane airframes ipynb for a demonstration of how to use the package and why it s useful for engineers so far i ve implemented the following a prototype of a material database record in yaml materials data al 6061 yaml so far this is only a limited subset of properties for a single material the core datastructures logic for representing a material property and doing interpolation for state dependent properties materials property py the core logic for loading material data from a yaml file into a python object materials material py some unit tests for the above project proposal provide for the interchange of information on material properties between cad software fea software and custom analysis scripts i spend a lot of time manually transcribing material properties from matweb or mmpds into solidworks and my own python scripts this process is tedious and error prone i d like to make it better and i imagine other engineers would appreciate this too it would be valuable for a project to have a single verified database of material properties which all of the project s computational tools cad fea analysis scripts reference core components database format for recording properties of a material use a standard syntax yaml interchange programs import from matweb convert to from formats used by major cad fea packages api access material properties from python matlab excel similar products projects granta mi https www grantadesign com products mi and the material data management consortium http www grantadesign com download pdf mdmc datasheet pdf | material analysis-script python engineering material-properties | server |
ABC-Engineering_DB | abc engineering db database system for abc engineering | server |
|
EECE6017C | eece6017c embedded systems design code yay | os |
|
DE-1-csv-to-relational-db | de 1 csv to relational db first project of data engineering moving data from csv files to relational database postgressql this repos contains various scripts one jupyter notebook per table imported for importing csv file data to relational database tables the datasets directory contains the data to be imported which are random data taken from the internet the create table queries directory contains the create table definations for each of the csv files one table per file finally the ipynb files contains the scripts to import data from csv file to their respective tables | server |
|
lingo | lingo img src https raw githubusercontent com chewxy lingo master media gopher small png align right build status https travis ci org chewxy lingo svg branch master https travis ci org chewxy lingo package lingo provides the data structures and algorithms required for natural language processing specifically it provides a pos tagger lingo pos a dependency parser lingo dep and a basic tokenizer lingo lexer for english it also provides data structures for holding corpuses lingo corpus and treebanks lingo treebank the aim of this package is to provide a production quality pipeline for natural language processing install the package is go gettable go get u github com chewxy lingo this package and its subpackages depend on very few external packages here they are package used for vitality notes licence gorgonia https github com chewxy gorgonia machine learning vital it won t be hard to rewrite them but why same author gorgonia licence https github com chewxy gorgonia blob master license apache 2 0 like gographviz https github com awalterschulze gographviz visualization of annotations and other graph related visualizations vital for visualizations which are a nice to have feature api last changed 12th april 2017 gographviz licence https github com awalterschulze gographviz blob master license apache 2 0 errors https github com pkg errors errors the package won t die without it but it s a very nice to have stable api for the past year errors licence https github com pkg errors blob master license mit bsd like set https github com xtgo set set operations can be easily replaced stable api for the past year set licence https github com xtgo set blob master license mit bsd like usage see the individual packages for usage there is also a bunch of executables in the cmd directory they re meant to be examples as to how a natural language processing pipeline can be set up a natural language pipeline with this package is heavily channels driven here s is an example for dependency parsing go func main inputstring the cat sat on the mat lx lexer new dummy strings newreader inputstring lexer required to break a sentence up into words pt pos new pos withmodel posmodel pos tagger required to tag the words with a part of speech tag dp dep new depmodel creates a new parser set up a pipeline pt input lx output dp input pt output run all go lx run go pt run go dp run wait to receive for select case d dp output do something case err dp error handle error how it works for specific tasks pos tagging parsing named entity recognition etc refer to the readme of each subpackage this package on its own mainly provides the data structures that the subpackages will use perhaps the most important data structure is the annotation structure it basically holds a word and the associated metadata for the word for dependency parses the graph takes three forms dependency dependencytree and annotation all three forms are convertable from one to another todo explain rationale behind each data type quirks very oddly specific pos tags and dependency rel types a particular quirk you may have noticed is that the postag and dependencytype are hard coded in as constants this package does in fact provide two variations of each one from stanford penn treebank and one from universaldependencies http universaldependencies org the main reason for hardcoding these are mainly for performance reasons knowing ahead how much to allocate reduces a lot of additional work the program has to do it also reduces the chances of mutating a global variable of course this comes as a tradeoff programs are limited to these two options thankfully there are only a limited number of pos tag and dependency relation types two of the most popular ones stanford ptb and universal dependencies have been implemented the following build tags are supported stanfordtags universaltags stanfordrel universalrel to use a specific tagset or relset build your program thusly go build tags stanfordtags the default tag and dependency rel types are the universal dependencies version lexer you should also note that the tokenizer lingo lexer is not your usual run of the mill nlp tokenizer it s a tokenizer that tokenizes by space with some specific rules for english it was inspired by rob pike s talk on lexers i thought it d be cool to write something like that for nlp the test cases in package lingo lexer showcases how it handles unicode and other pathalogical english contributing see contributing md for more info licence this package is licenced under the mit licence | natural-language-processing nlp nlp-library nlp-parsing nlp-dependency-parsing nlp-machine-learning language-model golang go part-of-speech-tagger part-of-speech inflection conll-u | ai |
PostAR | what does your app do mobile app allows user to login and out using firebase user name and password authentication footer navigation tabs for navigating to list view that displays all data points camera view and settings view settings view demonstrates gps and gyroscope access and allows for user sign out camera view displays markers of posts within a certain radius of the user these posts are clickable and will show the full message once clicked backend it sets up the api flow that will allow the app to authenticate with the server post location messages to save in database and retrieve messages in a 200 meter radius it is deployed on heroku uses mongodb for the database and jwt for authentication the landing page is hosted at the heroku deployment who worked on it ernesto l cortez tomas hernandez michael fernandez jessica vega what were you able to complete for this handin mobile app implemented facebook login form was created to allow user to input a message to send messages sent using the app post to the database posts within certain radius are represented by a marker on the camera view not done yet but hopefully before class message pops up when user submits a post backend various bug fixes and cleaned up code what are known problems if any with your project mobile app camera view still stretches beyond header and footer and is scrollable this is undesired behavior backend jwt is unencrypted security concerns how would you improve it if you had more time mobile app for this milestone we would have liked to include picture and video messages it would have been nice to implement a more clean and simplified design backend make more security considerations as far as encryption is concerned finish up the landing page with app images and team information pictures links | os |
|
PrintablesKit | printableskit reverse engineering the printables com graphql schema to write an ios app for it most unofficial and certainly not endorsed by printables status currently only searching and getting details of a print is implemented example see https github com m4p printableskitexample | os |
|
hookify | getting started hookify is a serious opinionated alternative to logify https github com theos logos wiki logify pl like logify hookify ouputs a hooked version of each method contained in a header file provided by classdump utility but with the following differences hookify works with a custom logs method logtool logdatafromnsstring this method allows you to take the control over your logged data you can choose either to log your data into a file in the console and or simply change the log format hookify logs data at the begining of the hooked method and also just before it returns hookify logs classname methodname paramvalue1 paramvalue2 when a hooked method is called and classname methodname return value if any before it returns this format is especially helpfull when you need to understand an app workflow example output image of yaktocat example output png usage hook an entire directory of dumped headers the option d or dir allows you to hook an entire directory of dumped headers because huge apps can contains more than 20k header files hookify works with pattern matching as follow using pre made pattern you can use a pre made regex option p or pattern to filter classes responsible of either encryption or networking and more to come using p crypto or p network options sh python hookify py d mydirpath p network or python hookify py d mydirpath p crypto using a custom regex pattern r or regex option allows you to filter classes using your own custom pattern sh python hookify py d mydirpath r a z 0 4 hook an entire class using single header file the f or file option allows you to give a single file path to hookify sh python hookify py f myfilepath getting started 1 preparing the log class and methods in order to get a similar result as the output above you ll have to copy past the following snippet in your tweak file let s say your tweak xm this is a small interface which allows your tweak to log each call with the right indentation at the begining and the end of each logified methods objective c interface logtool nsobject property int numberoftabs logtool sharedinstance void logdatafromnsstring nsstring logcontent nsstring prepend int count string nsstring toprepend tostring nsstring originalstring end implementation logtool logtool sharedinstance static logtool sharedinstance nil static dispatch once t oncetoken dispatch once oncetoken sharedinstance logtool alloc init do any other initialisation stuff here sharedinstance numberoftabs 1 return sharedinstance nsstring prepend int count string nsstring toprepend tostring nsstring originalstring nsstring newstring nsstring stringwithformat originalstring while count 0 newstring nsstring stringwithformat toprepend newstring count return newstring void logdatafromnsstring nsstring logcontent logtool logtool logtool sharedinstance bool isbegin logcontent containsstring bool isend logcontent containsstring nsstring toprint nsstring stringwithformat logcontent if isbegin logtool numberoftabs logtool numberoftabs 1 toprint logtool prepend logtool numberoftabs string tostring logcontent else if isend toprint logtool prepend logtool numberoftabs string tostring logcontent logtool numberoftabs logtool numberoftabs 1 else toprint logtool prepend logtool numberoftabs string tostring logcontent nslog toprint end 2 preparing the log class and methods once it s done you can use hookify to generate hooked methods sh python hookify py path to your header generated with classdump h your tweak xm sample output in your tweak xm objective c void logmediadownloadeventwithrequest id arg1 response id arg2 parameters id arg3 logtool logdatafromnsstring begin scapiclient logmediadownloadeventwithrequest orig logtool logdatafromnsstring end scapiclient logmediadownloadeventwithrequest void observevalueforkeypath id arg1 ofobject id arg2 change id arg3 context void arg4 logtool logdatafromnsstring begin scapiclient observevalueforkeypath orig logtool logdatafromnsstring end scapiclient observevalueforkeypath void enqueuehttprequestoperation id arg1 logtool logdatafromnsstring begin scapiclient enqueuehttprequestoperation orig logtool logdatafromnsstring end scapiclient enqueuehttprequestoperation you re all set now go on xcode devices yourdevice open console and all your logs will appear there other solution output logs into a file sometime we prefer to output logs into a file here is an example of interface which outputs logs directly in a file located at tmp filelog txt note this interface will not handle indentation but you can mix both examples for that objective c your tweak xm interface logtool nsobject void logdatafromnsstring nsstring logcontent end implementation logtool void logdatafromnsstring nsstring logcontent nsfilehandle file nsdata data file nsfilehandle filehandleforupdatingatpath tmp filelog txt file seektoendoffile logcontent logcontent stringbyappendingstring n n data logcontent datausingencoding nsutf8stringencoding file writedata data file closefile end | os |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.