names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
Data-Engineering
data engineering i learned the pipeline of data engineering that is etl also loaded data locally as well as pn cloud using aws s lambda function below is the link of the medium article on data engineering https medium com saideshmukh16 data engineering etl 262484b3a12d
cloud
Savant
savant supercharged computer vision and video analytics framework on deepstream savant is an open source high level framework for building real time streaming highly efficient multimedia ai applications on the nvidia stack it helps to develop dynamic fault tolerant inference pipelines that utilize the best nvidia approaches for data center and edge accelerators savant is built on deepstream and provides a high level abstraction layer for building inference pipelines it is designed to be easy to use flexible and scalable it is a great choice for building smart cv and video analytics applications for cities retail manufacturing and more star star us on github it motivates us a lot and helps the project become more visible to developers github release with filter https img shields io github v release insight platform savant build status https github com insight platform savant actions workflows main yml badge svg branch develop https github com insight platform savant actions workflows main yml twitter https img shields io twitter url https twitter com savantframework svg style social label follow 20 40savantframework https twitter com savantframework blog https img shields io badge inside 20insight 20blog 444444 logo medium https b savant ai io discord https img shields io badge discord 8a2be2 https discord gg kvafgbszgd savant is a member of the nvidia inception program savant inception member https github com insight platform savant assets 15047882 12928291 05cc 4639 b43c 6b13d36a01fd chat with us the best way to approach us is discord https discord gg kvafgbszgd we are always happy to help you with any questions you may have quick links blog https b savant ai io getting started tutorial https docs savant ai io v0 2 5 getting started 2 module devguide html pipeline samples https github com insight platform savant tree develop samples documentation https docs savant ai io performance regression tracking dashboard docs performance md quick start runtime configuration guide https docs savant ai io getting started 0 configure prod env html helps to configure the runtime to run savant pipelines the demo https github com insight platform savant tree develop samples peoplenet detector shows a pipeline featuring person detection facial detection tracking facial blurring opencv cuda and a real time analytics dashboard samples peoplenet detector assets peoplenet blur demo loop 400 webp bash git clone https github com insight platform savant git cd savant samples peoplenet detector git lfs pull if x86 utils check environment compatible docker compose f docker compose x86 yml up if jetson utils check environment compatible docker compose f docker compose l4t yml up open rtsp 127 0 0 1 554 stream in your player or visit http 127 0 0 1 888 stream ll hls ctrl c to stop running the compose bundle to get back to project root cd what savant is not savant is not for ai model training it s for building fast streaming inference applications working on edge and core nvidia equipment we use pytorch to train our models and recommend sticking with it who would be interested in savant if your task is to implement high performance production ready computer vision and video analytics applications savant is for you it helps to get the maximum performance on nvidia equipment on edge and in the core decrease time to market when building dynamic pipelines with deepstream technology but without low level programming develop easily maintainable and testable applications with a well established framework api runs on nvidia hardware nvidia jetson nx agx orin nano nx agx nvidia turing gpu nvidia ampere gpu nvidia hopper hopefully we did not have a chance to try it yet about nvidia deepstream nvidia deepstream https developer nvidia com deepstream sdk is today s most advanced toolkit for developing high performance real time computer vision ai applications that run magnitude times faster than conventional ai applications executed within the runtimes like pytorch tensorflow and similar nvidia deepstream picture https developer nvidia com sites default files akamai deepstream metropolis and iva deepstreadm sdk block diagrams 2009801 r1 1 png the top notch performance is achieved by specially designed software using the best nvidia accelerator features including hardware encoding and decoding for video streams moving the frames through inference blocks solely in gpu ram without data transfers into cpu ram and back the inference blocks utilize the highly efficient low level tensorrt https developer nvidia com tensorrt software stack optimizing inference operations to get the best of the hardware used why we developed savant why do we develop savant if deepstream solves the problem that is because deepstream is a challenging to use technology it does not define software architecture just a bunch of plug ins for gstreamer the open source multimedia framework for building highly efficient streaming applications it makes developing more or less sophisticated deepstream applications very painful because the developer must understand how the gstreamer processes the data making the learning curve steep and almost unreachable for ml engineers focused on model training savant is a very high level framework on deepstream hiding low level internals from the developer and providing practical tools for quickly implementing real life streaming ai applications so you implement your inference pipeline as a set of declarative yaml blocks with several user defined functions in python or c c if you would like to utilize most of the cuda runtime features savant is packed with several killer features which skyrocket the development of deepstream applications all you need for building real life applications savant supports everything you need for developing advanced pipelines detection classification segmentation tracking and custom pre and post processing for meta and images we have implemented samples demonstrating pipelines you can build with savant visit the samples samples folder to learn more high performance savant is designed to be fast it works on top of deepstream the fastest sdk for video analytics even the heavyweight segmentation models can run in real time on savant see the performance regression tracking dashboard docs performance md for the latest performance results works on edge and data center equipment the framework supports running the pipelines on both nvidia s edge devices jetson family and data center devices tesla quadro etc with minor or zero changes cloud ready savant pipelines run in docker containers we provide images for x86 dgpu and jetson hardware low latency and high capacity processing savant can be configured to execute a pipeline in real time skipping data when running out of capacity or in high capacity mode which guarantees the processing of all the data maximizing the utilization of the available resources ready to use api a pipeline is a self sufficient service communicating with the world via high performance streaming api whether developers use provided adapters or client sdk both approaches use the api advanced data protocol the framework universally uses a common protocol for both video and metadata delivery the protocol is highly flexible allowing video related information alongside arbitrary structures useful for iot and 3rd party integrations opentelemetry support in savant you can precisely instrument pipelines with opentelemetry a unified monitoring solution you can use sampled or complete traces to balance the performance and precision the traces can span from edge to core to business logic through network and storage because their propagation is supported by the savant protocol client sdk we provide python based sdk to interact with savant pipelines ingest and receive data it enables simple integration with 3rd party services client sdk is integrated with opentelemetry providing programmatic access to the pipeline traces and logs development server software development for vanilla deepstream is a pain savant provides a development server tool which enables dynamic reloading of changed code without pipeline restarts it helps to develop and debug pipelines much faster altogether with client sdk it makes the development of deepstream enabled applications really simple with the development server you can develop remotely on a jetson device or server right from your ide dynamic sources management in savant you can dynamically attach and detach sources and sinks to the pipeline without reloading the framework resiliently handles situations related to source sink outages handy source and sink adapters the communication interface is not limited to client sdk we provide several ready to use adapters which you can use as is or modify for your needs the following source adapters are available local video file https docs savant ai io savant 101 10 adapters html video file source adapter local directory of video files https docs savant ai io savant 101 10 adapters html video file source adapter video url https docs savant ai io savant 101 10 adapters html video file source adapter local image file https docs savant ai io savant 101 10 adapters html image file source adapter local directory of image files https docs savant ai io savant 101 10 adapters html image file source adapter image url https docs savant ai io savant 101 10 adapters html image file source adapter rtsp stream https docs savant ai io savant 101 10 adapters html rtsp source adapter usb csi camera https docs savant ai io savant 101 10 adapters html usb cam source adapter gige genicam industrial cam https docs savant ai io savant 101 10 adapters html gige vision source adapter kafka redis https docs savant ai io savant 101 10 adapters html kafka redis source adapter video loop url https docs savant ai io savant 101 10 adapters html video loop source adapter multi stream source https docs savant ai io savant 101 10 adapters html multi stream source adapter several sink adapters are implemented inference results are placed into json file stream https docs savant ai io savant 101 10 adapters html json metadata sink adapter resulting video overlay displayed on a screen per source https docs savant ai io savant 101 10 adapters html display sink adapter mp4 file per source https docs savant ai io savant 101 10 adapters html video file sink adapter image directory per source https docs savant ai io savant 101 10 adapters html image file sink adapter always on rtsp stream sink https docs savant ai io savant 101 10 adapters html always on rtsp sink adapter kafka redis https docs savant ai io savant 101 10 adapters html kafka redis sink adapter dynamic parameters ingestion advanced ml pipelines may require information from the external environment for their work the framework enables dynamic configuration of the pipeline with ingested frame attributes passed in per frame metadata etcd s attributes watched and instantly applied 3rd party attributes which are received through user defined functions opencv cuda support savant supports custom opencv cuda bindings enabling operations on deepstream s in gpu frames with a broad range of opencv cuda functions the feature helps in implementing highly efficient video transformations including but not limited to blurring cropping clipping applying banners and graphical elements over the frame and others the feature is available from python rotated detection models support we frequently deal with the models resulting in bounding boxes rotated relative to a video frame oriented bounding boxes for example it is often the case with bird eye cameras observing the underlying area from a high point such cases may require detecting the objects with minimal overlap to achieve that special models are used which generate bounding boxes that are not orthogonal to the frame axis take a look at rapid https vip bu edu projects vsns cossy fisheye rapid to find more image https user images githubusercontent com 15047882 167245173 aa0a18cd 06c9 4517 8817 253d120c0e07 png parallelization savant supports processing parallelization it helps to utilize the available resources to the maximum the parallelization is achieved by running the pipeline stages in separate threads despite flow control related python code is not parallel the developer can utilize gil releasing mechanisms to achieve the desired parallelization with numpy numba or custom native code in c or rust what s next getting started tutorial https docs savant ai io v0 2 5 getting started 2 module devguide html publications and samples https github com insight platform savant tree develop samples documentation https docs savant ai io contribution we welcome anyone who wishes to contribute report and learn about us the in sight team is a ml ai department of bitworks software we develop custom high performance cv applications for various industries providing full cycle process which includes but not limited to data labeling model evaluation training pruning quantization validation and verification pipelines development ci cd we are mostly focused on nvidia hardware both datacenter and edge contact us info bw sw com
computer-vision deepstream edge-computing inference-engine machine-learning nvidia-deepstream-sdk deep-learning nvidia object-detection video cuda opencv tensorrt instance-segmentation peoplenet yolo yolov5-face yolov8 yolov8-face
ai
cloud_devops_eng
this the the repo for my projects related to the udacity cloud devops nano degree program project 01 deploy static website on aws s3 bucket url https s3 console aws amazon com s3 home region us east 2 static website url https dn7ewd2ed2yfg cloudfront net index html project 02 deploy high availability website on aws there are no live assets to refer to for this project project yaml file https github com icarlosmendez cloud devops eng blob master project 02 deploy ha webapp ha webapp combined yml infrastructure diagram https github com icarlosmendez cloud devops eng blob master project 02 deploy ha webapp ha webapp diagram pdf screenshot of live site when it was running https github com icarlosmendez cloud devops eng blob master project 02 deploy ha webapp ha webapp indexpage png this was a bear of a project and required substantial troubleshooting and reading through the docs along with much time spent in the student hub forums as well as on our old friend stackoverflow this project was much harder than it needed to be had there been a little more detailed guidance from the content the learning experience could have been just as thorough but with less head banging
cloud
Reza-Arzani
reza arzani information technology and services
server
LLM_Open_server
easy to deploy your llm large language model server with no public address gpu machine https github com youkpan llm open server blob main readme cn md if you don t want to build a web server and it is hard to let your llm server export to public such as vast ai autodl com etc you can have your own ai website link only two step 1 register your machine websocket websocketapp wss v stylee top 8883 ws ai server system your server system id server id your server system name can be any string like abcdefg your server will receive openai like data json https platform openai com docs api reference chat create with websocket model gpt 3 5 turbo messages role user content hello use websocket send your result word by word or sentence finnaly send message all finished system will finish current dialog and ready for next server 2 then open url https ai zyinfo pro ai system your server system have fun note your can regist server with no limit more machine and serve for public we have plugin service and connect with our knowledge database powered by https ai zyinfo pro
ai
300Days__MachineLearningDeepLearning
journey of 300daysofdata in machine learning and deep learning machinelearning https github com thinamxx 300days machinelearningdeeplearning blob main images ml jpg books and resources status of completion 1 machine learning from scratch https dafriedman97 github io mlbook content introduction html white check mark 2 a comprehensive guide to machine learning white check mark 3 hands on machine learning with scikit learn keras and tensorflow white check mark 4 speech and language processing https web stanford edu jurafsky slp3 5 machine learning crash course https developers google com machine learning crash course white check mark 6 deep learning with pytorch part i https www manning com books deep learning with pytorch white check mark 7 dive into deep learning https d2l ai white check mark 8 logistic regression documentation https ml cheatsheet readthedocs io en latest logistic regression html white check mark 9 deep learning for coders with fastai and pytorch white check mark 10 approaching almost any machine learning problem 11 pyimagesearch https www pyimagesearch com research papers 1 practical recommendations for gradient based training of deep architectures https arxiv org pdf 1206 5533 pdf projects and notebooks 1 california housing prices https github com thinamxx californiahousing prices git 2 logistic regression from scratch https github com thinamxx machinelearning algorithms blob main logisticregression logisticregression ipynb 3 implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb 4 neural networks style transfer https github com thinamxx neural style transfer 5 object recognition on images cifar10 https github com thinamxx cifar10 recognition 6 dog breed identification imagenet https github com thinamxx dogbreedclassification 7 sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb 8 sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb 9 sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb 10 natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb 11 natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb 12 natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb 13 deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb 14 fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb 15 fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb 16 fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb 17 fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb 18 fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb 19 fastai image regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression regression ipynb 20 fastai advanced classification https github com thinamxx fastai blob main 6 20advanced 20classification imagenetteclassification ipynb 21 fastai collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb 22 fastai tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb 23 fastai natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb 24 fastai data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb 25 fastai language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb 26 fastai convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb 27 fastai residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb 28 fastai architecture details https github com thinamxx fastai blob main 14 20architecture 20details architectures ipynb 29 fastai training process https github com thinamxx 300days machinelearningdeeplearning blob main images day 20259 png 30 fastai neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb 31 fastai cnn interpretation with cam https github com thinamxx fastai blob main 17 20cnn 20interpretation cnn 20interpretation ipynb 32 fastai fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb 33 fastai chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb 34 supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb 35 evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb 36 opencv notebook https github com thinamxx computervision blob main 01 20opencv opencv ipynb 37 opencv project i https github com thinamxx computervision blob main 01 20opencv ocv 20project 20i ipynb 38 opencv project ii https github com thinamxx computervision blob main 01 20opencv ocv 20project 20ii ipynb 39 convolution https github com thinamxx computervision blob main 02 20convolutionalneuralnetwork convolutions ipynb 40 convolutional layers https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb 41 fastai transformers https github com thinamxx fastai blob main 20 20transformers transformers ipynb day1 of 300daysofdata gradient descent and cross validation gradient descent is an iterative approach to approximating the parameters that minimize a differentiable loss function cross validation is a resampling procedure used to evaluate machine learning models on a limited data sample which has a parameter that splits the data into number of groups on my journey of machine learning and deep learning today i have read in brief about the fundamental topics such as calculus matrices matrix calculus random variables density functions distributions independence maximum likelihood estimation and conditional probability i have also read and implemented about gradient descent and cross validation i am starting this journey from scratch and i am following the book machine learning from scratch i have presented the implementation of gradient descent and cross validation here in the snapshots i hope you will also spend some time reading the topics from the book mentioned above i am excited about the days to come book machine learning from scratch https dafriedman97 github io mlbook content introduction html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 201 png day2 of 300daysofdata ordinary linear regression linear regression is a linear approach to modelling the relationships between a scalar response or dependent variable and one or more explanatory variables or independent variables on my journey of machine learning and deep learning today i have read and implemented about ordinary linear regression parameter estimation minimizing loss and maximizing likelihood along with the construction and implementation of the lr from the book machine learning from scratch i have also started reading the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about regression ordinary least squares vector calculus orthogonal projection ridge regression feature engineering fitting ellipses polynomial features hyperparameters and validation errors and cross validation from this book i have presented the implementation of linear regression along with visualizations using python here in the snapshots i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 202a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 202b png day3 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about regularized regression such as ridge regression and lasso regression bayesian regression glms poisson regression along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about maximum likelihood estimation or mle and maximum a posteriori or mae for regression probabilistic model bias variance tradeoff metrics bias variance decomposition alternative decomposition multivariate gaussians estimating gaussians from data weighted least squares ridge regression and generalized least squares from this book i have presented the implementation of ridge regression lasso regression along with cross validation bayesian regression and poisson regression using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 203 png day4 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about discriminative classifiers such as binary and multiclass logistic regression the perceptron algorithm parameter estimation fishers linear discriminant and fisher criterion along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about kernels and ridge regression linear algebra derivation computational analysis sparse least squares orthogonal matching pursuit total least squares low rank formulation dimensionality reduction principal component analysis projection changing coordinates minimizing reconstruction errors and probabilistic pca from this book i have presented the implementation of binary and multiclass logistic regression the perceptron algorithm and fishers linear discriminant using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 204 png day5 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about generative classifiers such as linear discriminative analysis or lda quadratic discriminative analysis or qda naive bayes parameter estimation and data likelihood along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about generative and discriminative classification bayes decision rule least squares support vector machines feature extension neural network extension binary and multiclass logistic regression loss function training multiclass extension gaussian discriminant analysis qda and lda classification and support vector machines from this book i have presented the implementation of lda qda and naive bayes along with visualizations using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 205 png day6 of 300daysofdata decision trees a decision tree is an interpretable machine learning for regression and classification it is a flow chart like structure in which each internal node represents a test on an attribute and each branch represents the outcome of the test on my journey of machine learning and deep learning today i have read about decision trees such as regression trees and classification trees building trees making splits and predictions hyperparameters pruning and regularization along with construction and implementation of the same from the book machine learning from scratch i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about decision tree learning entropy and information gini impurity stopping criteria random forests boosting and adaboost gradient boosting and kmeans clustering from this book i have presented the implementation of regression trees and classification trees using python here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 206 png day7 of 300daysofdata tree ensemble methods ensemble methods combine the outputs of multiple simple models which is often called learners in order to create the fine model with low variance due to their high variance a decision trees often fail to reach a level of precision comparable to other predictive algorithms and ensemble methods minimize the variance on my journey of machine learning and deep learning today i have read and implemented about tree ensemble methods such as bagging for decision trees bootstrapping random forests and procedure boosting adaboost for binary classification weighted classification trees the discrete adaboost algorithm and adaboost for regression along with construction and implementation of the same from the book machine learning from scratch i have presented the implementation of bagging random forests and adaboost along with different base estimators using python here in the snapshot i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 207 png day8 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about neural networks from the book machine learning from scratch i have read about model structure communication between layers activation functions such as relu sigmoid the linear activation function optimization back propagation calculating gradients chain rule and observations loss functions along with construction using the loop approach and the matrix approach and implementation of the same from this book i have also read the book a comprehensive guide to machine learning which focuses on mathematics and theory behind the topics i have read about convolutional neural networks and layers pooling layers back propagation for cnn resnet and visual understanding of cnns from this book besides i have seen a couple of videos of neural networks and deep learning i have presented the simple implementation of neural networks with the functional api and the sequential api using tensorflow here in the snapshot i hope you will also spend some time reading the topics and books mentioned above excited about the days ahead books machine learning from scratch https dafriedman97 github io mlbook content introduction html a comprehensive guide to machine learning image https github com thinamxx 300days machinelearningdeeplearning blob main images day 208 png day9 of 300daysofdata reinforcement learning in reinforcement learning the learning system called an agent in a particular context can observe the environment select and perform actions and get rewards in return or penalties in the form of negative rewards it must learn by itself what is the best policy to get the most reward over time on my journey of machine learning and deep learning today i have started reading and implementing from the book hands on machine learning with scikit learn keras and tensorflow i have read briefly about the machine learning landscape viz types of machine learning systems such as supervised and unsupervised learning semisupervised learning reinforcement learning batch learning and online learning instance based learning and model based learning from this book i have presented the simple implementation of linear regression and knearest neighbors along with a simple plot using python here in the snapshot i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 209a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 209b png day10 of 300daysofdata on my journey of machine learning and deep learning today i have read about the main challenges of machine learning such as insufficient quantity of training data non representative training data poor quality data irrelevant features overfitting and underfitting the training data and testing and validating hyperparameter tuning and model selection and data mismatch from the book hands on machine learning with scikit learn keras and tensorflow i have started working on california housing prices dataset which is included in this book i will build a model of housing prices in california in this project i have presented the simple implementation of data processing and few techniques of eda using python here in the snapshot i have also presented the implementation of sweetviz library for analysis here i really appreciate chanin nantasenamat for sharing about this library in one of his videos i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow chanin nantasenamat video on sweetviz https www youtube com watch v ur ok8vbpey lc z22itptbrzv0vfky504t1aokgq4l23pa5kermfzdyrfkbk0h00410 1605764911555430 california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2010 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2010b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2010a png day11 of 300daysofdata on my journey of machine learning and deep learning today i have learned and implemented about creating categories from attributes stratified sampling visualizing data to gain insights scatter plots correlations scatter matrix and attribute combinations from the book hands on machine learning with scikit learn keras and tensorflow i have continued working with california housing prices dataset which is included in this book this dataset was based on data from the 1990 california census i will build a model of housing prices in california in this project i am still working on the same i have presented the implementation of stratified sampling correlations using scatter matrix and attribute combinations using python here in the snapshots i have also presented the snapshots of correlations using scatter plots here i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2011a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2011b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2011c png day12 of 300daysofdata on my journey of machine learning and deep learning today i have learned and implemented about preparing the data for machine learning algorithms data cleaning simple imputer ordinal encoder onehot encoder feature scaling transformation pipeline standard scaler column transformer linear regression decision tree regressor and cross validation from the book hands on machine learning with scikit learn keras and tensorflow i have continued working with california housing prices dataset which is included in this book this dataset was based on data from the 1990 california census i will build a model of housing prices in california in this project the notebook contains almost every topics mentioned above i have presented the implementation of data preparation handling missing values onehot encoder column transformer linear regression decision tree regressor along with cross validation using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2012a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2012b png day13 of 300daysofdata on my journey of machine learning and deep learning today i have learned and implemented about random forest regressor ensemble learning tuning the model grid search randomized search analyzing the best models and errors model evaluation cross validation and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have completed working with california housing prices dataset which is included in this book this dataset was based on data from the 1990 california census i have built a model using random forest regressor of california housing prices dataset to predict the price of the houses in california i have presented the implementation of random forest regressor and tuning the model with grid search and randomized search along with cross validation using python here in the snapshot i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow california housing prices https github com thinamxx californiahousing prices git image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2013 png day14 of 300daysofdata confusion matrix confusion matrix is a better way to evaluate the performance of a classifier the general idea of confusion matrix is to count the number of times instances of class a are classified as class b this approach requires to have a set of predictions so that they can be compared to the actual targets on my journey of machine learning and deep learning today i have read and implemented about classification training a binary classifier using stochastic gradient descent measuring accuracy using cross validation implementation of cv confusion matrix precision and recall and their curves and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of sgd classifier in mnist dataset along with precision and recall using python here in the snapshots i have also presented the curves of precision and recall here i hope you will spend some time working on the same and reading the topics and book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2014a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2014b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2014c png day15 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about the roc curve random forest classifier sgd classifier multi class classification one vs one and one vs all strategies cross validation error analysis using confusion matrix multi class classification kneighbors classifier multi output classification noises precision and recall tradeoff and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have completed the topic classification from this book i have presented the implementation of the roc curve random forest classifier in multi class classification the one vs one strategy standard scaler error analysis multi label classification and multi output classification using scikit learn here in the snapshots i hope you will also work on the same i hope you will also spend some time reading the topics and book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2015a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2015b png day16 of 300daysofdata ridge regression ridge regression is a regularized linear regression viz a regularization term is added to the cost function which forces the learning algorithm to not only fit the data but also keep the model weights as small as possible on my journey of machine learning and deep learning today i have read and implemented about training the models linear regression the normal equations and computational complexity cost function and gradient descent such as batch gradient descent convergence rate stochastic gradient descent mini batch gradient descent polynomial regression and poly features learning curves bias and variance tradeoff regularized linear models such as ridge regression and few more related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of polynomial regression learning curves and ridge regression along with visualization using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2016 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2016b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2016a png day17 of 300daysofdata elastic net elastic net is a middle grouped between ridge regression and lasso regression the regularization term r is a simple mix of both ridge and lasso s regularization terms when r equals 0 it is equivalent to ridge and when r equals 1 it is equivalent to lasso regression on my journey of machine learning and deep learning today i have read and implemented about lasso regression elastic net early stopping sgd regressor logistic regression estimating probabilities training and cost function sigmoid function decision boundaries softmax regression or multinomial logistic regression cross entropy and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have just started reading the topic support vector machines i have presented the simple implementation of lasso regression elastic net early stopping logistic regression and softmax regression using scikit learn here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2017a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2017 png day18 of 300daysofdata support vector machines a support vector machines or svm is a very powerful and versatile machine learning model which is capable of performing linear and nonlinear classification regression and even outlier detection svms are particularly well suited for classification of complex but medium sized datasets on my journey of machine learning and deep learning today i have read and implemented about support vector machines linear svm classification soft margin classification nonlinear svm classification polynomial regression polynomial kernel adding similarity features gaussian rbf kernel computational complexity svm regression which is linear as well nonlinear and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of nonlinear svm classification using svc and linear svc along with visualization using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2018a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2018b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2018c png day19 of 300daysofdata voting classifiers voting classifiers are the classifiers which aggregates the predictions of different classifiers and predicts the class that gets the most votes the majority vote classifier is called a hard voting classifier on my journey of machine learning and deep learning today i have read and implemented about ensemble learning and random forests voting classifiers such as hard voting and soft voting classifiers and few more topics related to the same actually i have also started working on a research project with an amazing team i have presented the implementation of hard voting and soft voting classifiers using scikit learn here in the snapshots i hope you will spend some time working on the same and reading the topics mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2019a png day20 of 300daysofdata the cart training algorithm the algorithm which represents scikit learn s implementation of the classification and regression tree or cart training algorithm to train decision trees also called growing trees it s working principle is splitting the training set into two subsets using a feature and a threshold on my journey of machine learning and deep learning today i have read and implemented about decision functions and predictions decision trees decision tree classifier making predictions gini impurity white box models and black box models estimating class probabilities the cart training algorithm computational complexities entropy regularization hyperparameters decision tree regressor cost function and instability from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of decision tree classifier and decision tree regressor along with visualization of the same using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2020b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2020a png day21 of 300daysofdata bagging and pasting it refers to the approach which uses the same training algorithm for every predictor but to train them on different random subsets of the training set when sampling is performed with replacement it is called bagging and when sampling is performed without replacement it is called pasting on my journey of machine learning and deep learning today i have read and implemented about ensemble learning and random forests voting classifiers bagging and pasting in scikit learn out of bag evaluation random patches and random subspaces random forests extremely randomized trees ensemble feature importance boosting adaboost gradient boosting and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of bagging ensembles decision trees random forest classifier feature importance adaboost classifier and gradient boosting using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2021aa png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2021a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2021b png day22 of 300daysofdata manifold learning manifold learning refers to the dimensionality reduction algorithms that work by modeling the manifold on which the training instances lie which relies on manifold hypothesis which holds that most real world high dimensional datasets to a much lower dimensional manifold on my journey of machine learning and deep learning today i have read and implemented about gradient boosting early stopping stochastic gradient boosting extreme gradient boosting or xgboost stacking and blending dimensionality reduction curse of dimensionality approaches for dimensionality reduction projection and manifold learning and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of gradient boosting with early stopping along with visualization using scikit learn here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2022a png day23 of 300daysofdata incremental pca incremental pca or ipca algorithms are the algorithms in which we can split the training set into mini batches and feed an ipca algorithm one mini batch at a time it is useful for large training sets and also to apply pca online on my journey of machine learning and deep learning today i have read and implemented about principal component analysis or pca preserving the variance principal components projecting down the dimensions explained variance ratio choosing the right number of dimensions pca for compression and decompression reconstruction error randomized pca svd incremental pca and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of pca randomized pca and incremental pca along with visualizations using scikit learn here in the snapshots i hope you will spend some time working on the same i hope you will also spend some time reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2023b png day24 of 300daysofdata clustering clustering algorithms are the algorithms whose goal is to group similar instances together into clusters it is a great tool for data analysis customer segmentation recommender systems search engines image segmentation dimensionality reduction and many more on my journey of machine learning and deep learning today i have read and implemented about kernel principal component analysis selecting a kernel and tuning hyperparameters pipeline and grid search locally linear embedding dimensionality reduction techniques such as multi dimensional scaling isomap and linear discriminant analysis unsupervised learning such as clustering and kmeans clustering algorithm and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of kernel pca and grid search cv and kmeans clustering algorithm along with a visualization using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2024a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2024b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2024c png day25 of 300daysofdata image segmentation image segmentation is the task of partitioning an image into multiple segments in semantic segmentation all the pixels that are part of the same object type get assigned to the same segment in instance segmentation all pixels that are part of the individual object are assigned to the same segment on my journey of machine learning and deep learning today i have read and implemented about kmeans algorithms centroid initialization accelerated kmeans and mini batch kmeans finding the optimal numbers of clusters elbow rule and silhouette coefficient score limitations of kmeans using clustering for image segmentation and preprocessing such as dimensionality reduction and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of clustering algorithms for image segmentation and preprocessing along with visualizations using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2025a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2025c png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2025b png day26 of 300daysofdata gaussian mixtures model a gaussian mixture model or gmm is a probabilistic model that assumes that the instances were generated from the mixture of several gaussian distributions whose parameters are unknown all the instances generated from a single gaussian distributions form a cluster that typically looks like an ellipsoid on my journey of machine learning and deep learning today i have read and implemented about using clustering algorithms for semi supervised learning active learning and uncertainty sampling dbscan agglomerative clustering birch algorithms mean shift and affinity propagation algorithms spectral clustering gaussian mixtures model expectation maximization algorithm and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of clustering algorithms for semi supervised learning and dbscan along with visualizations using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2026a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2026b png day27 of 300daysofdata anomaly detection anomaly detection also called outlier detection is the task of detecting instances that deviate strongly from the norm these instances are called anomalies or outliers while the normal instances are called inliers it is useful in fraud detection and more on my journey of machine learning and deep learning today i have read and implemented about gaussian mixture models anomaly detection using gaussian mixtures novelty detection selecting the number of clusters bayesian information criterion akaike information criterion likelihood function bayesian gaussian mixture models fast mcd isolation forest local outlier factor one class svm and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have just started neural networks and deep learning from this book i have presented the implementation of gaussian mixture model along with visualizations using python here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2027a png day28 of 300daysofdata rectified linear unit function or relu it is a continuous but not differentiable at 0 where the slope changes abruptly and makes the gradient descent bounce around it works very well and has the advantage of fast to compute on my journey of machine learning and deep learning today i have read and implemented about introduction to artificial neural networks with keras biological neurons logical computations with neurons the perceptron hebbian learning multi layer perceptron and backpropagation gradient descent hyperbolic tangent function and rectified linear unit function regression mlps classification mlps softmax activation and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of building an image classifier using the sequential api along with visualization using keras here in the snapshots i hope you will spend some time working on the same and reading the topics and book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2028a png day29 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about creating the model using sequential api compiling the model loss function and activation function training and evaluating the model learning curves using the model to make predictions building the regression mlp using the sequential api building complex models using the functional api deep neural networks and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of building regression mlp using sequential api and functional api here in the snapshots i hope you will gain some insights and you will spend some time working on the same i hope you will also spend some time reading and implementing the topics from the book mentioned above i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2029a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2029b png day30 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about building the complex models using functional api deep neural network architecture relu activation function handling multiple inputs in the model mean squared error loss function and stochastic gradient descent optimizer handling multiple outputs or auxiliary output for regularization and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of handling multiple inputs using keras functional api along with the implementation of handling multiple outputs or auxiliary output for regularization using the same here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below i am excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2030aa png day31 of 300daysofdata callbacks and early stopping early stopping is a method that allows you to specify an arbitrarily large number of training epochs and stopping once the model stops improving on the validation dataset on my journey of machine learning and deep learning today i have read and implemented about building dynamic models using the sub classing api sequential api and functional api saving and restoring the model using callbacks model checkpoints early stopping weights and biases and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of building dynamic models using the sub classing api along with the implementation of using callbacks and early stopping here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2031a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2031b png day32 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about visualization using tensorboard learning curves fine tuning neural network hyperparameters randomized search cv regressor libraries to optimize hyperparameters such as hyperopt talos and few more number of hidden layers number of neurons per hidden layer learning rate batch size and other hyperparameters and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have also spend some time reading the paper which is named as practical recommendations for gradient based training of deep architectures here i have read about deep learning and greedy layer wise pretraining online learning and optimization of generalization error and few more related to the same i have presented the implementation of tuning hyperparameters keras regressors and randomized search cv here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow paper practical recommendations for gradient based training of deep architectures https arxiv org pdf 1206 5533 pdf image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2032a png day33 of 300daysofdata vanishing gradient during backpropagation and calculating gradients it often gets smaller and smaller as the algorithms progresses down to the lower layers which prevents the training to converge to the good solution this leads to vanishing gradient problem on my journey of machine learning and deep learning today i have read and implemented about training deep neural networks vanishing and exploding gradient problems glorot and he initialization non saturating activation functions batch normalization and its implementation logistic and sigmoid activation function selu activation function relu activation function and variants leaky relu and parametric leaky relu and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of leaky relu and batch normalization here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2033a png day34 of 300daysofdata gradient clipping gradient clipping is the technique to lessen the exploding gradients problem which simply clip the gradients during backpropagation so that they never exceed some threshold and it is mostly used in recurrent neural networks on my journey of machine learning and deep learning today i have read and implemented about gradient clipping batch normalization reusing pretrained layers deep neural networks and transfer learning unsupervised pretraining restricted boltzmann machines pretraining on an auxiliary task self supervised learning faster optimizers gradient descent optimizer momentum optimization nesterov accelerated gradient and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of transfer learning using keras and sequential api here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2034a png day35 of 300daysofdata adam optimization adam which stands for adaptive moment estimation combines the ideas of momentum optimization and rmsprop where momentum optimization keeps track of an exponentially decaying average of past gradients and rmsprop keeps track of an exponentially decaying average of past squared gradients on my journey of machine learning and deep learning today i have read and implemented about adagrad algorithm gradient descent rmsprop algorithm adaptive moment estimation or adam optimization adamax nadam optimization training sparse models dual averaging learning rate scheduling power scheduling exponential scheduling piecewise constant scheduling performance scheduling and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of exponential scheduling and piecewise constant scheduling here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2035a png day36 of 300daysofdata deep neural networks the best deep neural networks configurations which will work fine in most cases without requiring much hyperparameter tuning is here kernel initializer as lecun initialization activation function as selu normalization as none regularization as early stopping optimizer as nadam learning rate schedule as performance scheduling on my journey of machine learning and deep learning today i have read and implemented about avoiding overfitting through regularization l1 and l2 regularization dropout regularization self normalization batch normalization monte carlo dropout max norm regularization activation functions like selu and leaky relu nadam optimization and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of l2 regularization and dropout regularization using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead book hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2036a png day37 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about custom models and training with tensorflow high level deep learning apis io and preprocessing lower level deep learning apis deployment and optimization tensorflow architecture tensors and operations keras low level api tensors and numpy sparse tensors arrays string tensors custom loss functions saving and loading the models containing custom components and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have also started reading a book speech and language processing here i have read about regular expressions text normalization tokenization lemmatization stemming sentence segmentation edit distance and few more topics related to the same i have presented the simple implementation of custom loss function here in the snapshot i hope you will also spend some time reading the topics from the books mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2037a png day38 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented about custom activation functions initializers regularizers and constraints custom metrics mae and mse streaming metric custom layers custom models losses and metrics based on models internals and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have also started reading a book speech and language processing here i have read about regular expressions basic regular expression patterns disjunction range kleene star wildcard expression grouping and precedence operator hierarchy greedy and non greedy matching sequence and anchors counters and few more topics related to the same i have presented the implementation of custom activation functions initializers regularizers constraints and custom metrics here in the snapshots i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2038a png day39 of 300daysofdata prefetching and data api prefetching is the loading of the resource before it is required to decrease the time waiting for that resource in other words while the training algorithm is working on one batch the dataset will already be working in parallel on getting the next batch ready which will improve the performance dramatically on my journey of machine learning and deep learning today i have read and implemented about loading and preprocessing data using tensorflow the data api chaining transformations shuffling the dataset gradient descent interleaving lines from multiple files parallelism preprocessing the dataset decoding prefetching multithreading and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of data api using tensorflow here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2039a png day40 of 300daysofdata embedding and representation learning an embedding is a trainable dense vector that represents a category the better the representation of the categories the easier it will be for the neural network to make accurate predictions so embeddings must make the useful representations of the categories this is called representation learning on my journey of machine learning and deep learning today i have read and implemented about the features api column transformer numerical and categorical features crossed categorical features encoding categorical features using one hot vectors and embeddings representation learning word embeddings using feature columns for parsing using feature columns in the models and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of the features api in numerical and categorical columns along with parsing here in the snapshots i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2040a png day41 of 300daysofdata convolutional layer the most important building block of cnn is the convolutional layer neurons in the first convolutional layer are not connected to every single pixel in the input image but only to pixels in their respective fields similarly each neurons in second cl is connected only to neurons located within a small rectangle in the first layer on my journey of machine learning and deep learning today i have read and implemented about deep computer vision using convolutional neural networks the architecture of the visual cortex convolutional layer zero padding filters stacking multiple feature maps padding memory requirements pooling layer invariance convolutional neural network architectures and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the simple implementation of convolutional neural network architecture here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2041a png day42 of 300daysofdata resnet model residual network or resnet won the ilsvrc 2015 challenge developed by kaiming he using an extremely deep cnn composed of 152 layers this network uses the skip connections which is also called shortcut connections the signal feeding into a layer is also added to the output of a layer located a bit higher up the stack on my journey of machine learning and deep learning today i have read and implemented about lenet 5 architecture alexnet cnn architecture data augmentation local response normalization googlenet architecture inception module vggnet residual network or resnet residual learning xception or extreme inception squeeze and excitation network or senet and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of resnet 34 cnn using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2042a png day43 of 300daysofdata xception model xception which stands for extreme inception is a variant of googlenet architecture which was proposed in 2016 by fran ois chollet it merges the ideas of googlenet and resnet architecture but it replaces the inception modules with a special type of layer called a depthwise separable convolution on my journey of machine learning and deep learning today i have read and implemented about using pretrained models from keras googlenet and residual network or resnet imagenet pretrained models for transfer learning xception model convolutional neural network batching prefetching global average pooling and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have presented the implementation of pretrained models such as resnet and xception for transfer learning here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2043a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2043b png day44 of 300daysofdata semantic segmentation in semantic segmentation each pixel is classified according to the class of the object it belongs to but the different objects of the same class are not distinguished on my journey of machine learning and deep learning today i have read and implemented about classification and localization crowdsourcing in computer vision intersection over union metric object detection fully convolutional networks or fcns valid padding you only look once or yolo architecture mean average precision or map convolutional neural networks semantic segmentation and few more topics related to the same from the book hands on machine learning with scikit learn keras and tensorflow i have just completed learning from this book i have presented the implementation of classification and localization along with the visualization here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time reading the topics from the book mentioned above and below excited about the days ahead books hands on machine learning with scikit learn keras and tensorflow image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2044a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2044b png day45 of 300daysofdata empirical risk minimization training a model means learning good values for all the weights and the biases from labeled examples in supervised learning a machine learning algorithm builds a model by examining many examples and attempting to find a model that minimizes loss which is called empirical risk minimization on my journey of machine learning and deep learning today i have started learning from the machine learning crash course of google here i have learned about machine learning philosophy fundamentals of machine learning and uses labels and features labeled and unlabeled example models and inference regression and classification linear regression weights and bias training and loss empirical risk minimization mean squared error or mse reducing loss gradient descent and few more topics related to the same i have presented the simple implementation of basic recurrent neural network here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2045 png day46 of 300daysofdata on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about learning rate or step size hyperparameters in machine learning algorithms regression gradient descent optimizing learning rate stochastic gradient descent or sgd batch and batch size minibatch stochastic gradient descent convergence hierarchy of tensorflow toolkits and few more topics related to the same i have also spend some time in reading the book speech and language processing here i have read about regular expressions and patterns precision and recall kleene star aliases for common characters re operators for counting and few more topics related to the same i have presented the simple implementation of recurrent neural network and deep rnn using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course and book mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course book speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2046 png day47 of 300daysofdata feature vector and feature engineering feature engineering means transforming raw data into feature vector which is the set of floating values comprising the examples of the dataset on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about generalization of model overfitting gradient descent and loss statistical and computational learning theories stationarity of data splitting of data and validation set representation and feature engineering feature vector categorical features and vocabulary one hot encoding and sparse representation qualities of good features and few more topics related to the same i have presented the simple implementation of rnn along with gru here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course and book mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2047 png day48 of 300daysofdata scaling features scaling means converting floating point feature values from their natural range into standard range such as 0 to 1 if the feature set contains multiple features then feature scaling helps gradient descent to converge more quickly on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about scaling feature values handling extreme outliers binning scrubbing the data standard deviation feature cross and synthetic feature encoding nonlinearity stochastic gradient descent cross product crossing one hot vectors regularization for simplicity generalization curve l2 regularization early stopping lambda and learning rate and few more topics related to the same i have presented the simple implementation of linear regression model using sequential api here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2048 png day49 of 300daysofdata prediction bias prediction bias is a quantity that measures how far apart is the average of predictions from the average of labels in dataset prediction bias is completely a different quantity than bias on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about logistic regression and calculating probability sigmoid function binary classification log loss and regularization early stopping l1 and l2 regularization classification and thresholding confusion matrix class imbalance and accuracy precision and recall roc curve area under curve or auc prediction bias calibration layer bucketing sparsity feature cross and one hot encoding and few more topics related to the same i have presented the simple implementation of normalization and binary classification using keras here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2049 png day50 of 300daysofdata categorical data and sparse tensors categorical data refers to input features that represent one or more discrete items from a finite set of choices sparse tensors are the tensors with very few non zero elements on my journey of machine learning and deep learning today i have learned from machine learning crash course of google here i have learned and implemented about neural networks hidden layers and activation functions nonlinear classification and feature crosses sigmoid function rectified linear unit or relu backpropagation vanishing and exploding gradients dropout regularization multi class neural networks softmax logistic regression embeddings collaborative filtering sparse features principal component analysis word2vec and few more topics related to the same i have presented the simple implementation of deep neural networks in multi class classification here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the course mentioned above and below excited about the days ahead course machine learning crash course https developers google com machine learning crash course image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2050 png day51 of 300daysofdata deep learning deep learning is the general class of algorithms which falls under artificial intelligence and deals with training mathematical entities named deep neural networks by presenting the instructive examples it uses large amounts of data to approximate complex functions on my journey of machine learning and deep learning today i have started reading and implementing from the book deep learning with pytorch here i have learned about core pytorch deep learning introduction and revolution tensors and arrays deep learning competitive landscape utility libraries pretrained neural network that recognizes the subject of an image imagenet image recognition alexnet and resnet torch vision module and few more topics related to the same from here i have presented the implementation of obtaining pretrained neural networks for image recognition using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2051 png day52 of 300daysofdata the gan game gan stands for generative adversarial network where generative means something being created adversarial means the two neural networks are competing to out smart the other and well network means neural networks a cycle gan can turn images of one domain into images of another domain without the need for us to explicitly provide matching pairs in the training set on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about pretrained models generative adversarial network or gan resnet generator and discriminator models cycle gan architecture torch vision module deep fakes a neural network that turns horses into zebras and few more topics related to the same from here i have presented the implementation of cycle gan that turns horses into zebras using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2052a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2052b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2052c png day53 of 300daysofdata tensors and multi dimensional arrays tensors are the fundamental data structure in pytorch a tensor is an array that is a data structure which stores a collection of numbers that are accessible individually using a index and that can be indexed with multiple indices on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about a pretrained neural network that describes the scenes neuraltalk2 model recurrent neural network torch hub fundamental building block tensors the world as floating point numbers multidimensional arrays and tensors lists and indexing tensors named tensors einsum broadcasting and few more topics related to the same from here i have presented the simple implementation of indexing tensors and named tensors using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2053 png day54 of 300daysofdata tensors and multi dimensional arrays tensors are the fundamental data structure in pytorch a tensor is an array that is a data structure which stores a collection of numbers that are accessible individually using a index and that can be indexed with multiple indices on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about named tensors changing the names of named tensors broadcasting tensors unnamed dimensions tensor element types specifying the numeric data type the tensor api creation operations indexing random sampling serialization parallelism tensors storage referencing storage indexing into storage and few more topics related to the same from here i have presented the simple implementation of named tensors tensor datatype attributes and tensor api using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2054 png day55 of 300daysofdata encoding color channels the most common way to encode colors into numbers is rgb where a color is defined by three numbers representing the intensity of red green and blue on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about tensors metadata such as size offset and stride transposing tensors without copying transposing in higher dimensions contiguous tensors managing tensors device attribute such as moving to gpu and cpu numpy interoperability generalized tensors serializing tensors data representation using tensors working with images adding color channels changing the layout and few more topics related to the same from here i have presented the implementation of working with images such as changing the layout and permute method along with contiguous tensors using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2055 png day56 of 300daysofdata continuous ordinal and categorical values continuous values are the values which can be counted and measured along with units ordinal values are the continuous values with no fixed relationships between values categorical values are the enumerations of possibilities on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about normalizing the image data working with 3d images or volumetric image data representing the tabular data loading the data tensors using numpy continuous values ordinal values categorical values ratio scale and interval scale nominal scale one hot encoding and embeddings singleton dimensions and few more topics related to the same from here i have presented the implementation of normalizing the image data volumetric data tabular data and one hot encoding using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2056 png day57 of 300daysofdata continuous ordinal and categorical values continuous values are the values which can be counted and measured along with units ordinal values are the continuous values with no fixed relationships between values categorical values are the enumerations of possibilities on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about continuous and categorical data pytorch tensor api finding thresholds in tabular data advanced indexing working with time series data adding time dimension in data shaping the data by time period tensors and arrays and few more topics related to the same from here i have presented the implementation of working with categorical data time series data and finding thresholds using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2057 png day58 of 300daysofdata encoding and ascii every written characters is represented by a code which refers to a sequence of bits of appropriate length so that each character can be uniquely identified and it is called encoding on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about working with time series data ordinal variables one hot encoding and concatenation unsqueeze and singleton dimension mean standard deviation and rescaling variables text representation natural language processing and recurrent neural networks converting the text into numbers project gutenberg corpus one hot encoding of characters encoding and ascii embeddings and processing the text and few more topics related to the same from here i have presented the implementation of time series data and text representation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2058 png day59 of 300daysofdata loss function loss function is a function that computes a single numerical value that the learning process will attempt to minimize the calculation of loss typically involves taking the difference between the desired outputs for some training samples on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about one hot encoding and vectors data representation using tensors text embeddings natural language processing the mechanics of learning johannes kepler s lesson in modeling eccentricity parameter estimation weight bias and gradients simple linear model loss function or cost function mean square loss broadcasting and few more topics related to the same from here i have presented the simple implementation of representing text mechanics of learning and simple linear model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2059 png day60 of 300daysofdata gradient descent gradient descent is the first order iterative optimization algorithm for finding a local minimum of a differentiable function simply gradient is the derivates of the function with respect to each parameter on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about cost function or loss function optimizing parameters using gradient descent decreasing loss function parameter estimation mechanics of learning scaling factor and learning rate evaluations of model computing the derivative of loss function and linear function defining gradient function partial derivative and iterating the model the training loop and few more topics related to the same from here i have presented the implementation of loss function computing derivatives gradient function and training loop here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2060 png day61 of 300daysofdata hyperparameter tuning hyperparameter tuning refers to the training of model s parameters and hyperparameters control how the training goes hyperparameters are generally set manually on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about gradient descent optimizing the training loop overtraining convergence and divergence learning rate hyperparameter tuning normalizing the inputs visualization or plotting the data argument unpacking pytorch s autograd and backpropagation chain rule linear model and few more topics related to the same from here i have presented the simple implementation of training loop and gradient descent along with visualization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2061 png day62 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about gradient descent pytorch s autograd and backpropagation chain rule and tensors grad attribute and parameters simple linear function and simple loss function accumulating grad functions zeroing the gradients autograd enabled training loop optimizers and vanilla gradient descent and optim submodule of torch and few more topics related to the same from here i have presented the simple implementation of linear model and loss function autograd enabled training loop using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2062 png day63 of 300daysofdata stochastic gradient descent stochastic gradient descent or sgd comes from the fact that the gradient is typically obtained by averaging over a random subset of all input samples on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about optimizers vanilla gradient descent optimization stochastic gradient descent momentum argument minibatch learning rate and params optim module neural network models adam optimizers backpropagation optimizing weights training validation and overfitting evaluating the training loss generalizing to the validation set overfitting and penalization terms and few more topics related to the same from here i have presented the implementation of sgd and adam optimizer along with the training loop here in the snapshots it is the continuation of the previous snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2063 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2063a png day64 of 300daysofdata activation functions activation functions are nonlinear which allows the overall network to approximate more complex functions they are differentiable so that gradients can be computed through them on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i am learning to use a neural network to fit the data artificial neurons the learning process and loss function non linear activation functions weights and biases composing a multilayer network understanding the error function capping and compressing the output range tanh and relu activations choosing the activation functions the pytorch nn module and few more topics related to the same from here i have presented the simple implementation of linear model and training loop using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2064 png day65 of 300daysofdata activation functions activation functions are nonlinear which allows the overall network to approximate more complex functions they are differentiable so that gradients can be computed through them on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about the pytorch nn module simple linear model batching input data optimizing batches mean square error loss function training loop neural networks sequential model tanh activation function inspecting parameters weights and biases ordereddict module comparing to the linear model overfitting and few more topics related to the same form here i have presented the simple implementation of sequential model and ordereddict submodule using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2065 png day66 of 300daysofdata computer vision computer vision is an interdisciplinary scientific field that deals with how computers can gain high level understanding from digital images or videos it seeks to understand and automate tasks that the human visual system can do on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have started the new topic learning from images i have learned about simple image recognition cifar10 which is a dataset of tiny images torch vision module the dataset class iterable dataset python imaging library or pil package dataset transforms arrays and tensors permute function and few more topics related to the same i have presented the simple implementation of torch vision module along with cifar10 dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2066 png day67 of 300daysofdata computer vision computer vision is an interdisciplinary scientific field that deals with how computers can gain high level understanding from digital images or videos on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about permutation function normalizing the data stacking mean and standard deviation torch vision module and submodules cifar10 dataset pil package image recognition building the dataset building a fully connected neural networks model sequential model simple linear model classification and regression problems one hot encoding and softmax and few more topics related to the same from here i have presented the implementation of normalizing the data building the dataset and neural network model using torch vision modules here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2067 png day68 of 300daysofdata softmax function softmax function is a type of function that takes a vector of values and produces another vector of the same dimension where the values satisfy the constraints presented as probabilities softmax is a monotone function that the lower values in the input will correspond to lower values in the output on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about representing output as probabilities and softmax function pytorch s nn module backpropagation a loss for classification mse loss negative log likelihood or nll loss log softmax function training the classifier stochastic gradient descent hyperparameters minibatches and few more topics related to the same from here i have presented the implementation of softmax function building neural network model and training loop using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2068 png day69 of 300daysofdata cross entropy loss cross entropy loss is a negative log likelihood of the predicted distribution under the target distribution as an outcome the combination of log softmax function and nll loss function is equivalent to using cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about gradient descent minibatches and data loader stochastic gradient descent neural network model log softmax function nll loss function cross entropy loss function trainable parameters weights an biases translation invariant data augment torch vision and nn modules and few more topics related to the same from here i have presented the implementation of building deep neural network training loop and model evaluation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2069 png day70 of 300daysofdata translational invariance translational invariance makes the convolutional neural network invariant to translation which means that if we translate the inputs then the cnn will still be able to detect the class to which the input belongs on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have started reading the topic using convolutions to generalize i have learned about convolutional neural network translation invariant weights and biases discrete cross correlations locality or local operations on neighborhood data model parameters multi channel image padding the boundary kernel size detecting features with convolutions and few more topics related to the same i have presented the simple implementation of cnn and building the data using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2070 png day71 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about kernel size padding the image edge detection kernel locality and translation invariant learning rate and weight update max pooling layer and down sampling stride convolutional neural networks receptive field tanh activation function simple linear model sequential model parameters of the model and few more topics related to the same from here i have presented the implementation of convolutional neural network plotting the image and inspecting the parameters of the model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2071 png day72 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about sub classing the nn module the sequential or the modular api forward function linear model max pooling layer padding the data convolutional neural network architecture resnet kernel size and attributes tanh activation function model parameters the functional api stateless modules and few more topics related to the same from here i have presented the implementation of sub classing the nn module using the sequential api and the functional api using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2072 png day73 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about the torch nn module the functional api convolutional neural network and the training the data loader module forward and backward pass of the network stochastic gradient descent optimizer zeroing the gradients cross entropy loss function model evaluation and gradient descent and few more topics related to the same from here i have presented the implementation of training loop and model evaluation using pytorch here in the snapshot actually it is the continuation of yesterday s snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2073 png day74 of 300daysofdata down sampling down sampling is the scaling of an image by half which is equivalent of taking four neighboring pixels as input and producing one pixel as output down sampling principle can be implemented in different ways such as max pooling on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about saving and loading the model weights and parameters of the model training the model on gpu the torch nn module and sub modules map location keyword designing model long short term memory or lstm adding memory capacity or width to the network feed forward network overfitting and few more topics related to the same from here i have presented the implementation of adding memory capacity or width to the network using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2074 png day75 of 300daysofdata l2 regularization l2 regularization is the sum of the squares of all the weights in the model whereas l1 regularization is the sum of the absolute values of all the weights in the model l2 regularization is also referred to as weight decay on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about convolutional neural network l2 regularization and l1 regularization optimization and generalization weight decay the pytorch nn module and sub modules stochastic gradient descent optimizer overfitting and dropout deep neural networks randomization and few more topics related to the same from here i have presented the implementation of l2 regularization and dropout layer using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2075 png day76 of 300daysofdata l2 regularization l2 regularization is the sum of the squares of all the weights in the model whereas l1 regularization is the sum of the absolute values of all the weights in the model l2 regularization is also referred to as weight decay on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about dropout module batch normalization and non linear activation functions regularization and principled augmentation convolutional neural networks minibatch and standard deviation deep neural networks and depth module skip connections mechanism relu activation function implementation of functional api and few more topics related to the same from here i have presented the implementation of batch normalization and deep neural networks and depth module using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2076 png day77 of 300daysofdata identity mapping when the output of the first activations is used as the input of the last in addition to the standard feed forward path then it is called the identity mapping identity mapping alleviate the issues of vanishing gradients on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about convolutional neural networks skip connections resnet architecture simple linear layer max pooling layer identity mapping highway networks unet model dense networks and very deep neural networks sequential and functional api forward and backpropagation torch vision module and sub modules batch normalization layer custom initializations and few more topics related to the same from here i have presented the implementation of resnet architecture and very deep neural networks using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2077a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2077b png day78 of 300daysofdata voxel a voxel is the 3d equivalent to the familiar 2d pixel it encloses a volume of space rather than an area on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about ct scan dataset voxel segmentation grouping and classification nodules 3d convolutions neural networks downloading the luna dataset data loading parsing the data training and validation set and few more topics related to the same from here i have started working with luna dataset which stands for lung nodule analysis 2016 the luna grand challenge is the combination of an open dataset with high quality labels of patient ct scans many with lung nodules and a public ranking of classifiers against the data i have presented the implementation of preparing the data using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2078 png day79 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about data loading and parsing the data ct scan dataset data pipeline and few more topics related to the same from here besides i have also learned about auto encoders recurrent neural networks and long short term memory or lstm data processing one hot encoding random splitting of training and validation dataset and few more i have continued working with luna dataset which stands for lung nodule analysis 2016 the luna grand challenge is the combination of an open dataset with high quality labels of patient ct scans many with lung nodules and a public ranking of classifiers against the data i have presented the simple implementation of data preparation using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2079 png day80 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about loading the individual ct scans dataset 3d nodules density data simpleitk library hounsfield units voxels batch normalization loading a nodule using the patient coordinate system converting between millimeters and voxel addresses array coordinates matrix multiplication and few more topics related to the same from here besides i have also learned about auto encoders using lstm stateful decoder model and data visualization i have continued working with luna dataset which stands for lung nodule analysis 2016 i have presented the implementation of conversion between patient coordinates and arrays coordinates on ct scans dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2080 png day81 of 300daysofdata voxel and nodules a voxel is the 3d equivalent to the familiar 2d pixel it encloses a volume of space rather than an area a mass of tissue made of proliferating cell in the lung is called a tumor a small tumor just a few millimeters wide is called a nodules on my journey of machine learning and deep learning today i have read and implemented from the book deep learning with pytorch here i have learned about pytorch dataset instance implementation luna dataset class cross entropy loss positive and negative nodules arrays and tensors caching candidate arrays training and validation datasets data visualization and few more topics related to the same from here besides i have also learned about about normalization of data variance threshold rdkit library and few more topics related to the same i have presented the implementation of preparing the luna dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book deep learning with pytorch https www manning com books deep learning with pytorch image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2081 png day82 of 300daysofdata tagging algorithms the problem of learning to predict classes that are not mutually exclusive is called multilabel classification auto tagging problems are best described as multilabel classification problems on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about a motivating example on machine learning learning algorithms training process data features models objective functions optimization algorithms supervised learning regression binary multiclass and hierarchical classification cross entropy and mean squared error loss functions gradient descent tagging algorithms and few more topics related to the same from here i have presented the implementation of preparing the data normalization removing low variance features and data loaders using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2082 png day83 of 300daysofdata reinforcement learning reinforcement learning gives a very general statement of problem in which an agent interacts with the environment over a series of time steps and receives some observation and must choose action on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about search algorithms recommender systems sequence learning tagging and parsing machine translation unsupervised learning interacting with an environment and reinforcement learning data manipulation mathematical operations broadcasting mechanisms indexing and slicing saving memory in tensors conversion to other datatypes and few more topics related to the same from here i have presented the implementation of mathematical operations tensors concatenation broadcasting mechanisms and datatypes conversion using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2083 png day84 of 300daysofdata tensors tensors refer to algebraic objects describing the n dimensional arrays with an arbitrary number of axes vectors are first order tensors and matrices are second order tensors on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about data processing reading the dataset handling the missing data categorical data conversion to the tensor format linear algebra such as scalars vectors length dimensionality and shape matrices symmetric matrix tensors basic properties of tensor arithmetic reduction non reduction sum dot products matrix vector products and few more topics related to the same from here i have presented the implementation of data processing handling the missing data scalars vectors matrices and dot products using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2084 png day85 of 300daysofdata method of exhaustion the ancient process of finding the area of curved shapes such as circle by inscribing the polygons in such shapes which better approximate the circle is called the method of exhaustion on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about matrix multiplication l1 and l2 normalization frobenius normalization calculus method of exhaustion derivatives and differentiation partial derivatives gradient descents chain rule automatic differentiation backward for non scalar variables detaching computation backpropagation computing the gradient with control flow and few more topics related to the same from here i have presented the implementation of matrix multiplication l1 l2 and frobenius normalization derivatives and differentiation automatic differentiation and computing the gradient using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2085 png day86 of 300daysofdata method of exhaustion the ancient process of finding the area of curved shapes such as circle by inscribing the polygons in such shapes which better approximate the circle is called the method of exhaustion on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about probabilities basic probability theory sampling multinomial distribution axioms of probability theory random variables dealing with multiple random variables joint probability conditional probability bayes theorem marginalization independence and dependence expectation and variance finding classes and functions in a module and few more topics related to the same from here i have presented the implementation of multinomial distribution visualization of probabilities derivatives and differentiation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2086 png day87 of 300daysofdata hyperparameters the parameters that are tunable but not updated in the training loop are called hyperparameters hyperparameters tuning is the process by which hyperparameters are chosen and typically requires adjusting based on the results of the training loop on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about linear regression basic elements of linear regression linear model and transformation loss function analytic solution minibatch stochastic gradient descent making predictions with the learned model vectorization of speed the normal distribution and squared loss linear regression to deep neural networks biological interpretation hyperparameters tuning and few more topics related to the same from here i have presented the implementation of vectorization of speed and normal distributions using python here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2087 png day88 of 300daysofdata hyperparameters the parameters that are tunable but not updated in the training loop are called hyperparameters on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about linear regression implementation from scratch data pipeline deep learning frameworks generating the artificial dataset scatter plot and correlation reading the dataset minibatches features and labels parallel computing initializing the model parameters minibatch stochastic gradient descent defining the simple linear regression model broadcasting mechanism vectors and scalars and few more topics related to the same from here i have presented the implementation of generating the synthetic dataset generating the scatter plot reading the dataset initializing the model parameters and defining the linear regression model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2088 png day89 of 300daysofdata linear regression linear regression is a linear approach to modelling the relationship between a scalar response and one or more explanatory variables also known as dependent variables and independent variables on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about linear regression defining the loss function defining the optimization algorithm minibatch stochastic gradient descent training the model tensors and differentiation concise implementation of linear regression generating the synthetic dataset model evaluation and few more topics related to the same from here i have presented the implementation of defining the loss function minibatch stochastic gradient descent training and evaluating the model concise implementation of linear regression and reading the dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2089 png day90 of 300daysofdata linear regression linear regression is a linear approach to modelling the relationship between a scalar response and one or more explanatory variables also known as dependent variables and independent variables on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax regression classification problem network architecture parameterization cost of fully connected layers softmax operation vectorization for minibatches loss function log likelihood softmax and derivatives cross entropy loss information theory basics entropy and surprisal model prediction and evaluation the image classification dataset and few more topics related to the same from here i have presented the implementation of image classification dataset visualization softmax regression and operation along with model parameters using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2090a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2090b png day91 of 300daysofdata activation functions activation functions decide whether a neuron should be activated or not by calculating the weighted sum and further adding bias with it they are differentiable operators on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about cross entropy loss function classification accuracy and training softmax regression model parameters optimization algorithms multi layer perceptrons hidden layers linear models problems from linear to nonlinear models universal approximators activation functions like relu function sigmoid function tanh function derivatives and gradients and few more topics related to the same from here i have presented the implementation of softmax regression model classification accuracy relu function sigmoid function tanh function along with visualizations using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2091a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2091b png day92 of 300daysofdata activation functions activation functions decide whether a neuron should be activated or not by calculating the weighted sum and further adding bias with it they are differentiable operators on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about implementation of multi layer perceptrons initializing model parameters relu activation functions cross entropy loss function training the model fully connected layers simple linear layer softmax regression and function stochastic gradient descent sequential api high level apis learning rate weights and biases tensors hyperparameters and few more topics related to the same from here i have presented the implementation of multi layer perceptrons relu activation function training the model and model evaluations using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2092 png day93 of 300daysofdata multi layer perceptrons the simplest deep neural networks are called multi layer perceptrons they consist of multiple layers of neurons on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about model selection underfitting overfitting training error and generalization error statistical learning theory model complexity early stopping training testing and validation dataset k fold cross validation dataset size polynomial regression generating the dataset training and testing the model third order polynomial function fitting linear function fitting high order polynomial function fitting weight decay normalization and few more topics related to the same from here i have presented the implementation of generating the dataset defining the training function and polynomial function fitting using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2093a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2093b png day94 of 300daysofdata multi layer perceptrons the simplest deep neural networks are called multi layer perceptrons they consist of multiple layers of neurons each fully connected to those in layers below from which they receive input and above which in turn influence on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about high dimensional linear regression model parameters defining l2 normalization penalty defining the training loop regularization and weight decay dropout and overfitting bias and variance tradeoff gaussian distributions stochastic gradient descent training error and test error and few more topics related to the same from here i have presented the implementation of high dimensional linear regression model parameters l2 normalization penalty regularization and weight decay using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2094a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2094b png day95 of 300daysofdata dropout and co adaption dropout is the process of injecting noise while computing each internal layer during forward propagation co adaption is the condition in neural network which is characterized by a state in which each layer relies on the specific pattern of the activations in the previous layer on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about dropout overfitting generalization error bias and variance tradeoff robustness through perturbations l2 regularization and weight decay co adaption dropout probability dropout layer fashion mnist dataset activation functions stochastic gradient descent the sequential and functional api and few more topics related to the same from here i have presented the implementation of dropout layer training and testing the model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2095a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2095b png day96 of 300daysofdata dropout and co adaption dropout is the process of injecting noise while computing each internal layer during forward propagation co adaption is the condition in neural network which is characterized by a state in which each layer relies on the specific pattern of the activations in the previous layer on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about forward propagation backward propagation and computational graphs numerical stability vanishing and exploding gradients breaking the symmetry parameter initialization environment and distribution shift covariate shift label shift concept shift non stationary distributions empirical risk and true risk batch learning online learning reinforcement learning and few more topics related to the same from here i have presented the implementation of data preprocessing and data preparation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html predicting housing prices https github com thinamxx californiahousing prices blob main predictinghouseprices ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2096 png day97 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about training and building deep networks downloading and caching datasets data preprocessing regression problems accessing and reading the dataset numerical and discrete categorical features optimization and variance arrays and tensors simple linear model the sequential api root mean squared error adam optimizer hyperparameter tuning k fold cross validation training and validation error model selection overfitting and regularization and few more topics related to the same from here i have presented the implementation of simple linear model root mean squared error training function and k fold cross validation using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html predicting housing prices https github com thinamxx californiahousing prices blob main predictinghouseprices ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2097 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2097a png day98 of 300daysofdata constant parameters constant parameters are the terms that are neither the result of the previous layers nor updatable parameters in the neural networks on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about k fold cross validation training and predictions hyperparameters optimization deep learning computation layers and blocks softmax regression multi layer perceptrons resnet architecture forward and backward propagation function relu activation function the sequential block implementation mlp implementation constant parameters and few more topics related to the same from here i have presented the implementation of mlp the sequential api class and forward propagation function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html predicting housing prices https github com thinamxx californiahousing prices blob main predictinghouseprices ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2098 png day99 of 300daysofdata constant parameters constant parameters are the terms that are neither the result of the previous layers nor updatable parameters in the neural networks on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about parameter management parameter access targeted parameters collecting parameters from nested block parameter initialization custom initialization tied parameters deferred initialization multi layer perceptrons input dimensions defining custom layers layers without parameters forward propagation function constant parameters xavier initializer weight and bias and few more topics related to the same from here i have presented the implementation of parameter access parameter initialization tied parameters and layers without parameters using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2099a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 2099b png day100 of 300daysofdata invariance and locality principle translation invariance principle states that out network should respond similarly to the same patch regardless of where it appears in the image locality principle states that the network should focus on local regions without regard to the contents of the image in distant regions on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fully connected layers to convolutions translation invariance locality principle constraining the mlp convolutional neural networks cross correlation images and channels file io loading and saving tensors loading and saving model parameters custom layers layers with parameters and few more topics related to the same from here i have presented the implementation of layers with parameters loading and saving the tensors and model parameters using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20100a png day101 of 300daysofdata invariance and locality principle translation invariance principle states that out network should respond similarly to the same patch regardless of where it appears in the image locality principle states that the network should focus on local regions without regard to the contents of the image in distant regions on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about convolutional neural networks convolutions for images the cross correlation operation convolutional layers constructor and forward propagation function weight and bias object edge detection in images learning a kernel back propagation feature map and receptive field kernel parameters and few more topics related to the same from here i have presented the implementation of cross correlation operation convolutional layers and learning a kernel using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20101 png day102 of 300daysofdata maximum pooling pooling operators consist of a fixed shape window that is slid over all the regions in the input according to its stride computing a single output for each location which is either maximum or the average value of the elements in the pooling window on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about padding and stride strided convolutions cross correlations multiple input and multiple output channels convolutional layer maximum pooling layer and average pooling layer pooling window and operators convolutional neural networks lenet architecture supervised learning convolutional encoder sigmoid activation function and few more topics related to the same from here i have presented the implementation of cnn implementation of padding stride and pooling layers multiple channels using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20102 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20102a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20102b png day103 of 300daysofdata vgg networks vgg networks construct a network using reusable convolutional blocks vgg models are defined by the number of convolutional layers and output channels in each block on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about convolutional neural networks supervised learning deep cnn and alexnet support vector machine and features learning representations data and hardware accelerator problems architectures of lenet and alexnet activation functions such as relu networks using cnn blocks vgg neural networks architecture padding and pooling convolutional layers dropout dense and linear layers and few more topics related to the same from here i have presented the implementation of alexnet architecture and vgg networks architecture along with cnn blocks using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20103a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20103b png day104 of 300daysofdata vgg networks vgg networks construct a network using reusable convolutional blocks vgg models are defined by the number of convolutional layers and output channels in each block on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about network in network or nin architecture nin blocks and model convolutional layer relu activation function the sequential and functional api global average pooling layer networks with parallel concatenations or googlenet inception blocks googlenet model and architecture maximum pooling layer training the model and few more topics related to the same from here i have presented the implementation of nin block and model inception block and googlenet model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20104a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20104b png day105 of 300daysofdata batch normalization batch normalization continuously adjusts the intermediate output of the neural network by utilizing the mean and standard deviation of the minibatch so that the values of the intermediate output are more stable on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about batch normalization training deep neural networks scale parameter and shift parameter batch normalization layers fully connected layers convolutional layers batch normalization during prediction tensors mean and variance applying bn in lenet concise implementation of bn using high level api internal covariate shift dropout layer residual networks or resnet function classes residual blocks and few more topics related to the same from here i have presented the implementation of batch normalization architecture using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20105a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20105b png day106 of 300daysofdata batch normalization batch normalization continuously adjusts the intermediate output of the neural network by utilizing the mean and standard deviation of the minibatch so that the values of the intermediate output are more stable on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about densely connected neural networks or densenet dense blocks batch normalization activation functions and convolutional layer transition layer residual networks or resnet function classes residual blocks residual mapping residual connection resnet model maximum and average pooling layers training the model and few more topics related to the same from here i have presented the implementation of resnet architecture and resnet model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20106a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20106b png day107 of 300daysofdata sequence models the prediction beyond the known observations is called extrapolation the estimating between the existing observations is called interpolation sequence models require specialized statistical tools for estimation such as auto regressive models on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about densenet model convolutional layers recurrent neural networks sequence models interpolation and extrapolation statistical tools autoregressive models latent autoregressive models markov models reinforcement learning algorithms causality conditional probability distribution training the mlp one step ahead prediction and few more topics related to the same from here i have presented the implementation of densenet architectures and simple implementation of rnns using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20107a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20107b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20107c png day108 of 300daysofdata tokenization and vocabulary tokenization is the splitting of a string or text into a list of tokens vocabulary is the dictionary that maps string tokens into numerical indices on my journey of machine learning and deep learning today i have read and implemented from the book ve into deep learning here i have learned about text preprocessing corpus of text tokenization function sequence models and dataset vocabulary dictionary multilayer perceptron one step ahead prediction multi step ahead prediction tensors recurrent neural networks and few more topics related to the same from here i have presented the implementation of reading the dataset tokenization and vocabulary using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20108 png day109 of 300daysofdata sequential partitioning sequential partitioning is the strategy that preserves the order of split subsequences when iterating over minibatches it ensures that the subsequences from two adjacent minibatches during iteration are adjacent in the original sequence on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about language models and sequence dataset conditional probability laplace smoothing markov models and ngrams unigram bigram and trigram models natural language statistics stop words word frequencies zipf s law reading long sequence data minibatches random sampling sequential partitioning and few more topics related to the same from here i have presented the implementation of unigram bigram and trigram model frequencies random sampling and sequential partitioning using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20109a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20109b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20109c png day110 of 300daysofdata recurrent neural networks recurrent neural networks are the networks that uses recurrent computation for hidden states the hidden state of an rnn can capture historical information of the sequence up to the current time step on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about recurrent neural networks or rnn hidden state neural networks without hidden states rnns with hidden states rnn layers rnn based character level language models perplexity implementation of rnn from scratch one hot encoding vocabulary initializing the model parameters rnn model minibatch and tanh activation function prediction and warm up period gradient clipping backpropagation and few more topics related to the same from here i have presented the implementation rnn model gradient clipping and training the model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20110 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20110a png day111 of 300daysofdata recurrent neural networks recurrent neural networks are the networks that uses recurrent computation for hidden states the hidden state of an rnn can capture historical information of the sequence up to the current time step on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about implementation of recurrent neural networks defining the rnn model training and prediction backpropagation through time exploding gradients vanishing gradients analysis of gradients in rnns full computation truncating time steps randomized truncation gradient computing strategies in rnns activation functions regular truncation and few more topics related to the same from here i have presented the implementation of recurrent neural networks training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20111a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20111b png day112 of 300daysofdata gated recurrent units gated recurrent units or grus are a gating mechanisms in recurrent neural networks in which hidden state should be updated and also when it should be reset it aims to solve the vanishing gradient problem which comes with standard rnns on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about modern recurrent neural networks gradient clipping gated recurrent units or grus memory cell gated hidden state reset gate and update gate broadcasting candidate hidden state hadamard product operator hidden state initializing model parameters defining the gru model training and prediction and few more topics related to the same from here i have presented the implementation of gated recurrent units gru model training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20112a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20112b png day113 of 300daysofdata long short term memory long short term memory or lstm is a type of recurrent neural networks capable of learning order dependence in sequence prediction problems lstm has input gates forget gates and output gates that control the flow of information on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about long short term memory or lstm gated memory cell input gate forget gate and output gate candidate memory cell tanh activation function sigmoid activation function memory cell hidden state initializing model parameters defining the lstm model training and prediction gated recurrent units or grus gaussian distribution and few more topics related to the same from here i have presented the implementation of long short term memory or lstm model training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20113a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20113b png day114 of 300daysofdata long short term memory long short term memory or lstm is a type of recurrent neural networks capable of learning order dependence in sequence prediction problems lstm has input gates forget gates and output gates that control the flow of information on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep recurrent neural networks functional dependencies bidirectional recurrent neural networks dynamic programming in hidden markov models bidirectional model computational cost and applications machine translation and dataset preprocessing the dataset tokenization vocabulary padding text sequences and few more topics related to the same from here i have presented the implementations of downloading the dataset preprocessing tokenization and vocabulary using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20114a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20114b png day115 of 300daysofdata encoder and decoder architecture encoder takes a variable length sequence as the input and transforms it into a state with a fixed shape decoder maps the encoded state of a fixed shape to a variable length sequence on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about encoder and decoder architectures machine translation model sequence transduction models forward propagation function sequence to sequence learning recurrent neural networks embedding layer gated recurrent units or gru layers hidden states and units rnn encoder and decoder architecture vocabulary and few more topics related to the same from here i have presented the implementation of encoder decoder architectures and rnn encoder decoder for sequence to sequence learning using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20115a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20115b png day116 of 300daysofdata sequence search greedy search is the conditional probability of generating an output sequence based on the input sequence beam search is an improved version of greedy search with a hyperparameter named beam size on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax cross entropy loss function sequence masking teacher forcing training and prediction evaluation of predicted sequences bleu or bilingual evaluation understudy rnn encoder decoder beam search greedy search exhaustive search attention mechanisms attention cues nonvolitional cue and volitional cue queries keys and values attention pooling and few more topics related to the same from here i have presented the implementation of sequence masking softmax cross entropy loss training rnn encoder decoder model and bleu using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20116a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20116b png day117 of 300daysofdata attention pooling attention pooling selectively aggregates values or sensory inputs to produce the output it implies the interaction between queries and keys on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about attention pooling or nadaraya watson kernel regression queries or volitional cues and keys or non volitional cues generating the dataset average pooling non parametric attention pooling attention weight gaussian kernel parametric attention pooling batch matrix multiplication defining the model training the model stochastic gradient descent mse loss function and few more topics related to the same from here i have presented the implementation of attention mechanisms non parametric attention pooling batch matrix multiplication nw kernel regression model training and prediction using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20117a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20117b png day118 of 300daysofdata attention pooling attention pooling selectively aggregates values or sensory inputs to produce the output it implies the interaction between queries or volitional cues and keys or non volitional cues attention pooling is the weighted average of the training outputs it can be parametric or nonparametric on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about attention scoring functions gaussian kernel attention weights softmax activation function masked softmax operation text sequences probability distribution additive attention queries keys and values tanh activation function dropout and linear layer attention pooling and few more topics related to the same from here i have presented the implementation of masked softmax operation and additive attention using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20118 png day119 of 300daysofdata attention pooling attention pooling selectively aggregates values or sensory inputs to produce the output it implies the interaction between queries or volitional cues and keys or non volitional cues attention pooling is the weighted average of the training outputs it can be parametric or nonparametric on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about scaled dot product attention queries keys and values additive attention attention pooling bahdanau attention rnn encoder decoder architecture hidden states embedding defining decoder with attention sequence to sequence attention decoder and few more topics related to the same from here i have presented the implementation of scaled dot product attention and sequence to sequence attention decoder model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20119a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20119b png day120 of 300daysofdata multi head attention multi head attention is the design for attention mechanisms which runs through an attention mechanism several times in parallel instead of performing single attention pooling queries keys and values can be transformed into learned linear projections which are fed into attention pooling in parallel on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bahdanau attention recurrent neural networks encoder decoder architecture training the sequence to sequence model embedding layer attention weights gru heatmaps multi head attention queries keys and values attention pooling additive attention and scaled dot product attention transpose functions and few more topics related to the same from here i have presented the implementation multi head attention using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20120 png day121 of 300daysofdata multi head attention multi head attention is the design for attention mechanisms which runs through an attention mechanism several times in parallel instead of performing single attention pooling queries keys and values can be transformed into learned linear projections which are fed into attention pooling in parallel on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about multi head attention queries keys and values attention pooling scaled dot product attention self attention and positional encoding recurrent neural networks intra attention comparing cnns rnns and self attention padding tokens absolute positional information relative positional information and few more topics related to the same from here i have presented the implementation of positional encoding using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20121 png day122 of 300daysofdata transformer architecture transformer is an architecture for transforming one sequence into another one with the help of two parts encoder and decoder it makes the use of self attention mechanisms on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about transformer self attention encoder and decoder architecture sequence embeddings positional encoding position wise feed forward networks residual connection and layer normalization encoder block and multi head self attention transformer decoder queries keys and values scaled dot product attention and few more topics related to the same from here i have presented the implementation of position wise feed forward networks residual connection and layer normalization encoder decoder block and transformer decoder using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20122a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20122b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20122c png day123 of 300daysofdata transformer architecture transformer is an architecture for transforming one sequence into another one with the help of two parts encoder and decoder it makes the use of self attention mechanisms on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about decoder architecture self attention encoder decoder attention position wise feed forward networks residual connections transformer decoder embedding layer sequential blocks training the transformer architecture and few more topics related to the same from here i have also read about logistic regression sigmoid activation function weights initialization gradient descent cost function and more i have presented the implementation of logistic regression from scratch using numpy transformer decoder and training using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html logistic regression docs https ml cheatsheet readthedocs io en latest logistic regression html implementation of logistic regression https github com thinamxx machinelearning algorithms tree main logisticregression image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20123a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20123b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20123c png day124 of 300daysofdata transformer architecture transformer is an architecture for transforming one sequence into another one with the help of two parts encoder and decoder it makes the use of self attention mechanisms on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about optimization algorithms and deep learning objective function and minimization goal of optimization generalization error training error risk function and empirical risk function optimization challenges local minimum and global minimum saddle points hessian matrix and eigenvalues vanishing gradients convexity convex sets and functions jensen s inequality and few more topics related to the same from here i have presented the implementation of local minima saddle points vanishing gradients and convex functions using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20124 png day125 of 300daysofdata gradient descent gradient descent is an optimization algorithm which is used to minimize the differentiable function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about convexity and second derivatives constrained optimization lagrangian function and multipliers penalties projections gradient clipping stochastic gradient descent one dimensional gradient descent objective function learning rate local minimum and global minimum multivariate gradient descent and few more topics related to the same from here i have presented the implementation of one dimensional gradient descent local minima and multivariate gradient descent using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20125a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20125b png day126 of 300daysofdata gradient descent gradient descent is an optimization algorithm which is used to minimize the differentiable function by iteratively moving in the direction of steepest descent as defined by the negative of the gradient on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about multivariate gradient descent adaptive methods learning rate newtons method taylor expansion hessian function gradient and backpropagation nonconvex function convergence analysis linear convergence preconditioning gradient descent with line search stochastic gradient descent loss functions and few more topics related to the same from here i have presented the implementation of newtons method non convex functions and stochastic gradient descent using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20126 png day127 of 300daysofdata stochastic gradient descent stochastic gradient descent is an iterative method for optimizing an objective function with suitable differentiable properties it is a variation of the gradient descent algorithm that calculates the error and updates the model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about stochastic gradient descent dynamic learning rate exponential decay and polynomial decay convergence analysis for convex objectives stochastic gradient and finite samples minibatch stochastic gradient descent vectorization and caches matrix multiplications minibatches variance implementation of gradients and few more topics related to the same from here i have presented the implementation of stochastic gradient descent and minibatch stochastic gradient descent using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20127a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20127b png day128 of 300daysofdata stochastic gradient descent stochastic gradient descent is an iterative method for optimizing an objective function with suitable differentiable properties it is a variation of the gradient descent algorithm that calculates the error and updates the model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about the momentum method stochastic gradient descent leaky averages variance accelerated gradient an ill conditioned problem and convergence effective sample weight practical experiments implementation of momentum with sgd theoretical analysis quadratic convex functions scalar functions and few more topics related to the same from here i have presented the implementation of momentum method effective sample weight and scalar functions using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20128a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20128b png day129 of 300daysofdata stochastic gradient descent stochastic gradient descent is an iterative method for optimizing an objective function with suitable differentiable properties it is a variation of the gradient descent algorithm that calculates the error and updates the model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about adagrad optimization algorithms sparse features and learning rates preconditioning stochastic gradient descent algorithm the algorithms implementation of adagrad from scratch deep learning and computational constraints learning rates and few more topics related to the same from here i have presented the implementation adagrad optimization algorithm from scratch using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20129 png day130 of 300daysofdata rmsprop optimization algorithm rmsprop is a gradient based optimization algorithm that utilizes the magnitude of recent gradients to normalize the gradients it deals with adagrad s radically diminishing learning rates it divides the learning rate by an exponentially decaying average of squared gradients on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about rmsprop optimization algorithm learning rate leaky averages and momentum method implementation of rmsprop from scratch gradient descent algorithm preconditioning and few more topics related to the same from here i have presented the implementation of rmsprop optimization algorithm from scratch using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20130 png day131 of 300daysofdata rmsprop optimization algorithm rmsprop is a gradient based optimization algorithm that utilizes the magnitude of recent gradients to normalize the gradients it deals with adagrad s radically diminishing learning rates it divides the learning rate by an exponentially decaying average of squared gradients on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about adadelta optimization algorithms learning rates leaky averages momentum gradient descent concise implementation of adadelta adam optimization algorithms vectorization and minibatch sgd weighting parameters normalization concise implementation of adam algorithms and few more topics related to the same from here i have presented the implementation of adadelta optimization algorithm and adam optimization algorithm from scratch using pytorch here in the snapshot i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20131 png day132 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about adam and yogi optimization algorithms variance minibatch sgd learning rate scheduling weight vectors convolutional layer linear layer max pooling layer sequential api relu cross entropy loss schedulers overfitting and few more topics related to the same from here i have presented the implementation of lenet architecture and yogi optimization algorithm using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20132a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20132b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20132c png day133 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about learning rate scheduling square root scheduler factor scheduler learning rate and polynomial decay multi factor scheduler piecewise constant optimization and local minimum cosine scheduler and few more topics related to the same from here i have presented the implementation of multi factor scheduler and cosine scheduler using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20133 png day134 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about model computational performance compilers and interpreters symbolic programming and imperative programming hybrid programming dynamic computations graph hybrid sequential acceleration by hybridization multi layer perceptrons asynchronous computation and few more topics related to the same from here i have presented the implementation of hybrid sequential acceleration by hybridization and asynchronous computation using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20134 png day135 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about asynchronous computation barriers and blockers improving computation and memory footprint automatic parallelism parallel computation and communication training on multiple gpus splitting the problem data parallelism network partitioning layer wise partitioning data parallel partitioning and few more topics related to the same from here i have presented the implementation of initializing model parameters and defining lenet model using pytorch here in the snapshot i am still working on the implementation of lenet model i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20135 png day136 of 300daysofdata adam optimizer adam uses exponential weighted moving averages also known as leaky averaging to obtain an estimate of both momentum and also the second moment of the gradient it combines the features of many optimization algorithms it uses ewma on minibatch stochastic gradient descent on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about training on multiple gpus lenet architecture data synchronization model parallelism data broadcasting data distribution optimization algorithms implementation back propagation model animation cross entropy loss function convolutional layer relu activation function matrix multiplication average pooling layer and few more topics related to the same from here i have presented the implementation of data distribution data synchronization and training function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20136 png day137 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about optimization and synchronization resnet neural networks architecture convolutional layer batch normalization layer strides and padding the sequential api parameter initialization and logistics minibatch gradient descent training resnet model stochastic gradient descent optimizer cross entropy loss function back propagation parallelization and few more topics related to the same from here i have presented the implementation of resnet architecture initialization and training the model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20137a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20137b png day138 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision applications image augmentation deep neural networks common image augmentation method such as flipping and cropping horizontal flipping and vertical flipping changing the color of images overlying multiple image augmentation methods cifar10 dataset torch vision module and random color jitter instance and few more topics related to the same from here i have presented the implementation of flipping and cropping the images and changing the color of images using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html implementation of lenet architecture https github com thinamxx machinelearning algorithms blob main lenetarchitecture lenetarchitecture ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20138 png day139 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about image augmentation cifar10 dataset using a multi gpu training model fine tuning the model overfitting pretrain neural network target initialization resnet model imagenet dataset normalization of rgb images mean and standard deviation torch vision module flipping and cropping images adam optimization cross entropy loss function and few more topics related to the same from here i have presented the implementation of training the model with image augmentation and normalization of images using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20139a png day140 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fine tuning the model pretrain neural networks normalization of images mean and standard deviation defining and initializing the model cross entropy loss function data loader class learning rate and stochastic gradient descent model parameters transfer learning source model and target model weights and biases and few more topics related to the same from here i have presented the implementation of normalization of images flipping and cropping the images and training pretrained model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20140 png day141 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about object detection and object recognition image classification and computer vision images and bounding boxes target location and axis coordinates and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about regular expressions disjunction grouping and precedence precision and recall substitution and capture groups lookahead assertions words corpora and few more topics related to the same i have presented the simple implementation of object detection and bounding boxes using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20141 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20141a png day142 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision anchor boxes object detection algorithms bounding boxes generating multiple anchor boxes computation complexity sizes and ratios and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about text normalization unix tools for crude tokenization and normalization word tokenization named entity detection penn treebank tokenization and few more topics related to the same from here i have presented the implementation of generating anchor boxes object detection and bounding boxes using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20142 png day143 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision generating multiple anchor boxes batch size coordinate values intersection over union algorithm jaccard index computation complexity sizes and ratios and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about byte pair encoding algorithm for tokenization subword tokens wordpiece and greedy tokenization algorithm maximum matching algorithm word normalization lemmatization and stemming the porter stemmer and few more topics related to the same from here i have presented the implementation of generating anchor boxes and intersection over union algorithm using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20143 png day144 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision labeling training set anchor boxes object detection and image recognition ground truth bounding box index anchor boxes and offset boxes intersection over union and jaccard algorithm and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about sentence segmentation the minimum edit distance algorithm viterbi algorithm n gram language models probability spelling correction and grammatical error correction and few more topics related to the same from here i have presented the implementation of labeling training set anchor boxes and initializing offset boxes using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20144 png day145 of 300daysofdata image segmentation image segmentation is the process of partitioning digital image into multiple segments or set of pixels the goal of segmentation is to simplify the representation of image into something meaningful and easier to analyze on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about non maximum suppression algorithms prediction bounding boxes ground truth bounding boxes confidence level batch size intersection over union algorithm or jaccard index aspect ratios bounding boxes for prediction multi box target function anchor boxes and few more topics related to the same from here i have presented the implementation of initializing multi box anchor boxes and initializing prediction bounding boxes using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20145 png day146 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about multiscale object detection generating multiple anchor boxes object detection single shot multiple detection algorithm category prediction layer bounding boxes prediction layer concatenating predictions for multiple scales height and width down sample block cnn layer relu and max pooling layer and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have read about part of speech tagging information extraction named entity recognition regular expressions and few more topics related to the same from here i have presented the implementation of initializing category prediction layer and height width down sample block using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20146 png day147 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about single shot multi box detection algorithm the base neural network height width down sample block category prediction layer bounding box prediction layer multiscale feature blocks the sequential api and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about n gram language models chain rule of probability markov models maximum likelihood estimation relative frequency evaluating language models log probabilities perplexity generalization zeros sparsity and few more topics related to the same from here i have presented the implementation of base ssd network and complete ssd model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20147 png day148 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about single shot multi box detection model implementation of tiny ssd model forward propagation function data reading and initialization object detection multi scale feature block global max pooling layer and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about unknown words or out of vocabulary words oov rate smoothing laplace smoothing text classification add one smoothing mle add k smoothing and few more topics related to the same from here i have presented the implementation of single shot multi box detection model and dataset initialization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20148 png day149 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax activation function convolutional layer training the single shot multi box detection model multi scale anchor boxes cross entropy loss function l1 normalization loss function average absolute error accuracy rate category and offset losses and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about backoff and interpolation katz backoff kneser ney smoothing absolute discounting the web and stupid backoff perplexity relation to entropy and few more topics related to the same from here i have presented the implementation of training single shot multi box detection model loss and evaluation functions using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20149 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20149a png day150 of 300daysofdata image segmentation image segmentation is the process of partitioning digital image into multiple segments or set of pixels the goal of segmentation is to simplify the representation of image into something meaningful and easier to analyze on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about region based convolutional neural networks fast rcnn faster rcnn mask rcnn category prediction layer bounding boxes prediction layer support vector machines rol pooling layer and rol alignment layer pixel level semantics image segmentation and instance segmentation pascal voc2012 semantic segmentation rgb data preprocessing and few more topics related to the same from here i have presented the implementation of semantic segmentation and data preprocessing using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20150a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20150b png day151 of 300daysofdata sequence to sequence model sequence to sequence neural networks can be built with a modular and reusable encoder and decoder architecture the encoder model generates a thought vector which is a dense and fixed dimension vector representation of the data the decoder model use thought vectors to generate output sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about dataset classes for custom semantic segmentation rgb channels normalization of images random cropping operation sequence to sequence recurrent neural networks label encoder one hot encoder encoding and vectorization long short term memory or lstm and few more topics related to the same from here i have presented the implementation dataset classes for custom semantic segmentation using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20151 png day152 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about transposed convolutional layer cnns basic 2d transposed convolution broadcasting matrices kernel size padding strides and channels analogy to matrix transposition matrix multiplication and matrix vector multiplication and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about naive bayes and sentiment classification text categorization spam detection probabilistic classifier multinomial nb classifier bag of words mlp unknown and stop words and few more topics related to the same from here i have presented the implementation of transposed convolution padding strides and matrix multiplication using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20152 png day153 of 300daysofdata transposed convolution transposed convolution implies that stride padding do not correspond to the number of zeros added around the image and the amount of shift in the kernel when sliding it across the input as they would in a standard convolution operation on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fully convolutional neural networks semantic segmentation principles transposed convolutional layer constructing a pretrained neural networks model global average pooling layer flattening layer image processing and upsampling bilinear interpolation kernel function and few more topics related to the same from here i have presented the implementation of fully convolutional layer pretrained nns bilinear interpolation kernel function and transposed convolutional layer using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20153 png day154 of 300daysofdata neural style transfer algorithms it is the task of changing the style of an image in one domain to the style of an image in another domain it manipulates images or videos in order to adopt the appearance of another image on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about softmax cross entropy loss function stochastic gradient descent cnns neural networks style transfer composite images rgb channels normalization and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about optimizing naive bayes for sentiment analysis sentiment lexicons naive bayes as language models precision recall and fmeasure multi label and multinomial classification and few more topics related to the same from here i have started working on style transfer using neural networks the notebook is mentioned below though i am still working on it book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 neural networks style transfer https github com thinamxx neural style transfer image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20154 png day155 of 300daysofdata neural style transfer algorithms it is the task of changing the style of an image in one domain to the style of an image in another domain it manipulates images or videos in order to adopt the appearance of another image on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about neural networks style transfer convolutional neural networks reading the content and style images preprocessing and postprocessing the images extracting image features composite images vgg neural networks squared error loss faction total variance loss function normalization of rgb channels of images and few more topics related to the same from here i am still working on style transfer using neural networks the notebook is mentioned below though i am still working on it i have presented the implementation of function for extracting features and square error loss function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 neural networks style transfer https github com thinamxx neural style transfer image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20155 png day156 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about creating and initializing the composite images synchronization functions adam optimizer gram matrix convolutional neural networks neural networks style transfer loss functions and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about test sets and cross validation statistical significance testing naive bayes classifiers bootstrapping logistic regression generative and discriminative classifiers feature representation sigmoid classification weight and bias term and few more topics related to the same from here i have completed working on style transfer using neural networks the notebook is mentioned below but i am still updating book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 neural networks style transfer https github com thinamxx neural style transfer image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20156a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20156b png day157 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision image classification cifar10 dataset obtaining and organizing the dataset augmentation and few more topics related to the same apart from that i have learned about data scraping and scrapy named entity recognition and spacy trained transformer model using spacy geocoding and few more topics related to the same from here i have completed working on style transfer using neural networks notebook i have started working on object recognition on images cifar10 notebook all the notebooks are mentioned below i have presented the implementation of obtaining and organizing the cifar10 dataset here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned above and below excited about the days ahead book dive into deep learning https d2l ai index html neural networks style transfer https github com thinamxx neural style transfer object recognition on images cifar10 https github com thinamxx cifar10 recognition image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20157 png day158 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision image classification image augmentation and overfitting normalization of rgb channels data loader and validation set and few more topics related to the same from here apart from that i have learned about stanford ner algorithms nltk named entity recognition and few more topics related to the same i have completed working on style transfer using neural networks notebook i have started working on object recognition on images cifar10 notebook all the notebooks are mentioned below i have presented the implementation of obtaining and organizing the dataset image augmentation and normalization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html neural networks style transfer https github com thinamxx neural style transfer object recognition on images cifar10 https github com thinamxx cifar10 recognition image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20158 png day159 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about computer vision resnet model and residual blocks xavier random initialization cross entropy loss function defining training functions stochastic gradient descent learning rate scheduler evaluation metrics and few more topics related to the same i have also spend some time reading the book speech and language processing here i have learned about sentiment classification learning in logistic regression conditional mle cost function and few more topics related to the same from here i am working on object recognition on images cifar10 notebook the notebook is mentioned below i have presented the implementation defining a training function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html object recognition on images cifar10 https github com thinamxx cifar10 recognition image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20159 png day160 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about imagenet dataset obtaining and organizing the dataset image augmentation such as flipping and resizing the image changing brightness and contrast of image transfer learning and features normalization of images and few more topics related to the same from here i have completed working on object recognition on images cifar10 notebook i have started working on dog breed identification imagenet notebook all the notebooks are mentioned below i have presented the implementation of image augmentation and normalization defining neural networks model and loss function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20160 png day161 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about defining the training functions computer vision hyperparameters stochastic gradient descent optimization function learning rate scheduler and optimization training loss and validation loss and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about gradient for logistic regression sgd algorithm minibatch training and few more topics related to the same from here i am working on dog breed identification imagenet notebook the notebooks is mentioned below i have presented the implementation of defining the training function using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20161 png day162 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretrained text representations word embedding and word2vec one hot vectors the skip gram model and training the continuous bag of words model and training approximate training negative sampling hierarchical softmax reading and processing the dataset subsampling vocabulary and few more topics related to the same from here apart from that i have also read about improving chemical autoencoders latent space and molecular diversity with hetero encoders i am working on dog breed identification imagenet notebook the notebooks is mentioned below i have presented the implementation of reading and preprocessing the dataset subsampling and comparison using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20162 png day163 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about subsampling extracting central target words and context words maximum context window size penn tree bank dataset and pretraining word embedding and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about regularization and overfitting manhattan distance lasso and ridge regression multinomial logistic regression features in mlr learning in mlr interpreting models deriving gradient equation and few more topics related to the same from here i have completed working on dog breed identification imagenet notebook i have presented the implementation of extracting central target words and context words using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 object recognition on images cifar10 https github com thinamxx cifar10 recognition dog breed identification imagenet https github com thinamxx dogbreedclassification image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20163 png day164 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about subsampling and negative sampling word embedding and word2vec probability reading into batches concatenation and padding random minibatches and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about vector semantics and embeddings lexical semantics lemmas and senses word sense disambiguation word similarity principle of contrast representation learning synonymy and few more topics related to the same from here i have presented the implementation negative sampling using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20164 png day165 of 300daysofdata subsampling subsampling is a method that reduces data size by selecting a subset of the original data the subset is specified by choosing a parameter subsampling attempts to minimize the impact of high frequency words on the training of a word embedding model on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about word embedding batches loss function and padding center and context words negative sampling data loader instance vocabulary subsampling data iterations mask variables and few more topics related to the same from here i have presented the implementation of reading batches and function for loading ptb dataset using pytorch here in the snapshots i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20165a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20165b png day166 of 300daysofdata word embedding word embedding is a term used for the representation of words for text analysis typically in the form of a real valued vector that encodes the meaning of the word such that the words that are closer in the vector space are expected to be similar in meaning on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about word embedding word2vec the skip gram model embedding layer word vector skip gram model forward calculation batch matrix multiplication binary cross entropy loss function negative sampling mask variables and padding initializing model parameters and few more topics related to the same from here i have presented the implementation of embedding layer skip gram model forward calculation and binary cross entropy loss function using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20166 png day167 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about training skip gram model loss function applying word embedding model negative sampling word embedding with global vectors or glove conditional probability the glove model cross entropy loss function and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about word relatedness semantic field semantic frames and roles connotation and sentiment vector semantics embeddings and few more topics related to the same from here i have presented the implementation of training word embedding model using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20167 png day168 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about subword embedding fast text and byte pair encoding finding synonyms and analogies pretrained word vectors token embedding central words and context words and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about words and vectors vectors and documents term document matrices information retrieval row vector and context matrix and few more topics related to the same from here i have presented the implementation of defining token embedding class using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20168 png day169 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about finding synonyms and analogies word embedding model and word2vec applying pretrained word vectors cosine similarity and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have read about cosine for measuring similarity dot and inner products weighing terms in the vector term frequency inverse document frequency or tfidf collection frequency applications of tfidf vector model and few more topics related to the same from here i have presented the implementation of cosine similarity and finding synonyms and analogies using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20169 png day170 of 300daysofdata bidirectional encoder representations from transformers elmo encodes context bidirectionally but uses task specific architectures and gpt is a task agnostic but encodes context left to right bert encodes context bidirectionally and requires minimal architecture changes for a wide range of nlp tasks on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bert architecture from context independent to context sensitive word embedding model and word2vec from task specific to task agnostic embeddings from language models or elmo architecture input representations token segment and positional embedding and learnable positional embedding and few more topics related to the same from here i have presented the implementation of bert input representations and bert encoder class using pytorch here in the snapshot i hope you will gain some insights i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20170 png day171 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bert encoder class pretraining tasks masked language modeling multi layer perceptron forward inference bert input sequences bidirectional context encoding and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have learned about pointwise mutual information or pmi laplace smoothing word2vec skip gram with negative sampling or sgns the classifier logistic and sigmoid function cosine similarity and dot product and few more topics related to the same from here i have presented the implementation of masked language modeling and bert encoder using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20171 png day172 of 300daysofdata bidirectional encoder representations from transformers elmo encodes context bidirectionally but uses task specific architectures and gpt is a task agnostic but encodes context left to right bert encodes context bidirectionally and requires minimal architecture changes for a wide range of nlp tasks the embeddings are the sum of the token segment and positional embeddings on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about bidirectional encoder representations from transformers or bert architecture next sentence prediction model cross entropy loss function mlp bert model masked language modeling bert encoder pretraining bert model and few more topics related to the same from here i have presented the implementation of next sentence prediction and bert model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20172 png day173 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretraining bert model and dataset defining helper functions for pretraining tasks generating next sentence prediction task generating masked language modeling task sequence tokens and few more topics related to the same from here i have also spend some time reading the book speech and language processing here i have read about learning skip gram embeddings binary classifier target and context embedding visualizing embeddings semantic properties of embeddings and few more topics related to the same from here i have presented the implementation of generating next sentence prediction task and generating masked language modeling task using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html speech and language processing https web stanford edu jurafsky slp3 image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20173a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20173b png day174 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretraining bert model next sentence prediction task and masked language modeling task transforming text into pretraining dataset and few more topics related to the same from here i have also learned about scorer and example instances of spacy model long short term memory neural networks smiles vectorizer feed forward neural networks and few more topics related to the same i have presented the implementation of transforming text into pretraining dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20174a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20174b png day175 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about pretraining bert model cross entropy loss function adam optimization function zeroing gradients back propagation and optimization masked language modeling loss and next sentence prediction loss and few more topics related to the same from here i have presented the implementation of pretraining bert model getting loss from bert model and training a neural networks model using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20175a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20175b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20175c png day176 of 300daysofdata on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language processing applications nlp architecture and pretraining sentiment analysis and dataset text classification tokenization and vocabulary padding tokens to same length and few more topics related to the same from here apart from that i have also learned about named entity recognition frequency distribution nltk extending lists and few more topics related to the same from here i have presented the implementation of reading the dataset tokenization and vocabulary and padding to fixed length using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20176 png day177 of 300daysofdata sentiment analysis sentiment analysis is the use of natural language processing text analysis computational linguistics and biometrics to systematically identify extract quantify and study affective states and subjective information it is widely applied to voice of the customer materials such as reviews and survey responses online and social media and healthcare materials for applications that range from marketing to customer service to clinical medicine on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about creating data iterations tokenization and vocabulary truncating and padding recurrent neural networks model and sentiment analysis pretrained word vectors and glove bidirectional lstm and embedding layer linear layer and decoding encoding and sequence data xavier initialization and few more topics related to the same from here i have presented the implementation of bidirectional recurrent neural networks model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20177 png day178 of 300daysofdata sentiment analysis sentiment analysis is the use of natural language processing text analysis computational linguistics and biometrics to systematically identify extract quantify and study affective states and subjective information it is widely applied to voice of the customer materials such as reviews and survey responses online and social media and healthcare materials for applications that range from marketing to customer service to clinical medicine on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about word vectors and vocabulary training and evaluating bidirectional rnn model sentiment analysis and one dimensional convolutional neural networks one dimensional cross correlation operation max over time pooling layer the text cnn model relu activation function and dropout layer and few more topics related to the same from here i have presented the implementation of text convolutional neural networks using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20178 png day179 of 300daysofdata natural language inference natural language inference is a study where a hypothesis can be inferred from a premise where both are a text sequence it determines the logical relationship between a pair of text sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language inference and dataset premise hypothesis or entailment contradiction and neutral the stanford natural language inference dataset reading snli dataset and few more topics related to the same from here i have presented the implementation of reading snli dataset using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20179 png day180 of 300daysofdata natural language inference natural language inference is a study where a hypothesis can be inferred from a premise where both are a text sequence it determines the logical relationship between a pair of text sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language inference and snli dataset premises hypotheses and labels vocabulary padding and truncation of sequences dataset and dataloader module and few more topics related to the same from here apart from here i have also read about confusion matrix and classification reports frequency distribution and word cloud of text data i have presented the implementation of loading snli dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html sentiment analysis dataset notebook https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20dataset ipynb sentiment analysis with rnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20rnn ipynb sentiment analysis with cnn https github com thinamxx neuralnetworks sentimentanalysis blob master pytorch sentiment 20analysis 20cnn ipynb natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20180 png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20180a png day181 of 300daysofdata natural language inference natural language inference is a study where a hypothesis can be inferred from a premise where both are a text sequence it determines the logical relationship between a pair of text sequences on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about natural language inference using attention model multi layer perceptron or mlp with attention mechanisms alignment of premises and hypotheses word embeddings and attention weights and few more topics related to the same from here i have presented the implementation of mlp and attention mechanism using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb natural language inference https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20181 png day182 of 300daysofdata comparing and aggregating class comparing class compares a word in one sequence with the other sequence that is softly aligned with the word aggregating class aggregates the two sets of comparison vectors to infer the logical relationship it feeds the concatenation of both summarization results into mlp function to obtain the classification result of the logical relationship on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about comparing word sequences soft alignment multi layer perceptron or mlp classifier aggregating comparison vectors linear layer and concatenation decomposable attention model embedding layer and few more topics related to the same from here i have presented the implementation of comparing class aggregating class and decomposable attention model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb natural language inference https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20182 png day183 of 300daysofdata comparing and aggregating class comparing class compares a word in one sequence with the other sequence that is softly aligned with the word aggregating class aggregates the two sets of comparison vectors to infer the logical relationship it feeds the concatenation of both summarization results into mlp function to obtain the classification result of the logical relationship on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about decomposable attention model embedding layer and linear layer training and evaluating the attention model natural language inference entailment contradiction and neutral pretrained glove embedding snli dataset adam optimizer and cross entropy loss function premises and hypotheses and few more topics related to the same from here i have presented the implementation of training and evaluating attention model using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference dataset https github com thinamxx natural language inference blob main naturallanguage 20inference 20data ipynb natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20183 png day184 of 300daysofdata bert model notes bert requires minimal architecture changes for sequence level and token level nlp applications such as single text classification text pair classification or regression and text tagging on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about fine tuning bert for sequence level and token level applications single text classification text pair classification or regression text tagging question answering natural language inference and pretrained bert model loading pretrained bert model and parameters semantic textual similarity pos tagging and few more topics related to the same from here i have presented the implementation of loading pretrained bert model and parameters using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20184 png day185 of 300daysofdata bert model notes bert requires minimal architecture changes for sequence level and token level nlp applications such as single text classification text pair classification or regression and text tagging on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about loading pretrained bert model and parameters the dataset for fine tuning bert model premise hypothesis and input sequence tokenization and vocabulary truncating and padding tokens natural language inference and few more topics related to the same from here i have presented the implementation of the dataset for fine tuning bert model and generating training and test examples using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20185a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20185b png day186 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about generative adversarial networks generator and discriminator networks updating discriminator and few more topics related to the same from here i have also read about recommender systems collaborative filtering explicit and implicit feedbacks recommendation tasks and few more topics related to the same i have presented a simple implementation of generator and discriminator networks and optimization using pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20186 png day187 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about generator and discriminator networks binary cross entropy loss function adam optimizer and normalized tensors gaussian distribution real and generated data and few more topics related to the same from here i have presented a simple implementation of updating generator and training function using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html natural language inference attention https github com thinamxx natural language inference blob main nl 20inference 20attention ipynb natural language inference bert https github com thinamxx natural language inference blob main nl 20inference 20bert ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20187a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20187b png day188 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep convolutional generative adversarial networks the pokemon dataset resizing and normalization dataloader the generator block module transposed convolution layer batch normalization layer relu activation function and few more topics related to the same from here i have also read about inter quartile range mean absolute deviation box plots density plots frequency tables and few more topics related to the same i have presented the implementation of the generator block and pokemon dataset using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20188a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20188b png day189 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep convolutional generative adversarial networks the generator and the discriminator networks leaky relu activation function and dying relu problem batch normalization convolutional layer stride and padding and few more topics related to the same from here i have presented the implementation of the discriminator block and the generator block using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20189 png day190 of 300daysofdata generative adversarial networks generative adversarial networks consist of two deep networks generator and discriminator the generator generates the image as much closer to the true image as possible to fool discriminator by maximizing the cross entropy loss the discriminator tries to distinguish the generated images from the true images by minimizing the cross entropy loss on my journey of machine learning and deep learning today i have read and implemented from the book dive into deep learning here i have learned about deep convolutional generative adversarial networks the generator and the discriminator blocks cross entropy loss function adam optimization function and few more topics related to the same from here i have presented the implementation of training generator and discriminator networks using pytorch here in the snapshots i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book dive into deep learning https d2l ai index html deep convolutional gan https github com thinamxx gan blob main deep 20gan ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20190a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20190b png day191 of 300daysofdata on my journey of machine learning and deep learning today i have started reading and implementing from the book deep learning for coders with fastai and pytorch here i have read about deep learning in practice areas of deep learning a brief history of neural networks fastai and jupyter notebooks cat and dog classification image loaders pretrained models resnet and cnns error rate and few more topics related to the same from here i have presented the implementation of cat and dog classification using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20191 png day192 of 300daysofdata transfer learning transfer learning is defined as the process of using pretrained model for a task different from what it was originally trained for fine tuning is a transfer learning technique that updates the parameters of pretrained model by training for additional epochs using a different task from that used for pretraining on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about machine learning and weight assignment neural networks and stochastic gradient descent limitations inherent to ml image recognition classification and regression overfitting and validation set transfer learning semantic segmentation sentiment classification data loaders and few more topics related to the same from here i have presented the implementation of semantic segmentation and sentiment classification using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20192 png day193 of 300daysofdata transfer learning transfer learning is defined as the process of using pretrained model for a task different from what it was originally trained for fine tuning is a transfer learning technique that updates the parameters of pretrained model by training for additional epochs using a different task from that used for pretraining on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular data and classification tabular data loaders categorical and continuous data recommendation system and collaborative filtering datasets for models validation sets and test sets judgement in test sets and few more topics related to the same from here i have presented the implementation of tabular classification and recommendation system model using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai introduction notebook https github com thinamxx fastai blob main 1 20introduction ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20193 png day194 of 300daysofdata the drivetrain approach it can be stated as start with considering your objective then think about what actions you can take to meet that objective and what data you have or can acquire that can help and then build a model that you can use to determine the best actions to take to get the best results in terms of your objective on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the practice of deep learning the state of dl computer vision text and nlp combining text and images tabular data and recommendation systems the drivetrain approach gathering data and duck duck go questionnaire and few more topics related to the same from here i have presented the implementation of gathering data for object detection using duck duck go and fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20194 png day195 of 300daysofdata the drivetrain approach it can be stated as start with considering your objective then think about what actions you can take to meet that objective and what data you have or can acquire that can help and then build a model that you can use to determine the best actions to take to get the best results in terms of your objective on my journey of machine learning and deep learning today i have read and implemented from the book deep learning for coders with fastai and pytorch here i have fastai dependencies and functions biased dataset data to data loaders data block api dependent and independent variables random splitting image transformations and few more topics related to the same from here i have presented the implementation of gathering data and initializing data loaders using duck duck go and fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20195 png day196 of 300daysofdata data augmentation data augmentation refers to creating random variations of the input data such that they appear different but do not change the meaning of the data randomresizedcrop is a specific example of data augmentation on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data loaders image block resizing squishing and stretching images padding images data augmentation image transformations training the model and error rate random resizing and cropping and few more topics related to the same from here i have presented the implementation of data loaders data augmentation and training the model using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20196 png day197 of 300daysofdata data augmentation data augmentation refers to creating random variations of the input data such that they appear different but do not change the meaning of the data randomresizedcrop is a specific example of data augmentation on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about training pretrained model data augmentation and transformations classification interpretation and confusion matrix cleaning dataset inference model and parameters notebooks and widgets and few more topics related to the same from here i have presented the implementation of classification interpretation cleaning dataset inference model and parameters using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image detection https github com thinamxx fastai blob main 2 20model 20production beardetector ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20197 png day198 of 300daysofdata data ethics ethics refers to well founded standards of right and wrong that prescribe what humans should do it is the study and development of ones ethical standards recourse process feedback loops bias are key examples for data ethics on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data ethics bugs and recourse feedback loops bias integrating ml with product design training a digit classifier pixels and computer vision tenacity and deep learning pixel similarity list comprehensions and few more topics related to the same from here i have presented the simple implementation of pixels and computer vision using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20198 png day199 of 300daysofdata l1 and l2 norm taking the mean of absolute value of differences is called mean absolute difference or l1 norm taking the mean of square of differences and then taking the square root is called root mean squared error or l2 norm on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about rank of tensors mean absolute difference or l1 norm and root mean squared error or l2 norm numpy arrays and pytorch tensors computing metrics using broadcasting and few more topics related to the same from here i have presented the simple implementation of arrays and tensors l1 and l2 norm using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20199 png day200 of 300daysofdata l1 and l2 norm taking the mean of absolute value of differences is called mean absolute difference or l1 norm taking the mean of square of differences and then taking the square root is called root mean squared error or l2 norm on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about computing metrics using broadcasting mean absolute error stochastic gradient descent initializing parameters loss function calculating gradients backpropagation and derivatives learning rate optimization and few more topics related to the same from here i have presented the simple implementation of stochastic gradient descent using fastai here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20200 png day201 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the gradient descent process initializing parameters calculating predictions and inspecting calculating loss and mse calculating gradients and backpropagation stepping the weights and updating parameters repeating the process stopping the process and few more topics related to the same from here i have presented the implementation of the gradient descent process using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20201 png day202 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the mnist loss function matrices and vectors independent variables weights and biases parameters matrix multiplication and dataset class gradient descent process and learning rate activation function and few more topics related to the same from here i have presented the implementation of the dataset class and matrix multiplication using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20202 png day203 of 300daysofdata accuracy and loss function the key difference between metric such as accuracy and loss function is that the loss is to drive automated learning and the metric is to drive human understanding the loss must be a function with meaningful derivative and metrics focuses on performance of the model on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about matrix multiplication activation function loss function gradients and slope sigmoid function accuracy metrics and understanding and few more topics related to the same from here i have presented the implementation of loss function and sigmoid using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20203 png day204 of 300daysofdata sgd and minibatches the process to change or update the weights based on the gradients in order to consider some of the details involved in the next phase of the learning process is called an optimization step the calculation of average loss for a few data items at a time is called a minibatch the number of data items in the minibatch is called batchsize a larger batchsize means more accurate and stable estimate of the dataset gradients from the loss function whereas a single batchsize result in an imprecise and unstable gradient on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stochastic gradient descent and minibatches optimization step batch size dataloader and dataset initializing parameters weights and bias backpropagation and gradients loss function and few more topics related to the same from here i have presented the implementation of dataloader and gradients using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20204 png day205 of 300daysofdata sgd and minibatches the process to change or update the weights based on the gradients in order to consider some of the details involved in the next phase of the learning process is called an optimization step the calculation of average loss for a few data items at a time is called a minibatch the number of data items in the minibatch is called batchsize a larger batchsize means more accurate and stable estimate of the dataset gradients from the loss function whereas a single batchsize result in an imprecise and unstable gradient on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about calculating gradients and back propagation weights bias and parameters zeroing gradients training loop and learning rate accuracy and evaluation creating an optimizer and few more topics related to the same from here i have presented the implementation of calculating gradients accuracy and training using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20205a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20205b png day206 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating an optimizer linear module weights and biases model parameters optimization and zeroing gradients sgd class data loaders and learner class of fastai and few more topics related to the same from here i have presented the implementation of creating optimizer and learner class using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20206a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20206b png day207 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about adding a nonlinearity simple linear classifiers basic neural networks weight and bias tensors rectified linear unit or relu activation function universal approximation theorem sequential module and few more topics related to the same from here i have presented the implementation of creating simple neural networks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai training classifier https github com thinamxx fastai blob main 3 20training 20a 20classifier digitclassifier ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20207 png day208 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about image classification localization regular expressions data block and data loaders regex labeller data augmentation presizing checking and debugging data block item and batch transformations and few more topics related to the same from here i have presented the implementation of creating and debugging data block and data loaders using fastai and pytorch here in the snapshot i have used resize as an item transform with a large size and randomresizedcrop as a batch transform with a smaller size randomresizedcrop will be added if min scale parameter is passed in aug transforms function as was done in datablock call below i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20208 png day209 of 300daysofdata exponential function exponential function is defined as e x where e is a special number approximately equal to 2 718 it is the inverse of natural logarithm function exponential function is always positive and increases very rapidly on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about cross entropy loss function viewing activations and labels softmax activation function sigmoid function exponential function negative log likelihood binary classification and few more topics related to the same from here i have presented the implementation of softmax function and negative log likelihood using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20209 png day210 of 300daysofdata exponential function exponential function is defined as e x where e is a special number approximately equal to 2 718 it is the inverse of natural logarithm function exponential function is always positive and increases very rapidly on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about logarithmic function negative log likelihood cross entropy loss function softmax function model interpretation confusion matrix improving the model the learning rate finder logarithmic scale and few more topics related to the same from here i have presented the implementation of cross entropy loss confusion matrix and learning rate finder using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20210 png day211 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about unfreezing and transfer learning freezing trained layers discriminative learning rates selecting the number of epochs deeper architectures and few more topics related to the same from here i have presented the implementation of unfreezing and transfer learning and discriminative learning rates using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai image classification https github com thinamxx fastai blob main 4 20image 20classification imageclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20211 png day212 of 300daysofdata multilabel classification multilabel classification refers to the problem of identifying the categories of objects in images that may not contain exactly one type of object on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about questionnaire of image classification multilabel classification and regression pascal dataset pandas and dataframes constructing datablock datasets and dataloaders lambda functions and few more topics related to the same from here i have presented the implementation of creating datablock and dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20212 png day213 of 300daysofdata multilabel classification multilabel classification refers to the problem of identifying the categories of objects in images that may not contain exactly one type of object on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about lambda functions transformation blocks such as image block and multi category block one hot encoding data splitting dataloaders datasets and datablock resizing and cropping and few more topics related to the same from here i have presented the implementation of creating datablock and dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20213 png day214 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about binary cross entropy loss function dataloaders and learner getting model activations sigmoid and softmax functions one hot encoding getting accuracy partial function and few more topics related to the same from here f binary cross entropy and its module equivalent nn bceloss calculate cross entropy on a one hot encoded target but don t include the initial sigmoid normally f binary cross entropy with logits or nn bcewithlogitsloss do both sigmoid and binary cross entropy in a single function similarly for single label dataset f nll loss or nn nlloss for the version without initial softmax and f cross entropy or nn crossentropyloss for the version with initial softmax i have presented the implementation of cross entropy loss functions and accuracy using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20214 png day215 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about multilabel classification and threshold sigmoid activation overfitting image regression validation loss and metrics partial function and few more topics related to the same from here f binary cross entropy and its module equivalent nn bceloss calculate cross entropy on a one hot encoded target but don t include the initial sigmoid normally f binary cross entropy with logits or nn bcewithlogitsloss do both sigmoid and binary cross entropy in a single function similarly for single label dataset f nll loss or nn nlloss for the version without initial softmax and f cross entropy or nn crossentropyloss for the version with initial softmax i have presented the implementation of training the convolutions with accuracy and threshold using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb fastai image regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression regression ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20215 png day216 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about image regression and localization assembling the dataset initializing datablock and dataloaders points and data augmentation training the model sigmoid range mse loss function transfer learning and few more topics related to the same from here i have presented the implementation of initializing datablock and dataloaders and training image regression using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai multilabel classification regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression multilabelclassification ipynb fastai image regression https github com thinamxx fastai blob main 5 20multilabelclassification 20regression regression ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20216a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20216b png day217 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about imagenette classification datablock and dataloaders data normalization and normalize function progressive resizing and data augmentation transfer learning mean and standard deviation and few more topics related to the same from here progressive resizing is the process of gradually using larger and larger images as training progresses i have presented the implementation of initializing datablock and dataloaders normalization and progressive resizing using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch advanced classification https github com thinamxx fastai blob main 6 20advanced 20classification imagenetteclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20217a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20217b png day218 of 300daysofdata label smoothing label smoothing is a process which replaces all the labels i e 1s with a number a bit less than 1 and 0s with a number a bit more than 0 for training it will make training more robust even if there is mislabeled data which results to be a model that generalizes better at inference on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about progressive resizing test time augmentation mixup augmentation linear combinations callbacks label smoothing and cross entropy loss function and few more topics related to the same from here during inference or validation creating multiple versions of each image using data augmentation and then taking the average or maximum of the predictions for each augmented version of the image is called test time augmentation i have presented the implementation of progressive resizing test time augmentation mixup augmentation and label smoothing using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch advanced classification https github com thinamxx fastai blob main 6 20advanced 20classification imagenetteclassification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20218 png day219 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about collaborative filtering learning the latent factors loss function and stochastic gradient descent creating dataloaders batches dot product and matrix multiplication and few more topics related to the same from here the mathematical operation of multiplying the elements of two vectors together and then summing up the result is called dot product i have presented the implementation of initializing dataset and creating dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20219 png day220 of 300daysofdata embedding the special layer that indexes into a vector using an integer but has its derivative calculated in such a way that it is identical to what it would have been if it had done a matrix multiplication with a one hot encoded vector is called embedding multiplying by a one hot encoded matrix using the computational shortcut that it can be implemented by simply indexing directly the thing that multiply the one hot encoded matrix is called the embedding matrix on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating dataloaders embedding matrix collaborative filtering object oriented programming with python inheritance module and forward propagation function batches and learner sigmoid range and few more topics related to the same from here i have presented the implementation embedding dot product class and sigmoid range using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20220a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20220b png day221 of 300daysofdata embedding the special layer that indexes into a vector using an integer but has its derivative calculated in such a way that it is identical to what it would have been if it had done a matrix multiplication with a one hot encoded vector is called embedding multiplying by a one hot encoded matrix using the computational shortcut that it can be implemented by simply indexing directly the thing that multiply the one hot encoded matrix is called the embedding matrix on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about collaborative filtering weight decay or l2 regularization overfitting creating embeddings and weight matrices parameter module and few more topics related to the same from here weight decay consists of adding sum of the squared weights to the loss function the idea is that the larger the coefficients are the sharper the canyons will be in the loss function i have presented the implementation of biases and weight decay and matrices using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20221a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20221b png day222 of 300daysofdata embedding the special layer that indexes into a vector using an integer but has its derivative calculated in such a way that it is identical to what it would have been if it had done a matrix multiplication with a one hot encoded vector is called embedding multiplying by a one hot encoded matrix using the computational shortcut that it can be implemented by simply indexing directly the thing that multiply the one hot encoded matrix is called the embedding matrix on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about interpreting embedding and biases principal component analysis or pca collab learner embedding distance and cosine similarity bootstrapping a collaborative filtering model probabilistic matrix factorization or dot product model and few more topics related to the same from here i have presented the implementation interpreting biases collab learner model and embedding distance using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20222 png day223 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about deep learning and collaborative filtering embedding matrices linear function relu and nonlinear functions sigmoid range forward propagation function tabular model and embedding neural networks and few more topics related to the same from here in python kwargs in a parameter list means put any additional keyword arguments into a dict called kwargs and kwargs in an argument list means insert all key and value pairs in the kwargs dict as named arguments here i have presented the implementation deep learning for collaborative filtering and neural networks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch collaborative filtering https github com thinamxx fastai blob main 7 20collaborative 20filtering collaborativefiltering ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20223 png day224 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular modeling categorical embeddings continuous and categorical variables recommendation system the tabular dataset ordinal columns decision trees handling dates tabular pandas and tabular proc object and few more topics related to the same from here i have presented the implementation of handling dates tabular pandas and tabular proc using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20224 png day225 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular modeling creating the decision tree leaf nodes root mean squared error dtreeviz library stopping criterion overfitting and few more topics related to the same from here i have presented the implementation of creating decision tree and leaf nodes using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20225a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20225b png day226 of 300daysofdata random forest a random forest is a model that averages the predictions of a large number of decision trees which are generated by randomly varying various parameters that specify what data is used to train the tree and other tree parameters bagging is a particular approach to ensembling or combining the results of multiple models together on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about categorical variables random forests and bagging predictors ensembling optimal parameters out of bag error tree variance for prediction confidence and standard deviation model interpretation and few more topics related to the same from here the out of bag error or oob error is a way of measuring prediction error in the training dataset by including in the calculation of a rows error trees only where that row was not included in the training i have presented the implementation of creating random forest and model interpretation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20226 png day227 of 300daysofdata random forest a random forest is a model that averages the predictions of a large number of decision trees which are generated by randomly varying various parameters that specify what data is used to train the tree and other tree parameters bagging is a particular approach to ensembling or combining the results of multiple models together on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about random forest feature importance removing low importance variables removing redundant features determining similarity of features rank correlation oob score and few more topics related to the same from here i have presented the implementation of random forest and feature importance using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20227 png day228 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about removing redundant features determining similarity oob score partial dependence plots data leakage root mean squared error and few more topics related to the same from here standard deviation of predictions across the trees presents the relative confidence of predictions the model is more consistent when the standard deviation is lower i have presented the implementation of removing redundant features and partial dependence plots using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20228 png day229 of 300daysofdata random forest model just averages the predictions of a number of trees and therefore it can never predict values outside the range of the training data random forests are not able to extrapolate outside the types of data i e out of domain data here prediction is simply the prediction that the random forest makes here bias is the prediction based on taking the mean of the dependent variable similarly contributions tells us the total change in prediction due to each of the independent variables on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tree interpreter redundant features waterfall charts or plots random forest prediction bias and contributions the extrapolation problem unsqueeze method out of domain data and few more topics related to the same from here i have presented the implementation of tree interpreter waterfall plots extrapolation problem using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20229 png day230 of 300daysofdata random forest model just averages the predictions of a number of trees and therefore it can never predict values outside the range of the training data random forests are not able to extrapolate outside the types of data i e out of domain data here prediction is simply the prediction that the random forest makes here bias is the prediction based on taking the mean of the dependent variable similarly contributions tells us the total change in prediction due to each of the independent variables on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about the extrapolation problem and random forest finding out of domain data root mean squared error and feature importance histograms and few more topics related to the same from here i have presented the implementation of finding out of domain data and rmse using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20230 png day231 of 300daysofdata random forest random forest model just averages the predictions of a number of trees and therefore it can never predict values outside the range of the training data random forests are not able to extrapolate outside the types of data i e out of domain data on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about tabular modeling and neural networks continuous and categorical features embedding matrix mean squared error and regression tabular learner and learning rate ensembling bagging and boosting combining embeddings and few more topics related to the same from here ensembling is the generalization technique in which the average of the predictions of several models are used i have presented the implementation of tabular modeling and neural networks and ensembling using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch tabular modeling https github com thinamxx fastai blob main 8 20tabular 20modeling tabularmodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20231a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20231b png day232 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about nlp and language model self supervised learning text preprocessing tokenization numericalization and embedding matrix subword and characters tokens and few more topics related to the same from here token is a element of a list created by the tokenization process which could be a word a part of a word or subword or a single character i have presented the implementation of loading the data and word tokenization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20232 png day233 of 300daysofdata tokenization subword tokenization splits words into smaller parts based on the most commonly occurring sub strings word tokenization splits a sentence on spaces as well as applying language specific rules to try to separate parts of meaning even when there are no spaces subword tokenization provides a way to easily scale between character tokenization i e using a small subword vocab and word tokenization i e using a large subword vocab and handles every human language without needing language specific algorithms to be developed on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about word tokenization subword tokenization setup method vocabulary numericalization with fastai embedding matrices and few more topics related to the same from here i have presented the implementation of subword tokenization and numericalization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20233 png day234 of 300daysofdata tokenization subword tokenization splits words into smaller parts based on the most commonly occurring sub strings word tokenization splits a sentence on spaces as well as applying language specific rules to try to separate parts of meaning even when there are no spaces subword tokenization provides a way to easily scale between character tokenization i e using a small subword vocab and word tokenization i e using a large subword vocab and handles every human language without needing language specific algorithms to be developed on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about numericalization with fastai embedding matrices creating batches for language model tokenization training a text classifier language model using datablock data loaders fine tuning language model and transfer learning and few more topics related to the same from here i have presented the implementation of creating data loaders and data block for language model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20234 png day235 of 300daysofdata encoder encoder is defined as the model which doesn t contain task specific final layers the term encoder means much the same thing as body when applied to vision cnn but encoder tends to be more used for nlp and generative models on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about encoder model text generation and classification creating the classifier data loaders embeddings data augmentation fine tuning the classifier discriminative learning rates and gradual unfreezing disinformation and language models and few more topics related to the same from here i have presented the implementation of training text classifier model using discriminative learning rates and gradual unfreezing using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch natural language processing https github com thinamxx fastai blob main 9 20natural 20language 20processing nlp ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20235a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20235b png day236 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data munging with fastai tokenization and numericalization creating data loaders and data block mid level api transforms decode method data augmentation cropping and padding and few more topics related to the same from here i have presented the implementation of creating data loaders tokenization and numericalization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20236 png day237 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data munging decorator pipeline method transformed collections training and validation set data loaders object categorize method transformations and few more topics related to the same from here i have presented the implementation of pipeline class and transformed collections using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20237 png day238 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i i have read about datasets class transformed collections pipelines categorize method data loaders and data block text block partial function category block and few more topics related to the same from here i have presented the implementation of datasets class transformed collections and data loaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20238 png day239 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about applying mid level data api for siamese pair and computer vision data loaders transforms and resizing images data augmentation subclasses transformed collections and few more topics related to the same from here datasets class will apply two or more pipelines in parallel to the same raw object and build a tuple with the result it will automatically do the setup and index into a datasets i have presented the implementation of siamese image object and data augmentation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20239 png day240 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about siamese transform object random splitting transformed collections and datasets class data loaders totensor and inttofloattensor methods data and batch normalization and few more topics related to the same from here totensor method converts images to tensors inttofloattensor method converts the tensor of images containing the integers from 0 to 255 to a tensor of floats and divide by 255 to make values between 0 and 1 i have presented the implementation of siamese transform object and data augmentation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch data munging https github com thinamxx fastai blob main 10 20data 20munging datamunging ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20240 png day241 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about language model from scratch data concatenation and tokenization vocabulary and numericalization neural networks independent variables and dependent variable sequence of tensors and few more topics related to the same from here i have presented the implementation of preparing sequence of tensors for language model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20241 png day242 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about language model from scratch using pytorch sequence tensors creating data loaders and batchsize neural network architecture and linear layers words embeddings and activations weight matrix creating learner and training and few more topics related to the same from here i will create neural network architecture that takes three words as input and returns the predictions of the probability of each possible next word in the vocab i will use three standard linear layers the first linear layer will use only the first words embedding as activations the second layer will use the second words embedding plus the first layers output activations and the third layer will use the third words embedding plus the second layers output activations the key effect is that every word is interpreted in the information context of any words preceding it each of these three layers will use the same weight matrix i have presented the implementation of creating data loaders language model from scratch and training using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20242 png day243 of 300daysofdata backpropagation through time backpropagation through time is a process of treating a neural network with effectively one layer per time step as one big model and calculating gradients on it in the usual way the bptt technique is used to avoid running out of memory and time which detaches the history of computation steps in the hidden state every few time steps hidden state is defined as the activations that are updated at each step of a recurrent neural network on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about recurrent neural networks hidden state of nn improving the rnn maintaining the state of rnn unrolled representation backpropagation and derivatives detach method stateful rnn backpropagation through time and few more topics related to the same from here i have presented the implementation of recurrent neural networks and language model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20243 png day244 of 300daysofdata backpropagation through time backpropagation through time is a process of treating a neural network with effectively one layer per time step as one big model and calculating gradients on it in the usual way the bptt technique is used to avoid running out of memory and time which detaches the history of computation steps in the hidden state every few time steps hidden state is defined as the activations that are updated at each step of a recurrent neural network on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about backpropagation through time lmdataloader object and arranging the dataset creating data loaders callbacks and reset method creating more signal and few more topics related to the same from here i have presented the implementation of arranging dataset creating data loaders callbacks and reset method using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20244 png day245 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating more signal or sequence cross entropy loss function and flatten method multilayer recurrent neural networks and activations unrolled representation stack and few more topics related to the same from here the single layer recurrent neural network performed better than multilayer recurrent neural network because a deeper model leads to exploding and vanishing activations i have presented the implementation creating more signal and multilayer recurrent neural network using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20245a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20245b png day246 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about exploding and disappearing activations matrix multiplication architecture of long short term memory and rnn sigmoid and tanh function hidden state and cell state forget gate input gate cell gate and output gate chunk method and few more topics related to the same from here i have presented the implementation long short term memory using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20246 png day247 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about training language model using lstm embedding layer linear layer overfitting and regularization of lstm dropout regularization training or inference bernoulli method and few more topics related to the same from here dropout is a regularization technique which randomly changes some activations to zero at a training time i have presented the implementation language model using long short term memory and dropout using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20247 png day248 of 300daysofdata activation regularization activation regularization is a process of adding the small penalty to the final activations produced by the lstm to make it as small as possible it is a regularization method very similar to weight decay on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about activation regularization and temporal activation regularization language model using long short term memory weight decay training a weight tied regularized lstm weight tying and input embeddings text learner cross entropy loss function and few more topics related to the same from here i have presented the implementation language model using regularized long short term memory and regularized dropout and activation regularization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch language model from scratch https github com thinamxx fastai blob main 11 20language 20model languagemodel ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20248 png day249 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutional neural networks the magic of convolutions feature engineering kernel and matrix mapping a convolutional kernel nested list comprehensions matrix multiplications and few more topics related to the same from here feature engineering is the process of creating a new transformations of the input data in order to make it easier to model i have presented the implementation of feature engineering and mapping a convolutional kernel using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20249 png day250 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutions with pytorch rank tensors creating data block and data loaders channel of images unsqueeze method and unit axis strides and padding understanding the convolutions equations matrix multiplication shared weights and few more topics related to the same from here a channel is a single basic color in an image for a regular full color images there are three channels red green and blue kernels passed to convolutions need to be rank 4 tensors i have presented the implementation of convolutions and dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20250 png day251 of 300daysofdata channels and features channels and features are largely used interchangeably and refer to the size of the second axis of a weight matrix which is the number of activations per grid cell after a convolution channels refer to the input data i e colors or activations inside the network using a stride 2 convolution often increases the number of features at the same time because the number of activations in the activation map decrease by the factor of 4 on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutional neural network refactoring channels and features understanding convolution arithmetic biases receptive fields convolution over rgb image stochastic gradient descent and few more topics related to the same from here i have presented the implementation of convolutional neural network and training the learner using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20251 png day252 of 300daysofdata channels and features channels and features are largely used interchangeably and refer to the size of the second axis of a weight matrix which is the number of activations per grid cell after a convolution channels refer to the input data i e colors or activations inside the network using a stride 2 convolution often increases the number of features at the same time because the number of activations in the activation map decrease by the factor of 4 on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about improving training stability of convolutional neural networks batch size and splitting the dataset simple baseline network activations and kernel size activation stat callbacks learning rate creating a learner and training and few more topics related to the same from here i have presented the implementation of convolutional neural network and training the learner using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20252 png day253 of 300daysofdata one cycle training 1 cycle training is a combination of warmup and annealing warmup is the one where learning rate grows from the minimum value to the maximum value and annealing is the one where it decreases back to the minimum value on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about activation stats callbacks increasing batch size activations 1 cycle training warmup and annealing super convergence learning rate and momentum colorful dimension and histograms and few more topics related to the same from here i have presented the implementation of increasing batch size 1 cycle training and inspecting momentum and activations using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch convolutional neural networks https github com thinamxx fastai blob main 12 20convolutional 20neural 20networks cnn ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20253 png day254 of 300daysofdata fully convolutional networks the idea in fully convolutional networks is to take the average of activations across a convolutional grid a fully convolutional networks has a number of convolutional layers some of which will be stride 2 convolutions at the end of which is an adaptive average pooling layer a flatten layer to remove the unit axis and finally a linear layer larger batches have gradients that are more accurate since they are calculated from more data but larger batch size means fewer batches per epoch which means fewer opportunities for the model to update weights on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about residual networks or resnets convolutional neural networks strides and padding fully convolutional networks adaptive average pooling layer flatten layer activations and matrix multiplications and few more topics related to the same from here i have presented the implementation of preparing data and fully convolutional networks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20254 png day255 of 300daysofdata fully convolutional networks the idea in fully convolutional networks is to take the average of activations across a convolutional grid a fully convolutional networks has a number of convolutional layers some of which will be stride 2 convolutions at the end of which is an adaptive average pooling layer a flatten layer to remove the unit axis and finally a linear layer larger batches have gradients that are more accurate since they are calculated from more data but larger batch size means fewer batches per epoch which means fewer opportunities for the model to update weights on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about fully convolutional neural networks building resnet skip connections identity mapping sgd batch normalization layer trainable parameters true identity path convolutional neural networks average pooling layer and few more topics related to the same from here i have presented the implementation of resnet architecture and skip connections using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20255 png day256 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about residual networks relu activation function skip connections training deeper models loss landscape of nn stem of the network convolutional layers max pooling layer and few more topics related to the same from here stem is defined as the first few layers of cnn it has different structure than the main body of cnn i have presented the implementation of training deeper models and stem of network using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20256 png day257 of 300daysofdata bottleneck layers bottleneck layers use three convolutions two 1x1 at the begining and the end and one 3x3 the 1x1 convolutions are much faster which facilitates to use higher number of filters in and out the 1x1 convolutions diminish and then restore the number of channels so called bottleneck the overall impact is to facilitate the use of more filters in the same amount of time on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stem of the network residual network architecture bottleneck layers convolutional neural networks progressive resizing and few more topics related to the same from here i have presented the implementation of training deeper networks and bottleneck layers using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch residual networks https github com thinamxx fastai blob main 13 20resnets resnets ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20257 png day258 of 300daysofdata splitter function a splitter is a function that tells the fastai library how to split the model into parameter groups which are used to train only the head of the model during transfer learning the params is just a function that returns all parameters of a given module on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about body and head of networks batch normalization layer unet learner and architecture generative vision models nearest neighbor interpolation transposed convolutions siamese network loss function and splitter function and few more topics related to the same from here i have presented the implementation of siamese network model loss function and splitter function using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch architecture details https github com thinamxx fastai blob main 14 20architecture 20details architectures ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20258 png day259 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stochastic gradient descent loss function updating weights optimization function creating data block and data loaders resnet model and learner training process and few more topics related to the same from here i have presented the implementation of preparing dataset and baseline model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20259 png day260 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about training process stochastic gradient descent optimization function learning rate finder momentum optimizer callbacks zeroing gradients partial function and few more topics related to the same from here i have presented the implementation of functions for optimizer and sgd here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20260 png day261 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about stochastic gradient descent and optimization function momentum exponentially weighted moving average gradient averages callbacks rms prop adaptive learning rate divergence and epsilon and few more topics related to the same from here i have presented the implementation of momentum and rms prop using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20261 png day262 of 300daysofdata adam optimizer adam mixes the idea of sgd with momentum and rmsprop together where it uses the moving average of the gradients as a direction and divides by the square root of the moving average of the gradients squared to give an adaptive learning rate to each parameter it takes the unbiased moving average on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about rmsprop optimizer sgd adam optimizer unbiased moving average of gradients momentum parameter decoupled weight decay l1 and l2 regularization callbacks and few more topics related to the same from here i have presented the implementation of rms prop and adam optimizer using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20262 png day263 of 300daysofdata adam optimizer adam mixes the idea of sgd with momentum and rmsprop together where it uses the moving average of the gradients as a direction and divides by the square root of the moving average of the gradients squared to give an adaptive learning rate to each parameter it takes the unbiased moving average on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating callbacks loss functions model resetter callbacks rnn regularization callback ordering and exceptions stochastic gradient descent and few more topics related to the same from here i have presented the implementation of model resetter callback and rnn regularization callback using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch training process https github com thinamxx fastai blob main 15 20training 20process training ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20263 png day264 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about neural networks building a neural network from scratch modeling a neuron nonlinear activation functions hidden size fully connected layer and dense layer linear layer matrix multiplication from scratch elementwise arithmetic and few more topics related to the same from here i have presented the implementation of matrix multiplication from scratch and elementwise arithmetic using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20264 png day265 of 300daysofdata forward and backward passes computing all the gradients of a given loss with respect to its parameters is known as backward pass similarly computing the output of the model on a given input based on the matrix products is known as forward pass on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about broadcasting with scalar broadcasting vector and matrix unsqueeze method einstein summation matrix multiplication the forward and backward passes defining and initializing layer activation function linear layer weights and biases and few more topics related to the same from here i have presented the implementation of einstein summation and defining and initializing linear layer using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20265 png day266 of 300daysofdata forward and backward passes computing all the gradients of a given loss with respect to its parameters is known as backward pass similarly computing the output of the model on a given input based on the matrix products is known as forward pass on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about mean and standard deviation matrix multiplications xavier initialization relu activation kaiming initialization weights and activations and few more topics related to the same from here i have presented the implementation of xavier initialization relu activation and matrix multiplications using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20266 png day267 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about kaiming initialization forward pass mean squared error loss function gradients and backward pass linear layers and relu activation function chain rule backpropagation and few more topics related to the same from here i have presented the implementation of kaiming initialization mse loss function and gradients using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20267 png day268 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about gradients of matrix multiplication symbolic computation forward and backward propagation function model parameters weights and biases refactoring the model callable module and few more topics related to the same from here i have presented the implementation of relu module linear module and mean squared error module using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20268 png day269 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about initializing model architecture callable function forward and backward propagation function linear function mean squared error loss function relu activation function back propagation function and gradients squeeze function and few more topics related to the same from here i have also read about perturbations and neural networks vanishing gradients and convolutional neural networks i have presented the implementation of defining model architecture layer function and relu using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20269 png day270 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about defining base class and sub classes linear layer relu activation function and non linearities mean squared error function super class initializer kaiming initialization elementwise arithmetic and broadcasting and few more topics related to the same from here i have presented the implementation of defining linear layer and linear model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch neural network foundations https github com thinamxx fastai blob main 16 20neural 20network 20foundations neuralfoundations ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20270 png day271 of 300daysofdata class activation map the class activation map uses the output of the last convolutional layer which is just before the average pooling layer together with predictions to give a heatmap visualization of model decision on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about cnn interpretation class activation map hooks heatmap visualization activations and convolutional layer dot product feature map data loaders and few more topics related to the same from here i have presented the implementation of defining hook function and decoding images using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch cnn interpretation with cam https github com thinamxx fastai blob main 17 20cnn 20interpretation cnn 20interpretation ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20271 png day272 of 300daysofdata class activation map the class activation map uses the output of the last convolutional layer which is just before the average pooling layer together with predictions to give a heatmap visualization of model decision on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about hook class and context manager gradient class activation map heatmap visualization activations and weights gradients and back propagation model interpretation and few more topics related to the same from here i have presented the implementation of defining hook function activations gradients and heatmap visualization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch cnn interpretation with cam https github com thinamxx fastai blob main 17 20cnn 20interpretation cnn 20interpretation ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20272a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20272b png day273 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about fastai learner from scratch dependent and independent variable vocabulary dataset and indexing and few more topics related to the same i have also read about convolutional neural networks perturbations and loss functions i have presented the implementation of preparing training and validation dataset using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20273 png day274 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about creating collation function parallel preprocessing decoding images data loader class normalization and image statistics permuting axis order precision and few more topics related to the same from here i have presented the implementation of initializing data loader and normalization using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20274 png day275 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about module and parameter forward propagation function convolutional layer training attributes kaiming normalization and xavier normalization initializer transformation function weights and biases linear model tensors and few more topics related to the same from here i have presented the implementation of defining module convolutional layer and linear model using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20275 png day276 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about convolutional neural networks linear model testing module sequential module parameters adaptive pooling layer and mean stride hook function pipeline and few more topics related to the same from here i have presented the implementation of testing module sequential module and convolutional neural network using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20276 png day277 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about loss function negative log likelihood function log softmax function log of sum of exponentials stochastic gradient descent optimizer function data loaders training and validation sets and few more topics related to the same from here i have presented the implementation of negative log likelihood function cross entropy loss function sgd optimizer and data loaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20277 png day278 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about data convolutional neural net model loss function stochastic gradient descent and optimization function learner callbacks parameters training and epochs and few more topics related to the same from here i have presented the implementation of learner and callbacks using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch fastai learner from scratch https github com thinamxx fastai blob main 18 20fastai 20learner fastai 20learner ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20278 png day279 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read binary classification chest x rays dicom or digital imaging and communications in medicine plotting the dicom data random splitter function medical imaging pixel data and few more topics related to the same from here i have presented the implementation of getting dicom files and inspection using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20279 png day280 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about binary classification initializing data block and data loaders image block and category block batch transformations training pretrained model learning rate finder tensors and probabilities model interpretation and few more topics related to the same from here i have presented the implementation of initializing data block and data loaders training pretrained model and interpretation using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20280 png day281 of 300daysofdata sensitivity specificity sensitivity true positive true positive false negative it is also known as a type ii error specificity true negative false positive true negative it is also known as a type i error on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about sensitivity and specificity positive predictive value and negative predictive value confusion matrix and model interpretation type i ii error accuracy and prevalence and few more topics related to the same from here i have presented the implementation of confusion matrix sensitivity and specificity accuracy using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch chest x rays classification https github com thinamxx fastai blob main 19 20chest 20xrays 20classification xrays 20classification ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20281 png day282 of 300daysofdata cross validation cross validation is a step in the process of building a machine learning model which helps us to ensure that our models fit the data accurately and also ensures that we do not overfit on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about supervised and unsupervised learning features samples and targets classification and regression clustering t distributed stochastic neighbour embedding 2d arrays cross validation overfitting and few more topics related to the same from here i have presented the implementation of tsne decomposition and preparing dataset here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20282 png day283 of 300daysofdata cross validation cross validation is a step in the process of building a machine learning model which helps us to ensure that our models fit the data accurately and also ensures that we do not overfit on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about decision trees and classification features and parameters accuracy and model predictions overfitting and model generalization training loss and validation loss cross validation and few more topics related to the same from here i have presented the implementation of decision tree classifier and model evaluation here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20283 png day284 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about stratified kfold cross validation skewed dataset and classification data distribution hold out cross validation time series data regression and sturge s rule probabilities evaluation metrics and accuracy and few more topics related to the same from here i have presented the implementation of distribution of labels and stratified kfold here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch supervised and unsupervised learning https github com thinamxx approachinganymachinelearning blob main 01 20supervised 20unsupervised 20learning supervised 20unsupervised ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20284 png day285 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about evaluation metrics and accuracy score training and validation set precision and recall true positive and true negative false positive and false negative binary classification and few more topics related to the same from here i have presented the implementation of true negative false negative false positive and accuracy score here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20285 png day286 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about true positive rate recall and sensitivity false positive rate and specificity area under roc curve prediction probability and thresholds log loss function multiclass classification and macro averaged precision and few more topics related to the same from here i have presented the implementation of true negative rate false positive rate log loss function and macro averaged precision here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20286 png day287 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about multiclass classification macro averaged precision micro averaged precision weighted precision recall metrics random forest regressor mean squared error root mean squared error and few more topics related to the same from here i have presented the implementation of micro averaged precision and weighted precision here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20287 png day288 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about recall metrics for multiclass classification weighted f1 score confusion matrix type i error and type ii error auc curve multilabel classification and average precision and few more topics related to the same from here i have presented the implementation of weighted f1 score and average precision here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20288a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20288b png day289 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book approaching almost any machine learning problem here i have read about regression metrics such as mean absolute and average error root mean squared error squared logarithmic error mean absolute percentage error r squared and coefficient of determination cohen s kappa score mcc score and few more topics related to the same from here i have presented the implementation of mean absolute and average error squared logarithmic error mean absolute percentage error r squared and mcc score here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch evaluation metrics https github com thinamxx approachinganymachinelearning blob main 02 20evaluation 20metrics evaluation 20metrics ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20289a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20289b png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20289c png day290 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented about object detection and fine tuning image segmentation tensors and aspect ratio arrays dataset and data loaders i have also started the machine learning engineering for production specialization from coursera here i have read about steps of ml project and case study ml project lifecycle and few more topics related to the same from here i have presented the implementation of dataset class here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resource machine learning engineering for production image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20290 png day291 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv loading and displaying an image accessing pixels array slicing and cropping resizing images rotating image smoothing image drawing on an image and few more topics related to the same i have also read about ml project lifecycle deployment patterns and pipeline monitoring from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in resizing and rotating and image smoothing and drawing on an image here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv notebook https github com thinamxx computervision blob main 01 20opencv opencv ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20291 png day292 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv counting objects converting image to grayscale edge detection thresholding detecting and drawing contours erosions and dilations masking and bitwise operations and few more topics related to the same from here i have also read about modeling overview key challenges and low average error from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in converting image to grayscale edge detection thresholding detecting and drawing contours erosions and dilations here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv notebook https github com thinamxx computervision blob main 01 20opencv opencv ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20292 png day293 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv rotating images image preprocessing rotation matrix and center coordinates image parsing edge detection and contour detection masking and blurring images and few more topics related to the same from here i have also read about baseline model selecting and training model error analysis and prioritization from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in rotating images and getting roi of images here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv project i https github com thinamxx computervision blob main 01 20opencv ocv 20project 20i ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20293 png day294 of 300daysofdata histogram matching histogram matching can be used as a normalization technique in an image processing pipeline as a form of color correction and color matching which allows to obtain a consistent normalized representation of images even if lighting conditions change on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about opencv color detection rgb colorspace histogram matching pixel distribution cumulative distribution resizing image and few more topics related to the same from here i have also read about skewed datasets performance auditing data centric ai development and data augmentation from machine learning engineering for production specialization of coursera i have presented the implementation of opencv in histogram matching here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com opencv project ii https github com thinamxx computervision blob main 01 20opencv ocv 20project 20ii ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20294 png day295 of 300daysofdata histogram matching histogram matching can be used as a normalization technique in an image processing pipeline as a form of color correction and color matching which allows to obtain a consistent normalized representation of images even if lighting conditions change on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about convolutional neural networks convolutional matrix kernels spatial dimensions padding roi of image elementwise multiplication and addition rescaling intensity laplacian kernel detecting blur and smoothing and few more topics related to the same from here i have presented the implementation of convolution method and constructing kernels here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolution https github com thinamxx computervision blob main 02 20convolutionalneuralnetwork convolutions ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20295 png day296 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about convolutional layers filters and kernel size strides padding input data format dilation rate activation function weights and biases kernel and bias initializer and regularizer generalization and overfitting kernel and bias constraint caltech dataset strided net and few more topics related to the same from here i have presented the implementation of strided net here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolutional layer https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20296 png day297 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about cnn architecture strided net label binarizer and one hot encoding image data generator and data augmentation loading and resizing images and few more topics related to the same from here i have presented the implementation of label binarizer and preparing dataset here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolutional layer https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20297 png day298 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from pyimagesearch blogs here i have read about convolutional neural networks adam optimization function compiling and training strided net model data augmentation and image data generator classification report plotting training loss and accuracy overfitting and generalization and few more topics related to the same from here i have presented the implementation of compiling and training model classification report training loss and accuracy here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead resources machine learning engineering for production pyimagesearch https www pyimagesearch com convolutional layer https github com thinamxx computervision blob main 02 20convolutionalneuralnetworks convolutional 20layers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20298 png day299 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about transformers model gpt2 pretrained model and tokenizer encodes and decodes methods preparing dataset transform method data loaders and few more topics related to the same from here i have presented the implementation of pretrained gpt2 model and tokenizer and transformed dataloaders using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch transformers https github com thinamxx fastai blob main 20 20transformers transformers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20299a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20299b png day300 of 300daysofdata on my journey of machine learning and deep learning i have read and implemented from the book deep learning for coders with fastai and pytorch here i have read about transformers model data loaders batch size and sequence length language model fine tuning gpt2 model callback learner perplexity and cross entropy loss function learner rate finder training and generating predictions and few more topics related to the same from here i have presented the implementation of initializing dataloaders fine tuning gpt2model and lr finder using fastai and pytorch here in the snapshot i hope you will gain some insights and work on the same i hope you will also spend some time learning the topics from the book mentioned below excited about the days ahead book deep learning for coders with fastai and pytorch transformers https github com thinamxx fastai blob main 20 20transformers transformers ipynb image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20300a png image https github com thinamxx 300days machinelearningdeeplearning blob main images day 20300b png
machine-learning deep-learning python
ai
Real-Time-Operating-Systems-Design-And-Programming-Education-Kit
real time operating systems design and programming education kit welcome to our real time operating systems design and programming education kit download the edkit here https github com arm university real time operating systems design and programming education kit archive refs heads main zip our flagship offering to universities worldwide is the arm university program education kit series these self contained educational materials offered exclusively and at no cost to academics and teaching staff worldwide they re designed to support your day to day teaching on core electronic engineering and computer science subjects you have the freedom to choose which modules to teach you can use all the modules in the education kit or only those that are most appropriate to your teaching outcomes our real time operating systems design and programming education kit teaches your students how operating systems control individual devices and how to enable the efficient functioning of device networks especially in real time environments given the complex tasks facing today s computing devices operating systems must be stable fast and efficient particularly given the interaction between devices that recent technologies such as the internet of things will bring a full description of the education kit can be found here https www arm com resources education education kits real time operating systems kit specification a full set of lecture slides ready for use in a typical 10 12 week undergraduate course full syllabus below lab manual with solutions for faculty labs use low cost powerful hardware boards donated by partners and subject to availability license for royalty free keil rtx real time os bundled with keil mdk development software prerequisites basics of programming course aim to produce students who can design and program real time operating systems on arm based platforms and use them to improve their application performance syllabus 1 introduction 2 os overview 3 process task and thread 4 scheduling 5 concurrency 6 memory 7 virtual memory 8 file system and i o 9 rtos and rtx 10 rtx task and simple time management 11 sharing data on rtx 12 performance evaluation and os aware debugging license you are free to fork or clone this material see license md https github com arm university real time operating systems design and programming education kit blob main license license md for the complete license inclusive language commitment arm is committed to making the language we use inclusive meaningful and respectful our goal is to remove and replace non inclusive language from our vocabulary to reflect our values and represent our global ecosystem arm is working actively with our partners standards bodies and the wider ecosystem to adopt a consistent approach to the use of inclusive language and to eradicate and replace offensive terms we recognise that this will take time this course has been updated to replace references to non inclusive language we recognise that some of you will be accustomed to using the previous terms and may not immediately recognise their replacements please refer to the following example when introducing the amba axi protocols we will use the term manager instead of master and subordinate instead of slave this course may still contain other references to non inclusive language it will be updated with newer terms as those terms are agreed and ratified with the wider community contact us at education arm com with questions or comments about this course you can also report non inclusive and offensive terminology usage in arm content at terms arm com
rtos arm stm32f4-discovery real-time-operating-system keil-mdk
os
kictdigital
hi there kictdigital kictdigital is a special repository because its readme md this file appears on your github profile here are some ideas to get you started i m currently working on i m currently learning i m looking to collaborate on i m looking for help with ask me about how to reach me pronouns fun fact
server
deepxde
deepxde build status https github com lululxvi deepxde actions workflows build yml badge svg https github com lululxvi deepxde actions workflows build yml documentation status https readthedocs org projects deepxde badge version latest https deepxde readthedocs io en latest badge latest codacy badge https app codacy com project badge grade 5c67adbfeabd4ccc9b84d2212c50a342 https app codacy com gh lululxvi deepxde dashboard utm source gh utm medium referral utm content utm campaign badge grade pypi version https badge fury io py deepxde svg https badge fury io py deepxde pypi downloads https static pepy tech badge deepxde https www pepy tech projects deepxde conda version https anaconda org conda forge deepxde badges version svg https anaconda org conda forge deepxde conda downloads https img shields io conda dn conda forge deepxde svg https anaconda org conda forge deepxde license https img shields io github license lululxvi deepxde https github com lululxvi deepxde blob master license deepxde is a library for scientific machine learning and physics informed learning deepxde includes the following algorithms physics informed neural network pinn solving different problems solving forward inverse ordinary partial differential equations odes pdes siam rev https doi org 10 1137 19m1274067 solving forward inverse integro differential equations ides siam rev https doi org 10 1137 19m1274067 fpinn solving forward inverse fractional pdes fpdes siam j sci comput https doi org 10 1137 18m1229845 nn arbitrary polynomial chaos nn apc solving forward inverse stochastic pdes spdes j comput phys https doi org 10 1016 j jcp 2019 07 048 pinn with hard constraints hpinn solving inverse design topology optimization siam j sci comput https doi org 10 1137 21m1397908 improving pinn accuracy residual based adaptive sampling siam rev https doi org 10 1137 19m1274067 comput methods appl mech eng https doi org 10 1016 j cma 2022 115671 gradient enhanced pinn gpinn comput methods appl mech eng https doi org 10 1016 j cma 2022 114823 pinn with multi scale fourier features comput methods appl mech eng https doi org 10 1016 j cma 2021 113938 slides https github com lululxvi tutorials blob master 20211210 pinn pinn pdf video https www youtube com watch v wfgr1pma9fy list pl1e3jic2 dwwjq528agjymepa0omadsa9 index 13 video in chinese http tianyuan xmu edu cn cn minicourses 637 html physics informed deep operator network deeponet deeponet learning operators nat mach intell https doi org 10 1038 s42256 021 00302 5 deeponet extensions e g pod deeponet comput methods appl mech eng https doi org 10 1016 j cma 2022 114778 mionet learning multiple input operators siam j sci comput https doi org 10 1137 22m1477751 fourier deeponet comput methods appl mech eng https doi org 10 1016 j cma 2023 116300 fourier mionet arxiv https arxiv org abs 2303 04778 physics informed deeponet sci adv https doi org 10 1126 sciadv abi8605 multifidelity deeponet phys rev research https doi org 10 1103 physrevresearch 4 023210 deepm mnet solving multiphysics and multiscale problems j comput phys https doi org 10 1016 j jcp 2021 110296 j comput phys https doi org 10 1016 j jcp 2021 110698 reliable extrapolation comput methods appl mech eng https doi org 10 1016 j cma 2023 116064 multifidelity neural network mfnn learning from multifidelity data j comput phys https doi org 10 1016 j jcp 2019 109020 pnas https doi org 10 1073 pnas 1922210117 deepxde supports five tensor libraries as backends tensorflow 1 x tensorflow compat v1 in tensorflow 2 x tensorflow 2 x pytorch jax and paddlepaddle for how to select one see working with different backends https deepxde readthedocs io en latest user installation html working with different backends documentation readthedocs https deepxde readthedocs io docs images pinn png docs images deeponet png docs images mfnn png docs images backend png features deepxde has implemented many algorithms as shown above and supports many features enables the user code to be compact resembling closely the mathematical formulation complex domain geometries without tyranny mesh generation the primitive geometries are interval triangle rectangle polygon disk ellipse star shaped cuboid sphere hypercube and hypersphere other geometries can be constructed as constructive solid geometry csg using three boolean operations union difference and intersection deepxde also supports a geometry represented by a point cloud 5 types of boundary conditions bcs dirichlet neumann robin periodic and a general bc which can be defined on an arbitrary domain or on a point set and approximate distance functions for hard constraints different neural networks fully connected neural network fnn stacked fnn residual neural network spatio temporal multi scale fourier feature networks etc many sampling methods uniform pseudorandom latin hypercube sampling halton sequence hammersley sequence and sobol sequence the training points can keep the same during training or be resampled adaptively every certain iterations 4 function spaces power series chebyshev polynomial gaussian random field 1d 2d data parallel training on multiple gpus different optimizers adam l bfgs etc conveniently save the model during training and load a trained model callbacks to monitor the internal states and statistics of the model during training early stopping etc uncertainty quantification using dropout float16 float32 and float64 many other useful features different weighted losses learning rate schedules metrics etc all the components of deepxde are loosely coupled and thus deepxde is well structured and highly configurable it is easy to customize deepxde to meet new demands installation deepxde requires one of the following backend specific dependencies to be installed tensorflow 1 x tensorflow https www tensorflow org 2 7 0 tensorflow 2 x tensorflow https www tensorflow org 2 2 0 tensorflow probability https www tensorflow org probability 0 10 0 pytorch pytorch https pytorch org 1 9 0 jax jax https jax readthedocs io flax https flax readthedocs io optax https optax readthedocs io paddlepaddle paddlepaddle https www paddlepaddle org cn en develop version https www paddlepaddle org cn en install quick docurl documentation docs en develop install pip linux pip en html then you can install deepxde itself install the stable version with pip pip install deepxde install the stable version with conda conda install c conda forge deepxde for developers you should clone the folder to your local machine and put it along with your project scripts git clone https github com lululxvi deepxde git explore more install and setup https deepxde readthedocs io en latest user installation html demos of function approximation https deepxde readthedocs io en latest demos function html demos of forward problems https deepxde readthedocs io en latest demos pinn forward html demos of inverse problems https deepxde readthedocs io en latest demos pinn inverse html demos of operator learning https deepxde readthedocs io en latest demos operator html faq https deepxde readthedocs io en latest user faq html research papers used deepxde https deepxde readthedocs io en latest user research html api https deepxde readthedocs io en latest modules deepxde html cite deepxde if you use deepxde for academic research you are encouraged to cite the following paper article lu2021deepxde author lu lu and meng xuhui and mao zhiping and karniadakis george em title deepxde a deep learning library for solving differential equations journal siam review volume 63 number 1 pages 208 228 year 2021 doi 10 1137 19m1274067 contributing to deepxde first off thanks for taking the time to contribute reporting bugs to report a bug simply open an issue in the github issues https github com lululxvi deepxde issues suggesting enhancements to submit an enhancement suggestion for deepxde including completely new features and minor improvements to existing functionality let us know by opening an issue in the github issues https github com lululxvi deepxde issues pull requests if you made improvements to deepxde fixed a bug or had a new example feel free to send us a pull request asking questions to get help on how to use deepxde or its functionalities you can open a discussion in the github discussions https github com lululxvi deepxde discussions answering questions if you know the answer to any question in the discussions https github com lululxvi deepxde discussions you are welcomed to answer slack the deepxde slack hosts a primary audience of moderate to experienced deepxde users and developers for general chat online discussions collaboration etc if you need a slack invite please send me an email the team deepxde was developed by lu lu https lu seas upenn edu under the supervision of prof george karniadakis https www brown edu research projects crunch george karniadakis at brown university https www brown edu from the summer of 2018 to 2020 supported by philms https www pnnl gov computing philms deepxde was originally self hosted in subversion at brown university under the name sciconet scientific computing neural networks on feb 7 2019 sciconet was moved from subversion to github renamed to deepxde deepxde is currently maintained by lu lu https lu seas upenn edu at university of pennsylvania https www upenn edu with major contributions coming from several talented individuals in various forms and means a non exhaustive but growing list needs to mention zongren zou https github com zongrenzou zhongyi jiang https github com jerry jzy shunyuan mao https github com smao astro paul escapil inchausp https github com pescap license lgpl 2 1 license https github com lululxvi deepxde blob master license
neural-network deep-learning scientific-machine-learning pinn multi-fidelity-data operator pytorch physics-informed-learning jax deeponet paddle pde tensorflow
ai
Mobile-UXSDK-iOS
dji ux sdk for ios what is this the ux sdk is a suite of product agnostic ui objects that fast tracks the development of ios applications using the dji mobile sdk http developer dji com mobile sdk get started immediately ux sdk installation with cocoapods since this project has been integrated with dji ios ux sdk cocoapods https cocoapods org pods dji uxsdk ios now please check the following steps to install djisdk framework using cocoapods after you downloading this project 1 install cocoapods open terminal and change to the download project s directory enter the following command to install it sudo gem install cocoapods the process may take a long time please wait for further installation instructions please check this guide https guides cocoapods org using getting started html getting started 2 install ux sdk and djiwidget with cocoapods in the project run the following command in the objcsamplecode and swiftsamplecode paths pod install if you install it successfully you should get the messages similar to the following analyzing dependencies downloading dependencies installing dji sdk ios 4 16 installing dji uxsdk ios 4 16 installing djiwidget 1 6 6 installing djiflysafedatabaseresource 01 00 01 18 generating pods project integrating client project please close any current xcode sessions and use uxsdkocsample xcworkspace for this project from now on pod installation complete there is 1 dependency from the podfile and 1 total pod installed note if you saw unable to satisfy the following requirements issue during pod install please run the following commands to update your pod repo and install the pod again pod repo update pod install run sample code developers will need to setup the app key by editing the sample code s info plist after generating their unique app key https developer dji com mobile sdk documentation quick start index html generate an app key for the objective c sample app the key value djisdkappkey should to be added to info plist with your unique app key as a string for the swift sample app the djisdkappkey is present in the info plist developers just need to add their unique key in both cases developers will still need to update the bundle identifier http developer dji com user mobile sdk ios configuration one of dji s aircraft or handheld cameras will be required to run the sample application djiwidget integration starting from dji ios sdk 4 7 we have replaced the videopreviewer with djiwidget for video decoding please add the following line to your podfile to install it to your xcode project pod djiwidget 1 6 6 note remember to add the use frameworks in the pod file learn more about dji ux sdk please visit ux sdk introduction http developer dji com mobile sdk documentation introduction ux sdk introduction html for more details development workflow from registering as a developer to deploying an application the following will take you through the full mobile sdk application development process prerequisites https developer dji com mobile sdk documentation application development workflow workflow prerequisits html register as dji developer download sdk https developer dji com mobile sdk documentation application development workflow workflow register html integrate sdk into application https developer dji com mobile sdk documentation application development workflow workflow integrate html run application https developer dji com mobile sdk documentation application development workflow workflow run html testing profiling debugging https developer dji com mobile sdk documentation application development workflow workflow testing html deploy https developer dji com mobile sdk documentation application development workflow workflow deploy html feedback we d love to have your feedback as soon as possible reach out to us when you hit roadblocks or want to talk through something at a minimum please let us know what improvements would you like to see what is hard to use or inconsistent with your expectations what is good any bugs you come across support you can get support from dji with the following methods post questions keep up to date on dji developer news and contribute to the community by visiting the dji s developer forum here https forum dji com forum 139 1 html from developer dev dji com join us dji is looking for all kinds of software engineers to continue building the future of possible available positions in shenzhen china and around the world if you are interested please send your resume to software sz dji com for more details and list of all our global offices please check https we dji com jobs en html dji based dji software sz dji com https we dji com zh cn recruitment
front_end
SOS4NLP
sos4nlp sos4nlp a survey list of surveys for natural language processing mainly contributed and maintained by yuan zang https github com zangy17 reading the surveys is an efficient way to learn about an academic field this repository provides a paperlist of surveys for different areas of natural language processing thanks for all great contributors acknowledgements everyone in github is welcomed to make contribution to this repository contents 0 surveys of natural language processing 0 surveys of natural language processing 1 language parsing 1 language parsing 1 1 chinese word segmentation 11 chinese word segmentation 1 2 syntactic parsing 12 syntactic parsing 1 3 dependency parsing 13 dependency parsing 1 4 semantic parsing 14 semantic parsing 1 5 part of speech tagging 15 part of speech tagging 1 6 word sense disambiguation 16 word sense disambiguation 1 7 named entity recognization 17 named entity recognization 1 8 coreference resolution 18 coreference resolution 2 natural language understanding and generation 2 natural language understanding and generation 2 1 text classification 21 text classification 2 2 sentiment analysis 22 sentiment analysis 2 3 natural language inference 23 natural language inference 2 4 reading comprehension 24 reading comprehension 2 5 text generation 25 text generation 2 6 machine translation 26 machine translation 2 7 text summarization 27 text summarization 3 information extraction 3 information extraction 3 1 relation extraction 31 relation extraction 3 2 event extraction 32 event extraction 3 3 open information extraction 33 open information extraction 4 information retrieval 4 information retrieval 5 dialogue and question answering 5 dialogue and question answering 5 1 dialogue 51 dialogue 5 2 question answering 52 question answering 6 representation learning 6 representation learning 6 1 representation learning 61 representation learning 6 2 word representation learning 62 word representation learning 6 3 network representation learning 63 network representation learning 7 knowledge graph 7 knowledge graph 7 1 knowledge graph 71 knowledge graph 7 2 commen sense knowledge graph 72 common sense knowledge graph 8 machine learning for natural language processing 8 machine learning for natural language processing 8 1 deep learning for natural language processing 81 deep learning for natural language processing 8 2 transformers and pre trained language models 82 transformers and pre trained language models 8 3 graph neural networks 83 graph neural networks 8 4 reinforcement learning 84 reinforcement learning 8 5 data augmentation 85 data augmentation 8 6 few and zero shot learning 86 few and zero shot learning 8 7 meta learning 87 meta learning 8 8 continual learning 88 continual learning 8 9 contrastive learning 89 contrastive learning 8 10 multi task learning 810 multi task learning 8 11 intepretability and analysis 811 intepretability and analysis 8 12 security threats and defense 812 security threats and defense 9 natural language processing applications 9 natural language processing applications 9 1 legal intelligence 91 legal intelligence 9 2 bioinformatics 92 bioinformatics 9 3 financial intelligence 93 financial intelligence 9 4 recommendation 94 recommendation 9 5 computational social science 95 computational social science acknowledgements acknowledgements 0 surveys of natural language processing 1 neural network methods for natural language processing yoav goldberg slhlt 2017 paper 1 advances in natural language processing julia hirschberg christopher d manning science 2015 paper https nlp stanford edu manning xyzzy hirschberg manning science 2015 pdf 1 jumping nlp curves a review of natural language processing research erik cambria bebo white cim 2014 paper https www gwern net docs ai 2014 cambria pdf 1 natural language processing an introduction prakash m nadkarni lucila ohno machado wendy w chapman jamia 2011 paper https watermark silverchair com 18 5 544 pdf token aqecahi208be49ooan9kkhw ercy7dm3zl 9cf3qfkac485ysgaaaskwgglfbgkqhkig9w0bbwagggk2miicsgibadccaqsgcsqgsib3dqehataebglghkgbzqmeas4weqqmtbw0kbdq70u31a2pageqgiicfb9c mxn3vbfbsti1npjdfqevgcpm4colbnsaoe iqi7mxdjx5dwd9mmyvdhoqf3cicds ydppljfp 9bvkf0mfqkkb3u9tsm kml xwljhkii090cekwuhqqcbihy4iwkjmapv0kadf73tpul9yghmjfe6bn7wakhbc0 xe a1qpqgscroyq3rocvi6dklbhjfkbbiko5790l6gjxijbvqwajf3wbgwhp9v0uvxlrun bg8bzyk2ca6aleww4t pch0fh4za1zxnnkr3mie3z3ytsdnvduna19kekwt8wtrdjd33krqqwizk t tekk87spmlr3o59in65vbsuibiufb6xpxju4 hku6m5hb9nybu8iyt2oqnqqxdh3xep4c8ofzckhlo cvghieolmngfkuonxzw9kaqezraunhd binadf6b5mkrteccovrdngaznsiytlguc knxbgmmbi3eltdvariulzyktf0zfpcfddxcrwlpy2fzs 9ybwnc h7qrc0mxinmxhladf7efxm3l9xom7v2bul5tqintjmjgdzx8zjslbs0e2nvq8yabxagjkyv7l ksy81qgrvmfekeha7erzfpan6bfbfo3wulupfd1vedasuwyilrsql3egx4vgmkse7spyprmevsdhu0wo8lkkj1xctzmiv 0por y4uanjb7j3itoavjoanu6iokg rfyyw5y8gufbdpj9cyklebf3rju1 keoxw2igavickvqqfil xnb8sbuszffj3bpnpt4aqhyakun8ec6yygd1hvugg57nhbudodtxoe rb0f3bk3k lu698fch fpwhaaczinhzhq 1 language parsing 1 1 chinese word segmentation 1 chinese word segmentation a decade review changning huang hai zhao jcip 2007 paper https en cnki com cn article en cjfdtotal mess200703001 htm 1 2 syntactic parsing 1 syntactic parsing a survey alton f sanders and ruth h sanders computers and the humanities 1989 paper https www academia edu download 46281305 bf0005876620160606 13840 7rn8pc pdf 1 3 dependency parsing 1 dependency parsing sandra kubler ryan mcdonald joakim nivre slhlt 2009 paper https www linguisticsociety org sites default files e learning dependencies pdf 1 4 semantic parsing 1 a survey on semantic parsing aishwarya kamath rajarshi das akbc 2018 paper https arxiv org pdf 1812 00978 1 5 part of speech tagging 1 part of speech tagging angel r martinez wires comp stats 2012 paper https wires onlinelibrary wiley com doi epdf 10 1002 wics 195 1 6 word sense disambiguation 1 word sense disambiguation a survey alok ranjan pal arxiv 2015 paper https arxiv org pdf 1508 01346 1 word sense disambiguation a survey roberto navigli csur 2009 paper http citeseerx ist psu edu viewdoc download doi 10 1 1 153 8457 rep rep1 type pdf 1 7 named entity recognization 1 a survey on deep learning for named entity recognition jing li aixin sun jianglei han chenliang li tkde 2020 paper https arxiv org pdf 1812 09449 1 a survey of named entity recognition and classification david nadeau satoshi sekine lingvisticae investigationes 2007 paper https www time mk trajkovski thesis li07 pdf 1 8 coreference resolution 1 coreference resolution a survey pradheep elango university of wisconsin madison wi 2005 paper https citeseerx ist psu edu viewdoc download doi 10 1 1 102 1565 rep rep1 type pdf 2 natural language understanding and generation 2 1 text classification 1 text classification algorithms a survey kamran kowsari kiana jafari meimandi mojtaba heidarysafa sanjana mendu laura barnes donald brown information 2019 paper https www mdpi com 2078 2489 10 4 150 pdf 1 semantic text classification a survey of past and recent advances berna altinel murat can ganiz ip m 2018 paper https www sciencedirect com science article abs pii s0306457317305757 2 2 sentiment analysis 1 a survey of sentiment analysis in social media lin yue weitong chen xue li wanli zuo minghao yin kais 2019 paper http cse iitkgp ac in saptarshi courses socomp2020a sentiment analysis survey yue2019 pdf 1 sentiment analysis algorithms and applications a survey walaa medhat ahmed hassan hoda korashy asej 2014 paper https www sciencedirect com science article pii s2090447914000550 2 3 natural language inference 1 recent advances in natural language inference a survey of benchmarks resources and approaches shane storks qiaozi gao joyce y chai arxiv 2019 paper https arxiv org pdf 1904 01172 2 4 reading comprehension 1 a survey on machine reading comprehension tasks evaluation metrics and benchmark datasets changchang zeng shaobo li qin li jie hu jianjun hu as 2020 paper https www mdpi com 2076 3417 10 21 7640 pdf 1 neural machine reading comprehension methods and trends shanshan liu xin zhang sheng zhang hui wang weiming zhang as 2019 paper https www mdpi com 2076 3417 9 18 3698 pdf 2 5 text generation 1 pretrained language models for text generation a survey junyi li tianyi tang wayne xin zhao ji rong wen arxiv 2021 paper https arxiv org pdf 2105 10311 pdf 1 survey of the state of the art in natural language generation core tasks applications and evaluation albert gatt emiel krahmer jair 2018 paper https www jair org index php jair article download 11173 26378 2 6 machine translation 1 neural machine translation a review of methods resources and tools zhixing tan shuo wang zonghan yang gang chen xuancheng huang maosong sun yangliu ai open 2020 paper https pdf sciencedirectassets com 777606 1 s2 0 s2666651020x00027 1 s2 0 s2666651020300024 main pdf x amz security token iqojb3jpz2lux2vjepj 2f 2f 2f 2f 2f 2f 2f 2f 2f 2fweacxvzlwvhc3qtmsjgmeqciei 2fd 2brj29itbnaq4ddn72x5y 2bvr4supqvygkvcaoxslaibb9jk8gnj2erv5xzqu3mdzlcm9cnvymk3bapmgouko5sqdbajh 2f 2f 2f 2f 2f 2f 2f 2f 2f 2f8beaqadda1otawmzu0njg2nsimgbzpbhjaltba2iklktcdlkl3m 2fkoev68axfa1lpwwunurjzel4opvifdasimygusq0petmapxpzuw7vrdiggdkvashdccyh0y0rjwxf9gxx 2bpjbuuv8eae1mgq1ojjbgt2rmivcx 2bccpybetkz2bwezf36zbqis96pgugptzoruwn03tkrtik9u7rehld1dp3waheynkyafu6zyxhj4i54r 2fddpuvkhvhre54qvyhcix 2bduqr 2b8lqkhgpls3 2f1fkkpbtbykpaed2il2rjag26xnhj2b6 2fznhoa6cn 2b 2f3cszexb6qhqhdb8g4zhmjtivtvtrmwxexmpfo0 2fupmxlllojv9socme80t1skncrqowzwjn7t 2f4exllxihvn9yxdkhczykkay0f 2b8gcicr36m0y9joasgo8csj5afhza1galxsydoisv0rfzapsz3giu74snmygixsmyti7o6g 2fprgc 2fnujfp0 2bzgviihcxy2otocujyyj3dixk6afmkahmbjeg2xyqe6hdoxsv7pcbde 2bbtkbmmribdlarwyp13qds 2bap8noutwdf3ykm8dwsqxc7bg1soovygftibypr 2fbbewcvxly 2bewo1xtcf2rbh93od29athriafnh16zu 2febhqfr 2fwiueommups4cgoqybump8kshbps7itmjqolftuow0ndp1jtaku7az0pc2575n2vf3lje 2bob4eojrx38ec9s 2boku62geezppihxoualmmoppwkpomahgss6sbm8gkkbuosqizyrkmmcaxbxlof7hb0j 2fo 2bp5x0pfoxehoahha5kasxgmyc 2bfovhkascnacloas3j1fiq5ypxsz97 2b98gxpdnke1 2bz349rppxydohmwlomznw 3d 3d x amz algorithm aws4 hmac sha256 x amz date 20210713t011505z x amz signedheaders host x amz expires 300 x amz credential asiaq3phcvtyxaof4abf 2f20210713 2fus east 1 2fs3 2faws4 request x amz signature e7ed2a8e3a1864f8e8501ab7267e8fccb4063f50bede5feafa0b70ccab5b2521 hash da1e8d310c84ee9994d7353a0f8bd6819053d8d02889f652f94641cbcfea9a19 host 68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61 pii s2666651020300024 tid spdf def16018 cee7 4413 938c 6b22e9f94fdc sid 27eca1f72344884dbe7913d4b18b564d6a85gxrqa type client 2 7 text summarization 1 a survey on dialogue summarization recent advances and new frontiers xiachong feng xiaocheng feng bing qin arxiv 2021 paper https arxiv org pdf 2107 03175 1 the factual inconsistency problem in abstractive text summarization a survey yichong huang xiachong feng xiaocheng feng bing qin arxiv 2021 paper https arxiv org pdf 2104 14839 1 what have we achieved on text summarization dandan huang leyang cui sen yang guangsheng bao kun wang jun xie yue zhang emnlp 2020 paper https aclanthology org 2020 emnlp main 33 pdf 1 recent automatic text summarization techniques a survey mahak gambhir vishal gupta air 2017 paper https link springer com content pdf 10 1007 s10462 016 9475 9 pdf 3 information extraction 3 1 relation extraction 1 more data more relations more context and more openness a review and outlook for relation extraction xu han tianyu gao yankai lin hao peng yaoliang yang chaojun xiao zhiyuan liu peng li maosong sun jie zhou aacl 2020 paper https aclanthology org 2020 aacl main 75 pdf 1 relation extraction a survey sachin pawar girish k palshikara pushpak bhattacharyyab arxiv 2017 paper https arxiv org pdf 1712 05191 pdf 3 2 event extraction 2 extracting events and their relations from texts a survey on recent research progress and challenges kang liu yubo chen jian liu xinyu zuo jun zhao ai open 2020 paper https www sciencedirect com science article pii s266665102100005x pdfft md5 3983861e9ae91ce7b45f0c5533071077 pid 1 s2 0 s266665102100005x main pdf 3 3 open information extraction 1 a survey on open information extraction christina niklaus matthias cetto andre freitas siegfried handschuh coling 2018 paper https arxiv org pdf 1806 05599 4 information retrieval 1 pretrained transformers for text ranking bert and beyond andrew yates rodrigo nogueira jimmy lin wsdm 2021 paper https dl acm org doi pdf 10 1145 3437963 3441667 1 data mining and information retrieval in the 21st century a bibliographic review jiaying liu xiangjie kong xinyu zhou lei wang da zhang ivan lee bo xu feng xia science review 2019 paper https pdf sciencedirectassets com 276226 1 s2 0 s1574013719x00040 1 s2 0 s1574013719301297 main pdf x amz security token iqojb3jpz2lux2vjeot 2f 2f 2f 2f 2f 2f 2f 2f 2f 2fweacxvzlwvhc3qtmsjgmeqcicq5iyb56s6und0ujrnruwz8mopqdexerz0ttrqqlgunaiafqh 2ftinybe 2bmwqneo 2b4gbkrlum0q8imz0dkqt4sb 2b1cqdbajn 2f 2f 2f 2f 2f 2f 2f 2f 2f 2f8beaqadda1otawmzu0njg2nsim5wdlal 2boe9irke4wktcdlh9upvvvjuq9h8kwcyfwpbaeff6xvjll 2fyzn4gge3arggvsejbi2kvc83hdhw03g 2bakxgwsqr10olgqg0nyturjdqzy6wczpjrbdcdjfme 2fvlglqug0mu 2bzyzro 2f96h 2baiaxwdiuzylraftkyirblgqnz0pbsc6lyofdph0rfw 2b19yfkhonuof 2fx 2brmpl63blk6ebpzji6zoq7f3hc 2fo 2fizgoah2aamsidgybibk 2fzx56s 2b 2fwv6hvkmzh 2bbuj 2fk9elxqnidgyokzox4 2fr9svpfp5lxslzu3bjmps2lsmv9si4xy92isodxgtw2ecgmnqvy4hum92tsjonkakuipzpu8em61wcrccjp2v 2bubxaz0lctwrnbb2jqylf9 2bzizf7dwriawvu8xuopwmfj 2fy7ielkndidchervp94d3 2fcv 2fihgchiuxvuk2pvelhb85xf0hgswrwbx9rusdu2s7krylocmjkedsgsfinh0uaaiuil40s5olbavuk00lgxeq9o1h3 2b 2bqekdo 2bn0kziolfmbpawpf2anmvfa0vgeirtxqs7tqighvdnnpme8gg9lb63qlkju2yt3jiubcqjbuwhez8ocqcngvgrvtqrt7nstb8v1ymc43qcmnpzrocgoqyb3b0jzwmuebzxigbadflifuj 2bgqm2kx0qi4lsubnjeyax0gfcwxtt 2baik 2by3pcp2h3vabgfo0binuxfljbhpkw4pz5i7d51ysfp 2fkxu4yeizt2jhdfgi70p0roesxyhrd5z5pi2nywxptctd6oe6jwr02qtcr5isqvmbtra3ruwojaf3vp58zt2dnswljutojndac3e2j1fa2cqn9f0c94pirkjpt4g 3d 3d x amz algorithm aws4 hmac sha256 x amz date 20210712t045217z x amz signedheaders host x amz expires 300 x amz credential asiaq3phcvty74qvixg7 2f20210712 2fus east 1 2fs3 2faws4 request x amz signature f2f453778540f3e5bf0883461f74ac8314f34bee23d998cb46403e9eee9c23c2 hash 6e521d2013bcd4ebe81b93a081d055b01acebb5dbd35d82f8502f2de73ce799e host 68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61 pii s1574013719301297 tid spdf 35941364 0392 4be0 b75e 20e00a183bf2 sid 27eca1f72344884dbe7913d4b18b564d6a85gxrqa type client 1 deep learning for matching in search and recommendation jun xu xiangnan he hang li sigir tutorial 2018 paper http staff ustc edu cn hexn papers www18 tutorial deep matching paper pdf 1 neural models for information retrieval bhaskar mitra nick craswell arxiv 2017 paper https arxiv org pdf 1705 01509 pdf 5 dialogue and question answering 5 1 dialogue 1 a survey on dialogue systems recent advances and new frontiers hongshen chen xiaorui liu dawei yin jiliang tang acm sigkdd explorations newsletter 2017 paper https arxiv org pdf 1711 01731 5 2 question answering 1 retrieving and reading a comprehensive survey on open domain question answering fengbin zhu wenqiang lei chao wang jianming zheng soujanya poria tat seng chua arxiv 2021 paper https arxiv org pdf 2101 00774 1 core techniques of question answering systems over knowledge bases a survey dennis diefenbach vanessa lopez kamal singh pierre maret kais 2018 paper https hal archives ouvertes fr hal 01637143 document 1 question answering systems survey and trends abdelghani bouziane djelloul bouchiha noureddine doumi mimoun malki procedia computer science 2015 paper https www sciencedirect com science article pii s1877050915034663 pdf md5 e483aabf699b13dd1fdb74c8e95b5100 pid 1 s2 0 s1877050915034663 main pdf valck 1 6 representation learning 6 1 representation learning 1 representation learning a review and new perspectives yoshua bengio aaron courville and pascal vincent tpami 2013 paper https arxiv org pdf 1206 5538 6 2 word representation learning 1 from word to sense embeddings a survey on vector representations of meaning jose camacho collados mohammad taher pilehvar jair 2018 paper https www jair org index php jair article download 11259 26454 6 3 network representation learning 1 network representation learning a macro and micro view xueyi liu jie tang ai open 2021 paper https pdf sciencedirectassets com 777606 1 s2 0 s2666651021x00022 1 s2 0 s2666651021000024 main pdf x amz security token iqojb3jpz2lux2vjepj 2f 2f 2f 2f 2f 2f 2f 2f 2f 2fweacxvzlwvhc3qtmsjimeyciqdf4az3x 2bnskwiyrqfjt5j2nmwqohmxcdnz 2fcuedlywsgihaii 2fnrzvtgmn1pmuzfjn 2fskd4ufywdv3w2k6wugu8gsjkomecoh 2f 2f 2f 2f 2f 2f 2f 2f 2f 2fweqbbommdu5mdazntq2ody1igx4dtx3r0qaisphsuaq1wo 2ffy2wuq6oz9 2fyujoq4p7j2zbmrsxfwedift 2binukhukor6z2qolcfx10s7ool05vvnlgkwesm4g5tibfdq9ayaktngb2fikrl872m94o9rruyvwong6uxfcmcmebxhvqpwbsz1ypesjktez7s9ovzawxajenr90brhwxsdooq2u 2fy 2bo7ocy4cexwbbzrgvpqq 2bhwejupzbmhhra3itizjzriedcze7tl 2fuq2axm963t9gox0dl3fx2af4u89flnxjsuws8ksii37omeoooqu8galrgqsqblfvx3jch4ljucc76fznxb7ueu 2bn 2fy 2bpysn71iafnspsvyoxl2zx 2bqytq8twgk70 2bu9brtchoav26z8z7irqimzm5wldvuaudt 2b79izgneq4dfpymy5z50rwxhhdupoyyuccl53ewhs5dj8eooq6mjoszngrvwzsnohhiw0v1kowh8zamwufuyvar543ji1hfxiplndtfskxhxnn 2bppqrw5myxf06nvqgz9qjpycnflcgcn3pn0qbjgzx 2fnkwqyvp3qfe6asvzzon694ufdsbnzcfkk4w7kczivesmd83 2fd5nc8p18i3emegk 2fqi34spbpjqgb0me3fc6nhqqm8ew3yw6a 2bzhwy6pagtemsongdapmnujdw06 2fp5mqunmcapmocm5j4p 2bqwbtdyaace9vpmbt4rpgra4b7axxi05vzhjj525wrthobj86 2bfihhyfxcipldy3z5wz1widv 2fl6qvxr 2b1a05xulv2bmqveegzfrftr 2bfslx 2bstedvtrwls8s22mbg6y4u6yoh7xri28kcipmaifbtscy34e3aqkqce 2flcdl19jygwg46duebw 3d 3d x amz algorithm aws4 hmac sha256 x amz date 20210713t011759z x amz signedheaders host x amz expires 300 x amz credential asiaq3phcvty4mke7sxl 2f20210713 2fus east 1 2fs3 2faws4 request x amz signature b475442910cb1cbe2893d7a57cb204012ba3fb1c4038481a79775f88eaf78ec9 hash b0aee28209890c3e40e3b1fca16a70a0d6f7c8ed6c36846912b65f7811a8d480 host 68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61 pii s2666651021000024 tid spdf 69cab0cb 463d 46df b673 08b0dc37a6fa sid 27eca1f72344884dbe7913d4b18b564d6a85gxrqa type client 1 a survey on network embedding peng cui xiao wang jian pei wenwu zhu tkde 2018 paper https arxiv org pdf 1711 08752 1 network representation learning a survey daokun zhang jie yin xingquan zhu chengqi zhang tbd 2018 paper https arxiv org pdf 1801 05852 pdf 1 network representation learning an overview cunchao tu cheng yang zhiyuan liu maosong sun ssi 2017 paper http engine scichina com publisher scp journal ssi 47 8 10 1360 n112017 00145 7 knowledge graph 7 1 knowledge graph 1 neural symbolic and neural symbolic reasoning on knowledge graphs jing zhang bo chen lingxi zhang xirui ke haipeng ding ai open 2021 paper https www sciencedirect com science article pii s2666651021000061 pdfft md5 41dae412c5802b063f8ff0615ba12622 pid 1 s2 0 s2666651021000061 main pdf 1 knowledge graph embedding a survey of approaches and applications quan wang zhendong mao bin wang li guo tkde 2017 paper http ieeexplore ieee org abstract document 8047276 1 knowledge graph refinement a survey of approaches and evaluation methods heiko paulheim semantic web 2017 paper http www semantic web journal net system files swj1167 pdf 1 knowledge representation learning a review zhiyuan liu maosong sun yankai lin ruobing xie jcrd 2016 paper https crad ict ac cn en article downloadarticlefile do attachtype pdf id 3099 1 a review of relational machine learning for knowledge graphs maximilian nickel kevin murphy volker tresp evgeniy gabrilovich proceedings of the ieee 2015 paper https arxiv org pdf 1503 00759 7 2 common sense knowledge graph 1 sememe knowledge computation a review of recent advances in application and expansion of sememe knowledge bases fanchao qi ruobing xie yuan zang zhiyuan liu maosong sun fcs 2021 paper https link springer com article 10 1007 s11704 020 0002 4 8 machine learning for natural language processing 8 1 deep learning for natural language processing 1 a survey of the usages of deep learning for natural language processing daniel w otter julian r medina jugal k kalita tnnls 2021 paper https arxiv org pdf 1807 10854 1 recent trends in deep learning based natural language processing tom young devamanyu hazarika soujanya poria erik cambria cim 2018 paper https arxiv org pdf 1708 02709 pdf c2 a0 8 2 transformers and pre trained language models 1 a survey of transformers tianyang lin yuxin wang xiangyang liu xipeng qiu arxiv 2021 paper https arxiv org pdf 2106 04554 1 pre trained models past present and future xu han zhengyan zhang ning ding yuxian gu xiao liu yuqi huo jiezhong qiu liang zhang wentao han minlie huang qin jin yanyan lan yang liu zhiyuan liu zhiwu lu xipeng qiu ruihua song jie tang ji rong wen jinhui yuan wayne xin zhao jun zhu arxiv 2021 paper https arxiv org abs 2106 07139 1 pre trained models for natural language processing a survey xipeng qiu tianxiang sun yige xu yunfan shao ning dai xuanjing huang science china technological sciences 2020 paper https link springer com content pdf 10 1007 s11431 020 1647 3 pdf 1 efficient transformers a survey yi tay mostafa dehghani dara bahri donald metzler arxiv 2020 paper https arxiv org pdf 2009 06732 8 3 graph neural networks 1 graph neural networks for natural language processing a survey lingfei wu yu chen kai shen xiaojie guo hanning gao shucheng li jian pei bo long arxiv 2021 paper https arxiv org pdf 2106 04554 1 robustness of deep learning models on graphs a survey jiarong xu junru chen siqi you zhiqing xiao yang yang jiangang lu ai open 2021 paper https pdf sciencedirectassets com 777606 1 s2 0 s2666651021x00022 1 s2 0 s2666651021000139 main pdf x amz security token iqojb3jpz2lux2vjebeacxvzlwvhc3qtmsjhmeuciqdmcb9h7dkn3wvtxbv0mgpcrs2zxb7v0onaoeyfk65lxaigwbqrsd0dcxz1eoksyxbmqb0iazt1hzautemhhxvc2f0qgwqi 2bf 2f 2f 2f 2f 2f 2f 2f 2f 2f 2faraeggwwntkwmdm1ndy4njuidmmc3gpvzxtpfbscwyrxazesguhajwjnap29to2qfbqfpc6lwnahbcrwvgcwfdvccx 2bqi 2bx0hgskvmdlb3f5nfkwg05jfde3zsfvttaz2djvytuk3 2f8r9uszaqjbwkfensuwkk0ogh3n6acye7w7bbkmj9vdfgf 2bjapnivsdcx6hmdezqqe4rokcv1ocm4m01t5 2ftmvzzs2f9aukdaz 2bvwxh2atxmb27n9vhn01ktbcci9ik2f2wpbrr2 2bbuebs1yyhe0sp3euasyuddbscpzxxb 2bnegbt 2fbnvm7lhr 2byhdhgxty1k7co6ohwm0aqhk8b5ijsd7ojrfnhbyjuogwe3hce0n7mixhk7kh4rv1dpb8fgoyxtzgwz8f0uprvdk9cogx3aoyxxvgpmo 2b5scgkxn3i6wdbwx5flfb 2fbhtbivewj1mq9kbhyzfrc2ktfqtacn9urxgw8wc8ns 2bsbus9u7oyjvvpgknj5je7guacxgotoxmvodeilxeywp7xnbzmz1oxvdeixtenc8k 2fq9kkle6emigz0dkwu3lsxs0n9jbvdaup8yot 2bnhe6zwxygfioy720aks 2fh0hs8tfpx648ttjg2kv6bce9 2fhshd9fea3ydeglrscrax02asrcerdardgjcuzwzc91bihbjqlaenpbd5bqfvycnensdgazcqfdgvju 2bltsvkglhmradbxekegniomcdnt321b 2febsnfjb7r7sfjyap8pizofir9mvds2om2kqyw5ykrq252dwys7sp 2bni6nvnmkrbbls 2f4r59bk3shemkj 2fxblucrmzd9qb9mpsizzfoudbeacq 2behq1fmc6cay109ngrayikfote6im0xow 2bqx07crvrwo 2fhlw 2f 2fvq 3d 3d x amz algorithm aws4 hmac sha256 x amz date 20210714t020215z x amz signedheaders host x amz expires 300 x amz credential asiaq3phcvtyv7ymi542 2f20210714 2fus east 1 2fs3 2faws4 request x amz signature dcaf351d4a7f1c1818f4d436fd3c522460839dd98c69809aecf7466c4c9debed hash c51537b014f1d20d25cd8553fa5586fab12f731df76eee3dabbcc0b342f0048a host 68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61 pii s2666651021000139 tid spdf 90eddceb 73c9 4b80 878d 65985d4a175a sid 80948377662c7948732be96 54885b4a3c51gxrqa type client 1 graph neural networks a review of methods and applications jie zhou ganqu cui shengding hu zhengyan zhang cheng yang zhiyuan liu lifeng wang changcheng li maosong sun ai open 2020 paper https www sciencedirect com science article pii s2666651021000012 1 a comprehensive survey on graph neural networks zonghan wu shirui pan fengwen chen guodong long chengqi zhang philip s yu tnnls 2020 paper https arxiv org pdf 1901 00596 pdf 8 4 reinforcement learning 1 a survey of reinforcement learning informed by natural language jelena luketina nantas nardelli gregory farquhar jakob foerster jacob andreas edward grefenstette shimon whiteson tim rockt schel ijcai 2019 paper https arxiv org pdf 1906 03926 8 5 data augmentation 2 a survey of data augmentation approaches for nlp steven y feng varun gangal jason wei sarath chandar soroush vosoughi teruko mitamura eduard hovy acl findings 2021 paper https arxiv org pdf 2105 03075 3 an empirical survey of data augmentation for limited data learning in nlp jiaao chen derek tam colin raffel mohit bansal diyi yang arxiv 2021 paper https arxiv org pdf 2106 07499 8 6 few and zero shot learning 1 a survey on recent approaches for natural language processing in low resource scenarios michael a hedderich lukas lange heike adel jannik str tgen dietrich klakow naacl 2021 paper https aclanthology org 2021 naacl main 201 pdf 1 a survey of zero shot learning settings methods and applications wei wang vincent w zheng han yu chunyan miao acm tist 2019 paper https www ntulily org wp content uploads journal a survey of zero shot learning settings methods and applications accepted pdf 8 7 meta learning 1 meta learning in neural networks a survey timothy hospedales antreas antoniou paul micaelli amos storkey pami 2020 paper https arxiv org pdf 2004 05439 1 a survey of deep meta learning mike huisman jan n van rijn aske plaat air 2021 paper https link springer com content pdf 10 1007 s10462 021 10004 4 pdf 8 8 continual learning 1 a continual learning survey defying forgetting in classification tasks matthias delange rahaf aljundi marc masana sarah parisot xu jia ales leonardis greg slabaugh tinne tuytelaars pami 2021 paper https arxiv org pdf 1909 08383 8 9 contrastive learning 1 a survey on contrastive self supervised learning ashish jaiswal ashwin ramesh babu mohammad zaki zadeh debapriya banerjee fillia makedon technologies 2021 paper https www mdpi com 2227 7080 9 1 2 pdf 8 10 multi task learning 1 multi task learning for natural language processing in the 2020s where are we going joseph worsham jugal kalita pattern recognition letters 2020 paper https arxiv org pdf 2007 16008 1 an overview of multi task learning in deep neural networks sebastian ruder arxiv 2017 paper https arxiv org pdf 1706 05098 8 11 intepretability and analysis 1 on interpretability of artificial neural networks a survey feng lei fan jinjun xiong mengzhou li ge wang trpms 2021 paper https arxiv org pdf 2001 02522 1 a survey of the state of explainable ai for natural language processing marina danilevsky kun qian ranit aharonov yannis katsis ban kawas prithviraj sen aacl 2020 paper https arxiv org pdf 2010 00711 1 machine learning interpretability a survey on methods and metrics diogo v carvalho eduardo m pereira jaime s cardoso electronics 2019 paper https www mdpi com 2079 9292 8 8 832 pdf 1 analysis methods in neural language processing a survey yonatan belinkov james glass tacl 2019 paper https direct mit edu tacl article pdf doi 10 1162 tacl a 00254 1923061 tacl a 00254 pdf 1 teach me to explain a review of datasets for explainable nlp sarah wiegreffe ana marasovi arxiv 2021 paper https arxiv org pdf 2102 12060 8 12 security threats and defense 1 adversarial attacks on deep learning models in natural language processing a survey wei emma zhang quan z sheng ahoud alhazmi chenliang li acm tist 2020 paper https dl acm org doi pdf 10 1145 3374217 1 backdoor learning a survey yiming li baoyuan wu yong jiang zhifeng li shu tao xia arxiv 2020 paper https arxiv org pdf 2007 08745 pdf 1 a survey of privacy attacks in machine learning maria rigaki sebastian garcia arxiv 2020 paper https arxiv org pdf 2007 07646 9 natural language processing applications 9 1 legal intelligence 1 how does nlp benefit legal system a summary of legal artificial intelligence haoxi zhong chaojun xiao cunchao tu tianyang zhang zhiyuan liu maosong sun acl 2020 paper https arxiv org pdf 2004 12158 9 2 bioinformatics 1 survey of natural language processing techniques in bioinformatics zhiqiang zeng hua shi yun wu zhiling hong cmmm 2015 paper https pdfs semanticscholar org 7013 479be7dda124750aa22fb6231eea2671f630 pdf 9 3 financial intelligence 1 natural language based financial forecasting a survey frank z xing erik cambria roy e welsch air 2018 paper https dspace mit edu bitstream handle 1721 1 116314 10462 2017 9588 referencepdf pdf sequence 2 isallowed y 9 4 recommendation 1 knowledge transfer via pre training for recommendation a review and prospect zheni zeng chaojun xiao yuan yao ruobing xie zhiyuan liu fen lin leyu lin maosong sun frontiers in big data 2021 paper https www ncbi nlm nih gov pmc articles pmc8013982 9 5 computational social science 1 from symbols to embeddings a tale of two representations in computational social science huimin chen cheng yang xuanming zhang zhiyuan liu maosong sun jianbin jin arxiv 2021 paper https arxiv org pdf 2106 14198 acknowledgements great thanks to other contributor shengding hu https github com shengdinghu chenglei si https github com noviscl and han gil kim https github com uoneway names are not listed in particular order please contact us if we miss your names in this list we will add you back asap
ai
NW-MSDS-434-fx-rl-project
nw msds 434 fx rl project landing page example home png this is my personal project for northwestern school of professional studies masters of data science s 434 analytics application engineering br br it is a reinforcement learning algorithm based on an implementation openai gym for forex training it is deployed as a flask app backend with a simple bootstrapped frontend managed via circleci and deployed on google cloud platform s appengine run locally simply by running from the app dir br pip3 install r requirements txt br python3 main py br then navigate to your local host as directed by flask model parameters currency pair trading pair that will be used eur usd usd jpy gbp usd aud usd nzd usd gbp jpy eur gbp eur cad eur sek eur chf eur huf eur jpy usd cny usd hkd usd sgd usd inr usd mxn usd php usd idr usd thb usd myr usd zar usd rub period 1d 5d 1mo 3mo 6mo 1y 2y 5y 10y ytd max
cloud
NLP
natural language processing nlp author bill zichos billzichos hotmail com i use this repository to store apps that help in the processing of natural language for sentiment analysis text classification or tagging a majority of my nlp tasks are written in python basic nlp feature gathering word count sentence count lexical diversity misspellings search word length frequency distribution text classification the first generation of my text classification will be used to classify whether or not a text request for a free pizza is compelling enough for someone to fulfill word sentiment list i created a dictionary to store positive and negative words to be used in keyword searches in a sentiment analysis words are flagged as either having a positive or negative sentiment
ai
LLMeBench
llmebench a flexible framework for accelerating llms benchmarking this repository contains code for the llmebench framework described in a href https arxiv org abs 2308 04945 target blank this paper a the framework currently supports evaluation of a variety of nlp tasks using three model providers openai e g gpt https platform openai com docs guides gpt huggingface inference api https huggingface co docs api inference and petals e g bloomz https huggingface co bigscience bloomz it can be seamlessly customized for any nlp task llm model and dataset regardless of language https github com qcri llmebench assets 3918663 15d989e0 edc7 489a ba3b 36184a715383 p align center picture img alt the architecture of the llmebench framework src https github com qcri llmebench assets 3918663 7f7a0da8 cd73 49d5 90d6 e5c62781b5c3 width 400 height 250 picture p overview p align center picture img alt summary and examples of the 53 datasets 31 tasks 3 model providers and metrics currently implemented and validated in llmebench src https github com qcri llmebench assets 3918663 8a0ddf60 5d2f 4e8c a7d9 de37cdeac104 width 510 height 160 picture p developing llmebench is an ongoing effort and it will be continuously expanded currently the framework features the following supports 31 tasks llmebench tasks featuring 3 model providers llmebench models tested with 53 datasets llmebench datasets associated with 12 languages resulting in 200 benchmarking assets assets ready to run easily extensible to new models accessible through apis extensive caching capabilities to avoid costly api re calls for repeated experiments supports zero and few shot learning paradigms on the fly datasets download and dataset caching open source quick start 1 install https github com qcri llmebench blob main readme md installation llmebench 2 create a new folder data then download arsas dataset https llmebench qcri org data arsas zip into data and unzip it 3 evaluate for example to evaluate the performance of a random baseline llmebench models randomgpt py for sentiment analysis on arsas dataset https github com qcri llmebench blob main llmebench datasets arsas py you need to create an asset assets ar sentiment emotion others sentiment arsas random py a file that specifies the dataset model and task to evaluate then run the evaluation as follows bash python m llmebench filter sentiment arsas random assets results where arsas random is the asset name referring to the arsas dataset name and the random model and assets ar sentiment emotion others sentiment is the directory where the benchmarking asset for the sentiment analysis task on arabic arsas dataset can be found results will be saved in a directory called results installation pip package to be made available soon clone this repository bash git clone https github com qcri llmebench git cd llmebench create and activate virtual environment bash python m venv envs llmebench source envs llmebench bin activate install the dependencies and benchmarking package bash pip install e dev fewshot get the benchmark data in addition to supporting the user to implement their own llm evaluation and benchmarking experiments the framework comes equipped with benchmarking assets over a large variety of datasets and nlp tasks to benchmark models on the same datasets download the benchmarking data from here https llmebench qcri org data an example command to download all these datasets bash mkdir data cd data wget r np nh cut dirs 3 a zip r index html https llmebench qcri org data next unzip the downloaded files to get a directory per dataset bash for i in zip do unzip i d i zip done voil all ready to start evaluation note some datasets and associated assets are implemented in llmebench but the dataset files can t be re distributed it is the responsibility of the framework user to acquaire them from their original sources the metadata for each dataset includes a link to the primary page for the dataset which can be used to obtain the data disclaimer the datasets associated with the current version of llmebench are either existing datasets or processed versions of them we refer users to the original license accompanying each dataset as provided in the metadata for each dataset script https github com qcri llmebench tree main llmebench datasets it is our understanding that these licenses allow for datasets use and redistribution for research or non commercial purposes usage to run the benchmark bash python m llmebench filter benchmarking asset limit k n shots n ignore cache benchmark dir results dir parameters filter benchmarking asset optional this flag indicates specific tasks in the benchmark to run the framework will run a wildcard search using benchmarking asset in the assets directory specified by benchmark dir if not set the framework will run the entire benchmark limit k optional specify the number of samples from input data to run through the pipeline to allow efficient testing if not set all the samples in a dataset will be evaluated n shots n optional if defined the framework will expect a few shot asset and will run the few shot learning paradigm with n as the number of shots if not set zero shot will be assumed ignore cache optional a flag to ignore loading and saving intermediate model responses from to cache benchmark dir path of the directory where the benchmarking assets can be found results dir path of the directory where to save output results along with intermediate cached values you might need to also define environment variables like access tokens and api urls e g azure api url and azure api key depending on the benchmark you are running this can be done by either export azure api key before running the above command or prepending azure api url azure api key to the above command supplying a dotenv file using the env flag sample dotenv files are provided in the env folder each model provider s llmebench models documentation specifies what environment variables are expected at runtime outputs format results dir this folder will contain the outputs resulting from running assets it follows this structure all results json a file that presents summarized output of all assets that were run where results dir was specified as the output directory the framework will create a sub folder per benchmarking asset in this directory a sub folder will contain n json a file per dataset sample where n indicates sample order in the dataset input file this file contains input sample full prompt sent to the model full model response and the model output after post processing as defined in the asset file summary jsonl lists all input samples and for each a summarized model prediction and the post processed model prediction summary failed jsonl lists all input samples that didn t get a successful response from the model in addition to output model s reason behind failure results json contains a summary on number of processed and failed input samples and evaluation results for few shot experiments all results are stored in a sub folder named like 3 shot where the number signifies the number of few shots samples provided in that particular experiment jq https jqlang github io jq is a helpful command line utility to analyze the resulting json files the simplest usage is jq summary jsonl which will print a summary of all samples and model responses in a readable form caching the framework provides caching if ignore cache isn t passed to enable the following allowing users to bypass making api calls for items that have already been successfully processed enhancing the post processing of the models output as post processing can be performed repeatedly without having to call the api every time running few shot assets the framework has some preliminary support to automatically select n examples per test sample based on a maximal marginal relevance based approach using langchain s implementation https python langchain com docs modules model io prompts example selectors mmr this will be expanded in the future to have more few shot example selection mechanisms e g random class based etc to run few shot assets supply the n shots n option to the benchmarking script this is set to 0 by default and will run only zero shot assets if n shots is zero only few shot assets are run tutorial the tutorials directory docs tutorials provides tutorials on the following updating an existing asset advanced usage commands to run different benchmarking use cases and extending the framework by at least one of these components model provider task dataset asset citation please cite our paper when referring to this framework article dalvi2023llmebench title llmebench a flexible framework for accelerating llms benchmarking author fahim dalvi and maram hasanain and sabri boughorbel and basel mousi and samir abdaljalil and nizi nazar and ahmed abdelali and shammur absar chowdhury and hamdy mubarak and ahmed ali and majd hawasly and nadir durrani and firoj alam year 2023 eprint 2308 04945 journal arxiv 2308 04945 primaryclass cs cl url https arxiv org abs 2308 04945
benchmarking large-language-models llm multilingual
ai
getloc-apps
br p align center a href img src logo logos getloc png a p br getloc apps getloc get location github repositories of bangkit academy 2021 capstone project from 3 learning path machine learning mobile development cloud computing from team b21 cap0128 about the project before traveling usually we will make a plan in advance about the location to be visited and the time of departure this is done to avoid problems one of which is the distance to be traveled is farther until the time needed does not match expectations to overcome this our team uses the traveling salesman problem method we present getloc as an application that is able to recommend several tourist attractions according to the location you want to go and provide the fastest and cheapest routes in visiting these places p align center img src logo splash png width 25 nbsp nbsp nbsp img src logo homeipohn png width 25 nbsp nbsp nbsp img src logo rekomendasi png width 25 p team members team id b21 cap0128 br name student id path agung prabowo m3142818 machine learning sabrina mutamimul ula m1071410 machine learning annisa syalsabila a0040224 android development faizal surya prabowo a0070729 android development dimas kuncoro jati c2772526 cloud computing luky mulana c3142828 cloud computing br resource in our project is divided into four branches 1 main https github com agungp88 getloc apps tree main 2 android development https github com agungp88 getloc apps tree android development 3 cloud computing https github com agungp88 getloc apps tree cloud computing 4 machine learning https github com agungp88 getloc apps tree machine learning getting started prerequisites 1 android 2 internet connection 3 gps location installation 1 download the apk 2 install the apk register 1 open getloc application 2 register your email address usage how to find the nearest tourist spots 1 login to your accounts 2 click on search button 3 input your current location 4 getloc will recommend you the nearest tourist spots how to find the tourist spot based on time and cost we make package of trip plan base on the cost and time 1 login to your accounts 2 click recommendation trip plan 3 chose the trip plan how to get the shortest route from the several tourist destination 1 you can choose from trip plan there are already several tourist destination in there or you can chose tourist destination one by one and then click next 2 you will direct to next page that will show you a solution of the best and shortest route product 1 getloc apps https storage googleapis com getloc 314510 appspot com getloc 1 0 apk 2 web service http getloc 314510 et r appspot com br technology used 1 machine learning tensorflow https www tensorflow org python https www python org 2 android development kotlin https kotlinlang org firebase https firebase google com 3 cloud computing flask https flask palletsprojects com google compute engine https cloud google com compute google app engine https cloud google com appengine google cloud function https cloud google com functions google cloud sql https cloud google com sql 4 design figma https www figma com file pj59hwcjstaf1tqfbji0jk design node id 0 3a1 br api documentation for api documentation see the following link restful apis getloc https github com agungp88 getloc apps tree cloud computing restful apis contributing we are very open to any input therefore we want to make contributing to this project as easy and transparent as possible whether it s reporting a bug discussing the current state of the code submitting a fix proposing new features becoming a maintainer if you think something important is missing or should be different based on your experience we d love to have you contribute to this project if you have suggestions for improving these apps please contact https github com agungp88 getloc apps contact the existing ones license distributed under the gnu general public license version 3 see license for more information contact name contact agung prabowo a href https www linkedin com in agung prabowo8800 img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a a href https github com agungp88 img src https img shields io badge github 100000 style for the badge logo github logocolor white a sabrina mutamimul ula a href https www linkedin com in sabrina m a65441130 img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a a href https github com sabrinaa68 img src https img shields io badge github 100000 style for the badge logo github logocolor white a annisa syalsabila a href https www linkedin com in annisa syalsabila 590099207 img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a a href https github com annisasyalsabila img src https img shields io badge github 100000 style for the badge logo github logocolor white a faizal surya prabowo a href https www linkedin com in faizal surya img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a a href https github com solsur img src https img shields io badge github 100000 style for the badge logo github logocolor white a dimas kuncoro jati a href https www linkedin com in dimas k jati img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a a href https github com dimaskunj img src https img shields io badge github 100000 style for the badge logo github logocolor white a luky mulana a href https www linkedin com in lukymulana img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a a href https github com lukymulana img src https img shields io badge github 100000 style for the badge logo github logocolor white a acknowledgments img src https www dicoding com blog wp content uploads 2020 12 cover png reference https github com alexandresanlim badges4 readme md profile
front_end
udapeople
circleci https circleci com gh vincentiroleh udapeople tree master svg style svg https circleci com gh vincentiroleh udapeople tree master udapeople project 3 cloud devops engineering udacity nanodegree submission links url 01 https github com vincentiroleh udapeople url 02 http udapeople f207e34 s3 website us east 1 amazonaws com url 03 http d14mqrutamrgh2 cloudfront net url 04 http 3 82 198 88 3030 api status url 05 http ec2 3 87 231 17 compute 1 amazonaws com 9090 targets
cloud
Mumono-IoT
mumono mumono internet of things device is an embedded systems project designed to provide remote safety for babies marketing website visit marketing website on https mumono squarespace com web interface find webapp on our website and backend code in directory babymon2 webapp can also be accessed here https whyphybabymonitor firebaseapp com installation ensure python 3 and requests library is installed on your pc runtime call python3 runtime py to initiate full functionality of the product the script calls driver package py which handles polling of all sensors communications py handles all data formatting and communicates correctly with firebase app exceptiondefinitions py handles all exception errors that could occur in sensor runtime accelerometer py air quality sensor py temperature sensor py contains our sensor interaction code communication https communication was used using post get and patch to interact with firebase app mqtt client and publisher was trialled but unused as https has equivalent abstraction contributors this project was completed with neelesh ravichandran neelesh99 and nidhi jaju nidhijaju
os
THUMT
thumt an open source toolkit for neural machine translation contents introduction introduction online demo online demo implementations implementations notable features notable features documentation documentation license license citation citation development team development team contact contact derivative repositories derivative repositories introduction machine translation is a natural language processing task that aims to translate natural languages using computers automatically recent several years have witnessed the rapid development of end to end neural machine translation which has become the new mainstream method in practical mt systems thumt is an open source toolkit for neural machine translation developed by the natural language processing group at tsinghua university http nlp csai tsinghua edu cn site2 index php lang en the website of thumt is http thumt thunlp org http thumt thunlp org online demo the online demo of thumt is available at http translate thumt cn http 101 6 5 207 3892 the languages involved include ancient chinese arabic chinese english french german indonesian japanese portuguese russian and spanish implementations thumt has currently three main implementations thumt pytorch https github com thumt thumt a new implementation developed with pytorch https github com pytorch pytorch it implements the transformer model transformer vaswani et al 2017 https arxiv org abs 1706 03762 thumt tensorflow https github com thumt thumt tree tensorflow an implementation developed with tensorflow https github com tensorflow tensorflow it implements the sequence to sequence model seq2seq sutskever et al 2014 https papers nips cc paper 5346 sequence to sequence learning with neural networks pdf the standard attention based model rnnsearch bahdanau et al 2014 https arxiv org pdf 1409 0473 pdf and the transformer model transformer vaswani et al 2017 https arxiv org abs 1706 03762 thumt theano https github com thumt thumt tree theano the original project developed with theano https github com theano theano which is no longer updated because mla put an end to theano https github com theano theano it implements the standard attention based model rnnsearch bahdanau et al 2014 https arxiv org pdf 1409 0473 pdf minimum risk training mrt shen et al 2016 http nlp csai tsinghua edu cn ly papers acl2016 mrt pdf for optimizing model parameters with respect to evaluation metrics semi supervised training sst cheng et al 2016 http nlp csai tsinghua edu cn ly papers acl2016 semi pdf for exploiting monolingual corpora to learn bi directional translation models and layer wise relevance propagation lrp ding et al 2017 http nlp csai tsinghua edu cn ly papers acl2017 dyz pdf for visualizing and anlayzing rnnsearch the following table summarizes the features of three implementations implementation model criterion optimizer lrp theano rnnsearch mle mrt sst sgd adadelta adam rnnsearch tensorflow seq2seq rnnsearch transformer mle adam rnnsearch transformer pytorch transformer mle sgd adadelta adam n a we recommend using thumt pytorch https github com thumt thumt or thumt tensorflow https github com thumt thumt tree tensorflow which delivers better translation performance than thumt theano https github com thumt thumt tree theano we will keep adding new features to thumt pytorch https github com thumt thumt and thumt tensorflow https github com thumt thumt tree tensorflow notable features transformer vaswani et al 2017 https arxiv org abs 1706 03762 multi gpu training decoding multi worker distributed training mixed precision training decoding model ensemble averaging gradient aggregation tensorboard for visualization documentation the documentation of pytorch implementation is avaiable at here docs index md license the source code is dual licensed open source licensing is under the bsd 3 clause https opensource org licenses bsd 3 clause which allows free use for research purposes for commercial licensing please email thumt17 gmail com mailto thumt17 gmail com citation please cite the following paper zhixing tan jiacheng zhang xuancheng huang gang chen shuo wang maosong sun huanbo luan yang liu thumt an open source toolkit for neural machine translation https www aclweb org anthology 2020 amta research 11 amta 2020 jiacheng zhang yanzhuo ding shiqi shen yong cheng maosong sun huanbo luan yang liu 2017 thumt an open source toolkit for neural machine translation https arxiv org abs 1706 06415 arxiv 1706 06415 development team project leaders maosong sun http www thunlp org site2 index php zh people id 16 yang liu http nlp csai tsinghua edu cn ly huanbo luan project members theano jiacheng zhang yanzhuo ding shiqi shen yong cheng tensorflow zhixing tan jiacheng zhang xuancheng huang gang chen shuo wang zonghan yang pytorch zhixing tan gang chen contact if you have questions suggestions and bug reports please email thumt17 gmail com mailto thumt17 gmail com derivative repositories uce4bt https github com thunlp mt uce4bt improving back translation with uncertainty based confidence estimation l2copy4ape https github com thunlp mt l2copy4ape learning to copy for automatic post editing document transformer https github com thunlp mt document transformer improving the transformer translation model with document level context pr4nmt https github com thunlp mt pr4nmt prior knowledge integration for neural machine translation using posterior regularization
neural-machine-translation machine-translation deep-learning
ai
IoT-Implant-Toolkit
py3 6 https img shields io badge python 3 6 blue svg mit https img shields io github license mashape apistatus svg iot implant toolkit iot implant toolkit is a framework of useful tools for malware implantation research of iot devices it is a toolkit consisted of essential software tools on firmware modification serial port debugging software analysis and stable spy clients with an easy to use and extensible shell like environment iot implant toolkit is a one stop shop toolkit simplifies complex procedure of iot malware implantation in our research we have succcessfully implanted trojans in eight devices including smart speakers cameras driving recorders and mobile translators with iot implant toolkit a demo video below implantdemo gif resources implantdemo gif how to use installation make sure you have git python3 and setuptools installed for audio processing and playing you should install alsa built in in linux sox and ffplay on ubuntu18 04 bash sudo apt install sox ffmpeg download source code from our github bash git clone https github com arthastang iot implant toolkit git set up environment and install dependencies bash cd iot implant toolkit python3 setup py install run run the toolkit bash python3 b iot implant toolkit py iot implant toolkit a framework for iot implantation research by marvel team command list list all tools run run a specific tool exit exit implant toolkit three commands supported list list all plugins run run a specific plugin with run plugin parameters exit exit features each software tool acts as a plugin which can be easily added into the framework there are more than ten plugins in four categories including topics on serial port debugging firmware pack unpack software analysis and implanted spy programs list of plugins existing plugins in our framework categories tools descriptions reference serial port debugging pyserial modem control and terminal emulation program https github com pyserial pyserial serial port debugging baudrate py find correct baudrate https github com devttys0 baudrate firmware pack unpack mksquashfs create and extract squashfs filesystem https github com plougher squashfs tools firmware pack unpack mkbootimg tools unpack repack boot img for android https github com xiaolu mkbootimg tools firmware pack unpack cramfs make cramfs filesystem https sourceforge net projects cramfs files cramfs 1 1 firmware pack unpack mountimg mount unmount ext4 filesystems for android system img data img on our github software analysis setools android setools for android with sepolicy inject https github com xmikos setools android software analysis crosscomplie crosscompile toolchain for arm on our github later software analysis odex unpack odex to smali for android on our github binary implant spy client server a stable spy client and server source and pre built bins on our github binary implant denoise tool denoise tool for audio porcess on our github create new plugins code structure bash iot implant toolkit py startup script outputs default folder of outputs toolkit core basic basic plugin class defination cli shell like cli defination toollist auto updating toollist of plugins plugins firmware plugins for firmware modification implant plugins for generate spy programs serialport plugins for serial port debugging software plugins for software analysis especially for android tools other tools create newplugin py in corresponding folder category and define init attributes to add a new plugin to iot implant toolkit the framework will detect new plugin automatically when startup other tools hardware tools essential hardware tools for malware implantation research see pictures in hardwaretools name description soldering iron solder tools solder wire solder tools solder paste solder tools solder wick solder tools hot air gun solder tools reballing tool reballing tool usb to ttl debug console cable dupont wire electrical wire eprom burner programmer burner programmer other useful software tools we have not added more plugins due to time limitation chart below are tools not fits our framework but may be useful we hope that iot implant tookit will be an essential toolkit in malware implantation categories tools descriptions reference firmware analysis binwalk a fast easy to use tool for analyzing reverse engineering and extracting firmware images https github com refirmlabs binwalk firmware modify firmware mod kit a collection of scripts and utilities to extract and rebuild linux based firmware images https github com rampagex firmware mod kit cross compiler buildroot cross compiler for arm mips powerpc https buildroot org
server
Responsive-Web-Design-Artboards
responsive web design artboards use these adobe illustrator files when creating different layouts for responsive web design files desktop responsive design artboards ai contains four 4 artboards for desktop browsing desktop 800x600 778x400 desktop 1024x768 1002x658 desktop 1280x800 1258x760 desktop 1680x1050 1418x700 note that the heights are based on default window sizing i e stuff that s above the fold iphone ipad web design artboards ai contains eighteen 18 artboards for iphone and ipad ipad landscape 1024 768 ipad portrait 768x1024 ipad retina landscape 2048x1536 ipad retina portrait 1536x2048 iphone retina landscape 640x960 iphone retina portrait 960x640 iphone landscape 320x480 iphone portrait 480x320 iphone 5 5s landscape 1136x640 iphone 5 5s portrait 640x1136 iphone 6 display zoom landscape 1136x640 iphone 6 display zoom portrait 640x1136 iphone 6 landscape 1334x750 iphone 6 portrait 750x1334 iphone 6 display zoom landscape 2001x1125 iphone 6 display zoom portrait 1125x2001 iphone 6 landscape 2208x1242 iphone 6 portrait 1242x2208 plays well with others the responsive web design artboards were designed for use with adapt js https github com nathansmith adapt take a look at adapt js when you are ready to take the designs into web production for help with designing native ios apps check out iphone http www teehanlax com downloads iphone sketch elements a and ipad stencils http www teehanlax com downloads ipad sketch elements ai contributing feel free to download submit your own artboards for inclusion into the set
front_end
WeBall_Statistics-Backend
weball statistics h4 the weball statistics application is a league statistics application for basketball which was created as part of the course apps development for mobile devices university of macedonia applied informatics academic year 2021 2022 6th semester h4 h4 part of the course was to get organized into groups of 10 people our team team 2 consists of the following students alphabetically h4 ul li b i ampatzidou elisavet i b li li b i charakopoulos minas theodoros i b li li b i dasyra evmorfia elpida i b li li b i iordanou sofia i b li li b i lougaris dionisis i b li li b i lousta aravella i b li li b i machairas panagiotis i b li li b i ouzounidis kyriakos i b li li b i pepa leonard i b li li b i stefou george john i b li ul h4 video presentation of the app on youtube a href https www youtube com watch v ouzmwkucq s list ll index 12 b i presentation video i b a h4 h4 visit the other repository with the android mobile application a href https github com uom android team2 weball statistics b i android app front end i b a h4 navigating the app s back end ui img src screenshots github index png width 950 h4 this is the home index page of the back end admin ui from there one can see information about the software that was developed such as team members course teachers etc through this page the admin user has the options to login or create an account by selecting the appropriate link from the nav bar h4 hr width 70 br div float left img src screenshots github admin login page png width 400 img src screenshots github admin register page png width 400 div h4 through the first screen the admin can login to their account if he does not have an account he can create one through the form shown on the second screen h4 hr width 70 br img src screenshots github admin panel page png width 950 h4 after successfully logging into his account he will go to the admin panel from there he has the following possibilities h4 ul li b i create team and view a list with all the teams in the mysql database i b li li b i create player and view a list with all the players in the mysql database i b li li b i create league manually for every week i b li li b i i championship draw selection automatically through our own algorithm b li li b i i reset his password b li li b i i load data from json files with ready data for players and teams so he doesn t have to put it all in manually b li li b i check the manual i b li li b i logout i b li ul hr width 70 br img src screenshots github teams page png width 950 h4 through this screen he can create teams and see the existing ones h4 hr width 70 br img src screenshots github palyers page png width 950 h4 through this screen he can create players and see the existing ones h4 hr width 70 br img src screenshots github matches championship page png width 950 h4 through this screen the admin can create the league matches manually h4 hr width 70 br img src screenshots github random week matches page png width 950 h4 through this screen the admin can create the league matches automatically by simply pressing the button and waiting 5sec for the process to complete h4 hr width 70 br img src screenshots github manual png width 950 h4 this is the last screen where the admin will find information about managing the admin panel h4 prerequisites ul li android studio li li xampp control panel li li an emulator installed e g nexus 5 api 30 pixel 3 xl api 29 li li internet connection li ul local installation h4 for the correct use of the application the following actions are required h4 run at first the back end git clone https github com uom android team2 weball statistics backend git or download the zip from github and extract it store or move the root folder weball statistics backend master in path xampp htdocs folder open xampp control panel and start apache and mysql servers visit from your browser http localhost weball statistics backend index then register or login and follow the manual now for the mobile application git clone https github com uom android team2 weball statistics git or download the zip from github and extract it store or move the root folder weball statistics main in path androidstudioprojects open android studio and the app root folder config the app public static final string ip your ip java uom team2 weball statistics configuration config java domain includesubdomains true your ip domain res xml network security config xml start any emulator and then you are ready to launch the app h4 note maybe you will see already data for live matches because of the real time cloud service was used firebase real time database h4
6th-semester api applied-informatics basketball-stats css html javascript json mysql php postman statistics uom xampp
front_end
Embedded-Systems-Design
embedded systems design a series of embedded systems design experiments covering various aspects of embedded systems microcontroller programming and hardware interfacing exp1 matrix keypad and 7 segment display input system design exp2 pwm based breathing led system design exp3 traffic signal system with independent 7 segment displays and tricolor leds exp4 digital clock system with scanning 7 segment displays exp5 electronic piano system with passive buzzer and matrix keypad equipment or software needed step mxo2 v2 fpga development board step baseboard v2 peripheral experimentation board lattice s diamond fpga integrated development tool modelsim simulation software
os
OSAS
one stop anomaly shop osas this repository implements the models methods and techniques presented in our paper a principled approach to enriching security related data for running processes through statistics and natural language processing https www scitepress org papers 2021 103814 103814 pdf introduction video follows quick start guide this video is a recording of our hack in the box hitb security conference 2021 amsterdam presentation image alt text here https img youtube com vi wi5nxgzsfc4 0 jpg https www youtube com watch v wi5nxgzsfc4 quick start guide step 1 get build the docker image option 1 use precompiled image might not reflect latest changes shell docker pull tiberiu44 osas latest docker image tag tiberiu44 osas latest osas latest option 2 build the image locally shell git clone https github com adobe osas git cd osas docker build f docker osas elastic dockerfile t osas latest step 2 after building the docker image you can start osas by typing shell docker run p 8888 8888 tcp p 5601 5601 tcp v absolute path to data folder app osas important note please modify the above command by adding the absolute path to your datafolder in the appropiate location after osas has started it might take 1 2 minutes you can use your browser to access some standard endpoints http localhost 5601 app home http localhost 5601 app home access to kibana frontend this is where you will see your data http localhost 8888 token osas http localhost 8888 token osas access to jupyter lab open terminal or create a notebook for debug in case you need to shell docker run p 8888 8888 tcp p 5601 5601 tcp v absolute path to data folder app ti osas bin bash building the test pipeline this guide will take you through all the necessary steps to configure train and run your own pipeline on your own dataset prerequisite add you own csv dataset into your data folder the one provided in the docker run command once you started your docker image use the osas console http localhost 8888 osas console to gain cli access to all the tools in what follows we assume that your dataset is called dataset csv please update the commands as necessary in case you use a different name location be sure you are running scripts in the root folder of osas bash cd osas step 1 build a custom pipeline configuration file this can be done fully manually on by bootstraping using our conf autogenerator script bash python3 osas main autoconfig py input file app dataset csv output file app dataset conf the above command will generate a custom configuration file for your dataset it will try guess field types and optimal combinations between fields you can edit the generated file which should be available in the shared data folder using your favourite editor standard templates for label generator types are editorconfig lg multinomial generator type multinomialfield field name field name absolute threshold 10 relative threshold 0 1 group by none this is an optional field it can be a single attribute name or a list of names lg text generator type textfield field name field name lm mode char ngram range 3 5 lg numeric generator type numericfield field name field name group by none this is an optional field it can be a single attribute name or a list of names lg mutlinomial combiner generator type multinomialfieldcombiner field names field 1 field 2 absolute threshold 10 relative threshold 0 1 group by none this is an optional field it can be a single attribute name or a list of names lg keyword generator type keywordbased field name field name keyword list keyword 1 keyword 2 keyword 3 lg regex generator type knowledgebased field name field name rules and labels tuple list regex 1 label 1 regex 2 label 2 you can use the above templates to add as many label generators you want just make sure that the header ids are unique in the configuration file step 2 train the pipeline bash python3 osas main train pipeline py conf file app dataset conf input file app dataset csv model file app dataset json the above command will generate a pretrained pipeline using the previously created configuration file and the dataset step 3 run the pipeline on a dataset bash python3 osas main run pipeline py conf file app dataset conf model file app dataset json input file app dataset csv output file app dataset out csv the above command will run the pretrained pipeline on any compatible dataset in the example we run the pipeline on the training data but you can use previously unseen data it will generate an output file with labels and anomaly scores and it will also import your data into elasticsearch kibana to view the result just use the the web interface http localhost 5601 app dashboards developing models now that everything is up and running we prepared a set of development guidelines that will help you apply osas on your own dataset 1 pipeline configuration docs pipeline configuration md this will help you understand how the label generators and anomaly scoring works in osas 2 rule based score modifiers and labeling docs rules md once you have a working osas pipeline you can furhter refine your results by adding new labels and modifying the anomaly scoring based on static rules citing and attribution full text paper a principled approach to enriching security related data for running processes through statistics and natural language processing https www scitepress org papers 2021 103814 103814 pdf if you want to use this repository in any academic work please cite the following work mla boros tiberiu et al a principled approach to enriching security related data for running processes through statistics and natural language processing iotbds 2021 6th international conference on internet of things big data and security 2021 apa boros t cotaie a vikramjeet k malik v park l pachis n 2021 a principled approach to enriching security related data for running processes through statistics and natural language processing iotbds 2021 6th international conference on internet of things big data and security chicago boros tiberiu andrei cotaie kumar vikramjeet vivek malik lauren park and nick pachis a principled approach to enriching security related data for running processes through statistics and natural language processing in iotbds 2021 6th international conference on internet of things big data and security 2021 bibtex text article boros2021principled title a principled approach to enriching security related data for running processes through statistics and natural language processing author boros tiberiu and cotaie andrei and vikramjeet kumar and malik vivek and park lauren and pachis nick year 2021 booktitle iotbds 2021 6th international conference on internet of things big data and security
ai
WebApplicationsDevelopmentLessons
join the chat at https gitter im lanubisl webapplicationsdevelopmentlessons https badges gitter im join 20chat svg https gitter im lanubisl webapplicationsdevelopmentlessons utm source badge utm medium badge utm campaign pr badge utm content badge c microsoft asp net mvc microsoft sql server https github com lanubisl webapplicationsdevelopmentlessons blob master lessonsplan md https github com lanubisl webapplicationsdevelopmentlessons https www youtube com channel ucqu1ll4wgyxuxmvsdml0amw https github com lanubisl webapplicationsdevelopmentlessons blob master coursework md
front_end
PrettyOS
p align center img src logo png p p align center b a preemptive hard real time kernel for embedded devices b p list of supported features static and dynamic priority schedulers preemptive scheduling using a static priority scheduling class an rms rate monotonic scheduling https en wikipedia org wiki rate monotonic scheduling can be effective for use number of tasks at each priority level is 1 edf earliest deadline first limited support for kernel services configurable number of tasks lock unlock scheduler support memory management using a basic memory manager for fixed sized allocatable objects in a memory partition i e region for static priority scheduling runtime priority change suspend resume tasks mutex support including ocpp original ceiling priority protocol https en wikipedia org wiki priority ceiling protocol to overcome priority inversion scenarios support semaphores message mailboxes and eventflags hooks apis at application and cpu port level software based tasks stack overflow detection porting availability system bsp cpu port notes ti stellaris lm4f120 linux machine requires posix 1b standards as minimal to add another port please read this porting guide port porting guide md first include the rtos you include only a single header file pretty os h kernel pretty os h which contains the list of the public apis with a proper description for each one license copyright 2020 present yahia farghaly ashour br this project is mit https github com yahiafarghaly prettyos blob master license licensed
rtos embedded-systems kernel real-time os operating-systems cortex-m4 arm embedded-devices edf-scheduling scheduling static-priority-scheduling
os
Seeed_Arduino_FreeRTOS
seeed arduino freertos build status https travis ci com seeed studio seeed arduino freertos svg branch master https travis ci com seeed studio seeed arduino freertos introduction this library ports the latest freertos 10 4 3 and allow you to create freertos projects in the in arduino ide boards samd21 series zero seeeduio xiao samd51 series wio terminal license this software is written by lynnl4 for seeed studio email hongtai liu seeed cc and is licensed under the mit license check license txt for more information contributing to this software is warmly welcomed you can do this basically by forking committing modifications and then pulling requests follow the links above for operating guide adding change log and your contact into file header is encouraged thanks for your contribution seeed studio is an open hardware facilitation company based in shenzhen china benefiting from local manufacture power and convenient global logistic system we integrate resources to serve new era of innovation seeed also works with global distributors and partners to push open hardware movement
freertos samd21 arduino-library arduino samd51
os
Mobile-Application-Development-JavaScript-Frameworks
mobile application development javascript frameworks code repository for mobile application development javascript frameworks published by packt what you will learn develop build run and deploy great cross platform mobile applications using apache cordova create complete mobile apps using apache cordova that run on apple ios google android and windows phone create a neat user interface for your mobile application using jquery mobile gain an in depth understanding of how react native works behind the scenes write your own custom native ui components learn the ins and outs of screen navigation and master the art of layouts and styles develop native modules in objective c and java that interact with javascript get to know ionic by creating three complete mobile applications test your apps to improve and optimize performance key features extend the power of apache cordova by creating your own apache cordova cross platform mobile plugins leverage your javascript skills to become a native app developer follow the featured sample projects to experience ionic s impressive capabilities extend your developer skillset to build test and launch mobile apps with confidence note modules 1 2 and 3 have code arranged by chapter for the chapters that have code click here https docs google com forms d e 1faipqlse5qwunkgf6puvzpirpdtuy1du5rlzew23ubp2s p3wb gcwq viewform if you have any feedback or suggestions download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781787129955 https packt link free ebook 9781787129955 a p
front_end
cloud-drive
clouddrive this project was generated with angular cli https github com angular angular cli version 1 4 9 development server run ng serve for a dev server navigate to http localhost 4200 the app will automatically reload if you change any of the source files code scaffolding run ng generate component component name to generate a new component you can also use ng generate directive pipe service class guard interface enum module build run ng build to build the project the build artifacts will be stored in the dist directory use the prod flag for a production build running unit tests run ng test to execute the unit tests via karma https karma runner github io running end to end tests run ng e2e to execute the end to end tests via protractor http www protractortest org further help to get more help on the angular cli use ng help or go check out the angular cli readme https github com angular angular cli blob master readme md
cloud
June2023_LLM_repo
june2023 llm repo unlocking the power of large language models pmlg talk slide decks from the june 22nd perth machine learning group meetup on large language models and next generation architectures
ai
NBlockchain
nblockchain nblockchain is a net standard library for building blockchain applications this project is currently in alpha status and any contributions are welcome the idea is that the developer would only need to focus on the data and rules for a blockchain and not worry about having to build all the infrastructure to facilitate a blockchain the developer would need to define the schema of data transactions they would like to store on the blockchain define the rules for a valid transaction select or create an appropriate local database select or create an appropriate network implementation select or create one or more peer discovery implementations beyond this it is meant to be highly customizable you can switch out the default services for address encoding signing hashing algorithm block verification block consensus method eg proof of work etc installation using nuget package console pm install package nblockchain version 0 5 0 alpha using net cli dotnet add package nblockchain version 0 5 0 alpha samples digital currency samples digitalcurrency local database stores litedb default built in mongodb providers nblockchain mongodb networking implementations in memory mostly for testing demo purposes tcp sockets peer discovery implementations static from a config file etc multicast for finding peers on the local network more to come key features automatic chain fork detection and resolution open flexible transaction schema customizable transaction level rules customizable block level rules peer discovery proof of work management documentation https github com danielgerlag nblockchain tree master doc outstanding items for v1 nat traversal more peer discovery options integration tests authors daniel gerlag daniel gerlag ca license this project is licensed under the mit license see the license license file for details
blockchain
blockchain
QWOYN
div align center style font size 20px img alt issues src https i imgur com ezbsgwh png h3 i decentralized gaming blockchain built using cosmos sdk i h3 div br div align center a href https github com cosmic horizon qwoyn blob main license img alt license src https img shields io badge license apache 202 0 blue a a href https github com cosmic horizon qwoyn releases latest img alt version src https img shields io github tag cosmic horizon qwoyn a a href https pkg go dev github com cosmic horizon qwoyn v5 img alt go doc src https pkg go dev badge github com cosmic horizon qwoyn v5 a div br div align center a href https github com cosmic horizon qwoyn issues img alt issues src https img shields io github issues cosmic horizon qwoyn color blue a a href https github com cosmic horizon qwoyn issues q is 3aissue is 3aopen label 3a 22good first issue 22 img alt good first issues src https img shields io github issues cosmic horizon qwoyn good 20first 20issue color blue a a href https github com cosmic horizon qwoyn discussions img alt discussions src https img shields io github discussions cosmic horizon qwoyn color blue a a href https discord cosmic horizon com img alt discord src https img shields io discord 684494798358315010 color blue a div br div align center a href https codecov io gh cosmic horizon qwoyn img alt code coverage src https codecov io gh cosmic horizon branch main graph badge svg a div br introduction qwoyn blockchain is under heavy development and as result the above features are implemented to varying degrees of completeness for more information about our approach and vision see qwoyn blockchain specification specs qwoyn blockchain md documentation documentation for qwoyn blockchain is hosted at docs qwoyn studio https docs qwoyn studio this includes installation instructions for users and developers information about live networks running qwoyn blockchain instructions on how to interact with local and live networks infrastructure and module specific documentation tutorials for users and developers migration guides for developers upgrade guides for validators a complete list of available commands and more contributing contributions are more than welcome and greatly appreciated all the information you need to get started should be available in contributing guidelines contributing md please take the time to read through the contributing guidelines before opening an issue or pull request the following prerequisites and commands cover the basics prerequisites git https git scm com 2 make https www gnu org software make 4 go https golang org 1 18 go tools install go tools make tools git hooks configure git hooks git config core hookspath scripts githooks lint and format run linter in all go modules make lint run linter and attempt to fix errors in all go modules make lint fix run formatting in all go modules make format run linter for all proto files make proto lint run linter and attempt to fix errors for all proto files make proto lint fix run formatting for all proto files make proto format running tests run all unit and integrations tests make test manual testing build the qwoynd binary make build view the available commands build qwoyn help related repositories cosmic horizon governance https github com cosmic horizon governance guidelines and long form proposals for qwoyn mainnet cosmic horizon mainnet https github com cosmic horizon mainnet additional information and historical record for qwoyn mainnet cosmic horizon testnets https github com cosmic horizon testnets additional information and historical record for qwoyn testnets
blockchain
HCMUS-Handwritting-Recognition
hcmus handwritting recognition 1 authors this is our final project for introduction to information technology our team 1 phuc song dong gia https github com fusodoya 2 loi nguyen minh https github com mf0212 3 thang nguyen quang https github com thanguyen165 2 environment 2 1 python 3 7 download at https docs conda io en latest miniconda html 2 2 visual studio code vs code download at https code visualstudio com download install extension python after installing vs code 2 3 numpy library pip install numpy 2 4 matplotlib library pip install matplotlib 2 5 cv2 library pip install opencv python 3 prepare mnist dataset download mnist dataset at http yann lecun com exdb mnist and do not unzip files the mnist dataset contains 60 000 images used to recognise input numbers called train and 10 000 images used to check if the algorithm is good or bad called test every image has its label respective to the number written in the image 4 organise project 4 zips of mnist dataset is in data subfolder 5 before we start run test mnist py file to make sure mnist dataset is successfully installed and set up 6 how do we recognise the numbers step 1 vectorize all the images of train dataset and the input img step 2 find the distance between input img and each img in train step 3 sort all the distances in increasing order step 4 choose k smallest value called k nearest neighbours knn k can be 50 100 500 etc you can choose any value for it step 5 count and find in k labels which label has the largest frequency that is the number this algorithm guess 7 run code run file main py run by this cmd python main py 8 optimize speed use c https www freecodecamp org news the c plus plus programming language code to increase speed 8 1 prepare you must have c compiler to compile c code get the lib hpp and lib cpp files run these command i use gnu gcc https gcc gnu org g c fpic lib cpp o lib o g shared lib o o lib so or compile them by visual studio https visualstudio microsoft com vs you now have a lib so file keep this file and main optimze py file in same directory if you don t want to edit the library or you don t have a compiler use mine instead of building by yourself 8 2 let s rock run main optimize py file instead of main py file the only difference between these files is main py runs guess function in python but in main optimize py the guess function calls the guess optimize function written by c in lib so
server
cloud-data-analysis-at-scale
data analysis at scale in the cloud course taught at duke mids https datascience duke edu noah gift spring 2020 2022 by noah gift https www noahgift com this is the course syllabus https noahgift github io cloud data analysis at scale syllabus these are the projects in the course https noahgift github io cloud data analysis at scale projects this the week by week calendar https noahgift github io cloud data analysis at scale calendar 2022 this is the rubric for grading assignments https noahgift github io cloud data analysis at scale rubric this is the grading for the course https noahgift github io cloud data analysis at scale grading this is the faq https noahgift github io cloud data analysis at scale faq a complete online book with screencast videos is available here https paiml com docs home books cloud computing for data chapter01 getting started coursera course building cloud computing solutions at scale specialization can be found here https www coursera org specializations building cloud computing solutions at scale https www coursera org specializations building cloud computing solutions at scale guest lecture 2022 async gpt 3 book https learning oreilly com library view gpt 3 9781098113612 interview https learning oreilly com videos 52 weeks of 021822022videopaiml shubham saboo sandra kublik prequel material these resources could be helpful before starting this course duke coursera foundations of data engineering course launching early 2022 course1 python and pandas for data engineering course2 linux and bash for data engineering github repos for projects in course week1 using linux lesson 1 using linux shell lab https github com noahgift coursera de c2 using linux lesson 2 how shell piping works https github com noahgift coursera de c2 shell piping lesson 3 using ssh https github com noahgift ssh tips tricks week2 using bash lesson 1 create and use bashrc https github com noahgift coursera de c2 configure shell lesson 2 sourcing shell variables from a script https github com noahgift coursera de c2 shell variables lesson3 using stdout and stdin https github com noahgift coursera de c2 standard streams week3 building bash scripts lesson 1 build a for loop in bash https github com noahgift coursera de c2 use shell logic and control flow lesson 2 truncate large files with bash https github com noahgift coursera de c2 truncate file lesson 3 building a command line tool for data processing https github com noahgift coursera de c2 bash cli reverse string lesson 4 build bash cli with options https github com noahgift coursera de c2 lab3 building bash scripts git week4 composing file and data management solutions with linux lesson 1 understand the search commands https github com noahgift coursera de c2 search commands lesson 2 setting permissions https github com noahgift coursera de c2 files directories permissions lesson 3 using regex to process text from file https github com noahgift coursera de c2 using regex search lesson 4 search the filesystem with find https github com noahgift coursera de c2 lab4 composing file data solutions course3 python and sql for data engineering course4 building data engineering solutions with python for web applications command line tools and notebooks sequel material these resources could be helpful after starting this course duke coursera applied data engineering course launching late 2022 github repos referenced duke coursera course course 1 cloud computing foundations practice markdown https github com noahgift duke coursera ccf lab1 blob main practice markdown ipynb github actions pytest https github com noahgift github actions pytest google app engine continuous delivery https github com noahgift gcp flask ml deploy hello world flask https github com noahgift flask hello coursera hugo continuous delivery on aws https github com noahgift dukehugofeb1 course 2 cloud computing building blocks lint dockerfile https github com noahgift duke coursera ccb lab1 flask change microservice lecture topics getting started week1 getting started https paiml com docs home books cloud computing for data chapter01 getting started cloud computing foundations week2 cloud computing foundations https paiml com docs home books cloud computing for data chapter02 cloud foundations virtualization and containers week3 week 4 containers virtualization and elasticity https paiml com docs home books cloud computing for data chapter03 virtualization containers elasticity challenges and opportunities in distributed computing week 5 week 6 distributed computing https paiml com docs home books cloud computing for data chapter04 distributed computing cloud storage week 7 week 8 cloud storage https paiml com docs home books cloud computing for data chapter05 cloud storage serverless week 9 week 10 serverless https paiml com docs home books cloud computing for data chapter06 serverless etl mlops big data and edge computer vision week 11 week 12 week 13 managed ml systems https paiml com docs home books cloud computing for data chapter07 managed ml edge computer vision notebooks and code https github com noahgift edge computer vision huggingface https learning oreilly com videos applied hugging face 10212022videopaiml openai https learning oreilly com videos assimilate openai 08252022videopaiml general key terms https noahgift github io cloud data analysis at scale topics key terms q a question answer https noahgift github io cloud data analysis at scale topics question answer student example projects 434 analytics application development by steve depp http www stevedepp com learn school msds de 434 html 462 computer vision by steve depp http www stevedepp com learn school msds ai 462 html a practical guide to data science machine learning engineering and data engineering read cloud computing for data book https paiml com docs home books cloud computing for data cloud4data books https d2sofvawe08yqg cloudfront net cloud4data hero2x 1578933644 free book developing on aws with csharp https d1 awsstatic com developer center developing on aws with csharp pdf screenshot 2022 10 28 at 7 12 09 am https user images githubusercontent com 58792 198574661 c631cffa 4fca 4b7e 836f a82bef7d77f6 png next steps take coursera mlops course cloud specialization https user images githubusercontent com 58792 121041040 650ca180 c780 11eb 956e 8d1ecb134641 png take the specialization https www coursera org learn cloud computing foundations duke specialization building cloud computing solutions at scale cloud computing foundations https www coursera org learn cloud computing foundations duke specialization building cloud computing solutions at scale cloud virtualization containers and apis https www coursera org learn cloud virtualization containers api duke specialization building cloud computing solutions at scale cloud data engineering https www coursera org learn cloud data engineering duke specialization building cloud computing solutions at scale cloud machine learning engineering and mlops https www coursera org learn cloud machine learning engineering mlops duke specialization building cloud computing solutions at scale text and code license the text and code content of notebooks and documents is released under the cc by nc nd license https github com noahgift cloud data analysis at scale blob master license md
duke mids cloud data analytics machine-learning syllabus github hugging huggingface
cloud
Android-ParentalControl
android parentalcontrol wip project for my mobile app development class description this project is composed of two applications one for the parent and one for the child it logs the phone calls sms messages and location of the child the parent can see the data in real time using his own application what i ve learned so far working on the ui of an android application using intents how to request permissions from an user when launching an application using google firebase realtime database to write and retrieve json data creating a basic background service using broadcast receivers phone sms related broadcasts using content observers to observe changes in content sms to log outgoing sms messages using location services logging the location of the child using recyclerview cardview along with firebaserecycleradapter to display json data from a firebase realtime database in a rich format first time working with maps sdk marking the location of the child
front_end
azure-rtos-learn-samples
page type sample languages c asm name azure rtos microsoft learning samples description sample projects for azure rtos microsoft learning courses how products azure rtos azure rtos microsoft learning samples this repo contains sample projects for azure rtos threadx learning path https learn microsoft com training paths azure rtos threadx azure rtos netx duo learning path https learn microsoft com training paths azure rtos netx duo get started use github codespaces github codespaces https github com features codespaces is the preferred way to building and run these sample if you have your github account enabled for this feature otherwise you can still use it with the local dev container https code visualstudio com docs remote containers or set up the toolchain by your own follow the set up environment https learn microsoft com training modules introduction azure rtos 2 set up environment unit to get started with the samples directory layout cmake cmakelist files for building the project docs documentation supplements courses source code for learning paths netxduo netx duo samples threadx threadx samples libs submoduled threadx and netx duo source code tools required scripts for using netx duo within the container use visual studio you can also find the sample projects that can be built and run with visual studio in the release page https github com azure samples azure rtos learn samples releases tag vs an alternative for using the sample projects follow the get started get started section above or the readme file in the zip to learn how to use it resources azure rtos https aka ms rtos azure rtos on github https github com azure rtos pdf real time embedded multithreading using threadx 4th edition https github com azure samples azure rtos learn samples releases download book real time embedded multithreading with threadx 4th edition pdf for some common issues we found please visit wiki https github com azure samples azure rtos learn samples wiki
os
hard-cv
hard cv a repository of ips for hardware computer vision fpga this project aims at creating an open source library of synthesizable vhdl design for computer vision the project is divided into different operators set bus contains bus peripheral commonly used fifo registers com defines protocol for communication outside the fpga conf defines configuration set for camera and display image contains basic building blocks for image processing tasks image filter image filtering components gaussian sobel thresholding erode dilate image feature feature detection and processing in images harris brief descriptor correlator image classifier classification algorithm color classifier image blob blob detection algorithm image graphic drawing functions interface component to interface fpga to sensors camera display and processor spi memory bus i2c primitive components to instantiate fpga resources memory multipliers utils all application agnostic components fifo counter registers delay these ips are free to use don t hesitate to contact me for any problem projects using this library logi boards a family of spartan6 based boards that connect to raspberry pi logipi beaglebone logibone
ai
ML-Quantamental
machine learning driven quantamental investing this repository contains the codes and data for the paper machine learning driven quantamental investing published in a chinese journal bin li xinyue shao and yueyang li 2019 machine learning driven quantamental investing china industrial economics 08 61 79 abstract quantamental investing is an emerging hot topic in financial technology and quantitative investments as a representative technique in artificial intelligence ai machine learning can significantly improve the prediction task in economics and management this paper investigates the application of machine learning in quantamental investing based on 96 anomaly factors in chinese markets ranging from january 1997 to october 2018 we adopt lasso regression ridge regression elastic net partial least square forecast combination support vector machine gradient boost tree extreme gradient boost tree ensemble neural network deep feedforward network recurrent neural network and long short term memory to build stock return prediction model and construct portfolios empirical evidence shows that machine learning algorithms can efficiently identify complex patterns among anomaly factor and excess return the investment strategy can deliver better performance than the traditional linear model and all factors long short portfolios based on the forecast of deep feedforward network can obtain a monthly return of 2 78 we further evaluate factors importance in the prediction model and find that trading friction factors demonstrate better predictive ability in the chinese stock markets deep feedforward network driven quantamental investing models running on the selected feature set provide the best performance of 3 41 per month this study introduces the machine learning techniques to the research on quantamental investing which will further facilitate the joint research on artificial intelligence machine learning and economics and management and finally will boost the national strategy of artificial intelligence key words quantamental investing machine learning market anomaly factors deep learning jel classification c8 g0 011 2019 and 2019 08 61 79 1997 1 2018 10 a 96 lasso 12 2 78 a 3 41 f832
machine-learning anomaly-factor quantamental-investments
ai
FreeRTOS-CMake
freertos cmake this repository covers the replacement of make used with freertos by cmake the goal is to use the posix linux simulator demo for freertos https www freertos org freertos simulator for linux html in order to run freertos on linux this work is based on the posix gcc demo https github com freertos freertos tree master freertos demo posix gcc of freertos prerequisites gnu make 4 1 cmake version 3 10 2 gcc 7 5 0 tested on ubuntu 18 04 lts distribution with windows wsl project structure project is a simple demo responsible to create two tasks communicating with queue and triggered by timers it will print received messages in the console main c main c contains application code console h c console h are used to trace code by avoiding concurrency issues freertosconfig h freertosconfig h is need by freertos it contains application specific definitions that are adjusted for a particular hardware and application requirements this one comes from the posix demo https github com freertos freertos blob master freertos demo posix gcc freertosconfig h makefile makefile used for direct compilation with make cmakelists txt cmakelists txt used by cmake makefile back makefile back backup of the manual makefile makefile running example with make after cloning repository ensure submodule and sub submodules are correctly initialized and updated project makefile makefile is based on the freertos demo one https github com freertos freertos blob master freertos demo posix gcc makefile but has been simplified to remove freertos dependencies you can try make it with make and run it build posix demo starting echo blinky demo message received from task message received from software timer message received from task message received from task message received from software timer message received from task cmake usage in order to replace make by cmake a cmakelists txt cmakelists txt file has been writen its goal is to include freertos and compile the demo code you can try it with cmake at this step a new makefile has been generated a copy of the initial one is in makefile back makefile back once again you can make the program with make and run it with build posix demo starting echo blinky demo message received from task message received from software timer message received from task message received from task message received from software timer message received from task
os
sonoff-th-homekit
sonoff th homekit an alternative firmware for the 6 itead sonoff th10 th16 this firmware supports the apple homekit framework and the qr code pairing for it poc prerequisites for building ota update images is the presence of openssl https www openssl org and the esptool2 https github com raburton esptool2 on your system usage 1 install esp open sdk https github com pfalcon esp open sdk build it with make toolchain esptool libhal standalone n then edit your path and add the generated toolchain bin directory the path will be something like home path to esp open sdk xtensa lx106 elf bin 2 checkout esp open rtos https github com superhouse esp open rtos and set sdk path environment variable pointing to it e g export sdk path home espressif esp open rtos 3 initialize and sync all submodules recursively shell git submodule update init recursive 4 install required python modules for qr code creation shell pip install pyqrcode pypng 5 create a new qr code for homekit pairing as every device on your network needs a unique setup id shell make homekitsettings print the qr code qrcode png and stick it on the device or save it on a safe place for documentation 6 set your environment variables to match your needs set espport environment variable pointing to usb device your esp8266 is attached to assuming your device is at dev tty slab usbtouart shell export espport dev tty slab usbtouart set your ota update server environment variables set the server shell export ota update server 192 168 1 2 set the port shell export ota update port 8080 set the path on the server to the firmware files shell export ota update path ota sonoff th name your firmware files defaults to latest shell export ota update firmware name latest 7 to prevent any effects from previous firmware it is highly recommended to erase flash shell make erase flash or if you didn t set the espport environment variable shell make erase flash espport dev tty slab usbtouart 8 flash the firmware on the sonoff th10 th16 shell make j4 test or shell make j4 flash make monitor ota updates again prerequisites for building ota update images is the presence of openssl https www openssl org and the esptool2 https github com raburton esptool2 on your system if you just later want to compile the ota firmware images just enter shell make j4 ota images and copy the compiled linked and hashed firmware images to your update server using the path you have chosen at compilation time you ll find the files in the firmware ota directory copy all files to your webserver directory accessible via your chosen update server path to initiate an ota update of the device just long press the device button 4 secs and on on an already flashed device
sonoff esp8266 homekit iot internet-of-things esp-open-rtos sonoff-th
os
presentations
collections of presentation for java web development prerequisite nodejs npm to contribute 1 fork this repository 2 clone to your local workstation 3 run npm install 4 run npm start
front_end
Django-Practice
django practice so i started my backend development journey with django and as always the first app we create is calculator so here is how to deploy it
server
Mobile-Demos
mobile demos you can clone this repo and familiarize with the implementations then try to add functionality to any of the projects
front_end
Metro-UI-CSS
p align center a href https metroui org ua img src https metroui org ua images metro xscode logo jpg alt a h3 align center metro 4 components library h3 p p align center sleek intuitive and powerful front end framework for faster and easier web development build responsive mobile first projects on the web with the first front end component library in metro style br a href https metroui org ua strong explore metro 4 docs strong a p p align center metro 4 is an mit licensed open source project it s an independent project with its ongoing development made possible entirely thanks to the support by these br a href https metroui org ua support html backers strong awesome backers strong a p p align center a href https www patreon com metroui img src https c5 patreon com external logo become a patron button 2x png height 38 a a href https www buymeacoffee com pimenov img src https metroui org ua images buy me coffee2 png height 38 a p br build status https travis ci org olton metro ui css svg branch master https travis ci org olton metro ui css dependencies https img shields io badge dependencies none darklime svg css gzip size http img badgesize io olton metro ui css master build css metro all min css compression gzip label css gzip https github com olton metro ui css blob master build css metro all min css js gzip size http img badgesize io olton metro ui css master build js metro min js compression gzip label js gzip https github com olton metro ui css blob master build js metro min js icons gzip size http img badgesize io olton metro ui css master build mif metro woff compression gzip label icons gzip https github com olton metro ui css tree master build mif github release https img shields io github v release olton metro ui css color darklime npm version https badge fury io js metro4 svg https badge fury io js metro4 nuget https img shields io nuget v metro4 svg color darklime website https img shields io website https metroui org ua svg license mit https img shields io badge license mit blue svg style flat github issues https img shields io github issues raw olton metro ui css svg github code size in bytes https img shields io github languages code size olton metro ui css svg license and premium features this project licensed under the mit license in addition support pack is available for an annual subscription on https korzh com https https korzh com and for a patreon patrons https www patreon com metroui support pack includes an extra time for priority support by email and other options community facebook https img shields io badge facebook group blue svg style flat https www facebook com groups metrouicss join the chat at https discord gg nydrab3 https img shields io badge chat on 20discord brightgreen svg style flat https discord gg nydrab3 contributing all contributions are welcome learn more about how you can contribute to this project here contributing md important before you create pull request you must sign cla docs please click here https metroui org ua for documentation and demo releases metro 4 releases frequently i am create release when there are significant bug fixes new apis or components the usual frequency of release of the new version is one week on sundays lts long term support of older versions of metro 4 does not currently exist if your current version of metro 4 works for you you can stay on it for as long as you d like if you want to make use of new features as they come in you should upgrade to a newer version browser compatibility chrome https raw github com alrra browser logos master src chrome chrome 48x48 png firefox https raw github com alrra browser logos master src firefox firefox 48x48 png edge https raw github com alrra browser logos master src edge edge 48x48 png safari https raw github com alrra browser logos master src safari safari 48x48 png opera https raw github com alrra browser logos master src opera opera 48x48 png latest 2 latest 2 latest 2 latest 2 latest 2 old version metro ui css 3 x you can find in a repo metro ui css 3 https github com olton metro ui css 3 metro ui css 2 x you can find in a repo metro ui css 2 https github com olton metro ui css 2 metro ui css 0 x you can find in a repo metro ui css 095 https github com olton metro ui css 095 documentation and demo for v3 metroui org ua v3 https metroui org ua v3 documentation and demo for v2 metroui org ua v2 https metroui org ua v2 documentation and demo for v0 metroui org ua v0 https metroui org ua v0 thanks thanks to all special thanks to all those who financially supported the project credits you can read about credits here credits md 2012 2020 copyright by serhii pimenov all rights reserved created by serhii pimenov
css javascript jquery less html html5 metro metro-ui metro-ui-css metro4 bootstrap bootstrap-replacement components component-library no-dependencies no-jquery-required css-frameworks css-framework metro-style
front_end
neopop-web
div align center img src neopop land png alt neopop banner h1 neopop h1 strong neopop is cred s inbuilt library for using neopop components in your web app strong div br div align center a href https github com cred club neopop web blob main license img src https badgen net github license cred club neopop web alt license a a href https twitter com cred club img src https img shields io twitter follow cred club label twitter style flat logo twitter color 1da1f2 alt cred twitter a div div align center br a href https playground cred club b playground docs b a br br div what is neopop neopop was created with one simple goal to create the next generation of the next beautiful more affirmative design system neopop stays true to everything that design at cred stands for what this library features easy to use and beautifully designed react components based on neopop design system flexible and composable components which accepts custom configurations commonly used utility methods and functions fluid and highly optimized animations note currently the components in this library are optimized for mobile views we will soon release support for desktop views how to install to use neopop library all you need to do is install the cred neopop web package and its peer dependencies sh yarn add cred neopop web react react dom styled components or npm i cred neopop web react react dom styled components how to use to start using the library you can import components from cred neopop web lib components import primitives from cred neopop web lib primitives import hooks from cred neopop web lib hooks import utils from cred neopop web lib utils for example to use button refer the following code snippet jsx import button from cred neopop web lib components const page return button variant primary kind elevated size big colormode dark onclick console log i m clicked primary button export default page a detailed documentation and an interactive playground can be found here https playground cred club contributing pull requests are welcome we d love help improving this library feel free to browse through open issues to look for things that need work if you have a feature request or bug please open a new issue so we can track it contributors chirag mittal github https github com mittalchirag linkedin https www linkedin com in mittalchirag tripurari shankar github https github com tripurari001 linkedin https www linkedin com in tripurari shankar 91907189 rahul jain github https github com rahuldkjain linkedin https www linkedin com in rahuldkjain utkarsh gupta github https github com utkarsh9799 linkedin https www linkedin com in utkarsh gupta 99923916a aditya sharma github https github com sharmaaditya570191 linkedin https www linkedin com in sharmaaditya570191 abhishek naidu github https github com abhisheknaiidu linkedin https www linkedin com in abhisheknaiidu license copyright 2022 dreamplug technologies private limited licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license
cred neopop design system react styled-components ui-components
os
Portfolio
portfolio mark kogon s portfolio detailing latest work on various cloud engineering projects
cloud
iotwifi
if you are interested in contributing to this project please use the https github com txn2 txwifi fork this version is archived for refrence update 2018 12 01 i am archiving this project the original use case was to enable the configuration of wifi over wifi like many iot devices on the market it has worked well for me for this purpose however many of the issues people have been reporting as bugs are simply other opinions on how it should work for them and outside of the original use case unfortunately i don t have the personal resources to help in these requests if others are willing to be contributors i would be grateful until then this project is for reference only update looking for contributors maintainers notice this project is intended to aid in developing configure wifi over wifi solutions for iot projects using the raspberry pi the main use case for this project is to reproduce functionality common to devices like nest or echo where the user turns on the device connects to it and configures it for wifi i have over 800 devices running this software in production and all have had their wifi configured using it this is not a captive portal project while i have personaly used it for this it requires additional networking and can be unstable i don t support this use and so your millage may vary iotwifi is only expected to run properly on stock raspberry pis and not tested on any other hardware configurations iot wifi raspberry pi ap client go report card https goreportcard com badge github com cjimti iotwifi https goreportcard com report github com cjimti iotwifi godoc https godoc org github com cjimti iotwifi iotwifi status svg https godoc org github com cjimti iotwifi iotwifi docker container image size https shields beevelop com docker image image size cjimti iotwifi latest svg https hub docker com r cjimti iotwifi docker container layers https shields beevelop com docker image layers cjimti iotwifi latest svg https hub docker com r cjimti iotwifi docker container pulls https img shields io docker pulls cjimti iotwifi svg https hub docker com r cjimti iotwifi waffle io columns and their card count https badge waffle io cjimti iotwifi svg columns all https waffle io cjimti iotwifi iot wifi is very small 8mb docker container built for the raspberry pi 3 https amzn to 2jfxhca iot wifi exposes a simple json based rest api for controlling the wireless network interface this container allows the raspberry pi to accept wifi connections as an access point aka ap while at the same time connecting to an existing wifi network station mode go golang was used to develop the main application code to produce a minimal docker image with great performance the container runs alpine linux https alpinelinux org with small optimized versions of hostapd wpa supplicant and dnsmasq controlled by the container s api endpoints if you have a raspberry pi 3 and you want to provide wifi based configuration abilities all you need is docker installed and a little over 8mb of free drive space raspberry pi ap client doc assets pi jpg https amzn to 2jfxhca table of contents iot wifi raspberry pi ap client iot wifi raspberry pi ap client background background getting started getting started disable wpa supplicant on raspberry pi disable wpa supplicant on raspberry pi install docker on raspberry pi install docker on raspberry pi pull the iot wifi docker image pull the iot wifi docker image iot wifi configuration iot wifi configuration run the iot wifi docker container run the iot wifi docker container connect to the pi over wifi connect to the pi over wifi connect the pi to a wifi network connect the pi to a wifi network check the network interface status check the network interface status conclusion conclusion tl dr if you are not interested in reading all this you can skip ahead to getting started getting started iot wifi is a raspberry pi wifi management rest service written in go and intended to run in a docker container on a raspberry pi https hub docker com r cjimti iotwifi iot wifi sets up network interfaces runs hostapd wpa supplicant and dnsmasq to run simultaneously allowing a user or another service to connect to the raspberry pi via hostapd dnsmasq and issue commands that configure and connect wpa supplicant to another ap iot wifi then exposes a small web server on the pi and offers a json based rest api to configure wifi the iot wifi container allows you to build a custom captive portal web page or even programmatically connect from another device and use the exposed api to configure the target device using wifi to configure a wifi connection is often a standard requirement for iot as raspberry pis are becoming a popular choice as an iot platform this helps solve the frequent need to manage ap and station modes background over a year ago i wrote a blog post called raspberry pi 3 wifi station ap with my notes on setting up a raspberry pi 3 to run as a wifi access point ap hotspot and a wifi client aka wifi station station simultaneously this old blog post gets a considerable amount of traffic so it seems there is quite a bit of interest in this i have come to realize that some of the steps in my old post have changed since newer versions of raspian n00bs build are released since writing the post i have had a few personal and professional projects requiring a raspberry pi to allow wifi setup over wifi i decided to open source this simple project to help others with similar requirements as well as gain some feedback on where and how i can improve it i would welcome any contribution and credit any contributors getting started you will need a raspberry pi 3 running raspian stretch you can use the noobs release to install the latest version of raspian disable wpa supplicant on raspberry pi you do not want the default wpa supplicant the software that communicates with the wifi driver and connects to wifi networks running and competing with the iot wifi container bash prevent wpa supplicant from starting on boot sudo systemctl mask wpa supplicant service rename wpa supplicant on the host to ensure that it is not used sudo mv sbin wpa supplicant sbin no wpa supplicant kill any running processes named wpa supplicant sudo pkill wpa supplicant install docker on raspberry pi ssh into the pi or use the terminal application from the desktop on the pi to get a bash shell bash docker install script curl ssl https get docker com sh install docker doc assets install docker gif bash add pi user to docker user group sudo usermod ag docker pi usermod docker doc assets usermod gif reboot the pi and test docker bash sudo reboot after reboot ensure docker is installed correctly by running a hello world docker container bash run the docker hello world container and remove the container when finished the rm flag docker run rm hello world docker hello world on raspberry pi doc assets docker hello world gif pull the iot wifi docker image you can optionally clone and build the entire project however to get started quickly i ll show you how to use a pre built docker image at only 16mb this little image contains everything you need the image is based on alpine linux and contains hostapd wpa supplicant and dnsmasq along with a compiled wifi management utility written in go the source is found in this repository https github com cjimti iotwifi bash pull the iot wifi docker image docker pull cjimti iotwifi docker iot wifi image doc assets docker pull image gif iot wifi configuration you will need a configuration json file you can download a default as a template or just it unmodified for testing you can mount the configuration file into the container or specify a location with an environment variable use the default configuration file and location for testing bash download the default configuration file wget https raw githubusercontent com cjimti iotwifi master cfg wificfg json download configuration doc assets download config gif the default configuration looks like this json dnsmasq cfg address 192 168 27 1 dhcp range 192 168 27 100 192 168 27 150 1h vendor class set device iot host apd cfg ip 192 168 27 1 ssid iot wifi cfg 3 wpa passphrase iotwifipass channel 6 wpa supplicant cfg cfg file etc wpa supplicant wpa supplicant conf you may want to change the ssid ap hotspot name and the wpa passphrase to something more appropriate to your needs however the defaults are fine for testing run the iot wifi docker container the following docker run command will create a running docker container from the cjimti iotwifi docker image we pulled in the steps above the container needs to run in a privileged mode and have access to the host network the raspberry pi device to configure and manage the network interfaces on the raspberry pi we will also need to mount the configuration file we will run it in the foreground to observe the startup process if you want it to run the background you need to remove the rm and pass the d flag if you want to it restart on reboot or failure you can pass the flag restart unless stopped read more on the docker run command https docs docker com engine reference run bash docker run rm privileged net host v pwd wificfg json cfg wificfg json cjimti iotwifi optionally you can also provide a wpa supplicant conf like so bash docker run rm privileged net host v pwd wificfg json cfg wificfg json v host path wpa supplicant conf container path wpa supplicant conf cjimti iotwifi where container path is the path to wpa supplicant conf specified in wificfg json the iot wifi container outputs logs in the json format while this makes them a bit more challenging to read we can feed them directly or indirectly into tools like elastic search or other databases for alerting or analytics you should see some initial json objects with messages like starting iot wifi json hostname raspberrypi level 30 msg starting iot wifi name iotwifi pid 0 time 2018 03 15t20 19 50 374z v 0 keeping the current terminal open you can log in to another terminal and take a look the network interfaces on the raspberry pi bash use ifconfig to view the network interfaces ifconfig you should see a new interface called uap0 plain uap0 flags 4163 up broadcast running multicast mtu 1500 inet 192 168 27 1 netmask 255 255 255 0 broadcast 192 168 27 255 inet6 fe80 6e13 d169 b00b c946 prefixlen 64 scopeid 0x20 link ether b8 27 eb fe c8 ab txqueuelen 1000 ethernet rx packets 111 bytes 8932 8 7 kib rx errors 0 dropped 0 overruns 0 frame 0 tx packets 182 bytes 24416 23 8 kib tx errors 0 dropped 0 overruns 0 carrier 0 collisions 0 the standard wifi interface wlan0 should be available yet unconfigured since we have not yet connected to an external wifi network access point plain wlan0 flags 4099 up broadcast multicast mtu 1500 ether b8 27 eb fe c8 ab txqueuelen 1000 ethernet rx packets 0 bytes 0 0 0 b rx errors 0 dropped 0 overruns 0 frame 0 tx packets 0 bytes 0 0 0 b tx errors 0 dropped 0 overruns 0 carrier 0 collisions 0 connect to the pi over wifi on your laptop or phone you should now see a wifi network named iot wifi cfg 3 assuming you did not change it from the default the default password for this network is iotwifipass once connected to this network you should get an ip address assigned to the range specified in the config 192 168 27 100 192 168 27 150 1h coeect phone doc assets phone jpg once connected open a web browser and go to http 192 168 27 1 8080 status you can access this api endpoint on the raspberry pi device itself from localhost on on pi try the curl command curl http localhost 8080 status you should receive a json message similar to the following json status ok message status payload address b8 27 eb fe c8 ab uuid a736659a ae85 5e03 9754 dd808ea0d7f2 wpa state inactive from now on i ll demonstrate api calls to the new container with the curl command https en wikipedia org wiki curl on the device if you were developing a captive portal or configuration web page you could translate these calls into javascript and control the device wifi with ajax you can use my simple static web server iot web container for hosting a captive portal or configuration web page see https github com cjimti iotweb to get a list of wifi networks the device can see issue a call to the scan endpoint bash curl http localhost 8080 scan connect the pi to a wifi network the device can connect to any network it can see after running a network scan curl http localhost 8080 scan you can choose a network and post the login credentials to iot web bash post wifi credentials curl w n d ssid home network psk mystrongpassword h content type application json x post localhost 8080 connect you should get a json response message after a few seconds if everything went well you will see something like the following json status ok message connection payload ssid straylight g state completed ip message you can get the status at any time with the following call to the status endpoint here is an example bash get the wifi status curl w n http localhost 8080 status sample return json json status ok message status payload address b7 26 ab fa c9 a4 bssid 50 3b cb c8 d3 cd freq 2437 group cipher ccmp id 0 ip address 192 168 86 116 key mgmt wpa2 psk mode station p2p device address fa 27 eb fe c9 ab pairwise cipher ccmp ssid straylight g uuid a736659a ae85 5e03 9754 dd808ea0d7f2 wpa state completed check the network interface status the wlan0 is now a client on a wifi network in this case it received the ip address 192 168 86 116 we can check the status of wlan0 with ifconfig bash check the status of wlan0 wireless interface ifconfig wlan0 example return plain wlan0 flags 4163 up broadcast running multicast mtu 1500 inet 192 168 86 116 netmask 255 255 255 0 broadcast 192 168 86 255 inet6 fe80 9988 beab 290e a6af prefixlen 64 scopeid 0x20 link ether b8 27 eb fe c8 ab txqueuelen 1000 ethernet rx packets 547 bytes 68641 67 0 kib rx errors 0 dropped 0 overruns 0 frame 0 tx packets 36 bytes 6025 5 8 kib tx errors 0 dropped 0 overruns 0 carrier 0 collisions 0 we can also check the connection by issuing a ping command from the device and specify the network interface to use bash ping out from the wlan0 interface ping i wlan0 8 8 8 8 hit control c to stop the ping and get calculations plain ping 8 8 8 8 8 8 8 8 from 192 168 86 116 wlan0 56 84 bytes of data 64 bytes from 8 8 8 8 icmp seq 1 ttl 57 time 20 9 ms 64 bytes from 8 8 8 8 icmp seq 2 ttl 57 time 23 4 ms 64 bytes from 8 8 8 8 icmp seq 3 ttl 57 time 16 0 ms c 8 8 8 8 ping statistics 3 packets transmitted 3 received 0 packet loss time 2002ms rtt min avg max mdev 16 075 20 138 23 422 3 049 ms conclusion wrapping the all complexity of wifi management into a small docker container accessible over a web based rest api reduces the dependencies on the device to only require docker there are many ways to handle security using middleware or ip tables a separate container can also manage security check out the project iot web https github com cjimti iotweb to get started with tiny a static web container suitable for building user interfaces for wifi management or captive portals submit a github issue or pull request if there are features or bug fixes you would like added to the project raspberry pi 3 wifi station ap http imti co post 145442415333 raspberry pi 3 wifi station ap raspberry pi https amzn to 2jfxhca raspian https www raspberrypi org downloads raspbian noobs https www raspberrypi org downloads noobs hostapd https w1 fi hostapd wpa supplicant https w1 fi wpa supplicant dnsmasq http www thekelleys org uk dnsmasq doc html captive portal https en wikipedia org wiki captive portal ap https en wikipedia org wiki wireless access point station https en wikipedia org wiki station networking go https golang org iot https en wikipedia org wiki internet of things docker https www docker com alpine linux https alpinelinux org cjimti iotwifi https hub docker com r cjimti iotwifi
raspberry-pi-3 raspberry-pi wifi docker iot iot-platform hostapd wpa-supplicant golang armv7 armhf docker-container docker-image dnsmasq hostap rest-api json json-api
server
rusty-machine
rusty machine this library is no longer actively maintained join the chat at https gitter im athemathmo rusty machine https badges gitter im athemathmo rusty machine svg https gitter im athemathmo rusty machine utm source badge utm medium badge utm campaign pr badge utm content badge build status https travis ci org athemathmo rusty machine svg branch master https travis ci org athemathmo rusty machine the crate is currently on version 0 5 4 https crates io crates rusty machine read the api documentation https athemathmo github io rusty machine to learn more and here is a document detailing development efforts including a projected timeline for immediate features please feel free to give feedback and let me know if there any features you believe should take precedence development development md summary rusty machine is a general purpose machine learning library implemented entirely in rust it aims to combine speed and ease of use without requiring a huge number of external dependencies this project began as a way for me to learn rust and brush up on some less familiar machine learning algorithms and techniques now the project aims to provide a complete easy to use machine learning library for rust this library is still very much in early stages of development although there are a good number of algorithms many other things are missing rusty machine is probably not the best choice for any serious projects but hopefully that can change in the near future contributing this project is currently looking for contributors of all capacities i have now created a dedicated page for contributing contributing md if you re interested please take a look implementation this project is implemented using rust https www rust lang org currently there are no other dependencies though we are planning on introducing optional blas lapack dependencies soon current progress rusty machine uses rulinalg https github com athemathmo rulinalg for its linear algebra back end this is fairly complete but there is still lots of room for optimization and we should provide blas lapack support machine learning linear regression logistic regression generalized linear models k means clustering neural networks gaussian process regression support vector machines gaussian mixture models naive bayes classifiers dbscan k nearest neighbor classifiers principal component analysis there is also a basic stats module behind a feature flag usage the library usage is described well in the api documentation https athemathmo github io rusty machine including example code i will provide a brief overview of the library in it s current state and intended usage installation the library is most easily used with cargo http doc crates io guide html simply include the following in your cargo toml file toml dependencies rusty machine 0 5 4 and then import the library using rust extern crate rusty machine as rm the library consists of two core components the linear algebra module and the learning module linalg the linear algebra module contains reexports from the rulinalg https github com athemathmo rulinalg crate this is to provide easy access to components which are used frequently within rusty machine more detailed coverage can be found in the api documentation https athemathmo github io rusty machine learning the learning module contains machine learning models the machine learning implementations are designed with simpicity and customization in mind this means you can control the optimization algorithms but still retain the ease of using default values this is an area i am actively trying to improve on the models all provide predict and train methods enforced by the supmodel and unsupmodel traits there are some examples within this repository that can help you familiarize yourself with the library
machine-learning rust
ai
awesome-robotics-datasets
toc toc dataset collections robotics radish the robotics data set repository http radish sourceforge net andrew howard and nicholas roy not working repository of robotics and computer vision datasets https www mrpt org robotics datasets mrpt memo it includes malaga datasets and some of classic datasets published in radish http radish sourceforge net ijrr data papers http journals sagepub com topic collections ijr 3 datapapers ijr ijrr awesome slam datasets https github com youngguncho awesome slam datasets younggun cho 1 computer vision cvonline image databases http homepages inf ed ac uk rbf cvonline imagedbase htm cvonline computer vision datasets on the web http www cvpapers com datasets html cvpapers 1 yacvid yet another computer vision index to datasets http riemenschneider hayko at vision dataset hayko riemenschneider 1 computer vision online datasets https computervisiononline com datasets computer vision online others machine learning repository http archive ics uci edu ml uci kaggle datasets https www kaggle com datasets kaggle ieee dataport https ieee dataport org ieee place specific datasets driving datasets kitti vision benchmark suite http www cvlibs net datasets kitti and kitti 360 http www cvlibs net datasets kitti 360 andreas geiger et al 1 semantickitti http semantic kitti org jens behley et al waymo open dataset https waymo com open waymo cityscapes dataset https www cityscapes dataset com appoloscape dataset http apolloscape auto berkely deepdrive dataset https bdd data berkeley edu bdd100k bair at uc berkely nuscenes dataset https www nuscenes org aptiv d 2 city dataset https outreach didichuxing com d2city d2city didi ford campus vision and lidar data set http robots engin umich edu softwaredata ford perl at univ of michigan mit darpa urban challenge dataset http grandchallenge mit edu wiki index php title publicdata mit kaist multi spectral recognition dataset in day and night https sites google com view multispectral rcv lab at kaist kaist complex urban dataset http irap kaist ac kr dataset irap lab at kaist new college dataset http www robots ox ac uk newcollegedata index php mrg at oxford univ chinese driving from a bike view http www sujingwang name cdbv html cdbv cas culane dataset https xingangpan github io projects culane html cuhk roma road markings image database http perso lcpc fr tarel jean philippe bdd jean philippe tarel et al flying datasets the zurich urban micro aerial vehicle dataset http rpg ifi uzh ch zurichmavdataset html rpg at ethz the uzh fpv drone racing dataset http rpg ifi uzh ch uzh fpv html rpg at ethz multidrone public dataset https multidrone eu multidrone public dataset multidrone project the blackbird dataset https github com mit fast blackbird dataset agiledrones group at mit underwater datasets marine robotics datasets http marine acfr usyd edu au datasets acfr outdoor datasets the rawseeds project http www rawseeds org memo it includes bovisa dataset is for outdoor and bicocca dataset is for indoor planetary mapping and navigation datasets http asrl utias utoronto ca datasets asrl at univ of toronto indoor datasets robotics 2d laser datasets http www ipb uni bonn de datasets cyrill stachniss memo it includes some of classic datasets published in radish http radish sourceforge net long term mobile robot operations http robotics researchdata lncn eu lincoln univ mit stata center data set http projects csail mit edu stata marine robotics group at mit kth and cold database https www pronobis pro data andrzej pronobis shopping mall datasets http www irc atr jp sets temposan dataset irc at atr rgb d dataset 7 scenes https www microsoft com en us research project rgb d dataset 7 scenes microsoft topic specific datasets for robotics localization mapping and slam slam benchmarking http ais informatik uni freiburg de slamevaluation ais at univ of freiburg robotic 3d scan repository http kos informatik uni osnabrueck de 3dscans univ of wurzburg and univ of osnabruck 3d pose graph optimization https lucacarlone mit edu datasets luca carlone landmark based localization range only data for localization http www frc ri cmu edu projects emergencyresponse rangedata cmu ri roh s angulation dataset https github com sunglok triangulationtoolbox tree master dataset roh hyunchul roh wireless sensor network dataset http www cs virginia edu whitehouse research localization kamin whitehouse path planning and navigation pathfinding benchmarks http www movingai com benchmarks moving ai lab at univ of denver task and motion planner benchmarking http www neil dantam name 2018 rss tmp workshop benchmarks rss 2018 workshop topic specific datasets for computer vision features affine covariant features datasets https www robots ox ac uk vgg data affine vgg at oxford repeatability benchmark tutorial https www vlfeat org benchmarks overview repeatability html vlfeat a list of feature performance evaluation datasets https github com openmvg features repeatability maintained by openmvg saliency and foreground saliency mit saliency benchmark http saliency mit edu mit salient object detection a benchmark http mmcheng net salobjbenchmark ming ming cheng foreground change detection background subtraction changedetection net http www changedetection net a k a cdnet motion and pose estimation adelaidermf robust model fitting data set https cs adelaide edu au hwong doku php id data hoi sim wong structure from motion and 3d reconstruction objects ivl synthesfm v2 https board unimib it datasets fnxy8z8894 1 davide marelli et al fuji sfm dataset https zenodo org record 3712808 ysfts44zaul jordi gene mola et al large geometric models archive https www cc gatech edu projects large models georgia tech the stanford 3d scanning repository http graphics stanford edu data 3dscanrep stanford univ places photo tourism data http phototour cs washington edu uw and microsoft object tracking visual object tracking challenge http www votchallenge net a k a vot 1 visual tracker benchmark http cvlab hanyang ac kr tracker benchmark a k a otb object place and event recognition pedestrians eurocity persons dataset https eurocity dataset tudelft nl a k a ecp daimler pedestrian benchmark data sets http www gavrila net datasets daimler pedestrian benchmark d daimler pedestrian benchmark d html crowdhuman http www crowdhuman org objects rgb d object dataset http rgbd dataset cs washington edu uw sweet pepper and peduncle 3d datasets http enddl22 net wordpress datasets sweet pepper and peduncle 3d datasets inkyu sa places loop closure detection http cogrob ensta paristech fr loopclosure html david filliat et al traffic and surveillance best benchmark and evaluation of surveillance task http best sjtu edu cn data list datasets sjtu virat video dataset http www viratdata org research groups tum cvg datasets https vision in tum de data datasets tags visual inertia odometry visual slam 3d reconstruction oxford vgg datasets http www robots ox ac uk vgg data tags visual features visual recognition 3d reconstruction qut cyphy datasets https wiki qut edu au display cyphy datasets tags visual slam lidar slam univ of bonn univ stachniss lab datasets https www ipb uni bonn de data tags slam epfl cvlab datasets https cvlab epfl ch data tags 3d reconstruction local keypoint optical flow rgb d pedestrian the middlebury computer vision pages http vision middlebury edu tags stereo matching 3d reconstruction mrf optical flow color caltech cvg datasets http www vision caltech edu archive html tags objects pedestrian car face 3d reconstruction on turntables
dataset robotics computer-vision
ai
SpeedyBite
speedybite a project using aws cloud functions for the service engineering lecture in the masters degree of software engineering in fctuc this small project consists in having two react websites frontoffice and backoffice these two will interact with our backend made with django and provide a interface to the clients and to the restaurant cooks technologies used react django boto3 aws cloud functions aws rekognition overview of the service architecture image https user images githubusercontent com 34323311 166493154 f50e7ed1 b22c 4fc0 b01e 09f15cb2bcfb png
cloud
NLPH
the open natural language processing in hebrew initiative the open natural language processing in hebrew nlph initiative is a joint effort by members of datahack http www datahack il com and the public knowledge workshop http www hasadna org il en to promote open tools and resources for natural language processing in hebrew https github com nlph nlph resources vision our vision is to bring natural language processing capabilities in hebrew to a level on par with international industry standards keeping up with state of the art techniques by providing open source implementations to new algorithms and tools and making these capabilities publicly available for both public and commercial use goals 1 creating maintaining adapting and spreading resources that enable high quality production ready open licensed natural language processing in hebrew 2 enable foster and catalyze cooperation between stakeholders in academia private and the public sectors in order to promote better open source hebrew nlp solutions and share existing knowledge and tools who s taking part the public knowledge workshop http www hasadna org il en datahack http www datahack il com dr reut tsarfaty s natural language processing lab at the open university of israel http www openu ac il en personalsites reuttsarfaty aspx dr yoav goldberg s lab at the bar ilan university http u cs biu ac il yogo your company organization lab faculty we hope active projects hebrew nlp resources list maintaining a list of relevant resources https github com nlph nlph resources common voice hebrew a hebrew version for the mozilla common voice apps what s our current focus forming a group of volunteers to start work on the core projects during the developer meetings of the public knowledge workshop and in other frameworks including events like hackathons and as part of educational and research projects encouraging the open licensing of high quality open licensed tagged and labelled datasets from various domains social media articles research papers etc and for various tasks part of speech tagging text classification sentiment analysis named entity recognition etc and helping in authoring these datasets where they are missing adapting and integrating existing hebrew nlp python tools with existing popular frameworks nltk http www nltk org plugin spacy https spacy io models creating those tools when they are missing focusing on tokenization specifically stemming and lemmatization a word embeddings model for hebrew part of speech tagger how can i help help expand our list of resources for nlp in hebrew https github com nlph nlph resources join our mailing list https mailchi mp 77178ddb4727 the nlph mailing list for updates and for opportunities to contribure need something more specific email us at contact nlph org il mailto contact nlph org il join the discussion in our facebook group https www facebook com groups 157877988136954 if you are associated with an organization that already has good working solutions for some of the problems we are interested in and you d like to consider sharing those solutions or a subset thereof in a suitable open license we d love to hear from you documents we are maintaining a list of resources for nlp in hebrew here https github com nlph nlph resources another great list of nlp tools for hebrew can be found here https github com iddoberger awesome hebrew nlp meeting summaries nlph development meeting 1 21 12 17 https github com nlph nlph blob master meeting summaries dev meeting 1 2017 12 25 rst
ai
CloudFormation-only-IaC
deploy a high availability web app aws cloudformation nd9991 c2 infrastructure as code deploy a high availability web app using cloudformation project cloud devops engineering course 2 this directory contains the following files final project starter yml write cloudformation code using this yaml template for building the cloud infrastructure based on specification and or requirements server parameters json a json file for increasing the generic nature of the yaml code for example the json file contains a parameterkey as environmentname and parametervalue as udacityproject in yaml code the environmentname would be substituted with udacityproject accordingly bash scripts create stack sh delete stack sh update stack sh these are reusable helper scripts that can be used to create delete and update cloudformation stacks infrastructure diagram sample infrastructure diagram aws infrastructure diagram jpg load balancer dns the web application can be accessed via a load balancer url http udagr webap gq3z3hpiuiwt 818124837 us east 2 elb amazonaws com domain name
bash-scripting cloudformation json yaml
cloud
CAiRE-COVID
caire covid a machine learning based system that uses state of the art natural language processing nlp question answering qa techniques combined with summarization for mining the available scientific literature img src img tensorflow png width 12 img src img pytorch logo dark png width 12 license mit https img shields io badge license mit yellow svg https opensource org licenses mit img align right src img hkust jpg width 12 if you use any source codes or datasets included in this toolkit in your work please cite the following paper the bibtex is listed below pre inproceedings su2020caire title caire covid a question answering and query focused multi document summarization system for covid 19 scholarly information management author su dan and xu yan and yu tiezheng and siddique farhad bin and barezi elham and fung pascale booktitle proceedings of the 1st workshop on nlp for covid 19 part 2 at emnlp 2020 year 2020 pre abstract we present caire covid a real time question answering qa and multi document summarization system which won one of the 10 tasks in the kaggle covid 19 open research dataset challenge judged by medical experts our system aims to tackle the recent challenge of mining the numerous scientific articles being published on covid 19 by answering high priority questions from the community and summarizing salient question related information it combines information extraction with state of the art qa and query focused multi document summarization techniques selecting and highlighting evidence snippets from existing literature given a query we also propose query focused abstractive and extractive multi document summarization methods to provide more relevant information related to the question we further conduct quantitative experiments that show consistent improvements on various metrics for each module we have launched our website caire covid for broader use by the medical community and have open sourced the code for our system to bootstrap further study by other researches system online currently the caire covid system has already been launched online please access the system by http caire ust hk covid http caire ust hk covid kaggle cord 19 task winner we are honored to be informed that our submission has won as the best response for the task what has been published about information sharing and inter sectoral collaboration https www kaggle com sudansudan caire cord task10 install 1 you can install the requirements by pip install r requirements txt 2 in addition you also need to install pytorch https pytorch org system modules usage if you are interested in trying out the system modules yourself you can utilize the system module by the following methods document retriever 1 query paraphrasing for this part you can implement your own methods or skip this step if your queries are relatively short and simple or you don t persuit sota performance 2 search engine 2 1 install python dependencies and pre built index following the lucene answerini information retrieval as described in https github com castorini anserini blob master docs experiments covid md https github com castorini anserini blob master docs experiments covid md set up java sdk 11 first curl o https download java net java ga jdk11 9 gpl openjdk 11 0 2 linux x64 bin tar gz mv openjdk 11 0 2 linux x64 bin tar gz usr lib jvm cd usr lib jvm tar zxvf openjdk 11 0 2 linux x64 bin tar gz update alternatives install usr bin java java usr lib jvm jdk 11 0 2 bin java 1 update alternatives set java usr lib jvm jdk 11 0 2 bin java python import os os environ java home usr lib jvm jdk 11 0 2 2 2 get the pyserini library which is anserini wrapped with python pip install pyserini 0 8 1 0 we can build the lucene index of the covid 19 dataset from scratch or get one of the pre built indexes using the paragraph indexing which indexes each paragraph of an article already uploaded the index as a dataset to use can be downloaded from link https hkustconnect my sharepoint com u g personal dsu connect ust hk exggmqssoijejai8bygmmhwbhbewm5v38 a41qw7tbbn8q python from pyserini search import pysearch covid index the directory name of the index you downloaded from the above link the indexing is done based on each paragraph merged with the title and abstract given an article with id doc id the index will be as follows doc id title abstract doc id 00001 title abstract 1st paragraph docid 00002 title abstract 2nd paragraph docid 00003 title abstract 3rd paragraph 2 3 try the example python project retrieval py relevent snippet selection you can use our package by install with pip or use the source code pip install cairecovid question answering system in this system we build qa modules by a ensemble of two qa models which are biobert https github com dmis lab bioasq biobert model which fine tuned on squad and mrqa model which is our submission to mrqa emnlp 2019 the mrqa model and the exported biobert model that are utilized in this project can bo downloaded by this link https drive google com drive folders 1yjzyn kcz8ulobqauddftbgpaz6usddj usp sharing if you want to use our mrqa model in your work please cite the following paper the bibtex is listed below pre inproceedings su2019generalizing title generalizing question answering system with pre trained language model fine tuning author su dan and xu yan and winata genta indra and xu peng and kim hyeondey and liu zihan and fung pascale booktitle proceedings of the 2nd workshop on machine reading for question answering pages 203 211 year 2019 pre we provide the example script while you need to change the paths to the qa models in project qa py note that the final output is already re ranked based on re ranking score python project qa py hightlighting keyword highlighting is mainly implemented by term matching of which the code could be found in src covidqa highlights py summarization you can use our package by install with pip or use the source code pip install covidsumm we provide the example scripts for both abstractive and extractive summarization python project abstractive summarization py python project extractive summarization py
question-answering qa summarization search-engine nlp kaggle
ai
ot-rtos
build status ot rtos travis svg ot rtos travis ot rtos travis https travis ci org openthread ot rtos ot rtos travis svg https travis ci org openthread ot rtos svg branch main openthread rtos the openthread rtos project provides an integration of 1 openthread https github com openthread openthread an open source implementation of the thread networking protocol 2 lwip https git savannah nongnu org git lwip lwip contrib git a small independent implementation of the tcp ip protocol suite 3 freertos https www freertos org a real time operating system for microcontrollers openthread rtos includes a number of application layer demonstrations including mqtt http mqtt org a machine to machine m2m internet of things connectivity protocol http https en wikipedia org wiki hypertext transfer protocol the underlying protocol used by the world wide web tcp https en wikipedia org wiki transmission control protocol one of the main transport protocols in the internet protocol suite getting started linux simulation sh git submodule update init mkdir build cd build cmake dplatform name linux make j12 this will build the cli test application in build ot cli linux nordic nrf52840 sh git submodule update init mkdir build cd build cmake dcmake toolchain file cmake arm none eabi cmake dplatform name nrf52 make j12 this will build the cli test application in build ot cli nrf52840 hex you can flash the binary with nrfjprog download https www nordicsemi com software and tools development tools nrf5 command line tools and connecting to the nrf52840 dk serial port this will also build the demo application in build ot demo 101 see the demo 101 readme examples apps demo 101 readme md for a description of the demo application contributing we would love for you to contribute to openthread rtos and help make it even better than it is today see our contributing guidelines https github com openthread ot rtos blob main contributing md for more information contributors are required to abide by our code of conduct https github com openthread ot rtos blob main code of conduct md and coding conventions and style guide https github com openthread ot rtos blob main style guide md we follow the philosophy of scripts to rule them all https github com github scripts to rule them all license openthread rtos is released under the bsd 3 clause license https github com openthread ot rtos blob main license see the license https github com openthread ot rtos blob main license file for more information please only use the openthread name and marks when accurately referencing this software distribution do not use the marks in a way that suggests you are endorsed by or otherwise affiliated with nest google or the thread group need help openthread support is available on github openthread rtos bugs and feature requests submit to the openthread ot rtos issue tracker https github com openthread ot rtos issues openthread bugs and feature requests submit to the openthread issue tracker https github com openthread openthread issues community discussion ask questions share ideas and engage with other community members https github com openthread openthread discussions openthread to learn more about openthread see the openthread repository https github com openthread openthread
openthread iot google nest wireless mesh-networks ieee-802154 ipv6 internet-of-things embedded freertos lwip
os
m2-devtools
magento 2 devtools circleci https circleci com gh magento m2 devtools svg style svg https circleci com gh magento m2 devtools an extension for google chrome and likely mozilla firefox https developer mozilla org en us docs mozilla add ons webextensions that exposes helpful debugging utilities for magento 2 front ends early release this is a very new project with little to no documentation published to solicit feedback from early adopters the extension is currently only available through manual installation of the development build and will be published to the chrome web store at a future time documentation docs readme md usage whenever you navigate to a page running magento 2 a new tab should appear in devtools from this extension p align center img src screenshot png p in progress features requirejs optimizer https requirejs org docs optimization html configuration generator including magento module for quick install requirejs module registry inspector possible future features uicomponents explorer inspector think react angular devtools m2 front end best practices checks running development build google chrome prerequisites node js 8 x npm 6 x setup 1 clone the repository 2 run npm install 3 run npm start 4 navigate to chrome extensions 5 enable developer mode 6 click load unpacked 7 select the extension folder in the root of this repository notes to run a single build use npm run build instead of npm start if you have chrome devtools open when you make a change in src you ll need to close and re open devtools to see the changes if you need to debug the devtools page react app in src open the magento 2 tab in devtools then right click inspect element this will open a new instance of the devtools pointed at the react application
front_end
cde
cde cloud data engineering
cloud
R2D2
plataforma training center client este o reposit rio front end da plataforma training center voc pode encontrar o back end aqui https github com training center hades temos um documento com os requisitos funcionais requisitos plataforma md que esta plataforma deve atender desta forma mantemos sempre um norte em rela o s funcionalidades que ainda dever o ser implementadas desenvolvimento instalando depend ncias recomendamos o uso do yarn https yarnpkg com para gerenciar as depend ncias do projeto mas voc tamb m pode usar o npm se preferir sh yarn ou npm i executando o projeto sh yarn start ou npm start executando os testes sh yarn test ou npm test para ativar o modo watch rode o comando adicionando o seguinte par metro sh yarn test watch ou npm test watch storybook utilizamos o storybook https storybook js org para nos ajudar a validar tanto a parte visual quanto a funcional dos componentes react enquanto os desenvolvemos voc pode checar localmente executando o seguinte comando sh yarn storybook ou npm run storybook e acessando em seu browser a url localhost 6006 http localhost 6006 contribuindo voc pode nos ajudar de diversas formas veja as issues abertas https github com training center r2d2 issues e se quiser trabalhar em alguma delas envie um coment rio viu alguma melhoria que gostaria de fazer abra uma nova issue ou envie um pull request com a sua proposta quando for abrir um pull request primeiro fa a o fork do projeto para trabalhar em sua vers o e crie uma branch com um nome descritivo ao commitar mantenha os commits tamb m descritivos e concisos dessa forma quando formos analisar o hist rico de altera es podemos acompanhar os pontos da hist ria onde cada funcionalidade foi introduzida e pegar bugs de forma mais r pida licen a este projeto est licenciado sob a licen a mit license ou seja voc pode us lo da forma que preferir incluindo suas pr prias modifica es em vers es pr prias
react reactjs javascript mentoria mentorship
front_end
foml
foundations of machine learning editing rebuilding and deploying this page building locally quickstart be sure to have node js https iojs org 7 x installed run npm install in the project root this will install some build tools we use run npm run build in place to do the templating and stylesheet compilation in place you can then view the site by simply opening index html run build sh to do a local build into the out directory you can then preview the site at out index html this should usually be the same as just index html but it is good to check before committing since this local deploy process is slightly more complicated than the in place build process how to edit content the file index hbs is usually what you should edit basically as though you were editing an html file the final html is generated with some javascript processing that pulls in data from the yaml files in the data directory basically the information in the lectures and assignments tables deployment run the script deploy sh from the root directory of the project to build and deploy the page to github the script does the following 1 pulls down the gh pages branch into a folder called out in the project root directory github serves webpages from gh pages branches then it runs npm run build which compiles the page and puts the output into out then the revised out folder is committed and pushed back to the gh pages branch ready to be served technologies used stylus https learnboost github io stylus is used for styling handlebars http handlebarsjs com is used for templating index hbs is minimally templated mostly delegating to the partials in templates those pull their data from data the logic that ties them all together is in build templater js the site is intended to be responsive which we accomplish with per device stylesheets and media queries in the html things to keep in mind while editing you should be using an editorconfig http editorconfig org plugin for your text editor to enforce a few basic stylistic things we are trying to maintain a reasonable html document outline so don t use section as if it were div to preview the document outline use the html 5 outliner tool https gsnedders html5 org outliner
ai
Espers
espers esp 2016 2023 integration staging tree project website https espers io navigation of these documents general readme you are here latest releases https github com cryptocoderz espers releases features specifications doc readme md installation guide install md asset attributions doc assets attribution md license espers esp is released under the terms of the mit license see copying copying for more information or see https opensource org licenses mit assistance and contact we strive to be an open and friendly community that is eager to help each other through the journey of the espers esp project development if at any time someone needs assistance or just a community member to talk to please feel free to join our discord at https discord gg cn3afps development process the master branch is regularly built and tested but is not guaranteed to be completely stable tags https github com cryptocoderz espers tags are created regularly to indicate new official stable release versions of espers esp the contribution workflow is described in contributing md contributing md the developer discord should be used to discuss complicated or controversial changes before working on a patch set developer discord can be found at https discord gg cn3afps testing testing and code review is the bottleneck for development we get more pull requests than we can review and test on short notice please be patient and help out by testing other people s pull requests and remember this is a security critical project where any mistake might cost people lots of money automated testing developers are strongly encouraged to write unit tests doc unit tests md for new code and to submit new unit tests for old code unit tests can be compiled and run assuming they weren t disabled in configure with make check there are also regression and integration tests qa of the rpc interface written in python that are run automatically on the build server manual quality assurance qa testing changes should be tested by somebody other than the developer who wrote the code this is especially important for large or high risk changes it is useful to add a test plan to the pull request description if testing the changes is not straightforward
blockchain
core
clarity logo png clarity core build https github com vmware clarity core workflows build badge svg npm version core https img shields io npm v cds core next label 40cds 2fcore style flat square https www npmjs com package cds core npm version cds react https img shields io npm v cds angular next label 40cds 2freact style flat square https www npmjs com package cds react npm version cds angular https img shields io npm v cds angular next label 40cds 2fangular style flat square https www npmjs com package cds angular npm version clarity city https img shields io npm v cds city latest label 40cds 2fcity style flat square https www npmjs com package cds city clarity is an open source design system that brings together ux guidelines design resources and coding implementations with web components this repository includes everything you need to build customize test and deploy clarity for complete documentation visit the clarity website https clarity design getting started clarity is published as five npm packages npm version core https img shields io npm v cds core next label 40cds 2fcore style flat square https www npmjs com package cds core contains the web components that work in any javascript framework npm version cds angular https img shields io npm v cds angular next label 40cds 2fangular style flat square https www npmjs com package cds angular contains shims for core usage in angular environment npm version cds react https img shields io npm v cds angular next label 40cds 2freact style flat square https www npmjs com package cds react contains shims for core usage in react environment npm version clarity city https img shields io npm v cds city latest label 40cds 2fcity style flat square https www npmjs com package cds city our open source sans serif typeface installing clarity visit our documentation at https clarity design get started documentation api documentation https clarity design storybook core usage guidelines https clarity design contributing the clarity project team welcomes contributions from the community for more detailed information see our contribution guidances docs contributing md licenses the clarity design system is licensed under the mit license license feedback if you find a bug or want to request a new feature please open a github issue https github com vmware clarity core issues include a link to the reproduction scenario you created by forking one of the clarity stackblitz templates for the version you are using at clarity stackblitz templates https stackblitz com clr team support for our support policies please visit https clarity design get started support for questions ideas or just reaching out to the team feel free to open a discussion in our github disscussion section https github com vmware clarity core discussions
design-system ui-components clarity a11y web-components react angular vue javascript
os
dac_sdc_2020
dac 2020 design contest note these files are still under development they should be stable by the end of january for full contest details please see the 2020 dac system design contest https dac sdc 2020 groups et byu net doku php page for general questions regarding this contest please use the piazza page piazza com dac 2018 winter2020 dacsdc2020 home setup pynq on your ultra96v2 board download the ultra96v2 board image from http www pynq io board html follow the instructions to image the sd card at https pynq readthedocs io en latest getting started pynq image html follow the instructions to setup and connect to the board at https ultra96 pynq readthedocs io en latest getting started html usage the get started users have to run the following command on the ultra96 board shell cd home xilinx jupyter notebooks sudo git clone https github com jgoeders dac sdc 2020 git remember the user name and password are both xilinx for super user after the above step is completed successfully you will see a folder dac sdc 2020 under your jupyter notebook dashboard open the sample team dac sdc ipynb notebook for directions on where to begin folder structure 1 sample team this folder contains files for a sample team this includes a teamname bit and teamname tcl file that defines the hardware and a ipynb jupyter notebook and a hw folder that is used to create a vivado project 2 images all the test images are stored in this folder 3 result the results contain the output xml produced when execution is complete and contains the runtime energy usage and predicted location of each object in each image
os
SQL-Employee-Database
employee database a mystery in two parts sql png sql png background it is a beautiful spring day and it is two weeks since you have been hired as a new data engineer at pewlett hackard your first major task is a research project on employees of the corporation from the 1980s and 1990s all that remain of the database of employees from that period are six csv files in this assignment you will design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data in other words you will perform 1 data modeling 2 data engineering 3 data analysis instructions data modeling inspect the csvs and sketch out an erd of the tables feel free to use a tool like http www quickdatabasediagrams com http www quickdatabasediagrams com data engineering use the information you have to create a table schema for each of the six csv files remember to specify data types primary keys foreign keys and other constraints import each csv file into the corresponding sql table data analysis once you have a complete database do the following 1 list the following details of each employee employee number last name first name gender and salary 2 list employees who were hired in 1986 3 list the manager of each department with the following information department number department name the manager s employee number last name first name and start and end employment dates 4 list the department of each employee with the following information employee number last name first name and department name 5 list all employees whose first name is hercules and last names begin with b 6 list all employees in the sales department including their employee number last name first name and department name 7 list all employees in the sales and development departments including their employee number last name first name and department name 8 in descending order list the frequency count of employee last names i e how many employees share each last name bonus optional as you examine the data you are overcome with a creeping suspicion that the dataset is fake you surmise that your boss handed you spurious data in order to test the data engineering skills of a new employee to confirm your hunch you decide to take the following steps to generate a visualization of the data with which you will confront your boss 1 import the sql database into pandas yes you could read the csvs directly in pandas but you are after all trying to prove your technical mettle this step may require some research feel free to use the code below to get started be sure to make any necessary modifications for your username password host port and database name sql from sqlalchemy import create engine engine create engine postgresql localhost 5432 your db name connection engine connect consult sqlalchemy documentation https docs sqlalchemy org en latest core engines html postgresql for more information if using a password do not upload your password to your github repository see https www youtube com watch v 2uatpmnvh0i https www youtube com watch v 2uatpmnvh0i and https martin thoma com configuration files in python https martin thoma com configuration files in python for more information 2 create a bar chart of average salary by title 3 you may also include a technical report in markdown format in which you outline the data engineering steps taken in the homework assignment epilogue evidence in hand you march into your boss s office and present the visualization with a sly grin your boss thanks you for your work on your way out of the office you hear the words search your id number you look down at your badge to see that your employee id number is 499942 submission create an image file of your erd create a sql file of your table schemata create a sql file of your queries optional create a jupyter notebook of the bonus analysis create and upload a repository with the above files to github and post a link on bootcamp spot
server
Maxim
maxim ok so this was a quickly hacked together thing we did for a mooc in 2013 i really don t think you want to use this anymore but i ll leave it here for posterity cross platform javascript java audio dsp and mobile web development library compatible with processing maxim is designed to make it easier to program cross platform audio for desktops amd mobile platforms it provides a single api for building complex audio applications on android ios and the desktop using the webaudioapi in combination with traditional java approaches for compatibility it s a work in progress but vastly simplifies the process of getting started writing audio and music software for mobile platforms some notes if you are using javascript mode make sure your browser supports webaudioapi properly see here for a list of browsers that support webaudio http caniuse com audio api
front_end
Full-Stack-Web-Development
full stack web development this course teaches full stack web development using php how should you use resources available here 1 there are 2 different directories for cse b and cse c which will be updated periodically 2 we are going to start this course in a top to bottom fashion i e begin with php backend and gradually understand the concepts of html css and java script frontend 3 we will invest our considerable time in learning java script through some small assignments games mini projects 4 students should create separate repositories where they will keep pushing solutions of assignments 5 in later phase of this course multiple students a team of 4 to 6 students should collaborate on the projects 6 tutorials will be updated frequently in main directory and lectures and labs will focus on solving these tutorials note in case of any doubt clarification or suggestion please contact deepakuniyal geu ac in deepak uniyal assistant professor cse graphic era deemed to be university dehradun uttarakhand to visit my linkedin profile click here https www linkedin com in deepak uniyal 592b7545
cse php java-script
front_end
web3.swift
web3 swift ethereum api for swift https raw github com argentlabs web3 swift master web3swift png swift https github com argentlabs web3 swift actions workflows swift yml badge svg branch develop https github com argentlabs web3 swift actions workflows swift yml installation swift package manager use xcode to add to the project file swift packages or add this to your package swift file swift package url https github com argentlabs web3 swift from 1 1 0 cocoapods add web3 swift to your podfile ruby pod web3 swift then run the following command bash pod install usage getting started create an instance of ethereumaccount with a ethereumkeystorage provider this provides a wrapper around your key for web3 swift to use note we recommend you implement your own keystorage provider instead of relying on the provided ethereumkeylocalstorage class this is provided as an example for conformity to the ethereumsinglekeystorageprotocol swift import web3 this is just an example ethereumkeylocalstorage should not be used in production code let keystorage ethereumkeylocalstorage let account try ethereumaccount create replacing keystorage keystorepassword my password create an instance of ethereumhttpclient or ethereumwebsocketclient this will then provide you access to a set of functions for interacting with the blockchain ethereumhttpclient swift guard let clienturl url string https an infura or similar url com 123 else return let client ethereumhttpclient url clienturl or ethereumwebsocketclient swift guard let clienturl url string wss goerli infura io ws v3 123 else return let client ethereumwebsocketclient url clienturl you can then interact with the client methods such as to get the current gas price swift client eth gasprice error currentprice in print the current gas price is currentprice if using async await you can await on the result swift let gasprice try await client eth gasprice smart contracts static types given a smart contract function abi like erc20 transfer javascript function transfer address recipient uint256 amount public returns bool then you can define an abifunction with corresponding encodable swift types like so swift public struct transfer abifunction public static let name transfer public let gasprice biguint nil public let gaslimit biguint nil public var contract ethereumaddress public let from ethereumaddress public let to ethereumaddress public let value biguint public init contract ethereumaddress from ethereumaddress nil to ethereumaddress value biguint self contract contract self from from self to to self value value public func encode to encoder abifunctionencoder throws try encoder encode to try encoder encode value this function can be used to generate contract call transactions to send with the client swift let function transfer contract 0xtokenaddress from 0xfrom to 0xto value 100 let transaction try function transaction client eth sendrawtransaction transaction withaccount account error txhash in print tx hash txhash if using async await you can await on the result swift let txhash try await client eth sendrawtransaction transaction withaccount account generating abi from a smart contract abi file currently we don t support code generation as making it properly is a bigger project and should possibly live outside of this repository you can try this project instead imanrep swiftabigen https github com imanrep swiftabigen data types the library provides some types and helpers to make interacting with web3 and ethereum easier ethereumaddress for representation of addresses including checksum support bigint and biguint using bigint https github com attaswift bigint library ethereumblock represents the block either number of rpc specific definitions like earliest or latest ethereumtransaction wraps a transaction encoders and decoders can work with it to generate proper data fields conversion from and to foundation types all extensions are namespaced under type web3 so for example to convert an int to a hex string swift let gwei 100 let hexgwei gwei web3 hexstring supported conversions convert from hex byte string 0xabc to data convert from hex byte string 0xabc to int convert from hex byte string 0xabc to biguint convert string int biguint data to a hex byte string 0xabc add or remove hex prefixes when working with string erc20 we support querying erc20 token data via the erc20 struct calls allow to get the token symbol name and decimals get a token balance retrieve transfer events erc721 we support querying erc721 token data via the erc721 struct including get the token symbol name and decimals get a token balance retrieve transfer events decode standard json for nft metadata please be aware some smart contracts are not 100 compliant with standard zksync era we also include additional helpers to interact with zksync era https zksync io by importing web3 zksync take a look at zksynctransaction https github com argentlabs web3 swift blob develop web3swift src zksync zksynctransaction swift or use directly zksyncclient https github com argentlabs web3 swift blob develop web3swift src zksync zksyncprovider swift which has similar api as the ethereumclient running tests some of the tests require a private key which is not stored in the repository you can ignore these while testing locally as ci will use the encrypted secret key from github it s better to run only the tests you need instead of the whole test suite while developing if you ever need to set up the key locally take a look at testconfig swift where you can manually set it up alternatively you can set it up by calling the script setupkey sh and passing the value adding 0x so it s written to an ignored file dependencies we built web3 swift to be as lightweight as possible however given the cryptographic nature of ethereum there s a couple of reliable c libraries you will find packaged with this framework keccac tiny https github com coruus keccak tiny an implementation of the fips 202 defined sha 3 and shake functions in 120 cloc 156 lines tiny aes https github com kokke tiny aes c a small and portable implementation of the aes ecb ctr and cbc encryption algorithms secp256k1 swift https github com boilertalk secp256k1 swift package dependencies bigint https github com attaswift bigint genericjson https github com iwill generic json swift secp256k1 https github com gigabitcoin secp256k1 swift git vapor websocket https github com vapor websocket kit git apple swift log https github com apple swift log git also for linux build we can t use apple crypto apis so we embedded a small subset of cryptoswift instead of importing the whole library credit to marcin krzy anowski https github com krzyzanowskim cryptoswift contributors the initial project was crafted by the team at argent however we encourage anyone to help implement new features and to keep this library up to date for features and fixes simply submit a pull request to the develop https github com argentlabs web3 swift tree develop branch please follow the contributing guidelines https github com argentlabs web3 swift blob master contributing md for bug reports and feature requests please open an issue https github com argentlabs web3 swift issues license released under the mit license https github com argentlabs web3 swift blob master license
ethereum swift blockchain ethereum-api
blockchain
mad
mad github for mobile application development useful links http www ntu edu sg home ehchua programming android android ndk html http www ntu edu sg home ehchua programming android android howto html
front_end
MSVC-FreeRTOS-Template-202012.00
msvc freertos template 202012 00 a minimalistic freertos project only for windows excl demos etc just clone this project to where you want to start a freertos project on windows example 1 go to the folder where you want your project stored 2 open a command prompt and clone this repository git clone https github com ihavn msvc freertos template 202012 00 wanted name of your project 3 open the project in visual studio 4 set the solution target to x86 5 rename the project to the project name you want
os
SmartParkingSystem
smartparkingsystem made for the course embedded system design hardware simulation on tinkercad tech stack involved flask firebase tinkercad
smart-parking-system flask thingspeak tinkercad machine-learning hacktoberfest
os
FPGA-Design-for-Embedded-Systems-Specialization
fpga design for embedded systems specialization fpga design for embedded systems specialization
os
Group-12-Mobile-Banking
group 12 jk2r mobile banking app final project in the subject mobile application development capture https user images githubusercontent com 107467802 179892522 a910fffe 7465 4c71 8b4d 3b110dfd6d29 png overview this is an application created in android studio using java programming language for android back end used is the firebase firestore database the app is real time displaying the current status that s on the firebase features the following screenshots below compromises the entirety of this mobile banking application project otp generation you must must carry out this operation before you may launch the jk2r mobile banking application the first step is to enter your active cell number to get your otp simply click get otp to process the code that will be supplied to you in less than a minute after waiting a short while for the code to be sent move on to the next stage screenshot 2022 07 18 15 02 50 41 https user images githubusercontent com 107467802 179890433 e5084d38 bfb4 49ce 9572 744171d821c6 jpg otp verification we are now verifying the mobile number you provided to receive the otp since the otp code was already sent to it a minute ago additionally this is being done to ensure the user s safety while utilizing the application the supplied code is already present on the screen as you can see simply copy the code click verify and sign in wait a few minutes for verification and if receiving the otp code proves challenging wait at least a minute before trying again screenshot 2022 07 18 15 02 54 79 https user images githubusercontent com 107467802 179890456 99206937 ee85 4da5 bc51 eb10e032451d jpg account registration to create an account as a new user of this application simply click create account and then provide the necessary details including the complete name email address password and mobile number for the application to function after entering the required data you can now click the register button screenshot 2022 07 18 15 36 49 28 https user images githubusercontent com 107467802 179891674 ccdcef07 7174 4a23 a801 85c5b2fc4009 jpg account log in if the user already has an account they can proceed with the transaction and other capabilities of the jk2r mobile banking application by entering their email address and password that they used to log in screenshot 2022 07 18 15 03 30 37 https user images githubusercontent com 107467802 179891713 b28fa1af 1799 41fe a2ff 23c9cba28043 jpg main menu after checking in we may access the main menu to check our balance set savings objectives transfer money and change our profile you have access to all of our application s features from this point screenshot 2022 07 18 14 33 47 08 https user images githubusercontent com 107467802 179890062 84ac4ea7 f493 4749 87c1 54e993a3acdb jpg check balance transferring money checking your balance or starting a savings account are your three alternatives for completing the transaction to check your balance or to make a deposit enter the necessary amount of cash to start a withdrawal you click the button that says withdraw screenshot 2022 07 18 15 36 36 72 https user images githubusercontent com 107467802 179891568 8d0928db 81fa 4e23 a0e5 26a436183d85 jpg savings balance with the savings account the client can deposit money or deposit to add to the current savings account screenshot 2022 07 18 15 35 30 17 https user images githubusercontent com 107467802 179891339 242e4206 4139 4594 b9a8 b05a8d05ae2a jpg transfer of funds the transfer fund function as the means to interchange the clients money from saving to check o vice versa screenshot 2022 07 18 15 28 08 90 https user images githubusercontent com 107467802 179891084 e802e9df 6ae9 4337 8f03 9f26675aba3b jpg user profile the user can examine all the transactions and details they can also see the edit button which allows them to change or modify their existing data as they can see there is also a logout option which the user must use to exit their account for security reasons 293137945 422562293158001 7238145427493069215 n https user images githubusercontent com 107467802 179890400 d8c29214 df25 4db2 912b b56f4ca188ba jpg requirements android studio 4 0 or higher installed on a mac linux or windows machine android device in developer mode with usb debugging enabled usb cable to connect android device to your computer or use the configured specific mobile device android version of android studio emulator demo video click the link to watch https youtu be zv oai bwpe researcher developer joshua rufo kentmer daet kingsley alexis tumbaga rizza micah bibay subject mobile aplication development adviser prof jefferson a costales school name eulogio amang rodriguez insitute of science and technology nagtahan sampaloc manila philippines college of arts and sciences ite department course and section bachelor of science in computer science 3 c date july 20 2022
front_end
react-universal
react universal app with social login starter kit minimal react redux boilerplate mern for desktop mobile and web app with social login feature inspired by creating universal apps like slack skype etc demo web app https react universal web herokuapp com expo mobile https expo io appetize simulator url https expo io by12380 react universal instructions click tap to play open with chrome click always scroll down the web page and click open project using expo electron app download links mac https www dropbox com s 2vnwx9dttz083or react universal 0 2 7 dmg raw 1 windows https www dropbox com s o97syfutahencpg react universal 20setup 200 2 7 exe raw 1 linux https www dropbox com s zrd413nhrmhibqg react universal 0 2 7 x86 64 appimage raw 1 br img src demo gif width 1000px br sample app using react universal todos app demo https react todo universal herokuapp com br img src https raw githubusercontent com by12380 react todo universal master todos demo gif width 1000px br features react universal redux universal electron desktop expo mobile express app server optional mongo db database optional automatic re login session storage sync app across all devices socket io social login auth0 https auth0 com default setup to demonstrate multiple social login platforms google facebook github twitter etc getting started git clone https github com by12380 react universal git cd react universal general setup auth0 auth0 setup for development web app client react electron app client electron expo app client expo app server server optional auth0 setup for development 1 sign in register auth0 account 2 go to application your app name settings 3 in allowed callback urls add http localhost 3000 callback for web and electron app https auth expo io your expo account username react universal for expo app in allowed logout urls add http localhost 3000 4 go to apis create api 5 set identifier ex https api react universal com hit create 6 go to settings toggle allow offline access save
react redux expo react-native electron social-login socket-io
front_end
Data-Engineering-Project-using-Azure-on-Olympics-Dataset
data engineering on olympics dataset using azure services leveraging azure cloud services end to end data engineering project on olympics dataset architecture diagram img src img architecture jpeg project tasks 1 get the data from the data source https www kaggle com datasets arjunprasadsarkhel 2021 olympics in tokyo 2 ingest the raw data into the data lake using the data factory 3 apply some transformations on the data in the databricks platform using apache spark 4 store the transformed data back in the data lake 5 analyze the transformed data using synapse analytics 6 build dashboards using power bi tableau or looker studio services used 1 azure data factory 2 azure data lake storage gen 2 3 azure data bricks 4 apache spark 5 azure synapse analytics 6 power bi data pipeline img src img data pipeline png visualizations ordering countries based on number of athletes img src img countries number of athletes jpeg ordering countries based on number of medals won img src img countries medals jpeg
cloud
PVIT
position enhanced visual instruction tuning for multimodal large language models extending the functionality of mllms by integrating an additional region level vision encoder paper https arxiv org abs 2308 13437 demo https huggingface co spaces pvit pvit code license https img shields io badge code 20license apache 2 0 green svg https github com tatsu lab stanford alpaca blob main license data license https img shields io badge data 20license cc 20by 20nc 204 0 red svg https github com tatsu lab stanford alpaca blob main data license usage and license notices the data and checkpoint is intended and licensed for research use only they are also restricted to uses that follow the license agreement of llama vicuna and gpt 4 the dataset is cc by nc 4 0 allowing only non commercial use and models trained using the dataset should not be used outside of research purposes contents install install pvit weights pvit weights data generation data generation demo demo data data train train evaluation evaluation install 1 clone this repository and navigate to pvit folder shell git clone https github com thunlp mt pvit git cd pvit 2 install package shell conda create n pvit python 3 9 6 conda activate pvit pip install r requirements txt 3 install regionclip shell git clone https github com microsoft regionclip git pip install e regionclip click here https github com microsoft regionclip for more details pvit weights to get pvit weights please first download weights of llama https ai meta com resources models and libraries llama downloads and regionclip https github com microsoft regionclip for regionclip please download regionclip pretrained cc rn50x4 pth click here https huggingface co pvit pvit for pvit checkpoints please put all the weights in folder model weights and merge pvit weights with llama weights through the following command shell base model model weights llama 7b target model model weights pvit delta model weights pvit delta scripts delta apply sh data generation we provide prompts and few shot examples used when querying chatgpt in both task specific instruction data generation and general instruction data generation figure 3 b and figure 3 c in our paper https arxiv org abs 2308 13437 the data generation task specific folder includes seeds prompts and examples in single turn conversation generation and multi turn conversation generation single turn conversation generation includes five types of tasks small object recognition object relationship based reasoning optical character recognition ocr object attribute based reasoning and same category object discrimination the data generation general folder includes seeds prompts and examples used in general instruction data generation demo to run our demo you need to prepare pvit checkpoints locally please follow the instructions here pvit weights to download and merge the checkpoints web server to run the demo please first launch a web server with the following command shell model path model weights pvit controller port 39996 worker port 40004 scripts model up sh streamlit web ui run the following command to run a streamlit demo locally the port of model addr should be consistant with worker port shell model addr http 0 0 0 0 40004 scripts run demo sh cli inference run the following command to do cli inference locally the port of model addr should be consistant with worker port shell model addr http 0 0 0 0 40004 scripts run cli sh data you can download stage1 https huggingface co datasets pvit pvit data stage1 and stage2 https huggingface co datasets pvit pvit data stage2 training data on huggingface you are required to download pictures of coco2017 train https cocodataset org sbu captioned photo https www cs rice edu vo9 sbucaptions visual genome https homes cs washington edu ranjay visualgenome gqa https cs stanford edu people dorarad gqa and visual commonsense reasoning http visualcommonsense com datasets as well please put stage1 and stage2 data and the downloaded pictures in folder data as follows you can modify image paths in data stage1 mapping yaml and data stage2 mapping yaml to change the path of downloaded pictures img src assets tree jpg width 50 height 50 train our model is trained in two stages in stage 1 we initialize the model with the pre trained llava and only train the linear projection layer that is responsible for transforming the region features in stage 2 we only keep the parameters of the image encoder and the region encoder frozen and fine tune the rest of the model to train pvit please download the pretrained llava checkpoints https github com haotian liu llava and put it in folder model weights the following commands are for stage 1 training shell export model path model weights llava lightning 7b v1 export region clip path model weights regionclip pretrained cc rn50x4 pth export data path data stage1 export output dir checkpoints stage1 ckpt export port 25001 scripts train stage1 sh the following commands are for stage 2 training shell export model path checkpoints stage1 ckpt export region clip path model weights regionclip pretrained cc rn50x4 pth export data path data stage2 export output dir checkpoints stage2 ckpt export port 25001 scripts train stage2 sh evaluation we propose fineeval dataset for human evaluation see folder fine eval for the dataset and model outputs the files in the folder are as follows images image files of fineeval dataset instructions jsonl questions of fineeval dataset pvit jsonl the results of pvit ours model llava jsonl the results of llava model shikra jsonl the results of shikra model gpt4roi jsonl the results of gpt4roi model to run pvit on fineeval dataset you can launch a web server and run the following command the port of model addr should be consistant with worker port shell model addr http 0 0 0 0 40004 scripts run fine eval sh citation if you find pvit useful for your research and applications please cite using this bibtex bibtex misc chen2023positionenhanced title position enhanced visual instruction tuning for multimodal large language models author chi chen and ruoyu qin and fuwen luo and xiaoyue mi and peng li and maosong sun and yang liu year 2023 eprint 2308 13437 archiveprefix arxiv primaryclass cs cv acknowledgement llava https github com haotian liu llava the codebase we built upon which has the amazing multi modal capabilities vicuna https github com lm sys fastchat the codebase llava built upon and the base model vicuna 13b that has the amazing language capabilities regionclip https github com microsoft regionclip our prompt encoder related projects instruction tuning with gpt 4 https github com instruction tuning with gpt 4 gpt 4 llm llava large language and vision assistant https github com haotian liu llava llava med training a large language and vision assistant for biomedicine in one day https github com microsoft llava med otter in context multi modal instruction tuning https github com luodian otter shikra unleashing multimodal llm s referential dialogue magic https github com shikras shikra gpt4roi instruction tuning large language model on region of interest https github com jshilong gpt4roi
ai
ibu-devops-engineering-on-aws-cloud-g4
ibu devops engineering on aws cloud group 4 team members 1 amina kod aga https github com aminakodzaga 2 adnan krnd ija https github com adnankrndzijaa 3 adnan brki https github com kojiado repository description there are two folders on the github repository resources and docs the application code is in the resources folder and all the documentation required for the project is in the docs folder the documentation includes an estimate of costs on a monthly and annual level including the necessary services in aws an architectural diagram of the project and all scripts we used for implementation of the project project description university website phase 1 in first phase we created a diagram and estimated costs for aws services on a monthly and annual basis on the diagram you can see how we planned our project to look like and in the end we did everything that way we divided the system into two availability zones us east 1a us east 1b and did a part of the work in each one so that in the end we could combine it all and get a project that as a whole is fully functional image https github com adnankrndzijaa ibu devops engineering on aws cloud group 4 assets 92021913 4c50fbd4 a856 4dc6 8441 63985a1c27cd image https github com adnankrndzijaa ibu devops engineering on aws cloud group 4 assets 92021913 834940bc f1b9 4b35 9369 8111db6005bc phase 2 in the second phase we created vpc 2 public and 2 private subnets internet gateway nat gateway and security groups after which we connected all these things to be properly workable we used a script with code from the aws academy course and used it when creating a new instance group4 publicinstance this instance was created for the website together with the database so it was necessary to use a session manager to successfully bring everything together and work we have successfully created a virtual machine using ec2 instance using ubuntu amazon machine image then we tested the crud operations on our website and found that everything works correctly adding deleting updating and listing data phase 3 given that we already created the necessary 2 private subnets in the previous phase each for one availability zone we moved on to creating a new instance phase3instance for which we used a new script without a predefined database and using the labinstanceprofile iam role after that we created an amazon rds database for mysql which will later be connected to a new instance immediately after that we created cloud9 and ran the script to create a secret on the secrets manager inside the secrets manager service there was a secret that we created and it contained all the data about the database including the user db name host endpoint and db password when we checked that everything was created as we set it up we moved on to migrating the old database to the new one and connecting the new instance to that created database within the amazon rds service using the commands from the script we managed to migrate the data from the first original database to the newly created one and thus successfully merge and complete this phase of the project after a successful connection we were able to successfully read old data edit and delete them but also to add new data on the new instance image https github com adnankrndzijaa ibu devops engineering on aws cloud group 4 assets 92021913 de9df513 2415 4b76 ada5 0f63f8ef6a03 phase 4 in the last phase we implemented high availability and scalability services that will improve our website first we created the ami image for the instance we created in phase 3 then we created the load balancer and auto scaling group for asg we primarily needed a launch template which we also had to create we checked the created services and confirmed that they were successfully created as we specified them and then using the dns name from elb we accessed our website and tested it our website was now fully created with all the plugins we created and set up afterwards as the last task within this phase we performed load testing of this application with the help of commands from the script we started the load testing installation command within the cloud9 service and then the testing command image https github com adnankrndzijaa ibu devops engineering on aws cloud group 4 assets 92021913 36d898df aad4 4d7d afd4 5897ef5897f9
cloud
notionGPT
notiongpt notiongpt a practical tool built on top of chatgpt large language model make it your note taking assistant notiongpt chatgpt notion 1 api notion https developers notion com reference intro 2 snownlp https github com isnowfy snownlp 3 openai emeddling api embedding https platform openai com docs api reference embeddings 4 pinecone 5 prompt 6 openai qa api tech stack 1 notion api get database content page content 2 snownlp chinese sentence segmentation 3 pinecone vectordb personal knowledge db upsert query 4 openai sentence embedding prompt qa 5 fastapi frontend web ui 6 prompt engineering cot chain of thought search in chain chain of keyword usage 1 notion data base id auth token 2 pinecone pinecone api key pinecone env name proxy needed 3 openai openai api key 4 git clone https github com suiwan notiongpt git 5 pip install r requirements txt 6 fill in code 7 python webui py todo add function of updating notes to dababase better prompt engineering other qa llm support local vector database reference chain of thought https arxiv org abs 2201 11903 search in chain https arxiv org abs 2304 14732
python vectordb pinecone fastapi chatgpt llm
ai
web-development-and-mobile-challenges
english web development and mobile challenges this repository is designed so that students can easily find small projects to learn web and mobile development choose your challenge 0 hello world it won t count challenges 0 hello world readme md 1 average test score challenges 1 average test score readme md 2 investment challenges 2 investment readme md 3 interest rate challenges 3 interest rate readme md 4 la padarie challenges 4 la padarie readme md contributing pull requests are welcome for major changes please open an issue first to discuss what you would like to change read our contribution guidelines here https github com infojrufba web development and mobile challenges blob main contributing md updating your fork does this repo have more challenges than yours then update your fork there are two ways to update your fork 1 github ui in your repo make a pr of this repository for yours and accept simple as that smile 2 terminal bash 1 switch to the master branch git checkout master 2 check if your local copy has the link to the original git remote v 3 if not add the original link git remote add upstream https github com infojrufba web development and mobile challenges git 4 make sure that the link has been added git remote v 5 you can now fetch with the original repo assuming the link name is upstream git fetch upstream 6 merge updates to your master branch git merge upstream master master 7 push the new changes to your fork git push origin master portuguese desafios de desenvolvimento web e mobile este reposit rio destinado para que estudantes possam facilmente achar pequenos projetos para aprender sobre desenvolvimento web e mobile escolha o seu desafio 0 hello world n o ir contar challenges 0 hello world readme md 1 m dia do aluno challenges 1 average test score readme md 2 investimento challenges 2 investment readme md 3 taxa de juro challenges 3 interest rate readme md 4 la padarie challenges 4 la padarie readme md contribuindo pull requests s o bem vindos para mudan as maiores por favor abra uma issue primeiro para discutir o que voc gostaria de alterar leia nossas diretrizes de contribui o aqui https github com infojrufba web development and mobile challenges blob main contributing md atualizando seu fork este repo tem mais desafios do que o seu ent o atualize seu fork existem duas formas de atualizar seu fork 1 pelo ui do github em seu repo fa a um pr deste reposit rio para o seu e aceite simples assim smile 2 pelo terminal bash 1 mudar para a branch master git checkout master 2 checar se sua c pia local tem o link do original git remote v 3 se n o adicione o link do original git remote add upstream https github com infojrufba web development and mobile challenges git 4 confirme se o link foi adicionado git remote v 5 agora voc pode fazer o fetch com o repo original assumindo que o nome do link upstream git fetch upstream 6 fazer merge dos updates para sua branch master git merge upstream master master 7 realizar o push para seu fork com as novas mudan as git push origin master
hacktoberfest hacktoberfest2021
front_end
vit
vit img src images great tit square small png alt logo width 150 height 150 align right visual interactive taskwarrior full screen terminal interface for vit 1 3 visit here https github com vit project vit tree 1 3 features fully customizable key bindings default vim like uncluttered display no mouse speed per column colorization advanced tab completion multiple customizable themes override customize column formatters intelligent sub project indenting requirements taskwarrior https taskwarrior org python https www python org 3 7 pip https pypi org project pip installation follow the directions in install md install md quick start run vit help from the command line for basic usage instructions run vit from the command line to start vit with default config report and filters while vit is running type help followed by enter to review basic command navigation actions recommendations vit will suggest to install a default user config file if none exists it s fully commented with all configuration options check it out do vit help know the vit command line arguments do help in vit look over the commands use an xterm terminal for full color support for suggestions on further tweaks see customize md customize md vit handles task coloring differently than taskwarrior see color md color md for more details troubleshooting see faq md faq md upgrading follow the directions in upgrade md upgrade md development interested in the architecture or in helping out with development see development md development md in tribute our friend and collaborator steve rader passed away in may 2013 we owe a lot to steve for his excellent work and so vit is preserved maintained and continued taskwarrior team support taskwarrior org
hacktoberfest
front_end
Cloud_Engineering_Project
cloud engineering final project predicting airline prices developed by alejandra lelo de larrea ibarra bannasorn paspanthong ruben nakano samuel swain this project develops a model to classify clouds into one of two types based on features generated from cloud images br table of contents business problem id businessproblem data description id datadesc data science project id dsproject pipeline id pipeline web application id webapp project structure id structure br div id id buisnessproblem business problem airline ticket prices are subject to constant fluctuations due to various factors such as demand availability competition seasonal trends and economic conditions for airlines having a reliable prediction system can greatly assist in make informed decisions regarding pricing strategies revenue management and resource allocation based on the forecasts revenue optimization airlines can leverage price predictions to optimize their resource allocation and revenue management strategies customer retention satisfied customers are more likely to choose an airline that consistently offers transparent and fair pricing competitive advantage developing a robust prediction system can provide airlines with a competitive edge in the market helping airlines differentiate themselves from competitors additionally travelers can benefit from accurate price ppredictions by improving the decision making travelers can plan their trips more effectively by choosing the most cost effective options enhancing customer experience reliable price prediciton reduces uncertainty and allow customers to secure the best deal br div id id datadesc data description data for the project is collected from easemytrip com and available on kagel https www kaggle com datasets shubhambathwal flight price prediction it contains inforemation about flights between india s top 6 metropolitan cities approximately 300 000 data points 25 mb from feb 11th 2022 to mar 31st 2022 50 days with the following features airline flight number seat type departure time arrival time origin destination number of stops flight duration br div id id dsproject data science project the goal of this project is to develop a predictive model to forecast airline ticket prices and deploy the solution as a web application to acomplish this we follow the next steps data collection obtain comprehensive data on historic flight prices and related variables data preparation exploration clean and explore the data for insights enrich dataset with extra features model development training train selected machine learning models on data model evaluation validation evaluate models hyperparameters using validation accuracy model deployment integrate the model into travel agency s website the data science solution is implemented in two steps pipeline and web application we write python modules for the different tasks and provide docker images so that the solution can be implemented regardless of the os and for reproducibility purposes both solutions are implemented leveraging aws the code for pipeline and app implementation can be found under the 04 impleentation folder architecture diagram 03 img architecturediagram png br div id id pipeline stage 1 pipeline this stage of the project develops the entire piple to train different ml models to predict airline prices for different flight configurations detailed instructions to run and reproduce the pipeline are included in the readme file inside the pipeline folder br div id id webapp stage 2 web application this stage of the project uses flask application as backend to serve the trianed model as a the model endopoint and streamlit as front end to deploy a web application that can predict the price of a flight given some flight characteristics detailed instructions to run and reproduce the application are included in the readme file inside the app folder br div id id structure project structure app readme config logging webapp log yaml webapp yaml dockerfiles dockerfile plane jpg requirements txt src aggregate data py predict api py webapp py test py pipeline readme config default config yaml logging local conf pipeline log dockerfiles dockerfile pipeline main dockerfile tests notebooks clean data ipynb eda ipynb modeling ipynb pipeline py requirements main txt requirements tests txt src init py aws utils py clean data py generate features py raw data py train model py tests init py test clean data py test generate features py test train model py
cloud
GCWeb
statut du d ploiement continu https github com wet boew gcweb workflows continuous 20deployment badge svg statut de devdependency https david dm org wet boew gcweb dev status png theme shields io https david dm org wet boew gcweb info devdependencies pacte du contributeur https img shields io badge pacte 20du 20contributeur v1 4 20adopt e9 ff69b4 svg code of conduct md slack https img shields io badge slack espace 20de 20travaill 20du 20systemes 20de 20conception 20gc yellow style flat logo slack https join slack com t design gc conception shared invite enqtode1otc5mzg5nzq4lwq3mjzjmtdjmjk2ztzmmtjjywq3zmrindywyjrmn2njyzqynjflndbly2fknwe1odg2yjexy2qwzmvjn2mwmgm english gcweb canadaca theme gcweb th me de canada ca aller vers le site officiel de cette r f rence d impl mentation du th me de canada ca gcweb https wet boew github io gcweb objectif le but de ce d p t est de permettre la communaut de d velopeurs d utiliser examiner et exp rimenter de nouvelles composantes mises en page gabarits et fonctionalit s pour cr er des produits num riques utilisez ce th mes pour offrir aux personnes qui acc dent aux services num riques du gouvernement du canada une exp rience en ligne plus conviviale coh rente et fiable commencez avec des mod les des configurations de conception et des principes de conception test s par les utilisateurs pour d buter rapidement et gagner du temps et conomiser de l argent comment a marche venir qui utilisera ce projet venir quel est le but de ce projet venir comment contribuer voir contributing md contributing md comment compiler et ex cuter les tests avec docker docker build t gcweb p cette commande va compiler gcweb et ex cut l ensemble des test une fois compl t vous devrez voir que le conteneur docker coute sur l adresse 0 0 0 0 8000 mais cela ne fonctionne exactement pas comme pr vu pour re compiler vous devriez l arr ter et le relancer licence sauf indication contraire le code source de ce projet est prot g par le droit d auteur de la couronne du gouvernement du canada et distribu sous la licence mit license le mot symbole canada et les l ments graphiques connexes li s cette distribution sont prot g s en vertu des lois portant sur les marques de commerce et le droit d auteur aucune autorisation n est accord e pour leur utilisation l ext rieur des param tres du programme de coordination de l image de marque du gouvernement du canada pour obtenir davantage de renseignements ce sujet veuillez consulter les exigences pour l image de marque https www canada ca fr secretariat conseil tresor sujets communications gouvernementales exigences image marque html continuous deployment status https github com wet boew gcweb workflows continuous 20deployment badge svg devdependency status https david dm org wet boew gcweb dev status png theme shields io https david dm org wet boew gcweb info devdependencies contributor covenant https img shields io badge contributor 20covenant v1 4 20adopted ff69b4 svg code of conduct md slack https img shields io badge slack gc 20design 20system 20workspace yellow style flat logo slack https join slack com t design gc conception shared invite enqtode1otc5mzg5nzq4lwq3mjzjmtdjmjk2ztzmmtjjywq3zmrindywyjrmn2njyzqynjflndbly2fknwe1odg2yjexy2qwzmvjn2mwmgm fran ais gcweb th me de canadaca gcweb canada ca theme navigate to the official canada ca reference implementation theme gcweb https wet boew github io gcweb purpose the purpose of this repository is to allow the developer community to use test and experiment with new components layouts templates and features for building digital products under the canada ca brand use this theme to provide a more usable consistent and trustworthy online experience for people who access government of canada digital services start with user tested templates patterns and design principles to get going quickly and help save time and money how does it work coming who will use this project coming what is the goal of this project coming how to contribute see contributing md contributing md license unless otherwise noted the source code of this project is covered under crown copyright government of canada and is distributed under the mit license license the canada wordmark and related graphics associated with this distribution are protected under trademark law and copyright law no permission is granted to use them outside the parameters of the government of canada s corporate identity program for more information see federal identity requirements https www canada ca en treasury board secretariat topics government communications federal identity requirements html
os
machine-learning-systems-design
machine learning systems design read this booklet here https huyenchip com machine learning systems design toc html this booklet was my initial attempt to write about machine learning systems design back in 2019 my understanding of the topic has gone through significant iterations since then my book designing machine learning systems https www amazon com designing machine learning systems production ready dp 1098107969 o reilly june 2022 is much more comprehensive and up to date the new book s repo https github com chiphuyen dmls book contains the full table of contents chapter summaries and random thoughts on mlops tooling this booklet covers four main steps of designing a machine learning system 1 project setup 2 data pipeline 3 modeling selecting training and debugging 4 serving testing deploying and maintaining it comes with links to practical resources that explain each aspect in more details it also suggests case studies written by machine learning engineers at major tech companies who have deployed machine learning systems to solve real world problems at the end the booklet contains 27 open ended machine learning systems design questions that might come up in machine learning interviews the answers for these questions will be published in the book machine learning interviews you can look at and contribute to community answers to these questions on github here https github com chiphuyen machine learning systems design tree master answers you can read more about the book and sign up for the book s mailing list here https huyenchip com 2019 07 21 machine learning interviews html contribute this is work in progress so any type of contribution is very much appreciated here are a few ways you can contribute 1 improve the text by fixing any lexical grammatical or technical error 1 add more relevant resources to each aspect of the machine learning project flow 1 add edit questions 1 add edit answers 1 other this book was created using the wonderful magicbook https github com magicbookproject magicbook package for detailed instructions on how to use the package see their github repo the package requires that you have node if you re on mac you can install node using brew install node install magicbook with npm install magicbook clone this repository git clone https github com chiphuyen machine learning systems design git cd machine learning systems design after you ve made changes to the content in the content folder you can build the booklet by the following steps magicbook build you ll find the generated html and pdf files in the folder build acknowledgment i d like to thank ben krause for being a great friend and helping me with this draft citation
data-science machine-learning-production mlops
os
dataengineering-youtube-analysis-using-aws-cloud
data engineering youtube analysis project overview this project aims to securely manage streamline and perform analysis on the structured and semi structured youtube videos data based on the video categories and the trending metrics project goals 1 data ingestion build a mechanism to ingest data from different sources 2 etl system we are getting data in raw format transforming this data into the proper format 3 data lake we will be getting data from multiple sources so we need centralized repo to store them 4 scalability as the size of our data increases we need to make sure our system scales with it 5 cloud we can t process vast amounts of data on our local computer so we need to use the cloud in this case we will use aws 6 reporting build a dashboard to get answers to the question we asked earlier services used 1 aws iam identity and access management which enables us to manage access to aws services and resources securely 2 amazon s3 amazon s3 is an object storage service that provides manufacturing scalability data availability security and performance 3 aws glue a serverless data integration service that makes it easy to discover prepare and combine data for analytics machine learning and application development 4 aws lambda lambda is a computing service that allows programmers to run code without creating or managing servers 5 aws athena athena is an interactive query service for s3 in which there is no need to load data it stays in s3 6 quicksight amazon quicksight is a scalable serverless embeddable machine learning powered business intelligence bi service built for the cloud dataset used this kaggle dataset contains statistics csv files on daily popular youtube videos over the course of many months there are up to 200 trending videos published every day for many locations the data for each region is in its own file the video title channel title publication time tags views likes and dislikes description and comment count are among the items included in the data a category id field which differs by area is also included in the json file linked to the region https www kaggle com datasets datasnaek youtube new architecture diagram img src architecture jpeg note i have attached the work done in the aws console in aws console videos directory where i recorded the s3 buckets lambda functions glue crawlers and jobs etc created aws cli s3 shell commands sh contains aws cli commands to upload data into aws s3 bucket lambda function py is used to create trigger function in aws lambda pyspark py is used to create etl job in aws glue job
cloud
MuDPT
mudpt this repository contains the official codes for multi modal deep prompt tuning for large pre trained vision language models https arxiv org abs 2306 11400 news 2023 8 15 release the initial codes something need to update 2023 7 10 we present our paper at the icme 2023 https www 2023 ieeeicme org program php ss1 special session advances in language and vision research oral 2023 3 12 our paper is accepted by icme 2023 icme 2023 yongzhu miao shasha li jintao tang ting wang mudpt multi modal deep prompt tuning for large pre trained vision language models img mudpt png how to use our code citation if you re using mudpt in your research or applications please cite using this bibtex bibtex inproceedings 10219840 author miao yongzhu and li shasha and tang jintao and wang ting booktitle 2023 ieee international conference on multimedia and expo icme title mudpt multi modal deep symphysis prompt tuning for large pre trained vision language models year 2023 volume number pages 25 30 doi 10 1109 icme55011 2023 00013
ai
radmind
radmind radmind is a suite of unix command line tools and a server designed to remotely administer the file systems of multiple unix machines copyright c 2003 regents of the university of michigan all rights reserved see copyright contents quick installation instructions quick installation instructions detailed installation instructions detailed installation instructions configuring for ubuntu 18 04 configuring for ubuntu 1804 configuring for centos 7 configuring for centos 7 configuring for freebsd 11 configuring for freebsd 11 configuring for redhat 9 configuring for redhat 9 configuring for macos configuring for macos getting the source getting the source configuring and building configuring and building building an os x installer package building an os x installer package known issues known issues more information more informatino references references quick installation instructions from within the source directory configure make make install detailed installation instructions configuring for ubuntu 18 04 these are the commands i had to run to get radmind to build using vagrant and generic ubuntu1804 https app vagrantup com generic boxes ubuntu1804 sudo apt get y update sudo apt get y install gcc sudo apt get y install libssl dev sudo apt get y install make sudo apt get y install autoconf configuring for centos 7 these are the commands i had to run to get radmind to build using vagrant and generic centos7 https app vagrantup com generic boxes centos7 sudo yum y update sudo yum y install git sudo yum y install openssl devel x86 64 configuring for freebsd 11 these are the commands i had to run to get radmind to build using vagrant and generic freebsd11 https app vagrantup com generic boxes freebsd11 sudo pkg update sudo pkg install y git sudo pkg install y autoconf configuring for redhat 9 to properly build radmind on redhat 9 with ssl support you have to specify the location of your kerberos files export cppflags i usr kerberos include configuring for macos last tested on 10 14 using homebrew install xcode https developer apple com xcode install brew https brew sh run these commands as an admin user brew install autoconf brew install openssl sudo ln s usr local opt openssl usr local openssl last tested on 10 14 using fink install xcode https developer apple com xcode install fink https finkproject org run these commands as an admin user fink install autoconf fink install openssl configuring for raspbian stretch debian 9 sudo apt get y update sudo apt get y install git sudo apt get y install autoconf sudo apt get y install libssl dev getting the source you can either download the source from the radmind project homepage http radmind org and uncompress it into a directory of your choice or else you can use git to build the most recent development source of the project building radmind from the git repository is a good way to ensure you ve got the most up to date version of everything you can also help contribute by filing bug reports on the radmind github page https github com radmind radmind first clone the repository locally git clone https github com radmind radmind git radmind then move into the directory and check out the required submodules 1 cd radmind sh bootstrap sh configuring and building if configure files are to be rebuilt because we installed it issue the following commands autoconf cd libsnet autoconf cd now that everything is set up we have to actually do the configuration and installation configure the build configure note that the configure scripts take several options to see them all do configure help now we re ready to actually build everything together make make install building an os x installer package the radmind makefile contains a target called package which will construct a mac os x installer package suitable for distribution to make the package log in as an administrator enter the radmind source directory and follow the steps below configure make package during the build process you will be prompted for your password packagemaker currently does not work with make so at the end of the build process you will see make package error 2 even though that package was created successfully after the source has been built and the package created you will be left with a package called radmindtools pkg in the parent directory of the radmind source this file may be double clicked in the finder to launch the installer this target will fail if it is used on a system other than mac os x known issues on opendarwin based systems the message hfs bwrite called with lock bit set is logged when you are doing a high volume of writes to a volume lcksum s progress output currently does not provide steady feedback increments more information if you have any problems with this source you may want to check the issue tracker issues to see if any problems have reports you can also contact the radmind development team by e mailing mlib its mac github radmind lists utah edu mailto mlib its mac github radmind lists utah edu an archived e mail discussion list has also been set up see the website for details on how to join in june of 2015 management of this project was transferred from the university of michigan to the university of utah the university of utah decided to migrate the project from the rapidly deteriorating sourceforge hosting site over to github we felt that this would help keep the project alive and make it easier to maintain note that the university of utah while longtime users of radmind are no longer contributing to the upkeep of the project except to merge pull requests make sure it builds and keep this readme updated if you feel you would be a better steward for radmind s future contact us the transfer of issues bugs and their comments was automated using the gosf2github https github com cmungall gosf2github script because no username map was readily available all of the issues and comments were automatically assigned to the member of the university of utah s team who managed the migration pdarragh https github com pdarragh references 1 current submodules libsnet http sourceforge net projects libsnet a networking library with tls support
os
generative-agents
generative agents a simple framework for working with generative agents powered by llms license mit https img shields io badge license mit green svg twitter follow https img shields io twitter follow toughyear style social https twitter com toughyear https dcbadge vercel app api server 9njpmxtvaw compact true style flat https discord gg 9njpmxtvaw basic demo https raw githubusercontent com toughyear blog uploads main uploads ga generative agents basic demo gif what are generative agents these agents act as drop in replacement for humans in online sandbox environments you can use them to simulate complex social behavior in a sandbox environment create dynamic npcs for your games create autonomous agents that acquire new skills while interacting with the environment like minecraft see it in the wild basic demo https raw githubusercontent com toughyear blog uploads main uploads demo 2 gif we created a sandbox simulation based on the alpha version of the agent architecture check out the live demo http demo multimode run or check out the demo video https www youtube com watch v hu4fj1gwxag on youtube openai apis can be slow and unstable at times check the network tab to see if the requests are failing or getting rate limited we are currently working on a more stable solution usage we have started with a typescript implementation of the agent architecture and plan to add more languages in the future you can create a sample agent with the following code ts import agent agentengine from generative agents create an engine that manages the network requests and overall state of the world const engine new agentengine openaikey create an agent const agent new agent engine thomas miller agent id thomas miller agent s name 42 agent s age background runs taiki seafood restaurant with his wife susan has a daughter lucy and a son mike currentgoal creating a new speciality dish for restaurant and getting a new chef innatetendency curious adventurous optimistic learnedtendency hardworking responsible diligent lifestyle goes to work comes home spends time with family goes to bed early values family honesty integrity once created the agent has following methods available ts stream relevant observations to the agents to make the world believable observe description string promise void agents auto reflect on their observations and update their internal state after crossing certain thresholds reflect promise void at initialization agents create a plan for the day createplan testing boolean promise void execute the current task in the plan executecurrenttask promise void agents can chat with each other and external users like humans replywithcontext message string participants string promise string repo structure the repository is divided into two parts core language examples type core contains the core implementation of the agent architecture it is divided into language specific folders currently we only have a typescript implementation examples contains examples of how to use the core implementation currently we have an example of a react phaser implementation of the agent architecture in a 2d top down pixelart game we look forward to adding more examples in the future contributions are welcome contributing yes we need you the current implementation is experimental if you are interested we would love to plan the future of this project and build with you all it will be interesting to see what we all can build with it next up on the roadmap is to add more languages and examples we are also working on a more stable solution for the openai api we also are planning to work on generic minecraft agent that can acquire new skills while interacting with the environment help us build it join the development on discord https discord gg 9njpmxtvaw if you have any ideas or suggestions please open an issue or a pull request twitter dms open here https twitter com toughyear we are so back let s build the matrix together notice for the ease of access we have put our openai key in the code hopefully we will not see any abuse if you are planning to use this code for your own project we request you to create your own openai key and use it instead
generative-agents llm openai
ai
LLM-scientific-feedback
can large language models provide useful feedback on research papers a large scale empirical analysis python 3 10 https img shields io badge python 3 10 blue svg https www python org downloads release python 3100 black https img shields io badge code 20style black 000000 svg https github com ambv black arxiv https img shields io badge arxiv 2310 01783 b31b1b svg https arxiv org abs 2310 01783 this repo provides the python source code of our paper can large language models provide useful feedback on research papers a large scale empirical analysis https arxiv org abs 2310 01783 pdf https arxiv org pdf 2310 01783 pdf twitter https twitter com james y zou status 1709608909395357946 inproceedings llm research feedback 2023 title can large language models provide useful feedback on research papers a large scale empirical analysis author liang weixin and zhang yuhui and cao hancheng and wang binglu and ding daisy and yang xinyu and vodrahalli kailas and he siyu and smith daniel and yin yian and mcfarland daniel and zou james booktitle arxiv preprint arxiv 2310 01783 year 2023 abstract expert feedback lays the foundation of rigorous research however the rapid growth of scholarly production and intricate knowledge specialization challenge the conventional scientific feedback mechanisms high quality peer reviews are increasingly difficult to obtain researchers who are more junior or from under resourced settings have especially hard times getting timely feedback with the breakthrough of large language models llm such as gpt 4 there is growing interest in using llms to generate scientific feedback on research manuscripts however the utility of llm generated feedback has not been systematically studied to address this gap we created an automated pipeline using gpt 4 to provide comments on the full pdfs of scientific papers we evaluated the quality of gpt 4 s feedback through two large scale studies we first quantitatively compared gpt 4 s generated feedback with human peer reviewer feedback in 15 nature family journals 3 096 papers in total and the iclr machine learning conference 1 709 papers the overlap in the points raised by gpt 4 and by human reviewers average overlap 30 85 for nature journals 39 23 for iclr is comparable to the overlap between two human reviewers average overlap 28 58 for nature journals 35 25 for iclr the overlap between gpt 4 and human reviewers is larger for the weaker papers i e rejected iclr papers average overlap 43 80 we then conducted a prospective user study with 308 researchers from 110 us institutions in the field of ai and computational biology to understand how researchers perceive feedback generated by our gpt 4 system on their own papers overall more than half 57 4 of the users found gpt 4 generated feedback helpful very helpful and 82 4 found it more beneficial than feedback from at least some human reviewers while our findings show that llm generated feedback can help researchers we also identify several limitations for example gpt 4 tends to focus on certain aspects of scientific feedback e g add experiments on more datasets and often struggles to provide in depth critique of method design together our results suggest that llm and human feedback can complement each other while human expert review is and should continue to be the foundation of rigorous scientific process llm feedback could benefit researchers especially when timely expert feedback is not available and in earlier stages of manuscript preparation before peer review 1 https github com weixin liang llm scientific feedback assets 32794044 8958eb56 a652 45bb 9347 e9578f432ae0 2 https github com weixin liang llm scientific feedback assets 32794044 6228288b 9a54 4c90 8510 32bb823f1e05 usage to run the code you need to 1 create a pdf parsing server and run in the background 2 create the llm feedback server 3 open the web browser and upload your paper create and run pdf parsing server sciencebeam pdf parser only supports x86 linux operating system please let us know if you find solutions for other operating systems bash conda env create f conda environment yml conda activate sciencebeam python m sciencebeam parser service server port 8080 make sure this is running in the background create and run llm feedback server bash conda create n llm python 3 10 conda activate llm pip install r requirements txt cat your openai api key key txt replace your openai api key with your openai api key starting with sk python main py if you have installed sciencebeam using x86 linux and want to generate feedback from the raw pdf file python main from text py if you are using other operating systems or want to generate feedback from the parsed paper in text format open the web browser and upload your paper open http 0 0 0 0 7799 and upload your paper the feedback will be generated in around 120 seconds you should get the following output demo demo png if you encounter any error please first check the server log and then open an issue
ai
accel-brain-code
accel brain code from proof of concept to prototype the purpose of this repository is to make prototypes as case study in the context of proof of concept poc and research and development r d that i have written in my website accel brain https accel brain com japanese and accel brain co ltd https accel brain co jp japanese the main research topics are auto encoders in relation to the representation learning the statistical machine learning for energy based models adversarial generation networks gans deep reinforcement learning such as deep q networks semi supervised learning and neural network language model for natural language processing problem setting deep learning after the era of democratization of artificial intelligence ai how the research and development r d on the subject of machine learning including deep learning after the era of democratization of artificial intelligence ai can become possible simply implementing the models and algorithms provided by standard machine learning libraries and applications like automl would reinvent the wheel if you just copy and paste the demo code from the library and use it your r d would fall into dogmatically authoritarian development or so called the hype driven development if you fall in love with the concept of democratization of ai you may forget the reality that the r d is under the influence of not only democracy but also capitalism the r d provides economic value when its r d artifacts are distinguished from the models and algorithms realized by standard machine learning libraries and applications such as automl in general terms r d must provide a differentiator to maximize the scarcity of its implementation artifacts on the other hand it must be remembered that any r d builds on the history of the social structure and the semantics of the concepts envisioned by previous studies many models and algorithms are variants derived not only from research but also from the relationship with business domains it is impossible to assume differentiating factors without taking commonality and identity between society and its history problem solution poc of poc the blind spot of democratization of ai occurs when a new concept is created throughout the society including business it takes time before a new concept can be broken down into an interface specification from a perspective such as object oriented analysis and code that conforms to the interface specification can be implemented there will always be some difference between the new ai created in this way and the ai already democratized in a more realistic perspective casual users who are just waiting for the ai to be democratized will always fall behind on the contrary those who can create new concepts and new ais with poc will always continue to have a leading advantage in the market where ai is the main topic hiding behind the democratic movement of ai democratization is the dry reality of capitalist competition lifehack of lifehack the basic theme in my poc is a lifehack which is any technique that reduces the burden of our life and make it easier to control or more convenient considering that many lifehack solutions are technological and obviously product design and development technology are kind of life which can be hacked lifehack itself also can be purpose of lifehack because of this autologie a seemingly endless round of my poc and technological prototypes is rotary driven by selbstreferenz to repeat lifehack of lifehack cyclically in this problem setting and recursive solutions this repository is functionally differentiated by compositions such as information collection searching optimal solution and focus booster each function can be considered an integral component of lifehack solutions these tools make it possible to efficiency the process of contemplation and accelerate our brain enabling provisions for the developments of other tools in this repository all code implemented as in an algorithm of machine learning or data science reflects the concept of proof of concept poc problem solution accel brain base https github com accel brain accel brain code tree master accel brain base as part of prototyping this repository publishes a special machine learning library accel brain base https github com accel brain accel brain code tree master accel brain base accel brain base is a basic library of the deep learning for rapid development at low cost this library makes it possible to design and implement deep learning which must be configured as a complex system or a system of systems by combining a plurality of functionally differentiated modules such as a restricted boltzmann machine rbm deep boltzmann machines dbms a stacked auto encoder an encoder decoder based on long short term memory lstm and a convolutional auto encoder cae div align center img src https storage googleapis com accel brain code deep learning by means of design pattern img horse099 jpg p image in a href https avaminzhang wordpress com 2012 12 07 e3 80 90dataset e3 80 91weizmann horses target blank the weizmann horse dataset a p div div align center img src https storage googleapis com accel brain code deep learning by means of design pattern img reconstructed by cae gif p reconstructed image by strong convolutional auto encoder strong p div from the view points of functionally equivalents and structural expansions this library also prototypes many variants such as energy based models and generative models typical examples are generative adversarial networks gans and adversarial auto encoders aaes in addition it provides deep reinforcement learning that applies the neural network described above as a function approximator considering many variable parts structural unions and functional equivalents in the deep learning paradigm which are variants derived not only from research but also from the relationship with business domains from perspective of commonality variability analysis in order to practice object oriented design this library provides abstract classes that define the skeleton of the deep learning algorithm in an operation deferring some steps in concrete variant algorithms such as the deep boltzmann machines stacked auto encoder encoder decoder based on lstm and convolutional auto encoder to client subclasses the abstract classes and the interfaces in this library let subclasses redefine certain steps of the deep learning algorithm without changing the algorithm s structure these abstract classes can also provide new original models and algorithms such as generative adversarial networks gans deep reinforcement learning or neural network language model by implementing the variable parts of the fluid elements of objects documentation full documentation is available on https code accel brain com accel brain base readme html https code accel brain com accel brain base readme html this document contains information on functionally reusability functional scalability and functional extensibility problem solution automatic summarization library pysummarization https github com chimera0 accel brain code tree master automatic summarization pysummarization is python3 library for the automatic summarization document abstraction and text filtering the function of this library is automatic summarization using a kind of natural language processing this library enable you to create a summary with the major points of the original document or web scraped text that filtered by text clustering documentation full documentation is available on https code accel brain com automatic summarization https code accel brain com automatic summarization this document contains information on functionally reusability functional scalability and functional extensibility problem solution reinforcement learning library pyqlearning https github com chimera0 accel brain code tree master reinforcement learning pyqlearning is python library to implement reinforcement learning and deep reinforcement learning especially for q learning deep q network and multi agent deep q network which can be optimized by annealing models such as simulated annealing adaptive simulated annealing and quantum monte carlo method according to the reinforcement learning problem settings q learning is a kind of temporal difference learning td learning that can be considered as hybrid of monte carlo method and dynamic programming method as monte carlo method td learning algorithm can learn by experience without model of environment and this learning algorithm is functional extension of bootstrap method as dynamic programming method the commonality variability of q learning in this library q learning can be distinguished into epsilon greedy q leanring and boltzmann q learning these algorithm is functionally equivalent but their structures should be conceptually distinguished considering many variable parts and functional extensions in the q learning paradigm from perspective of commonality variability analysis in order to practice object oriented design this library provides abstract class that defines the skeleton of a q learning algorithm in an operation deferring some steps in concrete variant algorithms such as epsilon greedy q leanring and boltzmann q learning to client subclasses the abstract class in this library lets subclasses redefine certain steps of a q learning algorithm without changing the algorithm s structure simple maze solving by deep q network demo search maze by deep q network ipynb https github com chimera0 accel brain code blob master reinforcement learning demo search maze by deep q network ipynb is a jupyter notebook which demonstrates a maze solving algorithm based on deep q network rigidly coupled with deep convolutional neural networks deep cnns the function of the deep learning is generalisation and cnns is a function approximator in this notebook several functional equivalents such as cnn long short term memory lstm networks and the model which loosely coupled cnn and lstm can be compared from a functional point of view div align center p a href https github com chimera0 accel brain code blob master reinforcement learning demo search maze by deep q network ipynb target blank img src https storage googleapis com accel brain code reinforcement learning img dqn single agent goal compressed gif a p p deep reinforcement learning to solve the maze p div black squares represent a wall light gray squares represent passages a dark gray square represents a start point a white squeare represents a goal point the pursuit evasion game expanding the search problem of the maze makes it possible to describe the pursuit evasion game that is a family of problems in mathematics and computer science in which one group attempts to track down members of another group in an environment this problem can be re described as the multi agent control problem which involves decomposing the global system state into an image like representation with information encoded in separate channels this reformulation allows us to use convolutional neural networks to efficiently extract important features from the image like state demo search maze by deep q network ipynb https github com chimera0 accel brain code blob master reinforcement learning demo search maze by deep q network ipynb also prototypes multi agent deep q network to solve the pursuit evasion game based on the image like state representation of the multi agent div align center table style border none tr td width 45 align center p a href https github com chimera0 accel brain code blob master reinforcement learning demo search maze by deep q network ipynb target blank img src https storage googleapis com accel brain code reinforcement learning img dqn multi agent demo crash enemy 2 compressed gif a p p multi agent deep reinforcement learning to solve the pursuit evasion game the player is caught by enemies p td td width 45 align center p a href https github com chimera0 accel brain code blob master reinforcement learning demo search maze by deep q network ipynb target blank img src https storage googleapis com accel brain code reinforcement learning img dqn multi agent demo goal enemy 2 compressed gif a p p p multi agent deep reinforcement learning to solve the pursuit evasion game the player reaches the goal p td tr table div black squares represent a wall light gray squares represent passages a dark gray square represents a start point moving dark gray squares represent enemies a white squeare represents a goal point combinatorial optimization problem and simulated annealing there are many hyperparameters that we have to set before the actual searching and learning process begins each parameter should be decided in relation to reinforcement learning theory and it cause side effects in training model this issue can be considered as combinatorial optimization problem which is an optimization problem where an optimal solution has to be identified from a finite set of solutions in this problem setting this library provides an annealing model such as simulated annealing to search optimal combination of hyperparameters as exemplified in annealing hand written digits ipynb https github com chimera0 accel brain code blob master reinforcement learning annealing hand written digits ipynb there are many functional extensions and functional equivalents of simulated annealing for instance adaptive simulated annealing also known as the very fast simulated reannealing is a very efficient version of simulated annealing and quantum monte carlo which is generally known a stochastic method to solve the schr dinger equation is one of the earliest types of solution in order to simulate the quantum annealing in classical computer documentation full documentation is available on https code accel brain com reinforcement learning https code accel brain com reinforcement learning this document contains information on functionally reusability functional scalability and functional extensibility problem solution generative adversarial networks library pygan https github com chimera0 accel brain code tree master generative adversarial networks pygan is python library to implement generative adversarial networks gans and adversarial auto encoders aaes this library makes it possible to design the generative models based on the statistical machine learning problems in relation to generative adversarial networks gans and adversarial auto encoders aaes to practice algorithm design for semi supervised learning the generative adversarial networks gans goodfellow et al 2014 framework establishes a min max adversarial game between two neural networks a generative model g and a discriminative model d the discriminator model d x is a neural network that computes the probability that a observed data point x in data space is a sample from the data distribution positive samples that we are trying to model rather than a sample from our generative model negative samples concurrently the generator uses a function g z that maps samples z from the prior p z to the data space g z is trained to maximally confuse the discriminator into believing that samples it generates come from the data distribution the generator is trained by leveraging the gradient of d x w r t x and using that to modify its parameters this library provides the adversarial auto encoders aaes which is a probabilistic auto encoder that uses gans to perform variational inference by matching the aggregated posterior of the feature points in hidden layer of the auto encoder with an arbitrary prior distribution makhzani a et al 2015 matching the aggregated posterior to the prior ensures that generating from any part of prior space results in meaningful samples as a result the decoder of the adversarial auto encoder learns a deep generative model that maps the imposed prior to the data distribution documentation full documentation is available on https code accel brain com generative adversarial networks https code accel brain com generative adversarial networks this document contains information on functionally reusability functional scalability and functional extensibility problem solution algorithmic composition https github com chimera0 accel brain code tree master algorithmic composition pycomposer is python library for algorithmic composition or automatic composition by reinforcement learning such as q learning and recurrent temporal restricted boltzmann machine rtrbm q learning and rtrbm in this library allows you to extract the melody information about a midi tracks and these models can learn and inference patterns of the melody and this library has wrapper class for converting melody data inferenced by q learning and rtrbm into midi file documentation full documentation is available on https code accel brain com algorithmic composition https code accel brain com algorithmic composition this document contains information on functionally reusability functional scalability and functional extensibility problem solution cardbox https github com chimera0 accel brain code tree master cardbox this is the simple card box system that make you able to find and save your ideas you can write down as many ideas as possible onto cards like the kj method or the mindmap tools this simple javascript tool helps us to discover potential relations among the cards that you created and the tagging function allow you to generate metadata of cards as to make their meaning and relationships understandable problem solution binaural beat and monaural beat with python https github com chimera0 accel brain code tree master binaural beat and monaural beat with python accelbrainbeat is a python library for creating the binaural beats or monaural beats you can play these beats and generate wav files the frequencys can be optionally selected this python script enables you to handle your mind state by a kind of brain wave controller which is generally known as biaural beat or monauarl beats in a simplified method documentation full documentation is available on https code accel brain com binaural beat and monaural beat with python https code accel brain com binaural beat and monaural beat with python this document contains information on functionally reusability functional scalability and functional extensibility problem solution binaural beat and monaural beat with js https github com chimera0 accel brain code tree master binaural beat and monaural beat with js these modules are functionally equivalent to python scripts in accelbrainbeat problem solution subliminal perception https github com chimera0 accel brain code tree master subliminal perception these javascript are tool for experimentation of subliminal perception this is a demo code for my case study in the context of my website references the basic concepts theories and methods behind this library are described in the following books div align center a href https www amazon co jp dp b08pv4zqg5 target blank img src https storage googleapis com accel brain code accel brain books in house r and d in the era of democratization of ai book cover jpg width 160px a p a href https www amazon co jp dp b08pv4zqg5 ref sr 1 1 dchild 1 qid 1607343553 s digital text sr 1 1 text e6 a0 aa e5 bc 8f e4 bc 9a e7 a4 beaccel brain target blank ai a japanese p div br div align center a href https www amazon co jp dp b093z533lk target blank img src https storage googleapis com accel brain code accel brain books ai vs investors as noise traders book cover jpg width 160px a p a href https www amazon co jp dp b093z533lk target blank ai vs a japanese p div br div align center a href https www amazon co jp dp b0994ch3cm target blank img src https storage googleapis com accel brain code accel brain books babel of natural language processing book cover jpg width 160px a p a href https www amazon co jp dp b0994ch3cm target blank ai a japanese p div div align center a href https www amazon co jp dp b09c4kyzbx target blank img src https storage googleapis com accel brain code accel brain books origin of the statistical machine learning book cover jpg width 160px a p a href https www amazon co jp dp b09c4kyzbx target blank a japanese p div br div align center a href https www amazon co jp dp b09jc4z7b4 target blank img border 0 src https storage googleapis com accel brain code accel brain books media aesthetics of data visualization book cover jpg width 160px a p a href https www amazon co jp dp b09jc4z7b4 target blank a japanese p div div align center a href https www amazon co jp dp b09p4jxqwb target blank img border 0 src https storage googleapis com accel brain code accel brain books education of the faustian man book cover jpg width 160px a p a href https www amazon co jp dp b09p4jxqwb target blank a japanese p div author accel brain co ltd author uri http accel brain com http accel brain co jp license gnu general public license v2 0
automatic-summarization deep-learning deep-reinforcement-learning reinforcement-learning q-learning restricted-boltzmann-machine simulated-annealing quantum-monte-carlo quantum-annealing auto-encoder combinatorial-optimization transfer-learning lstm multi-agent-reinforcement-learning deep-q-network semi-supervised-learning generative-adversarial-network energy-based-model self-supervised-learning
ai
VisionWorks
visionworks basic computer vision problem amp work contents basic matlab basicmatlab hybrid image hybridimage corner detection cornerdetection panorama stitching panoramastitching seam carving seamcarving
ai
ethers.js
the ethers project npm tag https img shields io npm v ethers https www npmjs com package ethers ci tests https github com ethers io ethers js actions workflows test ci yml badge svg branch main https github com ethers io ethers js actions workflows test ci yml npm bundle size version https img shields io bundlephobia minzip ethers npm downloads https img shields io npm dm ethers gitpoap badge https public api gitpoap io v1 repo ethers io ethers js badge https www gitpoap io gh ethers io ethers js twitter follow https img shields io twitter follow ricmoo style social https twitter com ricmoo a complete compact and simple library for ethereum and ilk written in typescript https www typescriptlang org features keep your private keys in your client safe and sound import and export json wallets geth parity and crowdsale import and export bip 39 mnemonic phrases 12 word backup phrases and hd wallets english as well as czech french italian japanese korean simplified chinese spanish traditional chinese meta classes create javascript objects from any contract abi including abiv2 and human readable abi connect to ethereum nodes over json rpc https github com ethereum wiki wiki json rpc infura https infura io etherscan https etherscan io alchemy https alchemyapi io ankr https ankr com or metamask https metamask io ens names are first class citizens they can be used anywhere an ethereum addresses can be used small 144kb compressed 460kb uncompressed tree shaking focused include only what you need during bundling complete functionality for all your ethereum desires extensive documentation https docs ethers org v6 large collection of test cases which are maintained and added to fully written in typescript with strict types for security and safety mit license including all dependencies completely open source to do with as you please keep updated for advisories and important notices follow ethersproject https twitter com ethersproject on twitter low traffic non marketing important information only as well as watch this github project for more general news discussions and feedback follow or dm me ricmoo https twitter com ricmoo on twitter or on the ethers discord https discord gg qytsscgyyc for the latest changes see the changelog https github com ethers io ethers js blob main changelog md summaries september 2022 https blog ricmoo com highlights ethers js september 2022 d7bda0fc37ed june 2022 https blog ricmoo com highlights ethers js june 2022 f5328932e35d march 2022 https blog ricmoo com highlights ethers js march 2022 f511fe1e88a1 december 2021 https blog ricmoo com highlights ethers js december 2021 dc1adb779d1a september 2021 https blog ricmoo com highlights ethers js september 2021 1bf7cb47d348 may 2021 https blog ricmoo com highlights ethers js may 2021 2826e858277d march 2021 https blog ricmoo com highlights ethers js march 2021 173d3a545b8d december 2020 https blog ricmoo com highlights ethers js december 2020 2e2db8bc800a installing nodejs home ricmoo some project npm install ethers browser esm the bundled library is available in the dist folder in this repo script type module import ethers from dist ethers min js script documentation browse the documentation https docs ethers org online getting started https docs ethers org v6 getting started full api documentation https docs ethers org v6 api various ethereum articles https blog ricmoo com providers ethers works closely with an ever growing list of third party providers to ensure getting started is quick and easy by providing default keys to each service these built in keys mean you can use ethers getdefaultprovider and start developing right away however the api keys provided to ethers are also shared and are intentionally throttled to encourage developers to eventually get their own keys which unlock many other features such as faster responses more capacity analytics and other features like archival data when you are ready to sign up and start using for your own keys please check out the provider api keys https docs ethers org v5 api keys in the documentation a special thanks to these services for providing community resources ankr https www ankr com quicknode https www quicknode com etherscan https etherscan io infura https infura io alchemy https dashboard alchemyapi io signup referral 55a35117 028e 4b7c 9e47 e275ad0acc6d pocket https pokt network pocket gateway ethereum mainnet extension packages the ethers package only includes the most common and most core functionality to interact with ethereum there are many other packages designed to further enhance the functionality and experience hardware wallets coming soon account abstraction coming soon license mit license including all dependencies
ethereum javascript typescript ethers web3 blockchain
blockchain
jquery-web-dev-lab
jquery web development lab starter files the lab instructions are in the pdf in the docs folder docs jquery w slider pdf
front_end
CodingWithMitch-Blog-Course
a href https codingwithmitch com courses building a website django python img class header img src https codingwithmitch s3 amazonaws com static building a website django python images python web development png a a href https codingwithmitch com courses building a website django python h1 building a website with django python h1 a p learn how to build a website with strong django strong the web development framework for strong python strong p p in the course you ll learn p ul li register update authenticate delete users li li blog posting create retrieve update delete crud li li building an admin panel li li launch a production website just like a href codingwithmitch com target blank codingwithmitch com a li li use aws s3 to store and serve static files on your server images li li bootstrap li li html li li css li li and much more li ul p when your done the course push your website strong live on the internet strong with these videos p ol li a href https codingwithmitch com courses hosting a django website with digital ocean hosting django website digital ocean target blank host your website with digital ocean a li li a href https codingwithmitch com courses hosting a django website with digital ocean registering domain name namecheap target blank register a custom domain a li li a href https codingwithmitch com courses hosting a django website with digital ocean setup https nginx and certbot target blank set up https on your server a li li a href https codingwithmitch com courses hosting a django website with digital ocean object storage django spaces s3 and aws target blank object storage with django digital ocean spaces s3 aws a li li a href https codingwithmitch com courses hosting a django website with digital ocean send password reset emails production env target blank send emails with django for password reset a li ol p then we ll take development to the next level by building a strong rest api strong on the website so other technologies can communicate with it strong ex an android app strong p ol li a href https codingwithmitch com courses build a rest api target blank build a rest api a li ol
python django digital-ocean digital-ocean-django
front_end
MCW-Azure-Blockchain
azure blockchain this workshop is archived and is no longer being maintained content is read only northwind traders is the world s largest food and beverage company the company has a long history of innovation since its founding more than 150 years ago over the last few years northwind has been increasing their emphasis on tracking their products from the origin of the raw materials all way through the manufacturing process to the consumer they pride themselves on being able to certify both the origin and delivery of their products with high accuracy jill anders the cto of northwind traders has reached out to you to help them build a truly innovative solution to better track their shipments jill says we need a system that is more secure more efficient and will help us lower not just it costs but other costs across the organization they ve heard of blockchain and smart contract technologies and are thinking these may help them solve this problem target audience developers it professional cloud solution architect workshop in this workshop you will learn how to design a solution with ethereum blockchain ledger and several azure services to collect device telemetry information and enforce contract specifics related to conditions during the transport of goods at the end of this workshop you will be better equipped to to deploy and configure azure blockchain workbench write and deploy ethereum smart contracts with solidity and integrate iot and the blockchain ledger into a single solution whiteboard design session in this whiteboard design session you will work with a group to learn how to build and configure an internet of things iot audit solution using azure blockchain services you will do this using ethereum blockchain ledger with the use of smart contracts to collect device telemetry information in addition to enforce contract specifics related to conditions during transport of goods specifically the iot devices will report temperature and humidity data that will be validated through the smart contracts against agreed upon acceptable ranges at the end of this session you will be able to deploy and configure azure blockchain workbench write and deploy ethereum smart contracts with solidity and integrate both iot and blockchain together into a single solution hands on lab in this lab you will learn how to build and configure an internet of things iot audit solution using azure blockchain services you will do this using ethereum blockchain ledger with the use of smart contracts to collect device telemetry information in addition to enforce contract specifics related to conditions during transport of goods specifically the iot devices will report temperature and humidity data that will be validated through the smart contracts against agreed upon acceptable ranges at the end of this hands on lab you will be better able to build a solution to deploy and configure azure blockchain workbench write and deploy ethereum smart contracts with solidity and integrate both iot and blockchain together into a single solution azure services and related products blockchain workbench iot hub service bus sql database key vault event grid application insights related references microsoft cloud workshop https microsoftcloudworkshop com index html
mcw blockchain app-builder aad smart-contracts ethereum event-hub stream-analytics logic-apps
blockchain
c1m1
author k shreenath bohra details assigment submission week 1 introduction to embedded systmens
os
Information-Academy
information academy a technology information website
server
-Big-Data-and-ML-on-Google-Cloud
big data and ml on google cloud
cloud
sparkify-redshift-data-warehouse
creating a redshift data warehouse on aws for the music streaming service sparkify table of contents 1 project motivation and description project motivation 2 installation installation 3 file descriptions file descriptions 4 authors and acknowledgements authors acknowledgements project motivation and description a name project motivation a the analytics team of the fictional music streaming service sparkify wants get enabled to understand what songs users are listening to this project aims to support this need by modeling log data which resides in s3 in json format and setting up a redshift data warehouse in the amazon cloud to make the data available for analysis in more detail a cloud based etl pipeline has to be implemented to load data from s3 transform it and load it into the newly designed and created redshift database installation a name installation a aws 0 create iam user dwhadmin in the aws management console files 1 add aws key and secret of the dwhadmin user into dwh example cfg and save it under dwh cfg 2 run create cluster py to create an aws redshift data warehouse 3 add dwh endpoint and dwh role arn into dwh cfg they get logged in step 2 4 run create tables py to create the tables in aws redshift 5 run etl py to load data from staging tables to analytics tables on redshift make sure to delete your redshift cluster afterwards you can use drop cluster py if not needed anymore to prevent unnecessary costs to check if your cluster is still running use check running cluster py file descriptions a name file descriptions a check running cluster py returns a list of running redshift cluster of the attached user create cluster py creates the aws redshift data warehouse and dwh role arn create tables py creates the fact and dimension tables defined in sql queries py in redshift drop cluster py deletes the redshift dwh and iam role created in create cluster py dwh example cfg an example of the configuration file etl py implements the etl pipeline that loads data from s3 into statging tables on redshift processes them and finally loads them into the redshift analytics tables sql queries py contains all sql statements needed within the above files authors and acknowledgements a name authors acknowledgements a this project has been implemented as part of the udacity data engineering nanodegree program the data has been provided by udacity accordingly as well as the project structure file templates
aws redshift cloud etl dwh data-engineering data-warehouse amazon-web-services amazon-s3
cloud
felm
felm image title png p align center a href https hkust nlp github io felm target blank website a a href https huggingface co datasets hkust nlp felm target blank hugging face dataset a a href http arxiv org abs 2310 00741 target blank paper a p felm is a meta benchmark to evaluate factuality evaluation for large language models the benchmark comprises 847 questions that span five distinct domains world knowledge science technology writing recommendation reasoning and math we gather prompts corresponding to each domain by various sources including standard datasets like truthfulqa online platforms like github repositories chatgpt generation or drafted by authors we then obtain responses from chatgpt for these prompts for each response we employ fine grained annotation at the segment level which includes reference links identified error types and the reasons behind these errors as provided by our annotators image felm examples png download method 1 download the whole dataset by wget https huggingface co datasets hkust nlp felm blob main all jsonl method 2 load the dataset using hugging face datasets https huggingface co datasets hkust nlp felm python from datasets import load dataset dataset load dataset r hkust nlp felm wk print dataset test 0 data description dataset snapshot category data number of instances 847 number of fields 5 labeled classes 2 number of labels 4427 descriptive statistics statistic all world knowledge reasoning math science tech writting recommendation segments 4427 532 1025 599 683 1588 positive segments 3642 385 877 477 582 1321 negative segments 785 147 148 122 101 267 data fields field name field value description index integer the order number of the data point source string the prompt source prompt string the prompt for generating response response string the response of chatgpt for prompt segmented response list segments of reponse labels list factuality labels for segmented response comment list error reasons for segments with factual error type list error types for segments with factual error ref list reference links typical data point index 0 source quora prompt which country or city has the maximum number of nuclear power plants response the united states has the highest number of nuclear power plants in the world with 94 operating reactors other countries with a significant number of nuclear power plants include france china russia and south korea segmented response the united states has the highest number of nuclear power plants in the world with 94 operating reactors other countries with a significant number of nuclear power plants include france china russia and south korea labels false true comment as of december 2022 there were 92 operable nuclear power reactors in the united states type knowledge error null ref https www eia gov tools faqs faq php id 207 t 3 evaluation on felm environment requirements transformers 4 32 0 openai 0 27 8 tenacity 8 2 2 tokenizer 3 4 2 pandas 2 0 3 to reproduce our results cd eval put all jsonl here downloaded by method 1 bash eval sh you can choose vicuna 30b gpt 3 5 turbo and gpt 4 for the parameter model you can choose raw cot link content and cot cons cot cons means cot self consistency method for the parameter method replace your openai key with your openai api key if using gpt 3 5 or gpt 4 leadboard in segment level model f1 score balanced accuracy gpt4 48 3 67 1 vicuna 33b 32 5 56 5 chatgpt 25 5 55 9 we only report the highest scores in this table licenses mit license https img shields io badge license mit blue svg https lbesson mit license org this work is licensed under a mit license https lbesson mit license org cc by nc sa 4 0 https img shields io badge license cc 20by nc sa 204 0 lightgrey svg http creativecommons org licenses by nc sa 4 0 the felm dataset is licensed under a creative commons attribution noncommercial sharealike 4 0 international license http creativecommons org licenses by nc sa 4 0 citation please cite our paper if you use our dataset bibtex inproceedings chen2023felm title felm benchmarking factuality evaluation of large language models author chen shiqi and zhao yiran and zhang jinghan and chern i chun and gao siyang and liu pengfei and he junxian booktitle thirty seventh conference on neural information processing systems datasets and benchmarks track year 2023 url http arxiv org abs 2310 00741
ai
Arduino-FreeRTOS-SAMD21
freertos v10 2 1 for arduino samd21 boards this library will allow you to create freertos projects in the arduino ide and run them on your samd21 boards want freertos for the samd51 use this other repository https github com briscoetech arduino freertos samd51 tested boards sparkfun samd21 mini sparkfun samd21 dev adafruit feather m0 atmel xplained samd21 whats new in the recent versions added and updated example projects with lessons learned to help you get started setting up a new project added optional serial printing when the rtos fails makes tracking down and diagnosing project problems easier added example project demonstrating the most common rtos failures and how you might detect them optional feature wrapped memmory functions this linker setting change will allow all microcontroller malloc free realloc calloc operations to be managed by freertos this could eliminate memory corruption issues on c intensive projects or projects that might be fragmenting the heap implementation guide can be found in wrapping memory functions platform local txt special thanks to these people for your help and guidance reference material and hard work on contributions richard barry for creating freertos and sharing it with the world www freertos org trlafleur drewfish baekgaard sergiotomasello godario tomasroj feilipu greiman
os
CLI-App-Generation-Bootcamp
command line interface cafe app python project p align center img src images app menu png width 600 p index overview overview initialisation initialisation database database features features conclusion conclusion overview a cafe app that uses the command line interface for python3 br using python code to natively view create update and remove from the products courier and order tables br integration with docker using containers such as adminer for providing the database and mysql for management of the data initialisation start by creating and running a virtual environment br python3 m venv venv br install required packages br pip install r requirements txt br run docker for mysql database br docker compose up d br connect to adminer by the following link select the miniproject tab sh http localhost 8080 input the username and password to login br p align center img src images login gif width 600 p run the saucy app in the saucy folder br py app py database with adminer the changes are made instantly in realtime to the database through the backend without having to close the app through the frontend we can see the change to the database reflected upon site refresh br for example we can see the functionality work below when creating a new coconut product p align center img src images database gif width 600 p this extends to all additional functions such as update and delete features tradionally python s cli terminals do not come with visual aid so i used packages installed from npm to supplement the ui aspect of the front end with the use of packages i was able to utitlise colours menu ascii art loading bars tables and cowsay features to give an element of interactivity to a cli app p align center img src images feature gif width 600 p i was also able to take this a step further with the back end of the app having error exceptions in mind for inputs i was also able to colour code for exceptions to print in red while successful entries printed in green for example execute query sql bars loading info info adding product to database entry function clear screen print fore green f name with a price of price has been added to the database except exception as e function clear screen bars loading info info processing print fore red restarting error str e break we can see the outcome of a wrong input below in red alongside the error type printed p align center img src images errorlog gif width 600 p conclusion the project originally included reading and writing to csv where the app relied solely on python coding for earlier versions br can be found in function py br as the project scope changed towards the final objective using a mysql database so did the code it was started from scratch as we worked on adding and removing more code to fit incrementally with understanding the developing nature building an app br this app was an independent project that we presented to our cohorts and mentors upon reaching the deadline to demonstrate our abilities with python3
server
real-time-iot-device-monitoring-with-kinesis
deprecation notice this aws solution has been archived and is no longer maintained by aws to discover other solutions please visit the aws solutions library https aws amazon com solutions real time iot device monitoring with kinesis analytics aws solution for analyzing iot device connectivity using kinesis analytics os python environment setup bash sudo apt get update sudo apt get install install zip wget gawk sed y building lambda package bash cd deployment build s3 dist sh source bucket base name solution name solution version source bucket base name should be the base name for the s3 bucket location where the template will source the lambda code from the template will append region name to this value for example build s3 dist sh solutions the template will then expect the source code to be located in the solutions region name bucket cf template and lambda function the cf template is located in deployment global s3 assets directory the lambda function is located in deployment regional s3 assets directory collection of operational metrics this solution collects anonymous operational metrics to help aws improve the quality of features of the solution for more information including how to disable this capability please see the implementation guide https docs aws amazon com solutions latest real time iot device monitoring with kinesis appendix c html copyright 2019 amazon com inc or its affiliates all rights reserved licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license a copy of the license is located at http www apache org licenses or in the license file accompanying this file this file is distributed on an as is basis without warranties or conditions of any kind express or implied see the license for the specific language governing permissions and limitations under the license
server
compromise
div align center img height 15px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div b compromise b div img src https user images githubusercontent com 399657 68222691 6597f180 ffb9 11e9 8a32 a7f38aa8bded png div modest natural language processing div div code npm install compromise code div div align center sub by a href https spencermounta in spencer kelly a and a href https github com spencermountain compromise graphs contributors many contributors a sub div img height 22px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div div align center div a href https npmjs org package compromise img src https img shields io npm v compromise svg style flat square a a href https codecov io gh spencermountain compromise img src https codecov io gh spencermountain compromise branch master graph badge svg a a href https bundlephobia com result p compromise img src https img shields io bundlephobia min compromise img src https badge size herokuapp com spencermountain compromise master builds compromise min js a div div align center sub a href https github com nlp compromise fr compromise french a a href https github com nlp compromise de compromise german a a href https github com nlp compromise it compromise italian a a href https github com nlp compromise es compromise spanish a sub div div spacer img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div align left don t you find it strange br ul img height 2px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png sub how easy b text b is to b make b sub br img height 2px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png nbsp i sub sub b b i nbsp sub sub and how hard it is to actually b parse b and i use i ul div spacer img height 45px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div align left img height 10px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png compromise i a href https observablehq com spencermountain compromise justification tries its best a i to turn text into data br img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png it makes limited and sensible decisions br sub img height 15px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png it s not as smart as you d think sub img height 45px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png it is a href https docs compromise cool compromise filesize small a href https docs compromise cool compromise performance quick a and often i a href https docs compromise cool compromise accuracy good enough a i br div img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png js import nlp from compromise let doc nlp she sells seashells by the seashore doc verbs topasttense doc text she sold seashells by the seashore spacer img height 50px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div align left i don t be fancy at all i div js if doc has simon says verb return true spacer img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div align center img height 50px src https user images githubusercontent com 399657 68221814 05ed1680 ffb8 11e9 8b6b c7528d163871 png div div align left i grab parts of the text i div js let doc nlp entirenovel doc match the adjective of times text the blurst of times div align right a href https docs compromise cool compromise match match docs a div div align center img height 50px src https user images githubusercontent com 399657 68221837 0d142480 ffb8 11e9 9d30 90669f1b897c png div spacer img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png i and get data i js import plg from compromise speech nlp extend plg let doc nlp milwaukee has certainly had its share of visitors doc compute syllables doc places json text milwaukee terms normal milwaukee syllables mil wau kee div align right a href https docs compromise cool compromise json json docs a div div align center img height 50px src https user images githubusercontent com 399657 68221814 05ed1680 ffb8 11e9 8b6b c7528d163871 png div spacer img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png avoid the problems of brittle parsers js let doc nlp we re not gonna take it doc has gonna true doc has going to true implicit transform doc contractions expand doc text we are not going to take it div align right a href https docs compromise cool compromise contractions contraction docs a div div align center img height 50px src https user images githubusercontent com 399657 68221814 05ed1680 ffb8 11e9 8b6b c7528d163871 png div spacer img height 30 src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png and whip stuff around like it s data js let doc nlp ninety five thousand and fifty two doc numbers add 20 doc text ninety five thousand and seventy two div align right a href https docs compromise cool compromise values number docs a div div align center img height 50px src https user images githubusercontent com 399657 68221837 0d142480 ffb8 11e9 9d30 90669f1b897c png div spacer img height 30 src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png sub because it actually is sub js let doc nlp the purple dinosaur doc nouns toplural doc text the purple dinosaurs div align right a href https docs compromise cool nouns noun docs a div div align center img height 50px src https user images githubusercontent com 399657 68221731 e8b84800 ffb7 11e9 8453 6395e0e903fa png div spacer img height 50px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png use it on the client side html script src https unpkg com compromise script script var doc nlp two bottles of beer doc numbers minus 1 document body innerhtml doc text one bottle of beer script or likewise typescript import nlp from compromise var doc nlp london is calling doc verbs tonegative london is not calling img height 75px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png bragging graphs spacer img height 30 src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png compromise is 250kb minified div align center filesize a href https bundlephobia com result p compromise img width 600 src https user images githubusercontent com 399657 68234819 14dfc300 ffd0 11e9 8b30 cb8545707b29 png a div it s pretty fast it can run on keypress div align center a href https observablehq com spencermountain compromise performance img width 600 src https user images githubusercontent com 399657 159795115 ed62440a be41 424c baa4 8dd15c48377d png a div it works mainly by a href https observablehq com spencermountain verbs conjugating all forms a of a basic word list the final lexicon is a href https observablehq com spencermountain compromise lexicon 14 000 words a div align center img width 600 src https user images githubusercontent com 399657 68234805 0d201e80 ffd0 11e9 8dc6 f7a600352555 png div you can read more about how it works here https observablehq com spencermountain compromise internals it s weird spacer img height 75px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png one two three parts p align left sub okay sub h1 code compromise one code h1 p align center a code tokenizer code of words sentences and punctuation p img height 15px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png p js import nlp from compromise one let doc nlp wayne s world party time let data doc json normal wayne s world party time terms text wayne s normal wayne div align right a href https docs compromise cool compromise tokenization tokenizer docs a div b compromise one b splits your text up wraps it in a handy api ul sub and does nothing else sub ul img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png b one b is quick most sentences take a 10th of a millisecond it can do b 1mb b of text a second or 10 wikipedia pages i infinite jest i is takes 3s div align right you can also parallelize or stream text to it with a href https github com spencermountain compromise tree master plugins speed compromise speed a div spacer img height 60px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png two p align center h1 align left code compromise two code h1 p align center a code part of speech code tagger and grammar interpreter p img height 15px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png p js import nlp from compromise two let doc nlp wayne s world party time let str doc match possessive noun text wayne s world div align right a href https docs compromise cool compromise tagger tagger docs a div p img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png p b compromise two b automatically calculates the very basic grammar of each word sub this is more useful than people sometimes realize sub light grammar helps you write cleaner templates and get closer to the information part of speech tagging is profoundly difficult task to get 100 on it is also a profoundly easy task to get 85 on img height 50px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png compromise has b 83 tags b arranged in a href https observablehq com spencermountain compromise tags a handsome graph a b firstname b b person b b propernoun b b noun b you can see the grammar of each word by running doc debug you can see the reasoning for each tag with nlp verbose tagger if you prefer a href https www ling upenn edu courses fall 2003 ling001 penn treebank pos html i penn tags i a you can derive them with js let doc nlp welcome thrillho doc compute penn doc json img height 60px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png three p align center h1 align left code compromise three code h1 p align center code phrase code and sentence tooling p img height 15px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png p js import nlp from compromise three let doc nlp wayne s world party time let str doc people normalize text wayne div align right a href https docs compromise cool compromise selections selection docs a div b compromise three b is a set of tooling to i zoom into i and operate on parts of a text numbers grabs all the numbers in a document for example and extends it with new methods like subtract when you have a phrase or group of words you can see additional metadata about it with json js let doc nlp four out of five dentists console log doc fractions json text four out of five terms object object object object fraction numerator 4 denominator 5 decimal 0 8 js let doc nlp 4 09cad doc money json text 4 09cad terms object number prefix num 4 09 suffix cad img height 80px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png api compromise one output text https observablehq com spencermountain compromise text return the document as text json https observablehq com spencermountain compromise json return the document as data debug https observablehq com spencermountain compromise output pretty print the interpreted document out https observablehq com spencermountain compromise output a named or custom output html https observablehq com spencermountain compromise html output custom html tags for matches wrap https observablehq com spencermountain compromise output produce custom output for document matches utils found https observablehq com spencermountain compromise utils getter is this document empty docs https observablehq com spencermountain compromise utils getter get term objects as json length https observablehq com spencermountain compromise utils getter count the of characters in the document string length isview https observablehq com spencermountain compromise utils getter identify a compromise object compute https observablehq com spencermountain compromise compute run a named analysis on the document clone https observablehq com spencermountain compromise utils deep copy the document so that no references remain termlist https observablehq com spencermountain compromise accessors return a flat list of all term objects in match cache https observablehq com spencermountain compromise cache freeze the current state of the document for speed purposes uncache https observablehq com spencermountain compromise cache un freezes the current state of the document so it may be transformed accessors all https observablehq com spencermountain compromise utils return the whole original document zoom out terms https observablehq com spencermountain compromise selections split up results by each individual term first n https observablehq com spencermountain compromise accessors use only the first result s last n https observablehq com spencermountain compromise accessors use only the last result s slice n n https observablehq com spencermountain compromise accessors grab a subset of the results eq n https observablehq com spencermountain compromise accessors use only the nth result firstterms https observablehq com spencermountain compromise accessors get the first word in each match lastterms https observablehq com spencermountain compromise accessors get the end word in each match fullsentences https observablehq com spencermountain compromise accessors get the whole sentence for each match groups https observablehq com spencermountain compromise accessors grab any named capture groups from a match wordcount https observablehq com spencermountain compromise utils count the of terms in the document confidence https observablehq com spencermountain compromise utils an average score for pos tag interpretations match match methods use the match syntax https docs compromise cool compromise match syntax match https observablehq com spencermountain compromise match return a new doc with this one as a parent not https observablehq com spencermountain compromise match return all results except for this matchone https observablehq com spencermountain compromise match return only the first match if https observablehq com spencermountain compromise match return each current phrase only if it contains this match only ifno https observablehq com spencermountain compromise match filter out any current phrases that have this match notif has https observablehq com spencermountain compromise match return a boolean if this match exists before https observablehq com spencermountain compromise match return all terms before a match in each phrase after https observablehq com spencermountain compromise match return all terms after a match in each phrase union https observablehq com spencermountain compromise pointers return combined matches without duplicates intersection https observablehq com spencermountain compromise pointers return only duplicate matches complement https observablehq com spencermountain compromise pointers get everything not in another match settle https observablehq com spencermountain compromise pointers remove overlaps from matches growright https observablehq com spencermountain compromise match add any matching terms immediately after each match growleft https observablehq com spencermountain compromise match add any matching terms immediately before each match grow https observablehq com spencermountain compromise match add any matching terms before or after each match sweep net https observablehq com spencermountain compromise sweep apply a series of match objects to the document spliton https observablehq com spencermountain compromise split return a document with three parts for every match spliton splitbefore https observablehq com spencermountain compromise split partition a phrase before each matching segment splitafter https observablehq com spencermountain compromise split partition a phrase after each matching segment lookup https observablehq com spencermountain compromise match quick find for an array of string matches autofill https observablehq com spencermountain compromise typeahead create type ahead assumptions on the document tag tag https observablehq com spencermountain compromise tagger give all terms the given tag tagsafe https observablehq com spencermountain compromise tagger only apply tag to terms if it is consistent with current tags untag https observablehq com spencermountain compromise tagger remove this term from the given terms canbe https observablehq com spencermountain compromise tagger return only the terms that can be this tag case tolowercase https observablehq com spencermountain compromise case turn every letter of every term to lower cse touppercase https observablehq com spencermountain compromise case turn every letter of every term to upper case totitlecase https observablehq com spencermountain compromise case upper case the first letter of each term tocamelcase https observablehq com spencermountain compromise case remove whitespace and title case each term whitespace pre https observablehq com spencermountain compromise whitespace add this punctuation or whitespace before each match post https observablehq com spencermountain compromise whitespace add this punctuation or whitespace after each match trim https observablehq com spencermountain compromise whitespace remove start and end whitespace hyphenate https observablehq com spencermountain compromise whitespace connect words with hyphen and remove whitespace dehyphenate https observablehq com spencermountain compromise whitespace remove hyphens between words and set whitespace toquotations https observablehq com spencermountain compromise whitespace add quotation marks around these matches toparentheses https observablehq com spencermountain compromise whitespace add brackets around these matches loops map fn https observablehq com spencermountain compromise loops run each phrase through a function and create a new document foreach fn https observablehq com spencermountain compromise loops run a function on each phrase as an individual document filter fn https observablehq com spencermountain compromise loops return only the phrases that return true find fn https observablehq com spencermountain compromise loops return a document with only the first phrase that matches some fn https observablehq com spencermountain compromise loops return true or false if there is one matching phrase random fn https observablehq com spencermountain compromise loops sample a subset of the results insert replace match replace https observablehq com spencermountain compromise insert search and replace match with new content replacewith replace https observablehq com spencermountain compromise insert substitute in new text remove https observablehq com spencermountain compromise insert fully remove these terms from the document insertbefore str https observablehq com spencermountain compromise insert add these new terms to the front of each match prepend insertafter str https observablehq com spencermountain compromise insert add these new terms to the end of each match append concat https observablehq com spencermountain compromise insert add these new things to the end swap fromlemma tolemma https observablehq com spencermountain compromise root smart replace of root words using proper conjugation transform sort method https observablehq com spencermountain compromise sorting re arrange the order of the matches in place reverse https observablehq com spencermountain compromise sorting reverse the order of the matches but not the words normalize https observablehq com spencermountain compromise normalization clean up the text in various ways unique https observablehq com spencermountain compromise sorting remove any duplicate matches lib these methods are on the main nlp object nlp tokenize str https observablehq com spencermountain compromise tokenization parse text without running pos tagging nlp lazy str match https observablehq com spencermountain compromise performance scan through a text with minimal analysis nlp plugin https observablehq com spencermountain compromise constructor methods mix in a compromise plugin nlp parsematch str https observablehq com spencermountain compromise constructor methods pre parse any match statements into json nlp world https observablehq com spencermountain compromise constructor methods grab or change library internals nlp model https observablehq com spencermountain compromise constructor methods grab all current linguistic data nlp methods https observablehq com spencermountain compromise constructor methods grab or change internal methods nlp hooks https observablehq com spencermountain compromise constructor methods see which compute methods run automatically nlp verbose mode https observablehq com spencermountain compromise constructor methods log our decision making for debugging nlp version https observablehq com spencermountain compromise constructor methods current semver version of the library nlp addwords obj https observablehq com spencermountain compromise plugin add new words to the lexicon nlp addtags obj https observablehq com spencermountain compromise plugin add new tags to the tagset nlp typeahead arr https observablehq com spencermountain compromise typeahead add words to the auto fill dictionary nlp buildtrie arr https observablehq com spencermountain compromise lookup compile a list of words into a fast lookup form nlp buildnet arr https observablehq com spencermountain compromise sweep compile a list of matches into a fast match form spacer img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png compromise two contractions contractions https observablehq com spencermountain compromise contractions things like didn t contractions expand https observablehq com spencermountain compromise contractions things like didn t contract https observablehq com spencermountain compromise contractions things like didn t spacer img height 30px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png compromise three nouns nouns https observablehq com spencermountain nouns return any subsequent terms tagged as a noun nouns json https observablehq com spencermountain nouns overloaded output with noun metadata nouns parse https observablehq com spencermountain nouns get tokenized noun phrase nouns isplural https observablehq com spencermountain nouns return only plural nouns nouns issingular https observablehq com spencermountain nouns return only singular nouns nouns toplural https observablehq com spencermountain nouns football captain football captains nouns tosingular https observablehq com spencermountain nouns turnovers turnover nouns adjectives https observablehq com spencermountain nouns get any adjectives describing this noun verbs verbs https observablehq com spencermountain verbs return any subsequent terms tagged as a verb verbs json https observablehq com spencermountain verbs overloaded output with verb metadata verbs parse https observablehq com spencermountain verbs get tokenized verb phrase verbs subjects https observablehq com spencermountain verbs what is doing the verb action verbs adverbs https observablehq com spencermountain verbs return the adverbs describing this verb verbs issingular https observablehq com spencermountain verbs return singular verbs like spencer walks verbs isplural https observablehq com spencermountain verbs return plural verbs like we walk verbs isimperative https observablehq com spencermountain verbs only instruction verbs like eat it verbs topasttense https observablehq com spencermountain verbs will go went verbs topresenttense https observablehq com spencermountain verbs walked walks verbs tofuturetense https observablehq com spencermountain verbs walked will walk verbs toinfinitive https observablehq com spencermountain verbs walks walk verbs togerund https observablehq com spencermountain verbs walks walking verbs topastparticiple https observablehq com spencermountain verbs drive had driven verbs conjugate https observablehq com spencermountain verbs return all conjugations of these verbs verbs isnegative https observablehq com spencermountain verbs return verbs with not never or no verbs ispositive https observablehq com spencermountain verbs only verbs without not never or no verbs tonegative https observablehq com spencermountain verbs went did not go verbs topositive https observablehq com spencermountain verbs didn t study studied numbers numbers https observablehq com spencermountain compromise values grab all written and numeric values numbers parse https observablehq com spencermountain compromise values get tokenized number phrase numbers get https observablehq com spencermountain compromise values get a simple javascript number numbers json https observablehq com spencermountain compromise values overloaded output with number metadata numbers tonumber https observablehq com spencermountain compromise values convert five to 5 numbers tolocalestring https observablehq com spencermountain compromise values add commas or nicer formatting for numbers numbers totext https observablehq com spencermountain compromise values convert 5 to five numbers toordinal https observablehq com spencermountain compromise values convert five to fifth or 5th numbers tocardinal https observablehq com spencermountain compromise values convert fifth to five or 5 numbers isordinal https observablehq com spencermountain compromise values return only ordinal numbers numbers iscardinal https observablehq com spencermountain compromise values return only cardinal numbers numbers isequal n https observablehq com spencermountain compromise values return numbers with this value numbers greaterthan min https observablehq com spencermountain compromise values return numbers bigger than n numbers lessthan max https observablehq com spencermountain compromise values return numbers smaller than n numbers between min max https observablehq com spencermountain compromise values return numbers between min and max numbers set n https observablehq com spencermountain compromise values set number to n numbers add n https observablehq com spencermountain compromise values increase number by n numbers subtract n https observablehq com spencermountain compromise values decrease number by n numbers increment https observablehq com spencermountain compromise values increase number by 1 numbers decrement https observablehq com spencermountain compromise values decrease number by 1 money https observablehq com spencermountain compromise values things like 2 50 money get https observablehq com spencermountain compromise values retrieve the parsed amount s of money money json https observablehq com spencermountain compromise values currency number info money currency https observablehq com spencermountain compromise values which currency the money is in fractions https observablehq com spencermountain compromise values like 2 3rds or one out of five fractions parse https observablehq com spencermountain compromise values get tokenized fraction fractions get https observablehq com spencermountain compromise values simple numerator denomenator data fractions json https observablehq com spencermountain compromise values json method overloaded with fractions data fractions todecimal https observablehq com spencermountain compromise values 2 3 0 66 fractions normalize https observablehq com spencermountain compromise values four out of 10 4 10 fractions totext https observablehq com spencermountain compromise values 4 10 four tenths fractions topercentage https observablehq com spencermountain compromise values 4 10 40 percentages https observablehq com spencermountain compromise values like 2 5 fractions get https observablehq com spencermountain compromise values return the percentage number 100 fractions json https observablehq com spencermountain compromise values json overloaded with percentage information fractions tofraction https observablehq com spencermountain compromise values 80 8 10 sentences sentences https observablehq com spencermountain compromise sentences return a sentence class with additional methods sentences json https observablehq com spencermountain compromise sentences overloaded output with sentence metadata sentences subjects https observablehq com spencermountain compromise sentences return the main noun of each sentence sentences topasttense https observablehq com spencermountain compromise sentences he walks he walked sentences topresenttense https observablehq com spencermountain compromise sentences he walked he walks sentences tofuturetense https observablehq com spencermountain compromise sentences he walks he will walk sentences toinfinitive https observablehq com spencermountain compromise sentences verb root form he walks he walk sentences tonegative https observablehq com spencermountain compromise sentences he walks he didn t walk sentences isquestion https observablehq com spencermountain compromise sentences return questions with a sentences isexclamation https observablehq com spencermountain compromise sentences return sentences with a sentences isstatement https observablehq com spencermountain compromise sentences return sentences without or adjectives adjectives https observablehq com spencermountain compromise selections things like quick adjectives json https observablehq com spencermountain compromise selections get adjective metadata adjectives conjugate https observablehq com spencermountain compromise selections return all inflections of these adjectives adjectives adverbs https observablehq com spencermountain compromise selections get adverbs describing this adjective adjectives tocomparative https observablehq com spencermountain compromise selections quick quicker adjectives tosuperlative https observablehq com spencermountain compromise selections quick quickest adjectives toadverb https observablehq com spencermountain compromise selections quick quickly adjectives tonoun https observablehq com spencermountain compromise selections quick quickness misc selections clauses https observablehq com spencermountain compromise selections split up sentences into multi term phrases chunks https observablehq com spencermountain compromise selections split up sentences noun phrases and verb phrases hyphenated https observablehq com spencermountain compromise selections all terms connected with a hyphen or dash like wash out phonenumbers https observablehq com spencermountain compromise selections things like 939 555 0113 hashtags https observablehq com spencermountain compromise selections things like nlp emails https observablehq com spencermountain compromise selections things like hi compromise cool emoticons https observablehq com spencermountain compromise selections things like emojis https observablehq com spencermountain compromise selections things like atmentions https observablehq com spencermountain compromise selections things like nlp compromise urls https observablehq com spencermountain compromise selections things like compromise cool pronouns https observablehq com spencermountain compromise selections things like he conjunctions https observablehq com spencermountain compromise selections things like but prepositions https observablehq com spencermountain compromise selections things like of abbreviations https observablehq com spencermountain compromise selections things like mrs people https observablehq com spencermountain topics named entity recognition names like john f kennedy people json https observablehq com spencermountain topics named entity recognition get person name metadata people parse https observablehq com spencermountain topics named entity recognition get person name interpretation places https observablehq com spencermountain topics named entity recognition like paris france organizations https observablehq com spencermountain topics named entity recognition like google inc topics https observablehq com spencermountain topics named entity recognition people places organizations adverbs https observablehq com spencermountain compromise selections things like quickly adverbs json https observablehq com spencermountain compromise selections get adverb metadata acronyms https observablehq com spencermountain compromise selections things like fbi acronyms strip https observablehq com spencermountain compromise selections remove periods from acronyms acronyms addperiods https observablehq com spencermountain compromise selections add periods to acronyms parentheses https observablehq com spencermountain compromise selections return anything inside parentheses parentheses strip https observablehq com spencermountain compromise selections remove brackets possessives https observablehq com spencermountain compromise selections things like spencer s possessives strip https observablehq com spencermountain compromise selections spencer s spencer quotations https observablehq com spencermountain compromise selections return any terms inside paired quotation marks quotations strip https observablehq com spencermountain compromise selections remove quotation marks p img height 85px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png p div align center img src https user images githubusercontent com 399657 68221814 05ed1680 ffb8 11e9 8b6b c7528d163871 png div extend this library comes with a considerate common sense baseline for english grammar you re free to change or lay waste to any settings which is the fun part actually the easiest part is just to suggest tags for any given words js let mywords kermit firstname fozzie firstname let doc nlp muppettext mywords or make heavier changes with a compromise plugin https observablehq com spencermountain compromise plugins js import nlp from compromise nlp extend add new tags tags character isa person nota adjective add or change words in the lexicon words kermit character gonzo character change inflections irregulars get pasttense gotten gerund gettin add new methods to compromise api view view prototype kermitvoice function this sentences prepend well this match i am was prepend um return this div align right a href https docs compromise cool compromise plugins plugin docs a div div align center img height 50px src https user images githubusercontent com 399657 68221848 11404200 ffb8 11e9 90cd 3adee8d8564f png div spacer div img height 50px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div docs gentle introduction 1 input output https docs compromise cool tutorial 1 2 match transform https docs compromise cool compromise tutorial 2 3 making a chat bot https docs compromise cool compromise making a bot tutorial 4 making a plugin div img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div documentation concepts api plugins accuracy https observablehq com spencermountain compromise accuracy accessors https observablehq com spencermountain compromise accessors adjectives https observablehq com spencermountain compromise adjectives caching https observablehq com spencermountain compromise cache constructor methods https observablehq com spencermountain compromise constructor methods dates https observablehq com spencermountain compromise dates case https observablehq com spencermountain compromise case contractions https observablehq com spencermountain compromise contractions export https observablehq com spencermountain compromise export filesize https observablehq com spencermountain compromise filesize insert https observablehq com spencermountain compromise insert hash https observablehq com spencermountain compromise hash internals https observablehq com spencermountain compromise internals json https observablehq com spencermountain compromise json html https observablehq com spencermountain compromise html justification https observablehq com spencermountain compromise justification character offsets https observablehq com spencermountain compromise offsets keypress https observablehq com spencermountain compromise keypress lexicon https observablehq com spencermountain compromise lexicon loops https observablehq com spencermountain compromise loops ngrams https observablehq com spencermountain compromise ngram match syntax https observablehq com spencermountain compromise match syntax match https observablehq com spencermountain compromise match numbers https observablehq com spencermountain compromise values performance https observablehq com spencermountain compromise performance nouns https observablehq com spencermountain nouns paragraphs https observablehq com spencermountain compromise paragraphs plugins https observablehq com spencermountain compromise plugins output https observablehq com spencermountain compromise output scan https observablehq com spencermountain compromise scan projects https observablehq com spencermountain compromise projects selections https observablehq com spencermountain compromise selections sentences https observablehq com spencermountain compromise sentences tagger https observablehq com spencermountain compromise tagger sorting https observablehq com spencermountain compromise sorting syllables https observablehq com spencermountain compromise syllables tags https observablehq com spencermountain compromise tags split https observablehq com spencermountain compromise split pronounce https observablehq com spencermountain compromise pronounce tokenization https observablehq com spencermountain compromise tokenization text https observablehq com spencermountain compromise text strict https observablehq com spencermountain compromise strict named entities https observablehq com spencermountain topics named entity recognition utils https observablehq com spencermountain compromise utils penn tags https observablehq com spencermountain compromise penn tags whitespace https observablehq com spencermountain compromise whitespace verbs https observablehq com spencermountain verbs typeahead https observablehq com spencermountain compromise compromise typeahead world data https observablehq com spencermountain compromise world normalization https observablehq com spencermountain compromise normalization sweep https observablehq com spencermountain compromise sweep fuzzy matching https observablehq com spencermountain compromise fuzzy matching typescript https observablehq com spencermountain compromise typescript mutation https observablehq com spencermountain compromise mutation root forms https observablehq com spencermountain compromise root div img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div talks language as an interface https www youtube com watch v wupvs2tcg8s by spencer kelly coding chat bots https www youtube com watch v c hmwfwvo0u by kahwee teng on typing and data https vimeo com 496095722 by spencer kelly articles geocoding social conversations with nlp and javascript http compromise cool by microsoft microservice recipe https eventn com recipes text parsing with nlp compromise by eventn adventure game sentence parsing with compromise https killalldefects com 2020 02 20 adventure game sentence parsing with compromise building text based games https killalldefects com 2019 09 24 building text based games with compromise nlp by matt eland fun with javascript in bigquery https medium com hoffa new in bigquery persistent udfs c9ea4100fd83 6e09 by felipe hoffa natural language processing in the browser https dev to charlesdlandau natural language processing in the browser 52hj by charles landau some fun applications automated bechdel test https github com guardian bechdel test by the guardian story generation framework https perchance org welcome by jose phrocca tumbler blog of lists https leanstooneside tumblr com horse ebooks like lists by michael paulukonis video editing from transcription https newtheory io by new theory browser extension fact checking https github com alexanderkidd factoidl by alexander kidd siri shortcut https routinehub co shortcut 3260 by michael byrns amazon skill https github com tajddin voiceplay by tajddin maghni tasking slack bot https github com kevinsuh toki by kevin suh see more https observablehq com spencermountain compromise projects comparisons compromise and spacy https observablehq com spencermountain compromise and spacy compromise and nltk https observablehq com spencermountain compromise and nltk spacer div align center img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png hr div div align center img height 50px src https user images githubusercontent com 399657 68221632 b9094000 ffb7 11e9 99e0 b48edd6cdf8a png div div align center img height 50px src https user images githubusercontent com 399657 68221824 09809d80 ffb8 11e9 9ef0 6ed3574b0ce8 png div plugins these are some helpful extensions dates npm install compromise dates dates https observablehq com spencermountain compromise dates find dates like june 8th or 03 03 18 dates get https observablehq com spencermountain compromise dates simple start end json result dates json https observablehq com spencermountain compromise dates overloaded output with date metadata dates format https observablehq com spencermountain compromise dates convert the dates to specific formats dates toshortform https observablehq com spencermountain compromise dates convert wednesday to wed etc dates tolongform https observablehq com spencermountain compromise dates convert feb to february etc durations https observablehq com spencermountain compromise dates 2 weeks or 5mins durations get https observablehq com spencermountain compromise dates return simple json for duration durations json https observablehq com spencermountain compromise dates overloaded output with duration metadata times https observablehq com spencermountain compromise dates 4 30pm or half past five durations get https observablehq com spencermountain compromise dates return simple json for times times json https observablehq com spencermountain compromise dates overloaded output with time metadata stats npm install compromise stats tfidf https observablehq com spencermountain compromise tfidf rank words by frequency and uniqueness ngrams https observablehq com spencermountain compromise ngram list all repeating sub phrases by word count unigrams https observablehq com spencermountain compromise ngram n grams with one word bigrams https observablehq com spencermountain compromise ngram n grams with two words trigrams https observablehq com spencermountain compromise ngram n grams with three words startgrams https observablehq com spencermountain compromise ngram n grams including the first term of a phrase endgrams https observablehq com spencermountain compromise ngram n grams including the last term of a phrase edgegrams https observablehq com spencermountain compromise ngram n grams including the first or last term of a phrase speech npm install compromise syllables syllables https observablehq com spencermountain compromise syllables split each term by its typical pronunciation soundslike https observablehq com spencermountain compromise soundslike produce a estimated pronunciation wikipedia npm install compromise wikipedia wikipedia https observablehq com spencermountain compromise wikipedia compressed article reconciliation spacer div img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png hr div typescript we re committed to typescript deno support both in main and in the official plugins ts import nlp from compromise import stats from compromise stats const nlpex nlp extend stats nlpex this is type safe ngrams min 1 div align right a href https docs compromise cool compromise typescript typescript docs a div div img height 50px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png div limitations slash support we currently split slashes up as different words like we do for hyphens so things like this don t work code nlp the koala eats shoots leaves has koala leaves false code inter sentence match by default sentences are the top level abstraction inter sentence or multi sentence matches aren t supported without a href https github com spencermountain compromise tree master plugins paragraphs a plugin a code nlp that s it back to winnipeg has it back false code nested match syntax the s danger s beauty of regex is that you can recurse indefinitely our match syntax is much weaker things like this are not i yet i possible code doc match modern major minor general code complex matches must be achieved with successive match statements dependency parsing proper sentence transformation requires understanding the syntax tree https en wikipedia org wiki parse tree of a sentence which we don t currently do we should help wanted with this faq ul align left p details summary isn t javascript too summary p p ul yeah it is br it wasn t built to compete with nltk and may not fit every project br string processing is synchronous too and parallelizing node processes is weird br see a href https observablehq com spencermountain compromise performance here a for information about speed performance and a href https observablehq com spencermountain compromise justification here a for project motivations ul p p details p p details summary can it run on my arduino watch summary p p ul only if it s water proof br read a href https observablehq com spencermountain compromise quickstart quick start a for running compromise in workers mobile apps and all sorts of funny environments ul p p details p p details summary compromise in other languages summary p p ul we ve got work in progress forks for a href https github com nlp compromise de compromise german a a href https github com nlp compromise fr compromise french a a href https github com nlp compromise es compromise spanish a and a href https github com nlp compromise it compromise italian a in the same philosophy br and need some help ul p p details p p details summary partial builds summary p p ul we do offer a a href https observablehq com spencermountain compromise filesize tokenize only a build which has the pos tagger pulled out br but otherwise compromise isn t easily tree shaken br the tagging methods are competitive and greedy so it s not recommended to pull things out br note that without a full pos tagging the contraction parser won t work perfectly i spencer s cool i vs i spencer s house i br it s recommended to run the library fully ul p p details p ul div align center img src https user images githubusercontent com 399657 68221731 e8b84800 ffb7 11e9 8453 6395e0e903fa png div see also nbsp en pos https github com finnlp en pos very clever javascript pos tagger by alex corvi https github com alexcorvi nbsp naturalnode https github com naturalnode natural fancier statistical nlp in javascript nbsp winkjs https winkjs org pos tagger tokenizer machine learning in javascript nbsp dariusk pos js https github com dariusk pos js fasttag fork in javascript nbsp compendium js https github com ulflander compendium js pos and sentiment analysis in javascript nbsp nodebox linguistics https www nodebox net code index php linguistics conjugation inflection in javascript nbsp retext https github com wooorm retext very impressive text utilities https github com wooorm retext blob master doc plugins md in javascript nbsp superscript https github com superscriptjs superscript conversation engine in js nbsp jspos https code google com archive p jspos javascript build of the time tested brill tagger nbsp spacy https spacy io speedy multilingual tagger in c python nbsp prose https github com jdkato prose quick tagger in go by joseph kato nbsp textblob https github com sloria textblob python tagger img height 25px src https user images githubusercontent com 399657 68221862 17ceb980 ffb8 11e9 87d4 7b30b6488f16 png b mit b
nlp part-of-speech named-entity-recognition
ai
hig
weave greenkeeper badge https badges greenkeeper io autodesk hig svg https greenkeeper io weave is autodesk s unified design system so we can build better products faster start doctoc generated toc please keep comment here to allow auto update don t edit this section instead re run doctoc to update getting started getting started theme data and theming components theme data and theming components react components react components basics basics components components contributing contributing end doctoc generated toc please keep comment here to allow auto update getting started add a component to your app jsx yarn add weave design button import and render the component jsx import button from weave design button function mycomponent return button title hello world theme data and theming components theme data is a representation of the weave visual design language in the form of data with weave design theme data packages theme data readme md we publish the data in ecmascript module json and scss formats this data includes 8 themes that can be used on any platform there are four color schemes including light gray dark gray dark blue with two densities high and low for each scheme see how to provide a theme to components packages theme context provide a theme to components weave design theme data packages theme data readme md weave design spec as data themecontext packages theme context readme md a component to ease consumption of theme data from within react components react components basics typography see typography packages typography readme md and richtext packages rich text readme md layout see spacer packages spacer readme md and surface packages surface readme md icons see icons packages icons readme md to easily render icons in react and the weave design icons packages icons readme md package for svg data components each weave pattern is implemented as a set of react components each pattern is published to npm individually under the weave design namespace accordion packages accordion readme md make content heavy pages appear less so by vertically stacking items in lists that users can expand or contract avatar packages avatar readme md a visual representation of a customer s identity avatarbundle packages avatar bundle readme md indicate a group of people who are associated with a task or information badge packages badge readme md visual indicators that communicate status and draw attention to an object banner packages banner readme md an alert that requires a user action button packages button readme md trigger actions or changes checkbox packages checkbox readme md a control to select from non exclusive options divider packages divider readme md separate content inline or in a stack dropdown packages dropdown readme md a menu to select one or many from a list flyout packages flyout readme md a lightweight popup container icons packages icons readme md represents a concept in graphical form iconbutton packages icon button readme md action buttons that include an icon only label packages label readme md a caption for an item in a user interface menu packages menu readme md display a list of choices through interaction with a button icon or other controls modal packages modal readme md an overlay that focuses the customer s attention notificationsflyout packages notifications flyout readme md a less intrusive way of announcing an event of potential interest to the user notificationstoast packages notifications toast readme md floating message boxes that appear numericinput packages numeric input readme md for numerical values that allows freehand text entries or toggling values up and down progressbar packages progress bar readme md an indication of content loading presented horizontally progressring packages progress ring readme md an indication of content loading presented circularly radiobutton packages radio button readme md a control to select one exclusively from a list richtext packages rich text readme md applies the hig typography styles to whatever is passed to it skeletonitem packages skeleton item readme md a placeholder for loading content slider packages slider readme md a control for selecting a single numeric value from a range spacer packages spacer readme md a square of empty space meant to add space between other components surface packages surface readme md a themable container with the appropriate background color for the current theme table packages table readme md a collection of data in rows and columns tabs packages tabs readme md keeps related content in a single container tag packages tag readme md compact elements that can be used to represent small blocks of information textarea packages text area readme md a control to provide a large amount of freeform text textlink packages text link readme md directs visitors to another location thumbnail packages thumbnail readme md visual anchors and identifiers for objects tile packages tile readme md a themable container that display information related to a single subject or destination timestamp packages timestamp readme md presents a date with humanized phrasing toggle packages toggle readme md a control for a single action with a clear on off or start stop tooltip packages tooltip readme md provides context in a small popup container treeview packages tree view readme md a way to view and manipulate a list of data typography packages typography readme md a set of components in many typographical variations contributing read our contribution guidelines here contributing md contributing md
os