names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
MC-Planner
describe explain plan and select interactive planning with large language models enables open world multi task agents div align center website http www craftjarvis org arxiv paper https arxiv org pdf 2302 01560 pdf team https github com craftjarvis pypi python version https img shields io pypi pyversions minedojo https pypi org project minedojo img src https img shields io badge framework pytorch red svg https pytorch org github license https img shields io github license minedojo mineclip https github com minedojo mineclip blob main license div updates 2023 03 28 due to the cancellation of access to codex by openai planning based on codex is no longer supported by this repository we will update to the latest openai model chatgpt which has better performance as soon as possible prepare packages our codebase require python 3 9 please run the following commands to prepare the environments sh conda create n planner python 3 9 conda activate planner python m pip install numpy torch 2 0 0 dev20230208 cu117 index url https download pytorch org whl nightly cu117 python m pip install r requirements txt python m pip install git https github com minedojo mineclip prepare environment it also requires a modified version of minedojo https github com craftjarvis mc simulator as the simulator and a goal conditioned controller https github com craftjarvis mc controller sh git clone https github com craftjarvis mc simulator git cd mc simulator pip install e prepare controller checkpoints below are the configures and weights of models configure download biome number of goals transformer weights https pkueducn my sharepoint com f g personal zhwang pkueducn onmicrosoft com ev7wgwhl5ppcjmkil0dyroub8nw0yqd8kuyfb47uxgojow e xtgtpy plains 4 prepare openai keys our planner depends on large language model like instructgpt codex or chatgpt so we need support the openai keys in the file data openai keys txt an openai key list is also accepted running agent models to run the code call sh python main py model load ckpt path path to ckpt after loading you should see a window where agents are playing minecraft painting wooden slab stone stairs img src imgs obtain painting gif width 200 img src imgs obtain wooden slab gif width 200 img src imgs obtain stone stairs gif width 200 note our planner depends on stable openai api connection if meeting connection error please retry it paper and citation our paper is posted on arxiv https arxiv org pdf 2301 10034 pdf if it helps you please consider citing us bib article wang2023describe title describe explain plan and select interactive planning with large language models enables open world multi task agents author wang zihao and cai shaofei and liu anji and ma xiaojian and liang yitao journal arxiv preprint arxiv 2302 01560 year 2023
language-model minecraft
ai
employment-database-data-engineering
09 sql challenge sql homework for usc data analytics bootcamp
postgresql database data-engineering sqlalchemy
server
front-ui
front ui used to accumulate some front end effects continuous updates always find the effects you need do not be fond of if you like give it a star star background image https github com silencehvk articles raw master assets images bgimages bg1 jpg directory canvas canvas components components layout layout other other canvas 1 draw wolf https htmlpreview github io https github com silencehvk frontui blob master canvas draw wolf index html img src https github com silencehvk front ui raw master assets images rendering canvas draw wolf png width 183px height 200px alt draw wolf image 2 draw animation https htmlpreview github io https github com silencehvk frontui blob master canvas draw animation index html img src https github com silencehvk front ui raw master assets images rendering canvas draw animation gif width 419px height 183px alt draw animation 3 fire works https htmlpreview github io https github com silencehvk frontui blob master canvas fire works index html img src https github com silencehvk front ui raw master assets images rendering canvas fire works gif width 419px height 183px alt fire works 4 hex https htmlpreview github io https github com silencehvk frontui blob master canvas hex index html img src https github com silencehvk front ui raw master assets images rendering canvas hex gif width 419px height 183px alt hex 5 storm https htmlpreview github io https github com silencehvk frontui blob master canvas storm index html img src https github com silencehvk front ui raw master assets images rendering canvas storm gif width 419px height 183px alt storm 6 love https htmlpreview github io https github com silencehvk frontui blob master canvas love index html img src https github com silencehvk front ui raw master assets images rendering canvas love gif width 419px height 230px alt love 7 rotation of the ball https htmlpreview github io https github com silencehvk frontui blob master canvas rotation of the ball index html img src https github com silencehvk front ui raw master assets images rendering canvas rotation of the ball gif width 419px height 230px alt rotation of the ball 8 the clock https htmlpreview github io https github com silencehvk frontui blob master canvas the clock index html img src https github com silencehvk front ui raw master assets images rendering canvas the clock gif width 419px height 419px alt the clock layout 1 single page switch https htmlpreview github io https github com silencehvk frontui blob master layout single page switch index html single page switch https github com silencehvk front ui raw master assets images rendering layout single page switch gif other 1 emoji https htmlpreview github io https github com silencehvk frontui blob master other emoji index html emoji https github com silencehvk front ui raw master assets images rendering other emoji gif 2 loading https htmlpreview github io https github com silencehvk frontui blob master other loading index html loading https github com silencehvk front ui raw master assets images rendering other loading gif
some-demo javascript html-css-javascript html5 css3 canvas
front_end
IoTMQTTSample
page type sample languages c python products azure iot hub name iot mqtt samples description using mqtt with azure iothub without sdk using mqtt with azure iot hub without an sdk this set of samples will demonstrate how to connect and send messages to an azure iot hub without using the azure iot sdks read the communicate with your iot hub using the mqtt protocol https docs microsoft com azure iot hub iot hub mqtt support for detailed instructions on connecting your device to iot hub using mqtt without an sdk why use an azure iot device sdk azure provides a set of sdks across multiple languages for connecting device to iot hub https docs microsoft com azure iot hub iot concepts and iot hub and dps https docs microsoft com azure iot dps the advantages of using an azure iot device sdk https docs microsoft com azure iot develop about iot sdks over building a custom connection layer are outlined below custom connection layer azure iot device sdks support need to support and document your solution access to microsoft support github microsoft q a microsoft technical documentation customer support teams new features need to manually add new azure features can immediately take advantage of new features added investment invest hundreds of hours of embedded development to design build test and maintain a custom version can take advantage of free open source tools the only cost associated with the sdks is the learning curve for more information refer to the overview of azure iot device sdks https docs microsoft com azure iot develop about iot sdks samples in this repo this repository contains the following samples mosquitto client library mosquitto mosquitto pub cli mosquitto pub python python root certificates note azure iot services are moving to a new ca root see here http aka ms iot ca updates for details this repository provides the file iothubrootca cert pem which contains the following root certificates certificate description baltimore cybertrust root https www digicert com kb digicert root certificates htm not recommended since iot hub migration out of baltimore https techcommunity microsoft com t5 internet of things blog azure iot tls critical changes are almost here and why you ba p 2393169 will start in feb 2023 digicert global root g2 https www digicert com kb digicert root certificates htm future root ca to become active in feb 2023 microsoft rsa root certificate authority 2017 https www microsoft com pkiops docs repository htm additional root recommended to prevent disruption from future changes general prerequisites to be able to run these samples the following general prerequisites are needed 1 clone this repository install git https git scm com downloads 1 change into the cloned directory cd iotmqttsample 1 install the azure cli https learn microsoft com en us cli azure install azure cli 1 provision an iot hub https learn microsoft com en us azure iot hub iot hub create through portal in your azure subscription 1 register a device https learn microsoft com en us azure iot hub iot hub create through portal register a new device in the iot hub within your iot hub 1 generate a sas token https learn microsoft com cli azure iot hub view azure cli latest az iot hub generate sas token for the device using the azure cli note by default the sas token is valid for 60 minutes use the du parameter to increase if needed additional prerequisites may be required by each individual sample contributing for details on contributing to this repository see the contributing contributing md guide reporting security vulnerabilities if you believe you have found a security vulnerability in any microsoft owned repository that meets microsoft s definition of a security vulnerability please report it to the microsoft security response center security md
server
Computer-Vision
computer vision computer vision exercise with python and opencv this repo contains three differents jupyter notebooks divided on different sections and problems of the computer vision subject of university of granada from applying filters to an image to the estimation of fundamental matrix equation http mathurl com jnymkmc png of the cameras the code and the images are also available on the repo filtering and subsetting gaussian filters one of the exercises consists on create a gaussian filter to create a mask and convolves the images with this masks hybrid images the hybrid images are two differents images mixed on a new image that contain the low frequencies of one of the images and the high frequencies of the other image to create the illusion that if you look the image closer you can see the image where we take the high frequencies and if you look further the image of the low frequencies appear gaussian pyramid to make it easier to appreciate the effect of the hybrid images you can create a gaussian pyramid where it appears the same image at different scales keypoints descriptors homographies and panoramas harris points detection these is my own implementation of harris points detector that detect and compute this points at three different scales and show 100 points of the 1500 points in total for every scale on an new image the green points belongs to the original scale the blue ones are belongs to the mid scale and the red ones to the last scale kaze akaze detectors i use one of this detectors to detect and compute de keypoints of two images and calculate the matches between two images with a brute force matcher and cross validation using opencv funtions akaze create or kaze create detectandcompute bf matcher and match panorama construction to create a panorama i use all of the previous point to find the homography between two images with findhomography with this i can create a linear panorama using a white canvas to insert the images transformed by the homography camera estimation and epipolar geometry camera estimation camera estimation from points correspondences using the dlt algorithm and the frobenius norm to calculate the error camera calibration camera calibration using chessboard images and the opencv functions findchessboardcorners drawchessboardcorners to visualize the pattern and calibratecamera to calibrate the camera to correct the lens distortion i use getoptimalnewcameramatrix and undistort fundamental matrix estimation equation http mathurl com jnymkmc png estimation using brisk orb detector to get points corresponences and 8 point algorithm with ransac also we can see the epilines on the images essential matrix estimation translation and rotation between two images essential matrix estimation using points correspondences and the 4 possible solutions to r t matrix problem requirements python 3 numpy opencv
computer-vision hybrid-image camera gaussian-filter python opencv panorama
ai
Front-end_TP_test
frontend developer travelpayouts tp test sketch tp test pdf tp test png layout preview maket preview png raw true layout preview github com nix os x html 200px 1024px webpack postcss svg dd mm yyyy typescript
front_end
ChatProtect
div align center img src web static images chatprotect logo svg width 340 h2 catch and revise hallucinations in large language models h2 div this is the code for the paper self contradictory hallucinations of large language models evaluation detection and mitigation https arxiv org abs 2305 15852 an easy to use website presenting the tool and its use cases is hosted at https chatprotect ai installation this project was tested with python3 10 set up a virtual environment or use conda and install the requirements for this project bash conda create n chatprotect pip pytorch python 3 10 conda activate chatprotect python3 m pip install r requirements txt python3 m spacy download en core web sm python3 m pip install e create a secret py and enter required api keys bash cp secret template py secret py use your favorite editor to set api keys using compactie for triple extraction set up compactie https aclanthology org 2022 naacl main 65 from the attached zip in this repository the zip file contains a forked version that provides a local api for triple extraction bash unzip compactie zip d compactie cd compactie models checkpoint are available in zenodo https zenodo org record 6804440 download the constituent extraction ce model model and put in in the folder save results models constituent https github com farimafatahi compactie tree master save results models constituent download the constituent linking cl model model and put in under save results models relation https github com farimafatahi compactie tree master save results models relation folder bash wget https zenodo org record 6804440 files ce model download 1 mv ce model download 1 save results models constituent ce model wget https zenodo org record 6804440 files cl model download 1 mv cl model download 1 save results models relation cl model then install the requirements you need python 3 6 and pytorch we recommend creating a seperate conda environment for this bash conda create n compactie pip python 3 6 pytorch 1 9 0 c pytorch conda activate compactie pip install transformers 4 2 2 configargparse 1 2 3 bidict 0 20 0 pyyaml 6 0 1 run the following command to start the api and return to the root directory and chatprotect conda environment keep this process running and continue with running running to run chatprotect bash python api py config file config yml running follow the below instructions to run the pipeline and website or reproduce the results locally running complete process first install the whole pipeline as described in the section installation then run the full pipeline on a singular topic via bash python3 m chatprotect prompt please tell me about thomas chapais running the website api this api provides the required streams to interact with the demo website bash uvicorn pipeline api app reload port 9913 running pipeline step by step to reproduce the results with gpt 4 chatgpt llama 2 70b chat and vicuna 13b 1 1 you will need to set up fastchat the together ai api key and the openai api key as described above in order to be able to assess each step of the pipeline and for scaling the whole pipeline is split into several seperate scripts that may be run step by step this corresponds to the steps glm gen sentence alm detect and alm revise bash generate answers to prompt python3 pipeline 0 generate descriptions py prompt please tell me about thomas chapais generate sentence alternative sentences pairs w tag for inconsistency gen sentence detect figure 1 2 python3 pipeline 1 generate sentences py prompt please tell me about thomas chapais generate new descriptions based on the original description and the tags first step of revise figure 3 python3 pipeline 2 generate new descriptions py prompt please tell me about thomas chapais automatically execute further mitigation steps bash pipeline mitigation sh prompt please tell me about thomas chapais test description dir test custom new descriptions run only a specific detect implementation detect figure 2 python3 pipeline direct sentences py prompt please tell me about thomas chapais each script has more information about its parameters such as employed alm or glm displayed via help note the default sentence generation method is specialized for descriptions generation you may use the generalized prompt by changing the sentence method to 4 reproducing results from paper to calculate the numbers presented in the figures in the paper run the bash scripts in the figures directory from root like this to compute the perplexity values you will need a gpu with at least 5gb vram or the computation will be quite slow the computation is disabled by default bash bash figures run sh
ai
demo_simple_blog
demo simpleblog this is the code base for our demo app simpleblog it is designed to help show a full end to end developer s workflow see the video as part of our prep work at vikingcodeschool com http vikingcodeschool com
front_end
QAmeleon
qameleon introduces synthetic multilingual qa data contaning in 8 langauges using palm 540b a large language model this dataset was generated by prompt tuning palm with only five examples per language we use the synthetic data to finetune downstream qa models leading to improved accuracy in comparison to english only and translation based baselines data available at https storage googleapis com qameleon qamelon pt accepted csv more details can be found in the paper https arxiv org abs 2211 08264 which can be cited as follows misc agrawal2022qameleon title qameleon multilingual qa with only 5 examples author priyanka agrawal and chris alberti and fantine huot and joshua maynez and ji ma and sebastian ruder and kuzman ganchev and dipanjan das and mirella lapata year 2022 eprint 2211 08264 archiveprefix arxiv primaryclass cs cl this dataset contains a total of 47173 question answer instances across 8 langauges following is the count per language language count ar 6966 bn 6084 fi 5028 id 6797 ko 6471 ru 5557 sw 5597 te 4673 total 47173 the qameleon dataset is released under the cc by 4 0 https creativecommons org licenses by 4 0 license
ai
Hands-On-Natural-Language-Processing-with-PyTorch-1.x
hands on natural language processing with pytorch 1 x a href https www packtpub com in data hands on natural language processing with pytorch 1 x utm source github utm medium repository utm campaign 9781789802740 img src https www packtpub com media catalog product cache bf3310292d6e1b4ca15aeea773aca35e 9 7 9781788830782 original 38 jpeg alt hands on natural language processing with pytorch 1 x height 256px align right a this is the code repository for hands on natural language processing with pytorch 1 x https www packtpub com in data hands on natural language processing with pytorch 1 x utm source github utm medium repository utm campaign 9781789802740 published by packt build smart ai driven linguistic applications using deep learning and nlp techniques what is this book about developers working with nlp will be able to put their knowledge to work with this practical guide to pytorch you will learn to use pytorch offerings and how to understand and analyze text using python you will learn to extract the underlying meaning in the text using deep neural networks and modern deep learning algorithms this book covers the following exciting features use nlp techniques for understanding processing and generating text understand pytorch its applications and how it can be used to build deep linguistic models explore the wide variety of deep learning architectures for nlp develop the skills you need to process and represent both structured and unstructured nlp data become well versed with state of the art technologies and exciting new developments in the nlp domain create chatbots using attention based neural networks if you feel this book is for you get your copy https www amazon com dp 1789802741 today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders the code will look like the following word dict inverse word dict for i word in enumerate corpus word dict word i inverse word dict i word following is what you need for this book this pytorch book is for nlp developers machine learning and deep learning developers and anyone interested in building intelligent language applications using both traditional nlp approaches and deep learning architectures if you re looking to adopt modern nlp techniques and models for your development projects this book is for you working knowledge of python programming along with basic working knowledge of nlp tasks is required with the following software and hardware list you can run all code files present in the book chapter 2 8 errata page 104 the github link provided in technical requirements section is incorrect the correct link is as follows https github com packtpublishing hands on natural language processing with pytorch 1 x tree master chapter05 software and hardware list chapter software required os required 2 8 python 3 7 pytorch 1 x gpu preferred windows linux macos we also provide a pdf file that has color images of the screenshots diagrams used in this book click here to download it https static packt cdn com downloads 9781789802740 colorimages pdf related products other books you may enjoy hands on python natural language processing packt https www packtpub com data hands on python natural language processing utm source github utm medium repository utm campaign 9781838989590 amazon https www amazon com dp 1838989595 natural language processing with python quick start guide packt https www packtpub com big data and business intelligence natural language processing python quick start guide utm source github utm medium repository utm campaign 9781789130386 amazon https www amazon com dp 1789130387 get to know the author thomas dop is a data scientist at magiclab a company that creates leading dating apps including bumble and badoo he works on a variety of areas within data science including nlp deep learning computer vision and predictive modeling he holds an msc in data science from the university of amsterdam suggestions and feedback click here https docs google com forms d e 1faipqlsdy7datc6qmel81fiuuymz0wy9vh1jhkvpy57oimekgqib ow viewform if you have any feedback or suggestions
ai
Pewlett-Hackard-Analysis
pewlett hackard analysis overview of the analysis the purpose of this analysis is to build a database with sql by applying data modeling engineering and analysis for this assignment we have to determine the number of retiring employees per title and identify employees who are eligible to participate in a mentorship program results retiring titles a query is written and executed to create a retirement titles table for employees who are born between january 1 1952 and december 31 1955 based on the analysis 90 398 employees are eligible for retirement image https user images githubusercontent com 95327338 153115922 f7124946 ef8d 4c86 8494 fb0307e62cad png image https user images githubusercontent com 95327338 153113529 c2c9b480 adc8 43f5 affa 2146273cd351 png unique titles a query is written and executed to create a unique titles table that contains the employee number first and last name and most recent title image https user images githubusercontent com 95327338 153120942 a6f2a4d4 48ab 4fc3 9b76 34a3055432a2 png image https user images githubusercontent com 95327338 153113682 ded6e1c1 3451 43aa 8088 6b456fb93814 png retirement titles the three groups with the highest quantity of employees are engineers staff and senior engineers image https user images githubusercontent com 95327338 153118345 02ecee41 d743 4d4d 9be2 51a43ebb425f png image https user images githubusercontent com 95327338 153113802 e7e6100f 530c 4326 83ff a297a1ef47f0 png mentoship elegibility i wrote a query to create a mentorship eligibility table that holds the employees who are eligible to participate in a mentorship program i used a distinct on statement to retrieve the first occurrence of the employee number for each set of rows also the total eligible to participate in a mentorship program is 1549 employees image https user images githubusercontent com 95327338 153121808 5404de30 96ac 4766 97ac 5878f3c07c1e png image https user images githubusercontent com 95327338 153113892 af8b69b4 2656 4a4b b0df 02f92e887f82 png summary how many roles will need to be filled as the silver tsunami begins to make an impact according to this analysis 90 398 roles will need to be filled as the silver tsunami begins are there enough qualified retirement ready employees in the departments to mentor the next generation of pewlett hackard employees from the retiring group there are 1549 employees that are candidates for the mentorship eligibility program which is not enough to start the process
server
Graph-Machine-Learning
graph machine learning a href https www packtpub com product graph machine learning 9781800204492 img src https static packt cdn com products 9781800204492 cover smaller height 256px align right a this is the code repository for graph machine learning https www packtpub com product graph machine learning 9781800204492 published by packt take graph data to the next level by applying machine learning techniques and algorithms what is this book about graph machine learning provides a new set of tools for processing network data and leveraging the power of the relation between entities that can be used for predictive modeling and analytics tasks this book covers the following exciting features first 5 what you ll learn points write python scripts to extract features from graphs distinguish between the main graph representation learning techniques become well versed with extracting data from social networks financial transaction systems and more implement the main unsupervised and supervised graph embedding techniques get to grips with shallow embedding methods graph neural networks graph regularization methods and more if you feel this book is for you get your copy https www amazon com dp 180020390x today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a errata page 16 the expression nt to numpy matrix g should be nx to numpy matrix g instructions and navigations all of the code is organized into folders for example chapter02 the code will look like the following from stellargraph mapper import hinsagenodegenerator batch size 50 num samples 10 5 generator hinsagenodegenerator subgraph batch size num samples head node type document following is what you need for this book this book is for data analysts graph developers graph analysts and graph professionals who want to leverage the information embedded in the connections and relations between data points to boost their analysis and model performance the book will also be useful for data scientists and machine learning developers who want to build ml driven graph databases a beginner level understanding of graph databases and graph data is required intermediate level working knowledge of python programming and machine learning is also expected to make the most out of this book with the following software and hardware list you can run all code files present in the book chapter 1 14 software and hardware list chapter software required os required 1 10 python windows mac os x and linux any 1 10 neo4j windows mac os x and linux any 1 10 gephi windows mac os x and linux any 1 10 google colab or jupyter notebook windows mac os x and linux any we also provide a pdf file that has color images of the screenshots diagrams used in this book click here to download it https static packt cdn com downloads 9781800204492 colorimages pdf related products learn grafana 7 0 packt https www packtpub com product learn grafana 7 0 1838826580 amazon https www amazon com dp 1788293770 interactive dashboards and data apps with plotly and dash packt https www packtpub com product interactive dashboards and data apps with plotly and dash 9781800568914 amazon https www amazon com dp 1800568916 get to know the authors claudio stamile received an m sc degree in computer science from the university of calabria cosenza italy in september 2013 and in september 2017 he received his joint ph d from ku leuven leuven belgium and universit claude bernard lyon 1 lyon france during his career he has developed a solid background in artificial intelligence graph theory and machine learning with a focus on the biomedical field he is currently a senior data scientist in cgnal a consulting firm fully committed to helping its top tier clients implement data driven strategies and build ai powered solutions to promote efficiency and support new business models aldo marzullo received an m sc degree in computer science from the university of calabria cosenza italy in september 2016 during his studies he developed a solid background in several areas including algorithm design graph theory and machine learning in january 2020 he received his joint ph d from the university of calabria and universit claude bernard lyon 1 lyon france with a thesis entitled deep learning and graph theory for brain connectivity analysis in multiple sclerosis he is currently a postdoctoral researcher at the university of calabria and collaborates with several international institutions enrico deusebio is currently the chief operating officer at cgnal a consulting firm that helps its top tier clients implement data driven strategies and build ai powered solutions he has been working with data and large scale simulations using high performance facilities and large scale computing centers for over 10 years both in an academic and industrial context he has collaborated and worked with top tier universities such as the university of cambridge the university of turin and the royal institute of technology kth in stockholm where he obtained a ph d in 2014 he also holds b sc and m sc degrees in aerospace engineering from politecnico di torino download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781800204492 https packt link free ebook 9781800204492 a p
ai
home_credit_risk
homecredit risk analysis originally a kaggle problem this is my work on the home credit risk database includes data exploratory analysis feature engineering and model building for predicting repayment risk problem many people struggle to get loans due to insufficient or non existent credit histories and unfortunately this population is often taken advantage of by untrustworthy lenders in order to make sure this underserved population has a positive loan experience home credit makes use of a variety of alternative data including telco and transactional information to predict their clients repayment abilities the task therefore is to predict whether a loan applicant will default on his loan or not dataset the data can be found at https www kaggle com c home credit default risk data entity relationship diagram of the entire dataset img src home credit data description png align center exploratory data analysis the folder feature engineering contains all exploratory data analysis done as multiple jupyter notebooks to sum it up all variables were explored skewness of data and anomalies checked and new features were generated based on whatever domain knowledge i could get predictive models i ve tried predicting the outcomes for the test dataset using engineered features and many different machine learning algorithms that can be found in the models folder
server
ML-foundations
machine learning foundations this repo is home to the code that accompanies jon krohn s machine learning foundations curriculum which provides a comprehensive overview of all of the subjects across mathematics statistics and computer science that underlie contemporary machine learning approaches including deep learning and other artificial intelligence techniques there are eight subjects in the curriculum organized into four subject areas see the machine learning house section below for detail on why these are the essential foundational subject areas linear algebra 1 intro to linear algebra https github com jonkrohn ml foundations blob master notebooks 1 intro to linear algebra ipynb 2 linear algebra ii matrix operations https github com jonkrohn ml foundations blob master notebooks 2 linear algebra ii ipynb calculus 3 calculus i limits derivatives https github com jonkrohn ml foundations blob master notebooks 3 calculus i ipynb 4 calculus ii partial derivatives integrals https github com jonkrohn ml foundations blob master notebooks 4 calculus ii ipynb probability and statistics 5 probability information theory https github com jonkrohn ml foundations blob master notebooks 5 probability ipynb 6 intro to statistics https github com jonkrohn ml foundations blob master notebooks 6 statistics ipynb computer science 7 algorithms data structures https github com jonkrohn ml foundations blob master notebooks 7 algos and data structures ipynb 8 optimization https github com jonkrohn ml foundations blob master notebooks 8 optimization ipynb later subjects build upon content from earlier subjects so the recommended approach is to progress through the eight subjects in the order provided that said you re welcome to pick and choose individual subjects based on your interest or existing familiarity with the material in particular each of the four subject areas are fairly independent so could be approached separately where and when the eight ml foundations subjects were initially offered by jon krohn jonkrohn com as live online trainings in the o reilly learning platform https learning oreilly com home from may sep 2020 and were offered a second time from jul dec 2021 see here https www jonkrohn com talks for individual lecture dates to suit your preferred mode of learning the content is now available via several channels youtube linear algebra complete playlist here https www youtube com playlist list plrdl2inprwqw1qswhbu0ki jq uelkh2a and detailed blog post here https www jonkrohn com posts 2021 5 9 linear algebra for machine learning complete math course on youtube calculus complete playlist here https www youtube com playlist list plrdl2inprwqvu2ovntvtkrpj wz urmjx probability playlist https www youtube com playlist list plrdl2inprwqwwj1mh4tcuxllfz76c1zge is in active development sign up for my email newsletter at jonkrohn com https www jonkrohn com to be notified of new video releases in time all of the subjects of my ml foundations curriculum will be freely available on youtube o reilly many employers and educational institutions provide free access to this platform if you don t have access you can get a 30 day free trial with the code sdspod23 linear algebra videos https learning oreilly com videos linear algebra for 9780137398119 published in dec 2020 free hour long lesson https www youtube com watch v ug wjmuiggg calculus videos https learning oreilly com videos calculus for machine 9780137398171 published in jan 2021 free hour long lesson https youtu be zdax17ogmam probability and stats videos https learning oreilly com videos probability and statistics 9780137566273 published in may 2021 free hour long lesson https youtu be ujcgj k50ie computer science videos https learning oreilly com videos data structures algorithms 9780137644889 published in jun 2021 free hour long lesson https youtu be yfkkmdndy e for convenience this publisher compiled all 28 hours of the above four video series into a single playlist here https learning oreilly com videos 9780137903245 udemy all the linear algebra and calculus content has been live in a mathematical foundations of ml course https www udemy com course machine learning data science foundations masterclass since sep 2021 free overview video here https youtu be qhlo19eia4g while this course stands alone as a complete introduction to the math subjects subjects 5 8 will eventually be added as free bonus material open data science conference the entire series was taught live online from dec 2020 to jun 2021 on demand recordings of all these trainings are now available in the ai platform https aiplus odsc com pages mlbootcamp book chapter drafts to begin appearing in 2022 note that while youtube contains 100 of the taught content the paid options e g udemy o reilly and odsc contain comprehensive solution walk throughs for exercises that are not available on youtube some of the paid options also include exclusive platform specific features such as interactive testing cheat sheets and the awarding of a certificate for successful course completion push notifications to stay informed of future live training sessions new video releases and book chapter releases consider signing up for jon krohn s email newsletter via his homepage https www jonkrohn com notebooks all code is provided within jupyter notebooks in this directory https github com jonkrohn dltfpt blob master notebooks these notebooks are intended for use within the free colab cloud environment https colab research google com and that is the only environment currently actively supported that said if you are familiar with running jupyter notebooks locally you re welcome to do so note that the library versions in this repo s dockerfile https github com jonkrohn ml foundations blob master dockerfile are not necessarily current but may provide a reasonable starting point for running jupyter within a docker container the machine learning house p align center img src https github com jonkrohn ml foundations blob master img ml house png width 500 align center p to be an outstanding data scientist or ml engineer it doesn t suffice to only know how to use ml algorithms via the abstract interfaces that the most popular libraries e g scikit learn keras provide to train innovative models or deploy them to run performantly in production an in depth appreciation of machine learning theory pictured as the central purple floor of the machine learning house may be helpful or essential and to cultivate such in depth appreciation of ml one must possess a working understanding of the foundational subjects when the foundations of the machine learning house are firm it also makes it much easier to make the jump from general ml principles purple floor to specialized ml domains the top floor shown in gray such as deep learning natural language processing machine vision and reinforcement learning this is because the more specialized the application the more likely its details for implementation are available only in academic papers or graduate level textbooks either of which typically assume an understanding of the foundational subjects the content in this series may be particularly relevant for you if you use high level software libraries to train or deploy machine learning algorithms and would now like to understand the fundamentals underlying the abstractions enabling you to expand your capabilities you re a data scientist who would like to reinforce your understanding of the subjects at the core of your professional discipline you re a software developer who would like to develop a firm foundation for the deployment of machine learning algorithms into production systems you re a data analyst or a i enthusiast who would like to become a data scientist or data ml engineer and so you re keen to deeply understand the field you re entering from the ground up very wise of you you re simply keen to understand the essentials of linear algebra calculus probability stats algorithms and or data structures the foundational subjects have largely been unchanged in recent decades and are likely to remain so for the coming decades yet they re critical across all machine learning and data science approaches thus the foundations provide a solid career long bedrock pedagogical approach the purpose of this series it to provide you with a practical functional understanding of the content covered context will be given for each topic highlighting its relevance to machine learning as with other materials created by jon krohn such as the book deep learning illustrated https www deeplearningillustrated com and his 18 hour video series deep learning with tensorflow keras and pytorch https github com jonkrohn dltfpt the content in the series is brought to life through the combination of vivid full color illustrations paper and pencil comprehension exercises with fully worked solutions hundreds of straightforward examples of python code within hands on jupyter notebooks with a particular focus on the pytorch and tensorflow libraries resources for digging even deeper into topics that pique your curiosity prerequisites programming all code demos will be in python so experience with it or another object oriented programming language would be helpful for following along with the code examples a good and free resource for getting started with python is al sweigart s automate the boring stuff https automatetheboringstuff com mathematics familiarity with secondary school level mathematics will make the class easier to follow along with if you are comfortable dealing with quantitative information such as understanding charts and rearranging simple equations then you should be well prepared to follow along with all of the mathematics if you discover you have some math gaps as you work through this ml foundations curriculum i recommend the free comprehensive khan academy https www khanacademy org to fill those gaps in oboe finally here s an illustration of oboe the machine learning foundations mascot created by the wonderful artist agla bassens https www aglaebassens com p align center img src https github com jonkrohn ml foundations blob master img oboe jpg width 400 align center p
machine-learning data-science python mathematics linear-algebra calculus probability statistics computer-science data-structures numpy pytorch tensorflow jupyter-notebook
ai
cpre488
cpre 488 embedded systems design embedded microprocessors embedded memory and i o devices component interfaces embedded software program development basic compiler techniques platform based fpga technology hardware synthesis design methodology real time operating system concepts performance analysis and optimizations learning objectives by the end of the semester students will be able to simplify the design of modern embedded computing systems into concrete stages perform analysis for various embedded systems using multiple metrics for performance power thermal and efficiency express fluency with techniques in embedded system optimization including compiler architectural and system level techniques categorize on chip and off chip communication approaches commonly in use for embedded systems including real world interfacing considerations utilize embedded memory management techniques prototype systems with significant hardware and software components using an fpga based platform develop systems for multiple compelling embedded application domains including image video processing and control systems appreciate the challenges involved with embedded system software and operating systems and illustrate familiarity with configuring and installing embedded linux
os
unrealcv
unrealcv join the chat at https gitter im unrealcv unrealcv https badges gitter im unrealcv unrealcv svg https gitter im unrealcv unrealcv utm source badge utm medium badge utm campaign pr badge utm content badge docs status https readthedocs org projects unrealcv badge version latest http docs unrealcv org build status https travis ci org unrealcv unrealcv svg branch master https travis ci org unrealcv unrealcv unrealcv is a project to help computer vision researchers build virtual worlds using unreal engine 4 ue4 it extends ue4 with a plugin by providing 1 a set of unrealcv commands to interact with the virtual world 2 communication between ue4 and an external program such as caffe unrealcv can be used in two ways the first one is using a compiled game binary with unrealcv embedded this is as simple as running a game no knowledge of unreal engine is required the second is installing unrealcv plugin to unreal engine 4 ue4 and use the editor of ue4 to build a new virtual world please read tutorial getting started http unrealcv github io tutorial getting started html to learn using unrealcv center img src http unrealcv github io images homepage teaser png alt annotation images generated from the technical demo a href http docs unrealcv org en master reference model zoo html realisticrendering realisticrendering a br center new features call any blueprint function from python by vbp obj name func name arg1 arg2 command support rpc communication between server and client in linux higher fps and more reliable a set of new commands for camera control and object manipulation please refer to command system https docs unrealcv org en latest reference commands html for more details how to install unrealcv to install the unrealcv server you need 1 download the source code and place it on the plugin folder of a c ue4 project 2 launch the c project with visual studio 2019 unrealcv will be compiled at the same time 3 to check the success installation of unrealcv you can run vget unrealcv status in the console press to display the console to install the unrealcv client just run pip install unrealcv citation if you found this project useful please consider citing our paper bibtex article qiu2017unrealcv author weichao qiu fangwei zhong yi zhang siyuan qiao zihao xiao tae soo kim yizhou wang alan yuille journal acm multimedia open source software competition title unrealcv virtual worlds for computer vision year 2017 contact if you have any suggestion or interested in using unrealcv please contact us http unrealcv github io contact html
virtual-worlds computer-vision ue4 embodied-ai machine-learning simulation synthetic-data
ai
master-frontend-lemoncode
m ster front end online lemoncode en este repositorio podr s encontrar el c digo fuentes de las demos y ejemplos del m ster front end online lemoncode http lemoncode net master frontend en muchos de los ejemplos encontrar s gu as paso a paso para poder reproducirlos las reas que se cubren layout building layouts with hmtl css 3 sass lenguajes javascript es5 es6 typescript bundling webpack parcel frameworks react vuejs angular testing jest react testing library rest api fetch axios graphql cloud demos heroku amazon azure docker github actions mobile pwa react native proximamente flutter te animas a cursar nuestro m ster si tienes ganas de ponerte al d a en front end te ofrecemos un m ster online en dos modalidades de convocatoria con clases en vivo y con interacci n alumno profesor continuo para ir a tu ritmo en el que cuentas con el acompa amiento de un mentor a lo largo del mismo m s informaci n m ster front end online lemoncode http lemoncode net master frontend opiniones de antiguos alumnos https lemoncode net lemoncode blog 2016 12 24 master lemoncode opiniones de los alumnos c mo funciona la molidad de convocatoria https lemoncode net lemoncode blog 2017 2 6 master front end lemon que tiene esto de especial c mo funciona la modalidad continua https lemoncode net lemoncode blog 2020 10 1 master front end continuo lemoncode si tienes cualquier dudas puedes escribirnos a info lemoncode net
front_end
EMBSYS110
embsys110 embedded systems 110 elapsed timer http img youtube com vi uajmhtpev e 0 jpg http www youtube com watch v uajmhtpev e embedded systems 110 elapsed timer
os
ejemplotareasfrontend
ejemplotareasfrontend paso 1 instalar dependencias ejecute el siguiente comando en una consola para instalar todas las dependencias del proyecto npm install instale el vue cli el proyecto esta desarrollado en vue cli por lo que debe instalarlo para asi poder editar el proyecto npm install g vue cli levante dev ejecute el siguiente comando para levantar localmente el proyecto y probarlo npm run serve compilar para produccion npm run build configure correctamente la api del backend dentro de la carpeta src se encuentra el archivo main js en dicho archivo se encuentra la configuracion de la url del backend edite la misma para que se conecte a dicho servicio luego puede probar y compilar para produccion axios defaults baseurl http localhost 3000 deploy en rama dist hay una branch dedicado a contener los archivos de distribucion build con el objetivo de hacer deploy con github incluso en entrega continua para ello debe ejecutar los comandos se utiliza push dir para ello npm run build npm run deploy
front_end
amazon-sagemaker-examples
sagemaker https github com aws amazon sagemaker examples raw main static sagemaker banner png amazon sagemaker examples example jupyter notebooks that demonstrate how to build train and deploy machine learning models using amazon sagemaker books background amazon sagemaker https aws amazon com sagemaker is a fully managed service for data science and machine learning ml workflows you can use amazon sagemaker to simplify the process of building training and deploying ml models the sagemaker example notebooks https sagemaker examples readthedocs io en latest are jupyter notebooks that demonstrate the usage of amazon sagemaker the sagemaker example community repository https github com aws amazon sagemaker examples community are additional notebooks beyond those critical for showcasing key sagemaker functionality can be shared and explored by the commmunity hammer and wrench setup the quickest setup to run example notebooks includes an aws account http docs aws amazon com sagemaker latest dg gs account html proper iam user and role http docs aws amazon com sagemaker latest dg authentication and access control html setup an amazon sagemaker notebook instance http docs aws amazon com sagemaker latest dg gs setup working env html an s3 bucket http docs aws amazon com sagemaker latest dg gs config permissions html computer usage these example notebooks are automatically loaded into sagemaker notebook instances they can be accessed by clicking on the sagemaker examples tab in jupyter or the sagemaker logo in jupyterlab although most examples utilize key amazon sagemaker functionality like distributed managed training or real time hosted endpoints these notebooks can be run outside of amazon sagemaker notebook instances with minimal modification updating iam role definition and installing the necessary libraries as of february 7 2022 the default branch is named main see our announcement https github com aws amazon sagemaker examples discussions 3131 for details and how to update your existing clone notebook examples introduction to geospatial capabilities these examples introduce sagemaker geospatial capabilities which makes it easy to build train and deploy ml models using geospatial data monitoring lake drought with sagemaker geospatial capabilities sagemaker geospatial lake mead drought monitoring shows how to monitor lake mead drought using sagemaker geospatial capabilities digital farming with amazon sagemaker geospatial capabilities sagemaker geospatial digital farming pipelines shows how geospatial capabilities can help accelerating optimizing and easing the processing of the geospatial data for the digital farming use cases assess wildfire damage with amazon sagemaker geospatial capabilities sagemaker geospatial dixie wildfire damage assessment dixie wildfire damage assessment ipynb demonstrates how amazon sagemaker geospatial capabilities can be used to identify and assess vegetation loss caused by the dixie wildfire in northern california how to use vector enrichment jobs for map matching sagemaker geospatial vector enrichment map matching vector enrichment map matching ipynb shows how to use vector enrichtment operations with amazon sagemaker geospatial capabilities to snap gps coordinates to road segments how to use vector enrichment jobs for reverse geocoding sagemaker geospatial vector enrichment reverse geocoding vector enrichment reverse geocoding ipynb shows how to use amazon sagemaker geospatial capabilities for reverse geocoding to obtain human readable addresses from data with latitude longitude information monitoring glacier melting with sagemaker geospatial capabilities sagemaker geospatial mount shasta glacier melting monitoring shows how to monitor glacier melting at mount shasta using sagemaker geospatial capabilities sagemaker pipelines with amazon sagemaker geospatial capabilities sagemaker geospatial geospatial pipeline geospatial pipelines ipynb shows how a geospatial data processing workflow can be automated by using an amazon sagemaker pipeline introduction to ground truth labeling jobs these examples provide quick walkthroughs to get you up and running with the labeling job workflow for amazon sagemaker ground truth bring your own model for sagemaker labeling workflows with active learning ground truth labeling jobs bring your own model for sagemaker labeling workflows with active learning is an end to end example that shows how to bring your custom training inference logic and active learning to the amazon sagemaker ecosystem from unlabeled data to a deployed machine learning model a sagemaker ground truth demonstration for image classification ground truth labeling jobs from unlabeled data to deployed machine learning model ground truth demo image classification is an end to end example that starts with an unlabeled dataset labels it using the ground truth api analyzes the results trains an image classification neural net using the annotated dataset and finally uses the trained model to perform batch and online inference ground truth object detection tutorial ground truth labeling jobs ground truth object detection tutorial is a similar end to end example but for an object detection task basic data analysis of an image classification output manifest ground truth labeling jobs data analysis of ground truth image classification output presents charts to visualize the number of annotations for each class differentiating between human annotations and automatic labels if your job used auto labeling it also displays sample images in each class and creates a pdf which concisely displays the full results training a machine learning model using an output manifest ground truth labeling jobs object detection augmented manifest training introduces the concept of an augmented manifest and demonstrates that the output file of a labeling job can be immediately used as the input file to train a sagemaker machine learning model annotation consolidation ground truth labeling jobs annotation consolidation demonstrates amazon sagemaker ground truth annotation consolidation techniques for image classification for a completed labeling job introduction to applying machine learning these examples provide a gentle introduction to machine learning concepts as they are applied in practical use cases across a variety of sectors predicting customer churn introduction to applying machine learning xgboost customer churn uses customer interaction and service usage data to find those most likely to churn and then walks through the cost benefit trade offs of providing retention incentives this uses amazon sagemaker s implementation of xgboost https github com dmlc xgboost to create a highly predictive model cancer prediction introduction to applying machine learning breast cancer prediction predicts breast cancer based on features derived from images using sagemaker s linear learner ensembling introduction to applying machine learning ensemble modeling predicts income using two amazon sagemaker models to show the advantages in ensembling video game sales introduction to applying machine learning video game sales develops a binary prediction model for the success of video games based on review scores mxnet gluon recommender system introduction to applying machine learning gluon recommender system uses neural network embeddings for non linear matrix factorization to predict user movie ratings on amazon digital reviews fair linear learner introduction to applying machine learning fair linear learner is an example of an effective way to create fair linear models with respect to sensitive features population segmentation of us census data using pca and kmeans introduction to applying machine learning us census population segmentation pca kmeans analyzes us census data and reduces dimensionality using pca then clusters us counties using kmeans to identify segments of similar counties document embedding using object2vec introduction to applying machine learning object2vec document embedding is an example to embed a large collection of documents in a common low dimensional space so that the semantic distances between these documents are preserved traffic violations forecasting using deepar introduction to applying machine learning deepar chicago traffic violations is an example to use daily traffic violation data to predict pattern and seasonality to use amazon deepar alogorithm visual inspection automation with pre trained amazon sagemaker models introduction to applying machine learning visual object detection is an example for fine tuning pre trained amazon sagemaker models on a target dataset create sagemaker models using the pytorch model zoo introduction to applying machine learning sagemaker pytorch model zoo contains an example notebook to create a sagemaker model leveraging the pytorch model zoo and visualize the results deep demand forecasting introduction to applying machine learning deep demand forecasting provides an end to end solution for demand forecasting task using three state of the art time series algorithms lstnet prophet and sagemaker deepar which are available in gluonts and amazon sagemaker fraud detection using graph neural networks introduction to applying machine learning fraud detection using graph neural networks is an example to identify fraudulent transactions from transaction and user identity datasets identify key insights from textual document introduction to applying machine learning identify key insights from textual document contains comphrensive notebooks for five natural language processing tasks document summarization text classification question and answering name entity recognition and semantic relation extracion synthetic churn prediction with text introduction to applying machine learning synthetic churn prediction with text contains an example notebook to train deploy and use a churn prediction model that processed numerical categorical and textual features to make its prediction credit card fraud detector introduction to applying machine learning credit card fraud detector is an example of the core of a credit card fraud detection system using sagemaker with random cut forest and xgboost churn prediction multimodality of text and tabular introduction to applying machine learning churn prediction multimodality of text and tabular is an example notebook to train and deploy a churn prediction model that uses state of the art natural language processing model to find useful signals in text in addition to textual inputs this model uses traditional structured data inputs such as numerical and categorical fields sagemaker automatic model tuning these examples introduce sagemaker s hyperparameter tuning functionality which helps deliver the best possible predictions by running a large number of training jobs to determine which hyperparameter values are the most impactful xgboost tuning hyperparameter tuning xgboost direct marketing shows how to use sagemaker hyperparameter tuning to improve your model fit blazingtext tuning hyperparameter tuning blazingtext text classification 20 newsgroups shows how to use sagemaker hyperparameter tuning with the blazingtext built in algorithm and 20 newsgroups dataset tensorflow tuning hyperparameter tuning tensorflow mnist shows how to use sagemaker hyperparameter tuning with the pre built tensorflow container and mnist dataset mxnet tuning hyperparameter tuning mxnet mnist shows how to use sagemaker hyperparameter tuning with the pre built mxnet container and mnist dataset huggingface tuning hyperparameter tuning huggingface multiclass text classification 20 newsgroups shows how to use sagemaker hyperparameter tuning with the pre built huggingface container and 20 newsgroups dataset keras byo tuning hyperparameter tuning keras bring your own shows how to use sagemaker hyperparameter tuning with a custom container running a keras convolutional network on cifar 10 data r byo tuning hyperparameter tuning r bring your own shows how to use sagemaker hyperparameter tuning with the custom container from the bring your own r algorithm advanced functionality r bring your own example analyzing results hyperparameter tuning analyze results is a shared notebook that can be used after each of the above notebooks to provide analysis on how training jobs with different hyperparameters performed model tuning for distributed training hyperparameter tuning model tuning for distributed training shows how to use sagemaker hyperparameter tuning with hyperband strategy for optimizing model in distributed training neural architecture search for large language models hyperparameter tuning neural architecture search llm shows how to prune fine tuned large language models via neural architecture search sagemaker autopilot these examples introduce sagemaker autopilot autopilot automatically performs feature engineering model selection model tuning hyperparameter optimization and allows you to directly deploy the best model to an endpoint to serve inference requests customer churn automl autopilot shows how to use sagemaker autopilot to automatically train a model for the predicting customer churn introduction to applying machine learning xgboost customer churn task targeted direct marketing automl autopilot shows how to use sagemaker autopilot to automatically train a model housing prices automl sagemaker autopilot housing prices shows how to use sagemaker autopilot for a linear regression problem predict housing prices portfolio churn prediction with amazon sagemaker autopilot and neo4j autopilot sagemaker autopilot neo4j portfolio churn ipynb shows how to use sagemaker autopilot with graph embeddings to predict investment portfolio churn move amazon sagemaker autopilot ml models from experimentation to production using amazon sagemaker pipelines autopilot sagemaker autopilot pipelines shows how to use sagemaker autopilot in combination with sagemaker pipelines for end to end automl training automation amazon sagemaker autopilot models to serverless endpoints autopilot autopilot serverless inference shows how to deploy autopilot generated models to serverless endpoints introduction to amazon algorithms these examples provide quick walkthroughs to get you up and running with amazon sagemaker s custom developed algorithms most of these algorithms can train on distributed hardware scale incredibly well and are faster and cheaper than popular alternatives k means sagemaker python sdk 1p kmeans highlevel is our introductory example for amazon sagemaker it walks through the process of clustering mnist images of handwritten digits using amazon sagemaker k means factorization machines introduction to amazon algorithms factorization machines mnist showcases amazon sagemaker s implementation of the algorithm to predict whether a handwritten digit from the mnist dataset is a 0 or not using a binary classifier latent dirichlet allocation lda introduction to amazon algorithms lda topic modeling introduces topic modeling using amazon sagemaker latent dirichlet allocation lda on a synthetic dataset linear learner introduction to amazon algorithms linear learner mnist predicts whether a handwritten digit from the mnist dataset is a 0 or not using a binary classifier from amazon sagemaker linear learner neural topic model ntm introduction to amazon algorithms ntm synthetic uses amazon sagemaker neural topic model ntm to uncover topics in documents from a synthetic data source where topic distributions are known principal components analysis pca introduction to amazon algorithms pca mnist uses amazon sagemaker pca to calculate eigendigits from mnist seq2seq introduction to amazon algorithms seq2seq translation en de uses the amazon sagemaker seq2seq algorithm that s built on top of sockeye https github com awslabs sockeye which is a sequence to sequence framework for neural machine translation based on mxnet seq2seq implements state of the art encoder decoder architectures which can also be used for tasks like abstractive summarization in addition to machine translation this notebook shows translation from english to german text image classification introduction to amazon algorithms imageclassification caltech includes full training and transfer learning examples of amazon sagemaker s image classification algorithm this uses a resnet deep convolutional neural network to classify images from the caltech dataset xgboost for regression introduction to amazon algorithms xgboost abalone predicts the age of abalone abalone dataset https www csie ntu edu tw cjlin libsvmtools datasets regression html using regression from amazon sagemaker s implementation of xgboost https github com dmlc xgboost xgboost for multi class classification introduction to amazon algorithms xgboost mnist uses amazon sagemaker s implementation of xgboost https github com dmlc xgboost to classify handwritten digits from the mnist dataset as one of the ten digits using a multi class classifier both single machine and distributed use cases are presented deepar for time series forecasting introduction to amazon algorithms deepar synthetic illustrates how to use the amazon sagemaker deepar algorithm for time series forecasting on a synthetically generated data set blazingtext word2vec introduction to amazon algorithms blazingtext word2vec text8 generates word2vec embeddings from a cleaned text dump of wikipedia articles using sagemaker s fast and scalable blazingtext implementation object detection for bird images introduction to amazon algorithms object detection birds demonstrates how to use the amazon sagemaker object detection algorithm with a public dataset of bird images object2vec for movie recommendation introduction to amazon algorithms object2vec movie recommendation demonstrates how object2vec can be used to model data consisting of pairs of singleton tokens using movie recommendation as a running example object2vec for multi label classification introduction to amazon algorithms object2vec multilabel genre classification shows how objecttovec algorithm can train on data consisting of pairs of sequences and singleton tokens using the setting of genre prediction of movies based on their plot descriptions object2vec for sentence similarity introduction to amazon algorithms object2vec sentence similarity explains how to train object2vec using sequence pairs as input using sentence similarity analysis as the application ip insights for suspicious logins introduction to amazon algorithms ipinsights login shows how to train ip insights on a login events for a web server to identify suspicious login attempts semantic segmentation introduction to amazon algorithms semantic segmentation pascalvoc shows how to train a semantic segmentation algorithm using the amazon sagemaker semantic segmentation algorithm it also demonstrates how to host the model and produce segmentation masks and probability of segmentation jumpstart instance segmentation introduction to amazon algorithms jumpstart instance segmentation demonstrates how to use a pre trained instance segmentation model available in jumpstart for inference jumpstart semantic segmentation introduction to amazon algorithms jumpstart semantic segmentation demonstrates how to use a pre trained semantic segmentation model available in jumpstart for inference how to finetune the pre trained model on a custom dataset using jumpstart transfer learning algorithm and how to use fine tuned model for inference jumpstart text generation introduction to amazon algorithms jumpstart text generation shows how to use jumpstart to generate text that appears indistinguishable from the hand written text jumpstart text summarization introduction to amazon algorithms jumpstart text summarization shows how to use jumpstart to summarize the text to contain only the important information jumpstart image embedding introduction to amazon algorithms jumpstart image embedding demonstrates how to use a pre trained model available in jumpstart for image embedding jumpstart text embedding introduction to amazon algorithms jumpstart text embedding demonstrates how to use a pre trained model available in jumpstart for text embedding jumpstart object detection introduction to amazon algorithms jumpstart object detection demonstrates how to use a pre trained object detection model available in jumpstart for inference how to finetune the pre trained model on a custom dataset using jumpstart transfer learning algorithm and how to use fine tuned model for inference jumpstart machine translation introduction to amazon algorithms jumpstart machine translation demonstrates how to translate text from one language to another language in jumpstart jumpstart named entity recognition introduction to amazon algorithms jumpstart named entity recognition demonstrates how to identify named entities such as names locations etc in the text in jumpstart jumpstart text to image introduction to amazon algorithms jumpstart text to image demonstrates how to generate image conditioned on text in jumpstart jumpstart upscaling introduction to amazon algorithms jumpstart upscaling demonstrates how to enhance image quality with stable diffusion models in jumpstart jumpstart inpainting introduction to amazon algorithms jumpstart inpainting demonstrates how to inpaint an image with stable diffusion models in jumpstart in context learning with alexatm 20b introduction to amazon algorithms jumpstart alexatm20b demonstrates how to use alexatm 20b for in context learning in jumpstart amazon sagemaker rl the following provide examples demonstrating different capabilities of amazon sagemaker rl cartpole using coach reinforcement learning rl cartpole coach demonstrates the simplest usecase of amazon sagemaker rl using intel s rl coach aws deepracer reinforcement learning rl deepracer robomaker coach gazebo demonstrates aws deepracer trainig using rl coach in the gazebo environment hvac using energyplus reinforcement learning rl hvac coach energyplus demonstrates the training of hvac systems using the energyplus environment knapsack problem reinforcement learning rl knapsack coach custom demonstrates how to solve the knapsack problem using a custom environment mountain car reinforcement learning rl mountain car coach gymenv mountain car is a classic rl problem this notebook explains how to solve this using the openai gym environment distributed neural network compression reinforcement learning rl network compression ray custom this notebook explains how to compress resnets using rl using a custom environment and the rllib toolkit portfolio management reinforcement learning rl portfolio management coach customenv this notebook uses a custom gym environment to manage multiple financial investments autoscaling reinforcement learning rl predictive autoscaling coach customenv demonstrates how to adjust load depending on demand this uses rl coach and a custom environment roboschool reinforcement learning rl roboschool ray is an open source physics simulator that is commonly used to train rl policies for robotic systems this notebook demonstrates training a few agents using it stable baselines reinforcement learning rl roboschool stable baselines in this notebook example we will make the halfcheetah agent learn to walk using the stable baselines which are a set of improved implementations of reinforcement learning rl algorithms based on openai baselines travelling salesman reinforcement learning rl traveling salesman vehicle routing coach is a classic np hard problem which this notebook solves with aws sagemaker rl tic tac toe reinforcement learning rl tic tac toe coach customenv is a simple implementation of a custom gym environment to train and deploy an rl agent in coach that then plays tic tac toe interactively in a jupyter notebook unity game agent reinforcement learning rl unity ray shows how to use rl algorithms to train an agent to play unity3d game scientific details of algorithms these examples provide more thorough mathematical treatment on a select group of algorithms streaming median scientific details of algorithms streaming median sequentially introduces concepts used in streaming algorithms which many sagemaker algorithms rely on to deliver speed and scalability latent dirichlet allocation lda scientific details of algorithms lda topic modeling dives into amazon sagemaker s spectral decomposition approach to lda linear learner features scientific details of algorithms linear learner class weights loss functions shows how to use the class weights and loss functions features of the sagemaker linear learner algorithm to improve performance on a credit card fraud prediction task amazon sagemaker debugger these examples provide and introduction to sagemaker debugger which allows debugging and monitoring capabilities for training of machine learning and deep learning algorithms note that although these notebooks focus on a specific framework the same approach works with all the frameworks that amazon sagemaker debugger supports the notebooks below are listed in the order in which we recommend you review them using a built in rule with tensorflow sagemaker debugger tensorflow builtin rule using a custom rule with tensorflow keras sagemaker debugger tensorflow keras custom rule interactive tensor analysis in notebook with mxnet sagemaker debugger mnist tensor analysis visualizing debugging tensors of mxnet training sagemaker debugger mnist tensor plot real time analysis in notebook with mxnet sagemaker debugger mxnet realtime analysis using a built in rule with xgboost sagemaker debugger xgboost builtin rules real time analysis in notebook with xgboost sagemaker debugger xgboost realtime analysis using sagemaker debugger with managed spot training and mxnet sagemaker debugger mxnet spot training reacting to cloudwatch events from rules to take an action based on status with tensorflow sagemaker debugger tensorflow action on rule using sagemaker debugger with a custom pytorch container sagemaker debugger pytorch custom container amazon sagemaker distributed training these examples provide an introduction to sagemaker distributed training libraries for data parallelism and model parallelism the libraries are optimized for the sagemaker training environment help adapt your distributed training jobs to sagemaker and improve training speed and throughput more examples for models such as bert and yolov5 can be found in distributed training https github com aws amazon sagemaker examples tree main training distributed training train gpt 2 with sharded data parallel https github com aws amazon sagemaker examples tree main training distributed training pytorch model parallel gpt2 smp train gpt simple sharded data parallel ipynb shows how to train gpt 2 with near linear scaling using sharded data parallelism technique in sagemaker model parallelism library train eleutherai gpt j with model parallel https github com aws amazon sagemaker examples blob main training distributed training pytorch model parallel gpt j 11 train gptj smp tensor parallel notebook ipynb shows how to train eleutherai gpt j with pytorch and tensor parallelism technique in the sagemaker model parallelism library train maskrcnn with data parallel https github com aws amazon sagemaker examples blob main training distributed training pytorch data parallel maskrcnn pytorch smdataparallel maskrcnn demo ipynb shows how to train maskrcnn with pytorch and sagemaker data parallelism library amazon sagemaker clarify these examples provide an introduction to sagemaker clarify which provides machine learning developers with greater visibility into their training data and models so they can identify and limit bias and explain predictions fairness and explainability with sagemaker clarify sagemaker clarify fairness and explainability shows how to use sagemaker clarify processor api to measure the pre training bias of a dataset and post training bias of a model and explain the importance of the input features on the model s decision amazon sagemaker clarify model monitors sagemaker model monitor fairness and explainability shows how to use sagemaker clarify model monitor api to schedule bias monitor to monitor predictions for bias drift on a regular basis and schedule explainability monitor to monitor predictions for feature attribution drift on a regular basis publishing content from rstudio on amazon sagemaker to rstudio connect these examples show you how to run r examples and publish applications in rstudio on amazon sagemaker to rstudio connect publishing r markdown r examples rsconnect rmarkdown shows how you can author an r markdown document rmd rpres within rstudio on amazon sagemaker and publish to rstudio connect for wide consumption publishing r shiny apps r examples rsconnect shiny shows how you can author an r shiny application within rstudio on amazon sagemaker and publish to rstudio connect for wide consumption publishing streamlit apps r examples rsconnect streamlit shows how you can author a streamlit application withing amazon sagemaker studio and publish to rstudio connect for wide consumption advanced amazon sagemaker functionality these examples showcase unique functionality available in amazon sagemaker they cover a broad range of topics and utilize a variety of methods but aim to provide the user with sufficient insight or inspiration to develop within amazon sagemaker data distribution types advanced functionality data distribution types showcases the difference between two methods for sending data from s3 to amazon sagemaker training instances this has particular implication for scalability and accuracy of distributed training encrypting your data advanced functionality handling kms encrypted data shows how to use server side kms encrypted data with amazon sagemaker training the iam role used for s3 access needs to have permissions to encrypt and decrypt data with the kms key using parquet data advanced functionality parquet to recordio protobuf shows how to bring parquet https parquet apache org data sitting in s3 into an amazon sagemaker notebook and convert it into the recordio protobuf format that many sagemaker algorithms consume connecting to redshift advanced functionality working with redshift data demonstrates how to copy data from redshift to s3 and vice versa without leaving amazon sagemaker notebooks bring your own xgboost model advanced functionality xgboost bring your own model shows how to use amazon sagemaker algorithms containers to bring a pre trained model to a realtime hosted endpoint without ever needing to think about rest apis bring your own k means model advanced functionality kmeans bring your own model shows how to take a model that s been fit elsewhere and use amazon sagemaker algorithms containers to host it bring your own r algorithm advanced functionality r bring your own shows how to bring your own algorithm container to amazon sagemaker using the r language installing the r kernel advanced functionality install r kernel shows how to install the r kernel into an amazon sagemaker notebook instance bring your own scikit algorithm advanced functionality scikit bring your own provides a detailed walkthrough on how to package a scikit learn algorithm for training and production ready hosting bring your own mxnet model advanced functionality mxnet mnist byom shows how to bring a model trained anywhere using mxnet into amazon sagemaker bring your own tensorflow model advanced functionality tensorflow iris byom shows how to bring a model trained anywhere using tensorflow into amazon sagemaker bring your own model train and deploy bertopic advanced functionality pytorch extend container train deploy bertopic shows how to bring a model through an external library how to train it and deploy it into amazon sagemaker by extending the pytorch base containers experiment management capabilities with search advanced functionality search shows how to organize training jobs into projects and track relationships between models endpoints and training jobs host multiple models with your own algorithm advanced functionality multi model bring your own shows how to deploy multiple models to a realtime hosted endpoint with your own custom algorithm host multiple models with xgboost advanced functionality multi model xgboost home value shows how to deploy multiple models to a realtime hosted endpoint using a multi model enabled xgboost container host multiple models with sklearn advanced functionality multi model sklearn home value shows how to deploy multiple models to a realtime hosted endpoint using a multi model enabled sklearn container host multimodal huggingface model advanced functionality huggingface deploy instructpix2pix shows how to host an instruction based image editing model from huggingface as a sagemaker endpoint using single core or multi core gpu based instances inference recommender is used to run load tests and compare the performance of instances sagemaker training and inference with script mode sagemaker script mode shows how to use custom training and inference scripts similar to those you would use outside of sagemaker with sagemaker s prebuilt containers for various frameworks like scikit learn pytorch and xgboost host models with nvidia triton server sagemaker triton shows how to deploy models to a realtime hosted endpoint using triton https developer nvidia com nvidia triton inference server as the model inference server heterogenous clusters training in tensorflow or pytorch training heterogeneous clusters readme md shows how to train using tensorflow tf data service distributed data pipeline or pytorch with grpc on top of amazon sagemaker heterogenous clusters to overcome cpu bottlenecks by including different instance types gpu cpu in the same training job amazon sagemaker neo compilation jobs these examples provide an introduction to how to use neo to compile and optimize deep learning models gluoncv ssd mobilenet sagemaker neo compilation jobs gluoncv ssd mobilenet shows how to train gluoncv ssd mobilenet and use amazon sagemaker neo to compile and optimize the trained model image classification sagemaker neo compilation jobs imageclassification caltech adapts from image classification introduction to amazon algorithms imageclassification caltech including neo api and comparison against the uncompiled baseline mnist with mxnet sagemaker neo compilation jobs mxnet mnist adapts from mxnet mnist sagemaker python sdk mxnet mnist including neo api and comparison against the uncompiled baseline deploying pre trained pytorch vision models sagemaker neo compilation jobs pytorch torchvision shows how to use amazon sagemaker neo to compile and optimize pre trained pytorch models from torchvision distributed tensorflow sagemaker neo compilation jobs tensorflow distributed mnist includes neo api and comparison against the uncompiled baseline predicting customer churn sagemaker neo compilation jobs xgboost customer churn adapts from xgboost customer churn introduction to applying machine learning xgboost customer churn including neo api and comparison against the uncompiled baseline amazon sagemaker processing these examples show you how to use sagemaker processing jobs to run data processing workloads scikit learn data processing and model evaluation sagemaker processing scikit learn data processing and model evaluation shows how to use sagemaker processing and the scikit learn container to run data preprocessing and model evaluation workloads feature transformation with amazon sagemaker processing and sparkml sagemaker processing feature transformation with sagemaker processing shows how to use sagemaker processing to run data processing workloads using sparkml prior to training feature transformation with amazon sagemaker processing and dask sagemaker processing feature transformation with sagemaker processing dask shows how to use sagemaker processing to transform data using dask distributed clusters distributed data processing using apache spark and sagemaker processing sagemaker processing spark distributed data processing shows how to use the built in spark container on sagemaker processing using the sagemaker python sdk amazon sagemaker pipelines these examples show you how to use sagemaker pipelines https aws amazon com sagemaker pipelines to create automate and manage end to end machine learning workflows amazon comprehend with sagemaker pipelines sagemaker pipelines nlp amazon comprehend sagemaker pipeline shows how to deploy a custom text classification using amazon comprehend and sagemaker pipelines amazon forecast with sagemaker pipelines sagemaker pipelines time series forecasting amazon forecast pipeline shows how you can create a dataset dataset group and predictor with amazon forecast and sagemaker pipelines multi model sagemaker pipeline with hyperparamater tuning and experiments sagemaker pipeline multi model shows how you can generate a regression model by training real estate data from athena using data wrangler and uses multiple algorithms both from a custom container and a sagemaker container in a single pipeline sagemaker pipeline local mode with frameworkprocessor and byoc for pytorch with sagemaker training toolkig sagemaker pipelines tabular local mode framework processor byoc sagemaker pipeline step caching sagemaker pipelines tabular caching shows how you can leverage pipeline step caching while building pipelines and shows expected cache hit cache miss behavior native automl step in sagemaker pipelines sagemaker pipelines tabular automl step sagemaker autopilot pipelines native auto ml step ipynb shows how you can use sagemaker autopilot with a native automl step in sagemaker pipelines for end to end automl training automation amazon sagemaker pre built framework containers and the python sdk pre built deep learning framework containers these examples show you how to train and host in pre built deep learning framework containers using the sagemaker python sdk chainer cifar 10 sagemaker python sdk chainer cifar10 trains a vgg image classification network on cifar 10 using chainer both single machine and multi machine versions are included chainer mnist sagemaker python sdk chainer mnist trains a basic neural network on mnist using chainer shows how to use local mode chainer sentiment analysis sagemaker python sdk chainer sentiment analysis trains a lstm network with embeddings to predict text sentiment using chainer iris with scikit learn sagemaker python sdk scikit learn iris trains a scikit learn classifier on iris data model registry and batch transform with scikit learn sagemaker python sdk scikit learn model registry batch transform trains a scikit learn random forest model registers it in model registry and runs a batch transform job mnist with mxnet gluon sagemaker python sdk mxnet gluon mnist trains a basic neural network on the mnist handwritten digit dataset using mxnet gluon mnist with mxnet sagemaker python sdk mxnet mnist trains a basic neural network on the mnist handwritten digit data using mxnet s symbolic syntax sentiment analysis with mxnet gluon sagemaker python sdk mxnet gluon sentiment trains a text classifier using embeddings with mxnet gluon tensorflow training and serving sagemaker python sdk tensorflow script mode training and serving trains a basic neural network on mnist tensorflow with horovod sagemaker python sdk tensorflow script mode horovod trains on mnist using horovod for distributed training tensorflow using shell commands sagemaker python sdk tensorflow script mode using shell commands shows how to use a shell script for the container s entry point pre built machine learning framework containers these examples show you how to build machine learning models with frameworks like apache spark or scikit learn using sagemaker python sdk inference with sparkml serving sagemaker python sdk sparkml serving emr mleap abalone shows how to build an ml model with apache spark using amazon emr on abalone dataset and deploy in sagemaker with sagemaker sparkml serving pipeline inference with scikit learn and linearlearner sagemaker python sdk scikit learn inference pipeline builds a ml pipeline using scikit learn preprocessing and linearlearner algorithm in single endpoint using amazon sagemaker with apache spark these examples show how to use amazon sagemaker for model training hosting and inference through apache spark using sagemaker spark https github com aws sagemaker spark sagemaker spark allows you to interleave spark pipeline stages with pipeline stages that interact with amazon sagemaker mnist with sagemaker pyspark sagemaker spark pyspark mnist parameterize spark configuration in pipeline pysparkprocessor execution sagemaker spark parameterize spark config pysparkprocessor pipeline shows how you can define spark configuration in different pipeline pysparkprocessor executions using amazon sagemaker with amazon keyspaces for apache cassandra these examples show how to use amazon sagemaker to read data from amazon keyspaces https docs aws amazon com keyspaces train machine learning models using amazon keyspaces as a data source ingest data sagemaker keyspaces aws marketplace create algorithms model packages for listing in aws marketplace for machine learning these example notebooks show you how to package a model or algorithm for listing in aws marketplace for machine learning creating marketplace products aws marketplace creating marketplace products creating a model package listing on aws marketplace aws marketplace creating marketplace products models provides a detailed walkthrough on how to package a pre trained model as a sagemaker model package that can be listed on aws marketplace creating algorithm and model package listing on aws marketplace aws marketplace creating marketplace products algorithms provides a detailed walkthrough on how to package a scikit learn algorithm to create sagemaker algorithm and sagemaker model package entities that can be used with the enhanced sagemaker train transform hosting tuning apis and listed on aws marketplace once you have created an algorithm or a model package to be listed in the aws marketplace the next step is to list it in aws marketplace and provide a sample notebook that customers can use to try your algorithm or model package curate your aws marketplace model package listing and sample notebook aws marketplace curating aws marketplace listing and sample notebook modelpackage provides instructions on how to craft a sample notebook to be associated with your listing and how to curate a good aws marketplace listing that makes it easy for aws customers to consume your model package curate your aws marketplace algorithm listing and sample notebook aws marketplace curating aws marketplace listing and sample notebook algorithm provides instructions on how to craft a sample notebook to be associated with your listing and how to curate a good aws marketplace listing that makes it easy for your customers to consume your algorithm use algorithms data and model packages from aws marketplace these examples show you how to use model packages and algorithms from aws marketplace and dataset products from aws data exchange for machine learning using algorithms aws marketplace using algorithms using algorithm from aws marketplace aws marketplace using algorithms amazon demo product provides a detailed walkthrough on how to use algorithm with the enhanced sagemaker train transform hosting tuning apis by choosing a canonical product listed on aws marketplace using automl algorithm aws marketplace using algorithms automl provides a detailed walkthrough on how to use automl algorithm from aws marketplace using model packages aws marketplace using model packages using model packages from aws marketplace aws marketplace using model packages generic sample notebook is a generic notebook which provides sample code snippets you can modify and use for performing inference on model packages from aws marketplace using amazon sagemaker using amazon demo product from aws marketplace aws marketplace using model packages amazon demo product provides a detailed walkthrough on how to use model package entities with the enhanced sagemaker transform hosting apis by choosing a canonical product listed on aws marketplace using models for extracting vehicle metadata aws marketplace using model packages auto insurance provides a detailed walkthrough on how to use pre trained models from aws marketplace for extracting metadata for a sample use case of auto insurance claim processing using models for identifying non compliance at a workplace aws marketplace using model packages improving industrial workplace safety provides a detailed walkthrough on how to use pre trained models from aws marketplace for extracting metadata for a sample use case of generating summary reports for identifying non compliance at a construction industrial workplace creative writing using gpt 2 text generation aws marketplace using model packages creative writing using gpt 2 text generation will show you how to use aws marketplace gpt 2 xl pre trained model on amazon sagemaker to generate text based on your prompt to help you author prose and poetry amazon augmented ai with aws marketplace ml models aws marketplace using model packages amazon augmented ai with aws marketplace ml models will show you how to use aws marketplace pre trained ml models with amazon augmented ai to implement human in loop workflow reviews with your ml model predictions monitoring data quality in third party models from aws marketplace aws marketplace using model packages data quality monitoring will show you how to perform data quality monitoring on a pre trained third party model from aws marketplace evaluating ml models from aws marketplace for person counting use case aws marketplace using model packages evaluating aws marketplace models for person counting use case will show you how to use two aws marketplace gluoncv pre trained ml models for person counting use case and evaluate each model for performance in different types of crowd images preprocessing audio data using a pre trained machine learning model using model packages preprocessing audio data using a machine learning model demonstrates the usage of a pre trained audio track separation model to create synthetic features and improve an acoustic classification model using dataset products aws marketplace using data using dataset product from aws data exchange with ml model from aws marketplace aws marketplace using data using data with ml model is a sample notebook which shows how a dataset from aws data exchange can be used with an ml model package from aws marketplace using shutterstock image datasets to train image classification models aws marketplace using data image classification with shutterstock image datasets provides a detailed walkthrough on how to use the free sample images metadata of whole foods shoppers https aws amazon com marketplace pp prodview y6xuddt42fmbu qid 1623195111604 sr 0 1 ref srh res product title offers from shutterstock s image datasets to train a multi label image classification model using shutterstock s pre labeled image assets you can learn more about this implementation from this blog post https aws amazon com blogs awsmarketplace using shutterstocks image datasets to train your computer vision models balance scale license this library is licensed under the apache 2 0 license http aws amazon com apache2 0 for more details please take a look at the license https github com aws amazon sagemaker examples blob master license txt file handshake contributing although we re extremely excited to receive contributions from the community we re still working on the best mechanism to take in examples from external sources please bear with us in the short term if pull requests take longer than expected or are closed please read our contributing guidelines https github com aws amazon sagemaker examples blob master contributing md if you d like to open an issue or submit a pull request
sagemaker aws reinforcement-learning machine-learning deep-learning examples jupyter-notebook mlops data-science training inference
ai
NLP-with-LLMs
natural language processing with large language models purpose jon krohn https www jonkrohn com created this repo to accompany his half day training on nlp with gpt 4 and other llms from training to deployment with hugging face and pytorch lightning which was first offered at the open data science conference odsc east https odsc com boston in boston on may 10th 2023 code code can be found in the aptly named code https github com jonkrohn nlp with llms tree main code directory jupyter notebooks are directly supported for execution in google colab https colab research google com py files are for running at the command line see instructions https github com jonkrohn nlp with llms tree main instructions n b code is intended to be accompanied by live instructions and so it will not necessarily be self explanatory repo art p align center img src https github com jonkrohn nlp with llms blob main img llamas jpeg p the repo art above was generated by prompting midjourney v5 with this artistic take on llms that was output by gpt 4 painting of a harmonious blend of alpacas https crfm stanford edu 2023 03 13 alpaca html and vicu as https vicuna lmsys org in rich shades of caramel and soft gray amidst a lush futuristic landscape the animals are surrounded by a web of glowing pulsating neural network connections in hues of electric blue and neon green symbolizing the cutting edge and cost effective ai training techniques in the background a dynamic matrix of binary code cascades down further emphasizing the technological prowess of the scene
ai
MachineLearningEngineerInterviewChallenge
challenge explore the data for identified fraudsters and other users what are your preliminary observations databases write an etl script in python to load the data into the postgresql database the associated ddl should be executed through python and not directly in sql you can find the desired schema in schema yaml and some sample code for the etl feature engineering utilizing your findings from part a and some creativity create some features explain your reasoning behind the features make a features py script which when executed will create these features and store them in the db model selection validation create an ml model which identifies fraudsters assess the quality of your model and explain make a train py file which generates the fitted model artifact it should be stored under the artifacts sub directory operationalization how will you utilize this model to catch fraudsters if a fraudster is identified what should be the resulting action lock user alert agent or both explain make a patrol py file and write a simple function which implements your logic from above the function should accept a user id and yield the suggested action s e g patrol user id lock user alert agent my solution database code etl py code etl py feature engineering code features py code features py model selection and validation code train py code train py operationalization code patrol py code patrol py here is the jupyter notebook where i presented all my solution run pipeline ipynb https github com halilbilgin machinelearningengineerinterviewchallenge blob master notebooks run 20pipeline ipynb
server
guibot
guibot gh actions https github com intra2net guibot actions workflows ci yml badge svg https github com intra2net guibot actions workflows ci yml documentation status https readthedocs org projects guibot badge version latest http guibot readthedocs io en latest badge latest codeql https github com intra2net guibot actions workflows codeql yml badge svg https github com intra2net guibot actions workflows codeql yml codecov https codecov io gh intra2net guibot branch master graph badge svg https codecov io gh intra2net guibot a tool for gui automation using a variety of computer vision and display control backends introduction and concepts in order to do gui automation you usually need to solve two problems first you need to have a way to control and interact with the interface and platform you are automating and second you need to be able to locate the objects you are interested in on the screen guibot helps you do both to interact with guis guibot provides the controller https github com intra2net guibot blob master guibot controller py module which contains a common interface for different display backends with methods to move the mouse take screenshots type characters and so on the backend to use will depend on how your platform is accessible with some backends running directly as native binaries or python scripts on windows macos and linux while others connecting through remote vnc displays to locate an element on the screen you will need an image representing the screen a target https github com intra2net guibot blob master guibot target py representing the element an image or a text in the simplest cases and a finder https github com intra2net guibot blob master guibot finder py configured for the target the finder looks for the target within the screenshot image and returns the coordinates to the region where that target appears finders just like display controllers are wrappers around different backends supported by guibot that could vary from a simplest 1 1 pixel matching by controller backends to template or feature matching mix by opencv to ocr and ml solutions by tesseract and ai frameworks finally to bridge the gap between controlling the gui and finding target elements the region https github com intra2net guibot blob master guibot region py module is provided it represents a subregion of a screen and contains methods to locate targets in this region using a choice of finder and interact with the graphical interface using a choice of controller supported backends supported computer vision cv backends are based on opencv https github com opencv opencv template matching contour matching feature matching haar cascade matching template feature and mixed matching tesseract ocr https github com tesseract ocr tesseract text matching through pytesseract tesserocr or opencv s bindings pytorch https github com pytorch pytorch r cnn matching through faster r cnn or mask r cnn autopy https github com msanders autopy autopy matching supported display controller dc backends are based on pyautogui https github com asweigart pyautogui autopy https github com msanders autopy vncdotool https github com sibson vncdotool xdotool https www semicomplete com projects xdotool further resources homepage http guibot org documentation http guibot readthedocs io installation https github com intra2net guibot wiki packaging issue tracking https github com intra2net guibot issues project wiki https github com intra2net guibot wiki
ai
widgets
blockchain visualization widgets library of visual components to build data visualization interface to the blockchain data universal tool across all major blockchains ethereum tron bitcoin zcash many more tldr try https explorer bitquery io https explorer bitquery io which is open sourced https github com bitquery explorer and completely built using widgets javascript programmers may start with jsfiddle examples https github com bitquery widgets wiki jsfiddle screen preview https raw githubusercontent com bitquery widgets master doc files screen2 png every widget is a reusable visualization component to display the data from blockchain typically every widget is displaying a result set from graphql request usage to start with look our wiki https github com bitquery widgets wiki you do not need programming skills to use and embed widgets most tasks can be done by examples to use specialized filtering and data querying you will need basic understanding of graphql language https graphql org setup clone this repo to your desktop and run npm install to install all the dependencies for widget developers npm init y npm install webpack webpack cli save dev npm install save graphql apollo boost vue vue i18n vue google charts vue loader vue template compiler vue template loader lodash npm install save dev sass loader node sass style loader css loader file loader npm install save vis numeral npm install save react react dom isomorphic fetch graphiql npm install save dev extract text webpack plugin next npm install run webpack node modules bin webpack run webpack autocompile for dev node modules bin webpack watch license you can check out the full license here https github com bitquery widgets blob master license this project is licensed under the terms of the mit license credits the project is developing with the binance fellowship program https binancex dev fellowship fellows html support
blockchain data widgets analytics statistics graphql
blockchain
ROS
ros https img shields io badge language c green svg https img shields io badge category learning blue svg https img shields io badge blog leereindeer red svg https leer moe logo art ros logo webp ros leertos leer s rtos rtos implementation a task scheduler run on atmega328p get more information on my chinese blog https leer moe 2019 05 12 ros how to use build to manually build ros the following avr and arduino tool chains are utilized avr gcc avr size avr objcopy and avrdude the correct paths of these tool chains should be properly configured within the makefile the default target all will build build dir target hex which depends on build dir target elf which in turn depends on objs therefore the final build order is objs build dir target elf build dir target hex the upload target will use the avrdude tool to upload the built hex file to the avr development board in this case the arduino uno with atmega328p the mcu variable can also be customized to build and upload for different development boards then you can simply execute the command make all to both build and upload the code example the following code is an example of using ros to blink two leds at different frequencies c blink example in ros do not use any function in arduino h include avr io h include ros h ros tcb task1 ros tcb task2 uint8 t task1 stack ros default stack size uint8 t task2 stack ros default stack size define led1 13 define led2 12 define bitset value bit value 1ul bit define bitclear value bit value 1ul bit define task1 priority 1 define task2 priority 0 max priority void t1 while 1 set led1 high bitset portb 5 ros delay 200 bitclear portb 5 ros delay 200 void t2 while 1 bitset portb 4 delay a second ros delay 100 bitclear portb 4 ros delay 100 void setup set led 13 and led 12 as output bitset ddrb 5 bitset ddrb 4 bool os started ros init if os started ros create task task1 t1 task1 priority task1 stack ros default stack size ros create task task2 t2 task2 priority task2 stack ros default stack size ros schedule int main setup return 0 the two tasks run as follows 1 after the system initializes and creates the task task1 task2 and the default task are all in the ready queue the highest priority task2 is selected to run led2 is lit and then the ros delay 100 is encountered and task2 will be blocked for 1 second 2 task1 is run led 1 is lit and then call ros delay 200 which will block for two seconds 3 both tasks are blocked and the default task runs 4 after one second task 2 is woken up continues to run turns off led2 and then task 2 will be blocked for another 1 second 5 after two seconds task1 and task2 are woken up and task2 with higher priority runs lights led2 and blocks the call ros delay 100 6 task2 is run led 1 is turned off and the call ros delay 200 is blocked the two tasks continue in this order with the effect that lde1 lights up and off every 2 seconds while led2 lights up and off every 1 second of course this is just a simple basic example porting to other boards implement following 4 functions in your own porting file c void ros init timer void ros idle task void ros task context init ros tcb tcb ptr task func task f void stack top void ros switch context ros tcb old tcb ros tcb new tcb related project s task full platform multi task library for c https github com leereindeer ros issues 1
rtos schdeduler ardiuno-uno arduino avr
os
Kirti-dubey
kirti dubey technology and gaming information news
server
Amazon-Front-end-Engineer-Interview-prep
front end engineer tips familiarity with prominent programming languages including the syntax of the language pick the one you r most comfortable with and stick with it know how to use languages libraries and rendering technologies e g web javascript templating languages html css c webviews view frameworks showcase your knowledge of front end system design i e reusable components separation of concerns view models from business logic application state management and basic n tier computing concepts front end middle tier back end understand the inner workings of common data structures be able to compare and contrast their usage in various applications e g retrieving json and using it to populate and power a user interface research application performance concepts and technology i e resource caching images fonts content delivery asynchronous programming and real usermetrics write syntactically correct code no pseudo code ensure it is scalable robust and well tested use object oriented design or functional programming logical and maintainable code best practices to build lasting scalable software familiarity with the devices and or browsers which run your software topics such as system availability efficient resource usage cpu battery screen size and performance implications familiarity with device and or browser topics such as security native api methods local storage and compatibility technical tips get ready to solve in depth technical questions on concepts like front end application design data structures and algorithms this will likely include qualifying requirements checking edge cases and white boarding your solutions with engineers in person interviews tend to be more in depth than the typesof questions asked during phone interview be prepared to discuss technologies listed on your resume i e if you list javascript or ruby or python as technical competencies expect technical questions about your experiences with these technologies it is helpful to review the job description before your interview to align your qualifications against the job s specific requirements and responsibilities when listing frameworks be prepared to speak about these in depth and compare to other frameworks brush up on problem solving and core computer science fundamentals white boarding be prepared to white board practice writing code front end system design and creating rough ui wireframes consider logical maintainable and scalable code or design before you begin drafting interact with your interviewer you will be asked several questions related to design engage with your interviewer with necessary questions to complete the exercise dig for clarification begin drawing a diagram once you ve done enough digging to begin white boarding your system design solution start with shapes to represent different software components and data sources and then arrows connecting them to show web services apis and interactions between components know how your solution solves the problem if you suggest technology to help solve understand how that technology works think out loud as you write out your code or system design show us your ability to solve problems application performance is a critical component of front end software design consider how to store and retrieve data client side vs server side processing browser device rendering efficiency and data flows keep this mind when diagramming and designing your software systems operational performance is also a critical component of front end software design how will you ensure this component or application is working at an acceptable level of performance if a problem occurs what will be involved to trouble shoot and resolve quickly what are the possible points of failure and how can they be made more robust agains failure keep the customer front of mind who is the customer and what problem are you solving for them write a list of requirements on the board and keep asking questions this should be the first thing you write out software development topics programming language familarity with a prominent language is generally a prerequisite for success knowing the syntax of languages such as java python c c c or ruby you should also know some of the languages nuances such as how memory management works or the most commonly used collections libraries etc pick the one language you r most comfortable with and stick with it data structures knowlage storing and providing access to data in efficent ways understanding the inner workings of common data structures and be able to compare and contrast their usage in various applications knowing the runtimes for common operations as well as how they use memory algorithms having a good understanding of the most common algorithms will likely make solving some of the questions a lot easier review common algorithms such as traversals divide and conquer breadth first search vs depth first search and understand the tradeoffs for each knowing the runtimes theoretical limitations and basic implementation strategies of different classes of algorithms is more important than memorizing the specific details of any given algorithm coding writing syntactically correct code no pseudo code get practice coding with a pen and paper writing a scalable robust and well tested code make sure that you check for edge cases and validate that no bad input can slip through object oriented design good software needs to be extensible and maintainable use object oriented design best practices should have a working knowledge of a few common and useful design patterns along with how to write software in an object oriented way able to defen your design choices databases figuring out how to most efficiently retrieve and store data for future use no knowlage or any particular level of expertise requires with non relational databases you should be familiar with broad database concepts and their applications knowing about tradeoffs between relational and non relational databases dynamodb distributed computing understanding of a few basic distributed computing concepts understanding topics such as service oriented architectures map reduce distributed caching load balancing and others operating systems you don t need to know how to build your own operating system from scratch but you should be familiar with some os topics that can affect code performance e g memory management processes threads synchronization paging and multithreading internet topics fundamentals of how the internet works how browsers function at a high level from dns lookups and tcp ip to socket connections having a solid understanding of the fundamentals of how the worldwide web works is a requirement general machine learning and artificial intelligence data driven modeling train test protocols error analysis and statistical significance given a problem definition you should be able to formulate it as a machine learning problem and propose a solution including ideas for data sources annotation modeling approaches and potential pitfalls understand the basic ai ml methods and algorithms revisit your favorite ml and ai textbooks recommendations reviewing or if you like me learning computer science fundamentals practicing coding outside of an integrated development environment resources cracking the coding interview is a good read my favorite is geeksforgeeks https www geeksforgeeks org if you need some help with fundamentals freecodecamp https guide freecodecamp org javascript es6 arrow functions javascript array exercises practice makes it perfect https www w3resource com javascript exercises javascript array exercises php editor
front_end
Albert-Sentiment-Analysis
albert for sentiment analysis dataset preparation a tab seperated tsv file is required with the name of train i e train tsv train dataset needs to be placed in a folder how to fine tune following parameters are required 1 data dir directory where data is stored 2 model type the model which we wanna use for fine tuning here we are using i albert i 3 model name or path the variant of albert which you want to use 4 output dir path where you want to save the model 5 do train because we are training the model example python run glue py data dir data model type albert model name or path albert base v2 output dir output do train different models available for use average squad1 1 squad2 0 mnli sst 2 race v2 albert base 82 3 90 2 83 2 82 1 79 3 84 6 92 9 66 8 albert large 85 7 91 8 85 2 84 9 81 8 86 5 94 9 75 2 albert xlarge 87 9 92 9 86 4 87 9 84 1 87 9 95 4 80 7 albert xxlarge 90 9 94 6 89 1 89 8 86 9 90 6 96 8 86 8 v1 albert base 80 1 89 3 82 3 80 0 77 1 81 6 90 3 64 0 albert large 82 4 90 6 83 9 82 3 79 4 83 5 91 7 68 5 albert xlarge 85 5 92 5 86 1 86 1 83 1 86 4 92 4 74 8 albert xxlarge 91 0 94 8 89 3 90 2 87 4 90 8 96 9 86 5 table taken from google research prediction both docker and python file are available for prediction 1 set the name of folder where model files are stored 2 run api py file python api py or from api import sentimentanalyzer classifier sentimentanalyzer print classifier predict the movie was nice thanks to huggingface for making the implementation simple and also google for this awesome pretrained model
ai
xcontrols-domino
xcontrols full documentation is available at xcontrols org http xcontrols org teamstudio xcontrols is a new set of controls for ibm domino xpage developers working on new xpages apps and on app modernisation projects it is effectively a re write of the teamstudio unplugged controls project but adding full support for pc browser based user interfaces as well as mobile interfaces xcontrols is built with bootstrap 3 x http getbootstrap com and the bootcards project http bootcards org it enables xpage developers to create controls that are responsive optimizing to the device whether it be a smartphone a tablet or a pc there are three key design goals for xcontrols 1 enable faster design and assembly of modern user interfaces using card list objects analagous to forms and views in traditional notes development 2 make it easy to write an xpages app which auto optimizes to smartphones tablets and pcs 3 make it easy to create xpages apps which work offline on mobile devices via the teamstudio unplugged xpages engine for ios and android the lead developer of the xcontrols is matt white mattwhite with technical design input from mark leusink and ui design input from jack herbert and steve ives the xcontrols project is tested using browserstack http browserstack com for a full release history you can view history md https github com teamstudio xcontrols domino blob master history md
front_end
TAMU-Eng-Edu-Database
tamu eng edu database a database of all texas a amp m university engineering education publications
server
Velostrata-Runbook
velostrata runbook the velostrata runbook automation tool simplifies the migration of multiple vms simultaneously using the tool to test and perform your migration reduces both the complexity and risk associated with a migration from on premise to the cloud the automation tool is useful for business areas such as it services engineering and dev tests the automation tool is ideally suited to migrate the following types of environments complex applications that run on multiple vms multi tier environments with no database but with elements that have state such as load balancing and access data production applications with databases that communicate with other systems for example supply chain crm ticketing inventory publishing and collaboration systems there are several actions scenarios supported by the tool exporting inventory for configuration definition migration testing including creating local linked clones and running them in cloud migration including running the workloads in cloud migrating storage preparing for cache detach and detaching moving back to on premises that is move vms that are running in cloud back to their on premises location the velostrata runbook automation tool is built as a powershell script and can be downloaded here here the automation actions run in a sequence as defined and are all re entrant that is if there is any failure in the process you can fix the problem and restart the script again full documentation is available here http docs velostrata com m 60079 testing my new contribution
cloud
klimadao
summary this repo contains 6 packages klimadao app app klimadao finance a standalone single page app for protocol interactions klimadao carbon projects a sanity cms that contains curated data for verra projects deployed to carbon projects sanity studio https carbon projects sanity studio and referenced by the carbonmark frontend and backend note unlike the other packages this one is not included as an npm workspace from the root package json to work with the cms you need to run sanity install from inside the carbon projects folder klimadao carbonmark the nextjs project that powers the carbonmark web application at carbonmark com the backend node js web service and smart contracts are located in separate repositories klimadao carbonmark api a fastify https www fastify io api acting as a backend for frontend for carbonmark klimadao carbonmark data carbon klimadao finance site klimadao cms a sanity cms that powers our blog deployed to klimadao sanity studio https klimadao sanity studio note unlike the other packages this one is not included as an npm workspace from the root package json to work with the cms you need to run sanity install from inside the cms folder klimadao lib components and utilities that are shared between packages klimadao site klimadao finance homepage content and cms powered pages requirements take note this repo utilizes newer features from node npm and typescript 4 5 node v18 17 1 npm v9 x for npm workspaces typescript 4 5 for esmodules support npm install g typescript vscode prettier extension https marketplace visualstudio com items itemname esbenp prettier vscode vscode eslint extension https marketplace visualstudio com items itemname dbaeumer vscode eslint install dependencies from the klimadao root folder not from individual packages npm install development a set of npm workspace commands are provided and can be run from the root folder npm run dev all run all workspaces with hot reloading enabled npm run dev app http localhost 3001 http localhost 3001 npm run dev carbonmark http localhost 3002 http localhost 3002 npm run dev carbonmark api http localhost 3003 http localhost 3003 npm run dev site http localhost 3000 http localhost 3000 npm run dev lib enable hot reload for changes to components or utils other scripts you should know about npm run build all build all workspaces npm run format all format all files with prettier npm run extract strings dev extract translation files for the source language en type generation typescript types for carbonmark and carbonmark api are generated via the generate types script in each respective project regenerate types in any of the following changes 1 models in the api model files 2 change to any of the dependant subgraphs see codegen constants ts 3 api version targeted by carbonmark translations for developers this repo uses lingui https lingui js org tutorials react html in combination with translation io https translation io we follow these rules don t use ids except for very long strings extract the source translation with npm run extract strings dev commit source language files en only targetting a local api instance if you would like carbonmark to target a local running instance of the carbonmark api you can set the url value via the next public carbonmark api url environment variable bash next public carbonmark api url http localhost 3003 api npm run dev carbonmark for translators setup create an account on translation io https translation io request access to the project atmosfearful klimadao in the content translation klimadao discord server translating log in translation io https translation io select the appropriate project klimadao site or klimadao app make sure you select the appropriate language in the top menu start translating select an item to translate in the left column on the bottom right you will notice the source text in english and the place to translate the text beneath you can prefill this by selecting an entry in the suggestion area above pledge dashboard klima infinity pledge dashboard is backed by a firebase database follow the following to set up your dev environment if you d like to contribute to the project setup set up a firebase account and download the service account json file see here https firebase google com docs admin setup set up project and service account for more information remove the line breaks on the json file you can use this tool https www textfixer com tools remove line breaks php do not remove the n characters from the private key fields or the key will no longer be valid under site create a env local file with the environment variable firebase admin cert set to the service account json file with no line breaks it should look like this firebase admin cert type service account project id your database name private key id diagrams see this page in the wiki of this repo https github com klimadao klimadao wiki diagrams for architecture and other diagrams contributing the dao is looking for react typescript devs as well as experienced solidity devs enjoy a flexible work schedule and work on something truly ambitious and meaningful monthly compensation available based on your level of experience and degree of contribution if you d like to just take a ticket or fix a bug go for it always better to ask first though if you d like to become a regular contributor to the dao join the klimadao discord https discord com invite klimadao and follow the application instructions check out the contribution style guide https github com klimadao klimadao wiki
os
ChatGPT-in-Academia
readme zh md english readme md chatgpt in academia as the development of natural language processing nlp technologies advances large language models llms such as chatgpt have gained significant attention for their potential to aid in scientific writing however concerns have also been raised regarding the ethical implications and reliability of using llms in scientific writing this repository is created to demonstrate the policies of different publishers and conferences towards the use of llms in scientific writing this will help authors quickly understand what should you not do with llm in scientific writing through this repository we hope to facilitate an open and transparent discussion on the role of llms in scientific writing and to promote responsible and ethical use of these technologies feel free to create a pr if you have additional news to share policy at a glance publishing group venue latest updated polishing own text literature search new idea chatgpt as author nature portfolio jan 24 2023 science jan 26 2023 arxiv preprint jan 31 2023 elsevier n a icml 2023 jan 2023 acl 2023 jan 2023 allow to use with clear official regulations not allowed with clear official regulations use with caution without clear official regulations unknown publishing groups nature 20 feb 2023 how nature readers are using chatgpt https www nature com articles d41586 023 00500 8 tldr eighty percent of respondents have used ai chatbots and 57 say they use it for creative fun 17 feb 2023 how will ai change mathematics rise of chatbots highlights discussion https www nature com articles d41586 023 00487 2 tldr machine learning tools already help mathematicians to formulate new theories and solve tough problems but they re set to shake up the field even more 03 feb 2023 chatgpt five priorities for research https www nature com articles d41586 023 00288 7 tldr conversational ai is a game changer for science 24 jan 2023 tools such as chatgpt threaten transparent science here are our ground rules for their use https www nature com articles d41586 023 00191 1 tldr nature portfolio has established their rules for the use of llm such as chatgpt 1 llm can not be listed as an author 2 the use of llm should be clearly described science aaas 26 jan 2023 chatgpt is fun but not an author https www science org doi 10 1126 science adg7879 tldr text generated from ai machine learning or similar algorithmic tools cannot be used in papers published in science journals nor can the accompanying figures images or graphics be the products of such tools without explicit permission from the editors arxiv 31 jan 2023 arxiv policy for authors use of generative ai language tools https blog arxiv org 2023 01 31 arxiv announces new policy on chatgpt and similar tools tldr 1 llm can not be listed as an author 2 irrespective of how the contents were generated the authors should take the responsibilities 3 report in their paper if generative ai was used for paper creating elsevier editorial policy the use of ai and ai assisted technologies in scientific writing https www elsevier com about policies publishing ethics tldr elsevier has issued a policy in response to the increasing use of generative ai and ai assisted technologies in content creation 1 authors should disclose the use of ai and ai assisted technologies in their manuscript and only use them to improve readability and language not to replace key researcher tasks 2 the authors are ultimately responsible for the contents of the work and should carefully review and edit the result of ai generated content 3 authors should not list ai and ai assisted technologies as an author or co author and should ensure the work is original and does not infringe third party rights conferences icml 2023 international conference on machine learning tldr the large language model llm policy for icml 2023 prohibits text produced entirely by llms i e generated this does not prohibit authors from using llms for editing or polishing author written text the llm policy is largely predicated on the principle of being conservative with respect to guarding against potential issues of using llms including plagiarism the llm policy applies to icml 2023 we expect this policy may evolve in future conferences as we understand llms and their impacts on scientific publishing better source url https icml cc conferences 2023 llm policy acl 2023 annual meeting of the association for computational linguistics tldr light editing and polishing for grammatical corrections are allowed without explicit declaration for literature search prompting with new ideas please use with caution and should make sure that relevant literatures are properly cited acl does not encourage the authors to entirely rely on llm for paper content generation for ai assisted coding the author should explicitly demonstrate its scope source url https 2023 aclweb org blog acl 2023 policy anti chatgpt tools in order to identify whether the text is from ai assisted engine such as chatgpt or human several tools are released which might help in identifying potential plagiarism and academic misconduct note these tools are not entirely reliable at the moment based on my own experience warning use with caution beware of data collection especially for those private data gptzero https gptzero me tldr api that detects whether text was generated by ai accept both files and text input and return probabilities on the sentence paragraph and document level features a minimum of 250 characters are required support pdf docx txt formats support api requests detailed scoring of the input text openai ai text classifier https platform openai com ai text classifier tldr this is an official ai classifier from openai to distinguish between ai written and human written text blog https openai com blog new ai classifier for indicating ai written text features requires a minimum of 1 000 characters which is approximately 150 250 words five levels of results very unlikely unlikely unclear if it is possibly or likely ai generated
chatgpt large-language-model llm scientific-writing
ai
DIG4639
dig4639 mobile development the prettiest people do the ugliest things for the road to riches and diamond rings kanye west
front_end
ICT-Blog
ict blog about information technology
server
taquito
taquito logo img taquito png node js ci https github com ecadlabs taquito workflows node js 20ci badge svg https github com ecadlabs taquito actions workflows main yml codecov https codecov io gh ecadlabs taquito branch master graph badge svg https codecov io gh ecadlabs taquito cii best practices https bestpractices coreinfrastructure org projects 3204 badge https bestpractices coreinfrastructure org projects 3204 npm version https badge fury io js 40taquito 2ftaquito svg https badge fury io js 40taquito 2ftaquito welcome web3 developer what is taquito taquito is a fast and lightweight typescript https www typescriptlang org library to accelerate dapp development on the tezos https tezos com developers blockchain with it you can easily interact with smart contracts deployed to tezos it is distributed as a suite of individual npm packages to reduce bloat and improve application startup times what about smart contract development if you are a current or aspiring full stack blockchain developer be sure to check out taquito s sister project taqueria https taqueria io taqueria is a developer tool suite with rich support for smart contract development and orchestration on tezos and fully compliments taquito what is included in taquito taquito is primarily targeted at front end web3 developers so it comes with batteries included such as a react template project https github com ecadlabs taquito react template an extensible framework and many helpful utilities it can be used in many execution contexts including serverless node js deno and electron to name a few and has minimal dependencies who uses taquito taquito is used by over 80 of dapps in the tezos ecosystem it is easy to use proven secure https bestpractices coreinfrastructure org en projects 3204 security and tested continuously https github com ecadlabs taquito actions workflows main yml against current versions of tezos both mainnet and testnets why should i use taquito taquito provides convenient abstractions for a multitude of common operations including wallet interactions with walletconnect2 https docs walletconnect com 2 0 in the works batching operations calling into contracts querying the blockchain and more taquito will isolate your code from subtle and some not so subtle changes made to the underlying tezos protocol not to mention our thriving helpful and welcoming community ok i m ready to get started with taquito quickly visit the taquito quickstart https tezostaquito io docs quick start if you prefer to start with a skeleton project check out our taquito react template https github com ecadlabs taquito react template do you wish to make a contribution to taquito see below contributing to taquito contributors getting started supported versions of node taquito currently supports the following versions of node js version supported v12 lts v14 lts v16 13 1 v16 lts gallium 17 3 x v18 lts hydrogen v20 while other versions often work the above are what we officially support ymmv community support channels we are active and enthusiastic participants of the following community support channels ecad labs discord channel discord tezos stackexchange stackexchange project organization taquito is organized as a monorepo https en wikipedia org wiki monorepo and is composed of several npm packages that are published to npmjs org https www npmjs com package taquito taquito under the taquito handle each package has its own readme which can be found in the corresponding directory within packages high level packages responsibility taquito taquito packages taquito facade https en wikipedia org wiki facade pattern to lower level package specific functionality low level packages responsibility taquito local forging packages taquito local forging local forging serialization of tezos operations as bytes taquito michelson encoder packages taquito michelson encoder creates js abstractions of smart contracts taquito michel codec packages taquito michel codec converts michelson between forms expands macros etc taquito remote signer packages taquito remote signer provides the facility to use a remote signer such as https signatory io taquito rpc packages taquito rpc rpc client library every rpc endpoint has its own method taquito signer packages taquito signer provides functionality to sign data using tezos keys taquito utils packages taquito utils provides different encoding and decoding utilities taquito tzip12 packages taquito tzip12 tzip 12 allows retrieving nft token metadata taquito tzip16 packages taquito tzip16 tzip 16 allows retrieving contract metadata and executing off chain views taquito beacon wallet packages taquito beacon wallet tzip 10 implementation of a wallet api api documentation typedoc api documentation for taquito is available here https tezostaquito io typedoc versioning strategy supported versions of taquito packages are maintained for the current and next beta protocol versions taquito uses semantic versioning todo or semver but with a small twist the major version number that we use tracks the latest version of tezos the minor and patch numbers do however follow semver norms for example in a past release the protocol was at 004 and 005 was being promoted through the on chain amendment process a feature unique to tezos so at that time the current version for taquito was v4 0 0 and work commenced on version v5 0 0 beta 1 release timing when it becomes clear that the next protocol proposal will be promoted and we have implemented and tested interoperability with the new protocol we release the next version v5 0 0 beta 1 in this example before the chain transitions to the new protocol it is essential for updated packages to be released before the protocol changes so that taquito developers have time to update and test their projects during major version updates the taquito public apis may include breaking changes we endeavor to make this clear and document it in our release notes note that all previous releases are backwards compatible with chain data all the way back to the genesis protocol support for older tezos node rpcs is maintained where feasible but are eventually dropped we encourage you to update older versions of taquito and you are encouraged to contact us with any technical issues that preclude doing so releases releases are pushed to npmjs org and the github releases page the maintainers sign all official releases releases git tags and npm packages are signed either by keybase jevonearth 2 or keybase simrob 3 releases that are not signed or signed by other keys should be brought to our attention todo immediately please contributors getting started you would like to make a contribution to taquito wonderful please read on setup and build the taquito project it is important to perform the following in the stated order install libudev dev if developing on gnu linux for ubuntu and other debian based distros sudo apt get install libudev dev for fedora and other redhat based distros sudo dnf install libudev devel this package contains low level files required to compile against libudev install use a suitable version of node js as listed above for example nvm use lts gallium install lerna globally used by our blazingly fast nx based build system npm install global lerna setup and build taquito now that your prerequisites have been installed run the following commands sh npm clean install npm run build if all goes well the last step is to run the unit tests which should all pass sh npm run test build gotchas do not delete node modules manually as this will confuse the build system the taquito build system is based on nx which uses caching extensively please use npm run clean instead do not use npm install as it will unnecessarily update package json the npm clean install command or just npm ci produces a stable installation of all dependencies and respects package lock json this will ensure a deterministic and repeatable build it is also some 2x to 10x faster than npm install hooray useful npm command targets scripts see the top level package json scripts section common targets include npm run clean recursively delete all build artifacts npm run test run the unit tests npm run build generate bundles typings and create typedocs for all packages npm run lint run the code linter eslint npm run example run an example node js app that demonstrates all functionality running integration tests the taquito integration tests are located in the integration tests directory please see the readme md in that directory for further information modifying taquito source after making your changes to taquito lint and run the unit test suite this will let you know if your changes are working well with the rest of the project sh npm run lint npm run test npm run commit please use npm run commit for your last commit before you push as this will automagically formulate the correct commit format a final lint and test cycle will take place before the commit is performed running the website locally you may wish to contribute to the live code examples this explains how to do that note that the tezos taquito website 4 is built using docusaurus 5 to run the taquito website in development mode locally run the following commands from top level run npm clean install run npm w taquito website start contributions reporting issues security issues to report a security issue please contact security ecadlabs com or via keybase jevonearth 2 on keybase io you can also encrypt your bug report using the keybase jevonearth 2 key bug or feature requests please use our github issue tracker https github com ecadlabs taquito issues to report bugs or request new features to contribute please check the issue tracker to see if an existing issue exists for your planned contribution if there s no issue please create one first and then submit a pull request with your contribution for a contribution to be merged it is required to have complete documentation unit tests and integration tests as appropriate submitting a work in progress pull request for review feedback is always welcome disclaimer this software is provided by the copyright holders and contributors as is and any express or implied warranties including but not limited to any implied warranties of merchantability noninfringement or fitness for a particular purpose are entirely disclaimed in no event shall the copyright owner or contributors or any affiliated parties or entities be liable for any direct indirect incidental special exemplary or consequential damages including but not limited to procurement of substitute goods or services loss of use data or profits or business interruption however caused and on any theory of liability whether in contract strict liability or tort including negligence or otherwise arising in any way out of the use of this software even if advised of the possibility of such damage persons using this software do so entirely at their own risk credits special thanks to these libraries which have been excellent references for developing taquito https github com andrewkishino sotez https github com teztech eztz 0 https github com ecadlabs tezos indexer api 2 https keybase io jevonearth 3 https keybase io simrob 4 https tezostaquito io 5 https docusaurus io stackexchange https tezos stackexchange com questions tagged taquito discord https discord com channels 934567382700146739 939205889901092874
tezos typescript javascript blockchain dapps-development web3 taquito tezos-blockchain
blockchain
blockchain
blockchain travis ci https travis ci org paoloo blockchain svg branch master https travis ci org paoloo blockchain a clojure implementation of a blockchain based on learn blockchains by building one https hackernoon com learn blockchains by building one 117428612f46 prerequisites you will need leiningen 2 0 0 or above installed leiningen https github com technomancy leiningen running to get the dependencies run lein deps to start a server for the application run lein ring server usage requesting the whole blockchain json curl x get 127 0 0 1 8090 chain mining coins json curl x get 127 0 0 1 8090 mine make a new transaction json curl x post h content type application json d sender d4ee26eee15148ee92c6cd394edd974e recipient someone other address amount 5 http 127 0 0 1 8090 transactions new register a new node json curl x post h content type application json d node http 127 0 0 1 8091 http 127 0 0 1 8090 nodes register resolving blockchain differences in each node json curl x get 127 0 0 1 8090 nodes resolve tests lain test docker create a self contained version of application with lein ring uberjar run docker build t paoloo blockchain to create image and finally run docker run p 8090 8090 paoloo blockchain to instantiate it license mit copyright c 2017 paolo oliveira
blockchain clojure docker
blockchain
hitokage-haze-maize-meat
hitokage haze maize meat game https haze maize meat netlify app landing https haze maize meat netlify app landingpage html
css html javascript landing-page web-game css-animations
server
sara
sara c computer vision library gitlab ci build status https gitlab com oddkiva sara badges master pipeline svg https gitlab com oddkiva sara pipelines github actions status https github com oddkiva sara actions workflows ci yml badge svg https github com oddkiva sara actions sara is a sanskrit word meaning essence sara tries to focus on 1 having an easy to use and simple api 2 having easy to understand and efficient implementations of computer vision algorithms sara is licensed with the mozilla public license version 2 0 https gitlab com oddkiva sara raw master license ref type heads documentation you can find the api documentation here https oddkiva gitlab io sara i also compile my personal notes here https oddkiva gitlab io sara book where i write down my understanding regarding the mathematical details to explain how i implement some algorithms in any case you are always better off consulting the examples folder https gitlab com oddkiva sara tree master cpp examples and the test folder https gitlab com oddkiva sara tree master cpp test why yet another library like a few people out there i never really liked opencv i use it out of necessity but i don t really like it the two aspects below are always in my mind everytime i write code for this library 1 as a computer vision scientist understanding as many algorithms as possible and as deeply as possible is one of my responsibilities 2 i also try to write clear and efficient code this is a library made out of love and i try to reflect this mindset as honestly as possible build the libraries please have a look at the ci scripts like gitlab ci yml or github workflows ci yml if something does not work help me and try fixing it p
computer-vision python image-processing feature-detection feature-extraction multiple-view-geometry structure-from-motion cuda halide-lang swift tensorrt modern-cpp
ai
coincoin
coincoin build status https travis ci org robinmonjo coincoin svg branch master https travis ci org robinmonjo blockchain img align right src logo png width 128px coincoin is a cryptocurrency proof of concept implemented in elixir it s an umbrella project that focuses on the 2 main components of most of the existing cryptocurrencies the blockchain and digital transactions it s goal is to be as simple as possible but complete enough to technically understand what s going on behind bitcoin or ethereum for example setup blockchains are p2p softwares to start using coincoin we need to setup multiple nodes and connect them together you need elixir https elixir lang org install html installed clone this repository and go to the root of the project then pull the dependencies using mix deps get to setup a 3 nodes blockchain spawn 3 tabs in your terminal node1 node2 and node3 and run node1 iex s mix phx server defaults port 4000 p2p port 5000 node2 port 4001 p2p port 5001 iex s mix phx server node3 port 4002 p2p port 5002 iex s mix phx server then connect the nodes to create a p2p network elixir node2 blockchain connect localhost 5000 connect node2 to node1 node3 blockchain connect localhost 5001 connect node3 to node2 this will setup a simple network node1 node2 node3 you can also use the robinmonjo coincoin docker image available on the docker hub https hub docker com r robinmonjo coincoin bash docker run it robinmonjo coincoin if you use docker in the blockchain connect 1 call make sure to pass your container ip address and that this address is reachable notes if you don t want to interact with the rest api you can skip the port env var and use iex s mix instead of iex s mix phx server blockchain connect 5000 is equivalent to blockchain connect localhost 5000 for releases use make release usage when started coincoin will start 3 apps blockchain apps blockchain readme md a minimal blockchain token apps token readme md a minimal cryptocurrency implemented on top of the blockchain blockchain web apps blockchain web readme md a web interface to manage nodes of the blockchain to manipulate the blockchain and store random data in it using the iex console checkout the blockchain app apps blockchain readme md to do the same using a rest api checkout the blockchain web app apps blockchain web readme md and finally to play with a cryptocurrency and use the blockchain as a distributed ledger checkout the token app apps token readme md why coincoin lately i heard a lot about 1 how elixir is awesome and is the future of complex system web development 2 how blockchain technology will be the next big thing so what about building a cryptocurrency proof of concept in elixir as i m sure about 1 i still have some doubts about 2 eventough technologies behind cryptocurrencies are exciting also coin coin in french is the noise of a duck hence scrooge mcduck final words issues suggestions and pull requests are very welcome
blockchain elixir phoenix cryptocurrency
blockchain
text-generation-webui-colab
please follow me for new updates https twitter com camenduru br please join our discord server https discord gg k5bwmmvjju br please join my patreon community https patreon com camenduru br wip colab colab info model page open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main vicuna 13b gptq 4bit 128g ipynb vicuna 13b gptq 4bit 128g br https vicuna lmsys org open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main vicuna 13b 1 1 gptq 4bit 128g ipynb vicuna 13b 1 1 gptq 4bit 128g br https vicuna lmsys org open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main stable vicuna 13b gptq 4bit 128g ipynb stable vicuna 13b gptq 4bit 128g br https huggingface co carperai stable vicuna 13b delta open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main gpt4 x alpaca 13b native 4bit 128g ipynb gpt4 x alpaca 13b native 4bit 128g br https huggingface co chavinlo gpt4 x alpaca open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main pyg 7b gptq 4bit 128g ipynb pyg 7b gptq 4bit 128g br https huggingface co neko institute of science pygmalion 7b open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main koala 13b gptq 4bit 128g ipynb koala 13b gptq 4bit 128g br https bair berkeley edu blog 2023 04 03 koala open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main oasst llama13b gptq 4bit 128g ipynb oasst llama13b gptq 4bit 128g br https open assistant io open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main wizard lm uncensored 7b gptq 4bit 128g ipynb wizard lm uncensored 7b gptq 4bit 128g br https github com nlpxucan wizardlm open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main mpt storywriter 7b gptq 4bit 128g ipynb mpt storywriter 7b gptq 4bit 128g br https www mosaicml com open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main wizard lm uncensored 13b gptq 4bit 128g ipynb wizard lm uncensored 13b gptq 4bit 128g br https github com nlpxucan wizardlm open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main pyg 13b gptq 4bit 128g ipynb pyg 13b gptq 4bit 128g br https huggingface co pygmalionai pygmalion 13b open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main falcon 7b instruct gptq 4bit ipynb falcon 7b instruct gptq 4bit br https falconllm tii ae open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main wizard lm 13b 1 1 gptq 4bit 128g ipynb wizard lm 13b 1 1 gptq 4bit 128g br https github com nlpxucan wizardlm open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main llama 2 7b chat gptq 4bit ipynb llama 2 7b chat gptq 4bit 4bit br https ai meta com llama open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main llama 2 13b chat gptq 4bit ipynb llama 2 13b chat gptq 4bit 4bit br https ai meta com llama br wip please try llama 2 13b chat or llama 2 7b chat or llama 2 7b chat gptq 4bit open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main llama 2 7b chat ipynb llama 2 7b chat 16bit br https ai meta com llama open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main llama 2 13b chat ipynb llama 2 13b chat 8bit br https ai meta com llama open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main redmond puffin 13b gptq 4bit ipynb redmond puffin 13b gptq 4bit 4bit br https huggingface co nousresearch redmond puffin 13b open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main stable beluga 7b ipynb stable beluga 7b 16bit br https huggingface co stabilityai stablebeluga 7b open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main doctor gpt 7b ipynb doctor gpt 7b 16bit br https ai meta com llama https github com llsourcell doctorgpt open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main code llama 7b ipynb code llama 7b 16bit br https github com facebookresearch codellama open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main code llama instruct 7b ipynb code llama instruct 7b 16bit br https github com facebookresearch codellama open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main code llama python 7b ipynb code llama python 7b 16bit br https github com facebookresearch codellama open in colab https colab research google com assets colab badge svg https colab research google com github camenduru text generation webui colab blob main mistral 7b instruct v0 1 8bit ipynb mistral 7b instruct v0 1 8bit 8bit br https mistral ai colab pro according to the facebook research llama license non commercial bespoke license maybe we cannot use this model with a colab pro account but yann lecun said gpl v3 https twitter com ylecun status 1629189925089296386 i am a little confused is it possible to use this with a non free colab pro account tutorial https www youtube com watch v kga7eku1xua if you encounter an indexerror list index out of range error please set the models instruction template screenshot 2023 08 28 165206 https github com camenduru text generation webui colab assets 54370274 7f619737 eb3e 4368 9b03 65836d1207f0 text generation web ui https github com oobabooga text generation webui https github com oobabooga text generation webui thanks to oobabooga models license model license vicuna 13b gptq 4bit 128g from https vicuna lmsys org the online demo is a research preview intended for non commercial use only subject to the model license https github com facebookresearch llama blob main model card md of llama terms of use of the data generated by openai and privacy practices of sharegpt please contact us if you find any potential violation the code is released under the apache license 2 0 gpt4 x alpaca 13b native 4bit 128g https huggingface co chavinlo alpaca native https huggingface co chavinlo alpaca 13b https huggingface co chavinlo gpt4 x alpaca llama 2 https ai meta com llama llama 2 is available for free for research and commercial use special thanks thanks to facebookresearch for https github com facebookresearch llama br thanks to lmsys for https huggingface co lmsys vicuna 13b delta v0 br thanks to anon8231489123 for https huggingface co anon8231489123 vicuna 13b gptq 4bit 128g gptq 4bit quantization of https huggingface co lmsys vicuna 13b delta v0 br thanks to tatsu lab for https github com tatsu lab stanford alpaca br thanks to chavinlo for https huggingface co chavinlo gpt4 x alpaca br thanks to qwopqwop200 for https github com qwopqwop200 gptq for llama br thanks to tsumeone for https huggingface co tsumeone gpt4 x alpaca 13b native 4bit 128g cuda gptq 4bit quantization of https huggingface co chavinlo gpt4 x alpaca br thanks to transformers for https github com huggingface transformers br thanks to gradio app for https github com gradio app gradio br thanks to thebloke for https huggingface co thebloke stable vicuna 13b gptq br thanks to neko institute of science for https huggingface co neko institute of science pygmalion 7b br thanks to gozfarb for https huggingface co gozfarb pygmalion 7b 4bit 128g cuda gptq 4bit quantization of https huggingface co neko institute of science pygmalion 7b br thanks to young geng for https huggingface co young geng koala br thanks to thebloke for https huggingface co thebloke koala 13b gptq 4bit 128g gptq 4bit quantization of https huggingface co young geng koala br thanks to dvruette for https huggingface co dvruette oasst llama 13b 2 epochs br thanks to gozfarb for https huggingface co gozfarb oasst llama13b 4bit 128g gptq 4bit quantization of https huggingface co dvruette oasst llama 13b 2 epochs br thanks to ehartford for https huggingface co ehartford wizardlm 7b uncensored br thanks to thebloke for https huggingface co thebloke wizardlm 7b uncensored gptq gptq 4bit quantization of https huggingface co ehartford wizardlm 7b uncensored br thanks to mosaicml for https huggingface co mosaicml mpt 7b storywriter br thanks to occamrazor for https huggingface co occamrazor mpt 7b storywriter 4bit 128g gptq 4bit quantization of https huggingface co mosaicml mpt 7b storywriter br thanks to ehartford for https huggingface co ehartford wizardlm 13b uncensored br thanks to ausboss for https huggingface co ausboss wizardlm 13b uncensored 4bit 128g gptq 4bit quantization of https huggingface co ehartford wizardlm 13b uncensored br thanks to pygmalionai for https huggingface co pygmalionai pygmalion 13b br thanks to notstoic for https huggingface co notstoic pygmalion 13b 4bit 128g gptq 4bit quantization of https huggingface co pygmalionai pygmalion 13b br thanks to wizardlm for https huggingface co wizardlm wizardlm 13b v1 1 br thanks to thebloke for https huggingface co thebloke wizardlm 13b v1 1 gptq gptq 4bit quantization of https huggingface co wizardlm wizardlm 13b v1 1 br thanks to meta llama for https huggingface co meta llama llama 2 7b chat hf br thanks to thebloke for https huggingface co thebloke llama 2 7b chat gptq gptq 4bit quantization of https huggingface co meta llama llama 2 7b chat hf br thanks to meta llama for https huggingface co meta llama llama 2 13b chat hf br thanks to localmodels for https huggingface co localmodels llama 2 13b chat gptq gptq 4bit quantization of https huggingface co meta llama llama 2 13b chat hf br thanks to nousresearch for https huggingface co nousresearch redmond puffin 13b br thanks to thebloke for https huggingface co thebloke redmond puffin 13b gptq gptq 4bit quantization of https huggingface co nousresearch redmond puffin 13b br thanks to llsourcell for https huggingface co llsourcell medllama2 7b br thanks to metaai for https ai meta com research publications code llama open foundation models for code br thanks to thebloke for https huggingface co thebloke codellama 7b fp16 br thanks to thebloke for https huggingface co thebloke codellama 7b instruct fp16 br thanks to thebloke for https huggingface co thebloke codellama 7b python fp16 br thanks to mistralai for https huggingface co mistralai mistral 7b instruct v0 1 br medical advice disclaimer disclaimer this website does not provide medical advice the information including but not limited to text graphics images and other material contained on this website are for informational purposes only no material on this site is intended to be a substitute for professional medical advice diagnosis or treatment always seek the advice of your physician or other qualified health care provider with any questions you may have regarding a medical condition or treatment and before undertaking a new health care regimen and never disregard professional medical advice or delay in seeking it because of something you have read on this website
colab colab-notebook colaboratory gradio llama llm alpaca koala lama llamas vicuna
ai
xcore_iot
xcore registered iot repository version https img shields io github v release xmos xcore iot include prereleases https github com xmos xcore iot releases latest issues https img shields io github issues xmos xcore iot https github com xmos xcore iot issues contributors https img shields io github contributors xmos xcore iot https github com xmos xcore iot graphs contributors prs welcome https img shields io badge prs welcome brightgreen svg style flat square https github com xmos xcore iot pulls xcore iot is a collection of c c software libraries designed to simplify and accelerate application development on xcore processors it is composed of the following components peripheral io libraries including uart i2c i2s spi qspi pdm microphones and usb these libraries support bare metal and rtos application development libraries core to dsp applications including vectorized math these libraries support bare metal and rtos application development voice processing libraries including adaptive echo cancellation adaptive gain control noise suppression interference cancellation ic and voice activity detection these libraries support bare metal and rtos application development libraries that enable multi core freertos development https www freertos org symmetric multiprocessing introduction html on xcore including a wide array of rtos drivers and middleware code examples examples showing a variety of xcore features based on bare metal and freertos programming xcore iot is designed to be used in conjunction with the xcore ai explorer board evaluation kit the example applications compile targeting this board further information about the explorer board and xcore ai devices is available to on www xmos ai https www xmos ai build status build type status ci linux ci https github com xmos xcore iot actions workflows ci yml badge svg branch develop event push docs ci https github com xmos xcore iot actions workflows docs yml badge svg branch develop event push cloning some dependent components are included as git submodules these can be obtained by cloning this repository with the following command git clone recurse submodules git github com xmos xcore iot git documentation see the official documentation https www xmos ai documentation xm 014660 pc 2 html for more information including instructions for modifying the software programming tutorials api references license this software is subject to the terms of the xmos public licence version 1 https github com xmos xcore iot blob develop license rst copyrights and licenses for third party components can be found in copyrights and licenses https github com xmos xcore iot blob develop doc shared legal rst
server
Embedded-Systems-Design
embedded systems design mct4334 compilation of exercises examples for course subject embedded systems design mct4334 iium video playlist at https youtube com playlist list plosksast2ch48tce6bepv7gtlhfi0fmjl
os
paper-computer-vision
papers of computer vision papers from arxiv cs cv http arxiv org 2018 2018 md 2019 2019 md 2020 2020 md 2021 2021 md 2022 2022 md new year 2022 is coming aug 2022 mon tue wed thu fri 1 2022 202208 20220801 md 2 2022 202208 20220802 md 3 2022 202208 20220803 md 4 2022 202208 20220804 md 5 2022 202208 20220805 md 8 2022 202208 20220808 md 9 2022 202208 20220809 md 10 11 12 15 16 17 18 19 2022 202208 20220819 md 22 2022 202208 20220822 md 23 2022 202208 20220823 md 24 2022 202208 20220824 md 25 26 29 2022 202208 20220829 md 30 2022 202208 20220830 md 31 2022 202208 20220831 md sep 2022 mon tue wed thu fri 1 2022 202209 20220901 md 2 2022 202209 20220902 md
ai
simplePhpApi
api for engineering degree project it works as a backend for microsoft blend desktop app about potted plants info and care
server
FXFaceDetection
fxfacedetection license mit https img shields io badge license mit blue svg https raw githubusercontent com houarizegai prayertimes master license real time face detection with computer vision features x real time face detection video tracking x easy to use x responsive design x capture and save pictures on your pc thank you please this repo and share it with others screenshots face detection face detection screenshoot screenshots face detection gif requirements java 8 opencv installation 1 press the fork button top right the page to save copy of this project on your account 2 download the repository files project from the download section or clone this project by typing in the bash the following command git clone https github com houarizegai fxfacedetection git 3 import it in intellij idea or any other java ide and let maven download the required dependencies for you 4 setup opencv on your pc import it to your project follow this tutorial on medium https medium com aadimator how to set up opencv in intellij idea 6eb103c1d45c 5 run the application d contributing if you want to contribute to this project and make it better with new ideas your pull request is very welcomed if you find any issue just put it in the repository issue section thank you
java javafx opencv javafx-opencv face-detection face-tracking face-tracker opencv-java
ai
face-classification
h1 align center face classification h1 p align center img align center src https github com furkan gulsen face classification blob main assets faces jpg width 100 p p align center in this project one or more human faces are detected in real time and predictions are made about the faces detected by ai models trained in the background p h2 emotion recognition h2 img src https github com furkan gulsen face classification blob main outputs emotionrecognitionoutput gif raw true p i used only transfer learning models in this project to get faster and better results for emotion recognation p ul li vgg16 li li vgg19 li li resnet li li inception li li xception li ul p i experimented with transfer learning models i used the xception model in this section because it gives the best result p if you want to run real time face emotion recognition python python emotionrecognition py h2 gender classification h2 img src https github com furkan gulsen face classification blob main outputs genderclassificationoutput gif raw true p i used only transfer learning models in this project to get faster and better results for gender classification p ul li vgg16 li li vgg19 li li resnet li li inception li li xception li ul p i experimented with transfer learning models i used the vgg16 model in this section because it gives the best result p if you want to run real time gender classificatin python python genderclassificationwithdnn py or python python genderclassificationwithdlib py h4 license h4 p cover photo a href https www freepik com vectors people people vector created by pikisuperstar www freepik com a p
python machine-learning deep-learning tensorflow keras face-classification face-emotion-detection facenet dlib opencv emotion-recognition gender detecting-faces computer-vision age
ai
website
this template replaces readme md when someone creates a new repo with the fastpages template https github com nlp with transformers website workflows ci badge svg https github com nlp with transformers website workflows gh pages 20status badge svg https img shields io static v1 label fastai message fastpages color 57aeac labelcolor black style flat logo data image png base64 ivborw0kggoaaaansuheugaaabkaaaajcayaaabhckgoaaagmkleqvr42q1xa0xtvxyfkexlui9blszob12ydzpgzjhtyt5s zbxuxelbqshm2zzu5epbf lclxae29pcxr5vegglquuiokduclhm8ouk7s9ve19tltl fa5p9mnc y hryezglxjl 87zk9ob zf5 nghmalzygddymwh0qly3lybtwi6lxdpn2cwn5a0 hrqke5r2pon2ud okcn uf5zsvdummyxvri jzebdmi5 juhwrgj3mti2ga0vvsuicmwm7gkod42t7mf6bqhkfry2yk7x5pxcxmvdn5dgtff9nkjfe6w5iauyfshjfv1kplk7vpaa0k11wjzl ervmj4ikqo0dw8sydjl op0u5cn 3tqtn fqtivtbqpiavf0ig7igt6nevkjpkptbuo3hj qo47xb8hfhfigaela t6mqqzfi e0otkm3iexqnxau56zrk5slvsq70lmf7tux0xntyvi1rthzlst3tgocgxwd0dpwdgoe07qkcsl m5ynbhwmzvm6b0sp9o2dzn8atztqk9w9b2g2hlbbvsjlx fry0vwu0os5sh68ylmilny3c3x9sovpruqn7ho8vqulzq6wjmuxfazcrfkdd5bg8b1bpc nu0 fqtgkylingoejwgt j9uxcijg1whj05ul4imejbslquufojjkqncdr4yshmeo1 umia3umr9tupj7zdmfjk8yo6razjlaf jqm rifco yp4aycgmlguat9cz0oyp2um5prjblhtvlhy68fs7rfqbrvslf15ybgdylcpjmcpficiut4nqqt sa2vazaby1fb jgi1c9inhuiv9fpiysitih3cvgvazxfee1evzse bwr8bolcaxs zcqkxksqc5 fd2d svt06i8iytauezlzzsvm 3ordmon1ok 2nkyijss0xnj84rknxg6zggee1it rsptryudoxbkajlro1qnw7 opqenxf4hwv6v4rql3ufrvl datnc 29x4lmy2t4fxvjy asgwylm8dbvksm2gpgx1bpg4hyyysqvouufrw0z8 jxe40yifsp1lpc9navljpe9jih7rvwfjywmkzo4hkh02nz1filfkjli1b4ghlpduazgazho9lgdx waj7 npzwuqqvuoboo1va91dj3tdgyinc0dae hyirxvc2npbcxlxrjvcw3ceskdmhkcoexrynulsqg0xu0iis5dxwzm6c x9ikkex8q2lkv5rarjccm9we2sgszhgzmgmyjjou7uhpoiqhrwwlmewrbzhgcbrkkkx4ysvvbmzqnxosdhwcys6sv20ha vasftise8 ttvhede4narlxvb1kde0fyagjgaowgyd1vxkrqminksbchrkmiuc4kilhonao4 9gwvhynelqmesaxbrdshtp7dq5crwly2vlze efrcvdcbqvbtpzexly1jmpvlthzbbrasbodssbipgobqv6c sujzffwflqx8btevctzmzeosluo9qjjzytzdw3ruiktihlhxdfhdoj7ttxy xdbbpgushwfmsrytvwim7fjvt6afyonovkqc7mzqdzznwsmnd3uegcudl8r2qzhz7bjbqoygyn692 zmulcfxenooactotbunjyrfsq 5 a3sjp5bxm6hez7obhnoveihyocekix6wiiykwwdd1hhzt8rzy2yqxnk0hnqbjtw500ddiwrdgdiecabz4mpnkqdk9xdhup3wfhsqbbi9v e9jo0iy30ccogamyvgmmvcmwql cqxfkp2r1dwwrrm0pzukrixc9ykdy hnj5dqke709guriwsrggzwtqcpabwjz6vbnhqlgo099 ccempnf6xnwynyetewd8ls0wpupswntrfuahawacpsluiqrnlbgxfsa7trl8v3gnhestnlfy0jb bywvp0i7scly184jvtcayi7so2yua0r4npbjsv8cjhzhpq7no323cj5w8fqplwr yjnrnhs0hngs6zfw lpsb 9oj dzsbul0xunojx4d9gch5mot0iminsdkyhzt9muz1lcxhrwbo9a8j3b72h8lg6 bkb1hywmpeerbxmgrxebcm7ddfh 1jduwhb5 qkaaaaasuvork5cyii https github com fastai fastpages https nlp with transformers github io website website website for our o reilly book natural language processing with transformers https learning oreilly com library view natural language processing 9781098103231 adding new content if you d like to tweak the website s content just edit the index html file if you have docker installed you can view the changes locally by building the website with bash make server powered by fastpages https github com fastai fastpages
nlp transformers deep-learning huggingface
ai
thelocalhost
the localhost modern web development guides hints and tips follow the guides youtube playlist the post links youtube playlist https www youtube com playlist list plgi uhe v04drz58l 0o5cilalymkc7on the post https thelocalhost io 2019 10 31 build an mdx blog
gatsby mdx
front_end
GCP-Cloud-Engineering
google cloud platform cloud engineering course
cloud
EmbeddedSystemsDesign
embeddedsystemsdesign embedded systems design course spring 2016
os
mdn-fiori
mdn fiori mdn web docs front end style guide this has been merged into the main mdn web docs code repository at https github com mdn yari
styleguide mdn mdn-fiori
front_end
khs-blockchain-java-example
simple blockchain java reference implementation this repo contains an example of a blockchain without a specific vendor written in java this is a companion project to accompany the 2018 keyhole software https keyholesoftware com white paper blockchain for the enterprise there is a similar example companion project written in c https github com in the keyhole khs blockchain csharp example installation and running 1 clone repo 2 import into ide as maven project and or execute junit tests jdk 1 8 required junit tests helpers hashtest java calculate a sha256 hash string from message nonce noncetest java mine a hash using a nonce key simple chain simplechaintest java create a block chain and add and order blocks
blockchain
Udacity-image-filter-starter-code
please find the url to test the application http image filter starter code dev dev2 us east 2 elasticbeanstalk com
cloud
Database-Engineering
database engineering a series of colab notebooks focused on database engineering amp cover a range of topics related to database design implementation and optimization including sql fundamentals data modeling indexing strategies and more list introduction to databases br version control
server
Hands-On-Natural-Language-Processing-with-Pytorch
hands on natural language processing with pytorch video this is the code repository for hands on natural language processing with pytorch video https www packtpub com application development hands natural language processing pytorch video utm source github utm medium repository utm campaign 9781789133974 published by packt https www packtpub com utm source github it contains all the supporting project files necessary to work through the video course from start to finish about the video course the main goal of this course is to train you to perform complex nlp tasks and build intelligent language applications using deep learning with pytorch you will build two complete real world nlp applications throughout the course the first application is a sentiment analyzer that analyzes data to determine whether a review is positive or negative towards a particular movie you will then create an advanced neural translation machine that is a speech translation engine using sequence to sequence models with the speed and flexibility of pytorch to translate given text into different languages by the end of the course you will have the skills to build your own real world nlp models using pytorch s deep learning capabilities h2 what you will learn h2 div class book info will learn text ul li processing insightful information from raw data using nlp techniques with pytorch li working with pytorch to take advantage of its maximum speed and flexibility li traditional and modern nlp methods amp tools like nltk spacy word2vec amp gensim li implementing word embedding model and using it with the gensim toolkit li sequence to sequence models used in translation that read one sequence amp produces another li usage of lstms using pytorch for sentiment analysis and how its different from rnns nbsp li comparing and analysing results using attention networks to improve your project s performance li ul div instructions and navigation assumed knowledge to fully benefit from the coverage included in this course you will need br to fully benefit from the coverage included in this course you will need python programming skill some understanding of nlp basic understanding of matrix operations technical requirements this course has the following software requirements br this course has the following software requirements python 3 pip installer jupyter notebook this course has been tested on the following system configuration os any modern os windows mac or linux processor an intel i7 processor memory 8 16gb of ram hard disk space 250mb video card nvidia graphics card is highly recommended related products deep learning adventures with pytorch video https www packtpub com big data and business intelligence deep learning adventures pytorch video utm source github utm medium repository utm campaign 9781789138641 deep learning and neural networks in pytorch for beginners video https www packtpub com application development deep learning and neural networks pytorch beginners video utm source github utm medium repository utm campaign 9781789536249 dynamic neural network programming with pytorch video https www packtpub com application development dynamic neural network programming pytorch video utm source github utm medium repository utm campaign 9781789610314
ai
SWE-Words
swe words database the chinese english swe software engineering words database for foreign developers part i https github com yerzhanserikbay swe words database blob master part 20i md part ii https github com yerzhanserikbay swe words database blob master part 20ii md part iii https github com yerzhanserikbay swe words database blob master part 20iii md part iv https github com yerzhanserikbay swe words database blob master part 20iv md part v https github com yerzhanserikbay swe words database blob master part 20v md part vi https github com yerzhanserikbay swe words database blob master part 20vi md part vii https github com yerzhanserikbay swe words database blob master part 20vii md part viii https github com yerzhanserikbay swe words database blob master part 20viii md part ix https github com yerzhanserikbay swe words database blob master part 20ix md support if you have a question find a bug or just want to say hi please open an issue on github https github com yerzhanserikbay yerzhanserikbay github io issues new or give me feedback via email yerzhan serikbay gmail com license mit license license yerzhan serikbay
server
eth-indexer
eth indexer eth indexer is an ethereum blockchain indexer project to crawl blocks transactions state difference per block address into mysql database travis https travis ci com getamis eth indexer svg branch develop https travis ci com getamis eth indexer codecov https codecov io gh getamis eth indexer branch develop graph badge svg https codecov io gh getamis eth indexer go report card https goreportcard com badge github com getamis eth indexer https goreportcard com report github com getamis eth indexer getting started there are 3 main components in the project 1 geth modified geth to get state difference per block address 2 idx database mysql to store all indexed data 3 indexer indexer to crawl from geth then push to database prerequisites docker docker compose before building before building please make sure environment variables mysql data path and geth data path are setup properly which are used to mount local data folder to mysql and geth containers for data persistence one way to set this up is to have a env file in the same folder of the docker compose yml example env file mysql data path indexer data mysql geth data path indexer data geth configs and flags eth indexer supports two kinds of input 1 static config yaml files 2 dynamic flags through command line you can either define your configs config yml or pass flags e g indexer eth port 1234 from command line to start eth indexer if you use both settings eth indexer will load configs config yaml as default and overwrite the corresponding values with specified flags from command line build shell git clone git github com getamis eth indexer git cd eth indexer set mysql data path and geth data path environment variables docker compose build usage we use docker compose for testing and developing mysql data path geth data path environment variables are necessary create them out of eth indexer directory to store database and geth data first time to run indexer you need to create the database schema shell mkdir p indexer data mysql indexer data geth create database sechema mysql data path home indexer data mysql docker compose up idx database idx migration press ctrl c when see eth indexer idx migration 1 exited with code 0 then use docker compose up with environment variables to start indexer shell mysql data path home indexer data mysql geth data path home indexer data geth docker compose up wait few minutes then you can see indexing messages from indexer inserted td for block number 0 td 17179869184 hash 0xd4e56740f876aef8c010b86a40d5f56745a118d0906a34e69aec8c0db1cb8fa3 example once there are some data in mysql you can query specific data from it e g you can get data from block headers and transactions table balance is slightly different and you can take a look at example example folder to see how to query them go package main import context fmt github com ethereum go ethereum common github com getamis eth indexer model github com getamis eth indexer store account github com getamis eth indexer store sqldb github com getamis sirius database github com getamis sirius database mysql func main db sqldb new mysql database driveroption mysql database ethdb mysql connector mysql defaultprotocol 127 0 0 1 3306 mysql userinfo root my secret pw addr common hextoaddress 0x756f45e3fa69347a9a973a725e3c98bc4db0b5a0 store account newwithdb db account err store findaccount context background model ethaddress addr if err nil fmt printf failed to find account v n err else fmt printf find account block number v balance v n account balance account blocknumber erc20 is similar and you can see the test case for erc20 store balance erc20 test go to know how to use it contributing there are several ways to contribute to this project 1 find bug create an issue in our github issue tracker 2 fix a bug check our issue tracker leave comments and send a pull request to us to fix a bug 3 make new feature leave your idea in the issue tracker and discuss with us then send a pull request license this project is licensed under the lgpl 3 see the license license file for details
golang blockchain ethereum geth indexer database
blockchain
Shelter_Mobile
shelter mobile overview a system to detect weather and potential disasters there are forest fire landslide flood and earthquake this system is able to forecast those things upon one year from now also as a platform to report it straight away features 1 various disaster prediction 2 weather prediction 3 reporting call and text 4 article news 5 account system technology stack 1 room for storage by using room for storage it will make our code cleaner because writing queries becomes simpler and directly generates an object 2 mvvm for design pattern beside we used mvvm because of google best practice mvvm could make our application more reactive we also make an improvement in model by using asynchronous view model so we can put preasync method in our view model the view can be adjusted before asynchronous task performed 3 retrofit for network because it will make the object relational mapping more clean than the other library rest client 4 clean architecture by using clean architecture we do separation of concern we divide between layers data and view so the dependency between them is loosely coupled so if there are some modification in one layer it will not make a major changes to another components machine learning documentation https github com alfalifr shelter ml cloud computing documentaion https github com alfalifr shelter cloud
front_end
duke-coursera-dennis
duke coursera dennis repo f r alles zur spezialisierung cloud engineering duke uni
cloud
Cattle-grazing-auto-system-
cattle grazing auto system this is an automated cattle grazing system designed during my embedded systems course the system has a 7 segment display to register the number of hours the cattle have been grazing a keypad to open the paddocks 1 through 4 or 0 for all an lcd to keep track of the status of the paddocks main doors a buzzer that sounds when the time for grazing or for drinking water is up and red and green leds to represent each of the paddocks when the door is opened the cows move into the paddock graze for 4 minutes then move to the water subdivision and drink for two minutes before going back to the grazing paddock where they spend another 3 minutes and finally return to the resting area though the pattern is repeatable the 7 segment display will reset the count after each cycle
os
AAGPT
aagpt auto agent gpt a k a aagpt is another experimental open source application showcasing the capabilities of large language models strong support general tasks overcooked game strong language english readme md docs readme cn md div align center img src assets demo png width 200 height 200 img src assets demo overcooked000 png width 200 height 200 div features memory support gpt as memory vector database as memory requires a pinecone api key lifespan limit for an agent may save money support for playing the overcooked game installation to install aagpt follow these steps 1 clone the aagpt repository from github and navigate to the downloaded folder bash git clone git github com hyintell aagpt git cd aagpt 2 use the following command in your terminal with pip bash pip install r requirements txt quickplay just two steps you can start using aagpt s natural language processing abilities with your openai api key 1 open the setup game yaml file and enter your openai api key in the openai api key field 2 navigate to the aagpt folder and run the following command bash python aagpt py setup after installing aagpt you will need to set up related apis to use the application you can do this by following these steps 1 navigate to the setup folder in the aagpt directory bash cd setup 2 in the setup folder there are two game settings game yaml which using the chatgpt as momery store and game2 yaml which using the pinecone as momery store you can choose one of them to set up the api 3 in the game yaml file you will use gpt as memory store so please fill in the following information openai api key your openai api key if you don t have one you can create a free account and get an api key from the openai website openai model the openai chatgpt model to use choose from gpt 3 5 turbo gpt4 or text davinci 003 env openai api key openai chatgpt key for env you can keep same as the common agent openai api key openai chatgpt key for agents you can keep same as the common goal the main objective of the ai agent init task the initial tasks to be appended to the task list agent life the life time of the agents in default we set it to 256 note optionally you can use game2 yaml which using pinecone as memory store in addition to the above settings you will need to fill in the following information agent pinecone api key the form will be a list your pinecone api your pinecone region the first is pinecone api and second will be the region of your index you can get it from the pinecone website agent pinecone index the index name of the pinecone index to use in default we use aagpt agent index playing overcooked div align center img src assets demo overcooked000 png width 200 height 200 div 1 install opencooking https github com hyintell opencooking envs 2 let s play bash python aagpt overcooked py render usage after setting the correct apis you can test aagpt by executing the aagpt py file in your terminal bash python aagpt py once aagpt is running you can start interacting with it by typing in prompts and observing its responses if you want to change the setup or memory setting you can use the following command bash python aagpt py world root setup game2 yaml todo ui more memory support multi agent support more llms support acknowledgement we are deeply grateful for the contributions made by open source projects auto gpt https github com significant gravitas auto gpt and babyagi https github com yoheinakajima babyagi references auto gpt babyagi gym cooking overcookedgpt overcooked ai
gpt-35-turbo gpt-4 agent llms
ai
ehr
ehr health information sharing technology
server
itba-cloud-data-engineering-python-data-app
building a data system with airflow the goal of this practice is to build a system that logs the daily price of different stocks instructions 1 setup airflow using the official docker compose yaml file this can be obtained here https airflow apache org docs apache airflow stable start docker html docker compose yaml before starting we suggest changing one line in the docker compose file to disable the automatic loading of examples to avoid ui clutter airflow core load examples true to airflow core load examples false by default this setup creates a dags directory in the host from which airflow obtains the dag definitions after the initial setup an airflow instance should be reachable in http localhost 8080 home the default username and password are airflow airflow 2 create another database in the postgres used by airflow to store the stocks data 3 develop the data model to store the daily stock data symbol date open high low close using sqlalchemy s declarative base then create the tables in the db 4 create a python class similar to the sqliteclient of the practical airflow coursework in order to connect with the postgres db you should implement the same methods present in the sqliteclient bonus try to write a parent base db api class and make the sqlite and postgres client inherit from it 5 develop a dag that obtains the price information of google goog microsoft msft and amazon amzn and then inserts the data in the database using the python class developed in the previous point for this we suggest using the following api https www alphavantage co the python files that define the dags should be placed in the dags directory previosly mentioned 6 add another task the that depends on the first one that fetches data from the database of the last week and produces a plot of the value of each stock during said time period 7 add two unit tests runnable with pytest https docs pytest org that can be run from the commandline one that tests the extraction this refers to the formatting that takes place after the data is fetched from the api that is to be inserted in the db another for the aggregation of the values used for plotting after they are extracted from the db 8 implement a ci step using github actions https docs github com en actions to run the unit tests using pytest each time a commit is pushed to a branch in a pr in case of failure the result should be visible in github s merge request ui extras using the suggested docker compose setup you can access the database using airflow as both busername and password in the following way sudo docker compose exec airflow webserver psql h postgres password for user airflow psql 11 13 debian 11 13 0 deb10u1 server 13 4 debian 13 4 4 pgdg110 1 warning psql major version 11 server major version 13 some psql features might not work type help for help airflow in the same way you can open a shell to work inside the docker containers using sudo docker compose exec airflow webserver bin bash this can be useful when creating the tables that will hold the data when connecting to the db from inside the container you can use the default value of the airflow core sql alchemy conn variable defined in the compose file bonus points if you want to go an extra mile you can do the following add the configs for pylint https pylint org and black https black readthedocs io en stable implement a ci step using github actions https docs github com en actions to run pylint each time a commit is pushed to a branch in a pr
cloud
12-text-processing
core skills data science springboard day 12 special data types natural language processing binder https mybinder org badge svg https mybinder org v2 gh core skills 12 text processing git master overview aims 1 gain a practical understanding of traditional and modern natural language processing techniques 2 develop an intuition for knowledge graphs and ontologies 3 familiarisation with basic text handling and processing such as lemmatisation stemming etc 4 gain intuition towards word vectors and their applications in natural language processing 5 develop an understanding of supervised learning using modern tools such as huggingface 6 develop an understanding of unsupervised learning using latent topic models schedule awst aest agenda 07 30 07 45 09 30 09 45 q a issues announcements 07 45 09 15 09 45 11 15 12 0 overview of nlp 09 15 09 30 11 15 11 30 morning tea 09 30 11 00 11 30 13 00 12 1 2 fundamentals of nlp 11 00 11 45 13 00 13 45 lunch 11 45 13 15 13 45 15 15 12 3 supervised learning 13 15 13 30 15 15 15 30 afternoon tea 13 30 14 30 15 30 16 30 12 4 unsupervised learning 14 30 14 55 16 30 16 55 closeout feedback 14 55 15 00 16 55 17 00 menti feedback 15 00 15 00 17 00 17 00 closeout instructions for setting up interactive google colab notebooks the following steps assume that you have access to a google account e g have a gmail email copying the code repository from github 1 go to https github com core skills 12 text processing and click on the green button clone or download download the repository zip file and unzip 2 you should now have a folder with the following structure 12 text processing data reviews data txt gz walden txt wamex xml zip handouts coreskills week 12 nlp pptx notebooks 12 1 an introduction to the nltk ipnyb 12 2 1 gensim word vector visualisation ipynb 12 2 2 wordembeddings ipynb 12 3 lda gswa ipynb 12 4 pytorchsentimentanalysis ipynb ds store license readme md environment yml download embedding data 1 download the embedding data glove 6b 100d txt from the following google drive link https drive google com open id 1l9byied6jbtw3p4ywmnyvt178lup5mx9 2 place this file into 12 text processing data folder in the structure shown above copy the 12 text processing folder into google drive 1 copy the entire folder 12 text processing into your google drive my drive the path at the top of the drive should look like my drive 12 text processing master ensuring that google colab notebooks function correctly 1 in your google drive go to the folder my drive 12 text processing notebooks open the first notebook 12 1 an introduction to the nltk ipynb by right clicking on it s name and selecting open with google colaboratory this will open a new tab in your browser 2 with the notebook open get familiar with mounting your google drive to the notebook we ll need to do this for the majority of the notebooks to ensure we can access the data for this particular notebook this is done in the first cell under the heading loading text data 3 press shift enter to execute the cell or click on the play button on the upper left hand side of the cell this will prompt you to follow a url and get an activation code to permit mounting the drive once you d done this it will not be required again and will permit you to access files on your google drive resources mentioned in the course 1 cere for transforming maintenance through data science ctmtds https www maintenance org au 2 ctmtds theme 1 support the maintainer wei tyler nlp tlp https www maintenance org au category rt1 3 uwa natural technical language processing group https nlp tlp org 4 industrial ontologies maintenance working group https www industrialontologies org page id 92 5 interactive word2vec embedding visualisation tool https ronxin github io wevi 6 huggingface https huggingface co 7 huggingface notebooks https github com huggingface transformers tree master notebooks 8 allen institute of artificial intelligence ai2 demos https demo allennlp org reading comprehension bidaf elmo 9 gpt 3 language model demos https beta openai com examples dependencies used in the notebooks 1 spacy industrial strength natural language processing https spacy io 2 gensim topic modelling for humans https radimrehurek com gensim 3 nltk natural language toolkit https www nltk org 4 pytorch binary cross entropy loss bceloss https pytorch org docs stable nn html bceloss 5 pytorch recurrent neural network rnn module https pytorch org docs stable nn html rnn 6 cuda framework for gpu training https developer nvidia com cudnn 7 cuda supported gpus https developer nvidia com cuda gpus
ai
Hello-World
hello world learning git through dataquest io database engineering certification 230728
server
blockchain101
blockchain101 https taibiaoguo gitee io blockchain101 https taibiaoguo github io blockchain101 a rel license href http creativecommons org licenses by nc sa 4 0 img alt style border width 0 src https i creativecommons org l by nc sa 4 0 88x31 png a br a rel license href http creativecommons org licenses by nc sa 4 0 4 0 a
blockchain course mooc
blockchain
docker-nginx-php-lecture
itp 405 deeper dive into docker demo intro in class we have played a bit with laravel sail a local development tool powered by docker in the demo we ll look at a more generalized approach to deploying php applications with nginx and a cgi server this is more popular in the real world don t get caught up too much in the big words that are being thrown around just know this reverse proxy or nginx will refer to a web server that receives the request cgi server or php fpm will be the thing that actually executes our php code for us this demonstration is just meant to be a brief overview of what might go into deploying a laravel app in production i hope you get something out of learning what it takes to build a dockerfile the goal is to understand docker better not to become an nginx or cgi server expert high level overview on what we will build docker container with our application and two processes running nginx and php fpm that will run on heroku nginx is our reverse proxy server that will handle and forward requests to our cgi server php fpm php fpm is our cgi server that will receive the request from nginx and actually run our php application important note general best practices usually say to separate nginx and php fpm into separate containers to keep this demonstration simple we will keep them in one container prerequisites docker heroku cli background on nginx reverse proxies cgi servers what is a proxy server and a reverse proxy proxy server intermediarry server that forwards requests from content from multiple clients to different servers broad reverse proxy type of proxy server that forwards requests to the appropriate backend server motivating factors for running a laravel app in front of a reverse proxy strong php laravel s built in development server is not meant for production strong proxying to multiple applications ie some requests go to your laravel app but others go to a django app logging well known proxy servers such as nginx and apache are built for a production environment in general speed security configuration provided out of the box strong they abstract lots of concerns away from developers strong strong we will be using nginx as a reverse proxy server strong but don t worry you will not need any background if the configuration file looks scary that s totally ok you don t need to know what everything means this is just demonstrating how one might go about deploying containerized laravel apps in production review the nginx conf file look at the commented block above the root property this points to the public subdirectory within our laravel code look at the index property pointing to index php this is our entrypoint into our application look at the comment block that starts with cgi stuff this is where requests get proxied to our php fpm cgi server dockerfile pseduo code 1 install php fpm in container 1 install nginx in container 1 copy over our app code 1 copy over configuration files required 1 set the default command to first start nginx and then php fpm
server
interactive-map-dog-poison-in-belgrade-app
getting started with create react app this project was bootstrapped with create react app https github com facebook create react app available scripts in the project directory you can run npm start runs the app in the development mode open http localhost 3000 http localhost 3000 to view it in the browser the page will reload if you make edits you will also see any lint errors in the console npm test launches the test runner in the interactive watch mode see the section about running tests https facebook github io create react app docs running tests for more information npm run build builds the app for production to the build folder it correctly bundles react in production mode and optimizes the build for the best performance the build is minified and the filenames include the hashes your app is ready to be deployed see the section about deployment https facebook github io create react app docs deployment for more information npm run eject note this is a one way operation once you eject you can t go back if you aren t satisfied with the build tool and configuration choices you can eject at any time this command will remove the single build dependency from your project instead it will copy all the configuration files and the transitive dependencies webpack babel eslint etc right into your project so you have full control over them all of the commands except eject will still work but they will point to the copied scripts so you can tweak them at this point you re on your own you don t have to ever use eject the curated feature set is suitable for small and middle deployments and you shouldn t feel obligated to use this feature however we understand that this tool wouldn t be useful if you couldn t customize it when you are ready for it learn more you can learn more in the create react app documentation https facebook github io create react app docs getting started to learn react check out the react documentation https reactjs org code splitting this section has moved here https facebook github io create react app docs code splitting https facebook github io create react app docs code splitting analyzing the bundle size this section has moved here https facebook github io create react app docs analyzing the bundle size https facebook github io create react app docs analyzing the bundle size making a progressive web app this section has moved here https facebook github io create react app docs making a progressive web app https facebook github io create react app docs making a progressive web app advanced configuration this section has moved here https facebook github io create react app docs advanced configuration https facebook github io create react app docs advanced configuration deployment this section has moved here https facebook github io create react app docs deployment https facebook github io create react app docs deployment npm run build fails to minify this section has moved here https facebook github io create react app docs troubleshooting npm run build fails to minify https facebook github io create react app docs troubleshooting npm run build fails to minify
server
Pngyu
pngyu is front end gui application of pngquant pngyu is distributed under the bsd license
front_end
Node.js-Blog-Engine
this repository has moved you can find the code here https github com croach nodejs step by step code or within the main repository which contains the code as well as the scripts for this series here https github com croach nodejs step by step
front_end
A2D1_B2C2_2022BP1
api todolist in de map todolist is een eenvoudige applicatie te vinden met deze applicatie is het mogelijk een todolist bij te houden de applicatie is ontwikkeld in net core 7 0 dit is een voorbeeld van hoe een applicatie eruit zou kunnen zien in lagen zie laatste hoofdstuk hieronder met verder lezen links over diverse onderwerpen uit deze opdracht uitleg bestaande appplicatie datalayer de data wordt momenteel opgeslagen in een tekstbestand in json formaat dit is een eenvoudig opslagmechanisme en niet geschikt voor multithreading of complexe relaties zoals wel kan bij een rmdb voor dit voorbeeld volstaat dit de daadwerkelijke dal is de jsondal hierin zijn alle crud operaties naar de file geimplementeerd de json file wordt automatisch opgeslagen in je bin folder indien de file niet wordt gevonden wordt dummy data aangemaakt wil je je data resetten gooi dan de file gewoon weg naast de dal is er een idataaccesslayer interface deze is gemaakt om in de toekomst mogelijk andere dals te implementeren stel je wilt een mssql dal gaan maken dan kun je dat doen op basis van de interface er is een dalsingleton class welke volgens het singleton pattern ervoor zorgt dat je je dal niet telkens opnieuw hoeft aan te maken models hierin staat de enige entiteit van deze applicatie het is een class uitgewerkt met properties en methods de methods gedrag van de class be nvloeden de properties status in je eigen applicatie zou dit ook zo uit moeten zien program cs dit is het programma dat momenteel alle crud operaties een keer uitvoert todolist app uitleg http img youtube com vi uh39jzkxyg4 0 jpg https youtu be uh39jzkxyg4 opdracht het is de bedoeling om voor deze bestaande applicatie een api te maken om dit te doen voer je de volgende stappen uit de stappen zijn ook te zien in het filmpje filmpje om te beginnen 1 maak een fork van de repository door rechtsboven op de knop fork te klikken en de stappen te doorlopen je hebt nu een kopie gemaakt van de repository in je eigen github account fork create fork png 2 haal de code binnen in visual studio door op code open in visual studio te klikken open code opencodeownrepo png 3 bekijk de code op basis van bovenstaande uitleg zorg dat je snapt welke componenten er zijn en hoe het werkt voer de console applicatie uit en kijk of alles naar wens werkt maak een fork http img youtube com vi uiftnnkojko 0 jpg https youtu be uiftnnkojko maak de api zie ook het filmpje onder 1 maak een nieuw project in je solution kies voor asp net core web api let op dat je c kiest geef het project een passende naam bijvoorbeeld todoapi verander niets aan de standaard instellingen voor het project open code createapiproject png 2 zoek in je solution het nieuwe project op en klik met je rechtermuisknop kies voor set as startup project als de oplossing nu gestart wordt zal dit het project zijn wat wordt uitgevoerd de api zal dus starten valideer of dit gebeurt 3 onderzoek welke componenten er zijn aangemaakt in je nieuwe project zorg dat je begrijpt waar alles voor dient 4 rechtermuisklik op de controllers map en voeg een controller class toe kies voor api controller with read write actions geef de controller een passende naam zoals todocontroller naamgeving conventions geven aan dat je moet besluiten met controller 5 voer je applicatie opnieuw uit en zie in de swagger ui dat je nieuwe controller m t endpoints meteen herkend worden er zit echter nog geen logica in de api dus er zal nog niets zinvols gebeuren als je ze aanroept 6 ga naar de dependencies van het todoapi project en voeg een project reference toe naar het todolistmodel 7 implementeer de gegenereerde methods zodanig dat je bestaande model project todolistmodel wordt gebruikt 8 maak extra endpoints om een taak aan iemand te assignen en om een taak te finishen 8 test de api met swagger ui maak een fork http img youtube com vi ypsychl0ie0 0 jpg https youtu be ypsychl0ie0 verdieping statuscodes als je http gebruikt heeft elk request een statuscode zie hier een lijst met statuscodes https en wikipedia org wiki list of http status codes zie ook https docs microsoft com en us dotnet api system net httpstatuscode view net 6 0 zie deze documentatie voor de how to https docs microsoft com en us aspnet core web api action return types view aspnetcore 6 0 1 pas je endpoint aan zodat het aangemaakte object wordt teruggegeven en je een bijbehorende statuscode krijgt te zien filmpje verdieping documentatie voeg de xml commentaren toe uit je code aan de documentatie https learn microsoft com en us aspnet core tutorials getting started with swashbuckle view aspnetcore 6 0 tabs visual studio xml comments verdieping api aanroepen we hebben de api nu aangeroepen met de swagger ui in de praktijk zal een api aangeroepen worden door een ander systeem bijvoorbeeld een frontend web mobile of een andere applicatie probeer de applicatie postman eens om de api aan te roepen implementeer zelf een eenvoudige console applicatie die de api aanroept https docs microsoft com en us aspnet web api overview advanced calling a web api from a net client verdieping authenticatie onderzoek hoe je authenticatie kunt gebruiken op de api https docs microsoft com en us aspnet web api overview security authentication and authorization in aspnet web api implementeer dit zelf of probeer een nieuw api project aan te maken met authenticatie vinkje bij maken van project kijk wat er gebeurt verder lezen json https www json org json en html interface https docs microsoft com en us dotnet csharp language reference keywords interface singleton pattern https en wikipedia org wiki singleton pattern fork https docs github com en get started quickstart fork a repo http status codes https en wikipedia org wiki list of http status codes http methods https en wikipedia org wiki representational state transfer semantics of http methods swagger https swagger io alle videos https youtube com playlist list plv3 439d8hgalu3zye61x4 qsfvz17xkl
front_end
Principles-of-Machine-Learning-Python
principles of machine learning python principles of machine learning python
ai
sapper
sapper sapper https sapper svelte dev is deprecated in favor of its successor sveltekit https kit svelte dev which we recommend using instead financial support to support sapper sveltekit and the rest of the svelte ecosystem please consider contributing via opencollective https opencollective com svelte license mit license
front_end
Metacognitive-Prompting
metacognitive prompting improves understanding in large language models this repository contains datasets model descriptions the full set of prompts used in experiments and corresponding experimental results datasets we utilize multiple natural language understanding datasets for our experiments selected from glue https gluebenchmark com and superglue https super gluebenchmark com for evaluation purposes we utilize the development set corresponding to each task the overview of datasets is shown below task dataset input output metric sentiment sst 2 single sentence binary accuracy similarity sts b sentence pair continuous pearson spearman correlation paraphrase qqp question pair binary f1 accuracy qa nli qnli question passage binary accuracy nli wnli rte cb sentence pair binary ternary f1 accuracy wsd wic sentence pair target word binary accuracy coref wsc passage pronouns binary accuracy qa copa question choices binary accuracy here qa stands for question answering nli is natural language inference wsd is word sense disambiguation and coref is coreference resolution datasets can be obtained in datasets datasets models in our evaluation we consider five popular large language models llms the open source models llama 2 13b chat https arxiv org pdf 2307 09288 pdf and vicuna 13b v1 1 https lmsys org blog 2023 03 30 vicuna and the closed source models palm bison chat https arxiv org pdf 2305 10403 pdf gpt 3 5 turbo and gpt 4 https arxiv org pdf 2303 08774 pdf for all models we apply greedy decoding i e temperature 0 for response generation prompts metacognitive prompting mp is inspired by human introspective reasoning processes the figure below shows the alignment between human metacognitive pro cesses and the stages of mp for llms div align center img width 75 alt image src https github com eternityyw metacognitive prompting blob main image sources human llm metacognition png div mp consists of five main stages 1 understanding the input text 2 making a preliminary judgment 3 critically evaluating this preliminary analysis 4 reaching a final decision accompanied by an explanation of the reasoning and 5 evaluating the confidence level in the entire process a sample question chosen from the quora question pair qqp dataset demonstrates the overall mp process div align center img width 90 alt image src https github com eternityyw metacognitive prompting blob main image sources mp illustration png div the diagram features three columns from left to right representing the high level metacognitive stages specific metacognitive prompts fed into the llm and the llm s corresponding outputs prompts in the middle column are collectively fed into the llm as a single input during the experiments for our experiments we compare our proposed mp with standard prompting sp and chain of thought cot prompting each of these is conducted under zero shot and 5 shot learning settings for exemplars used under 5 shot learning settings they are randomly selected from the training set of each dataset each dataset has its own set of exemplars where exemplar answers are obtained through human annotation the full set of prompts used when applying mp stp and cot under zero shot and 5 shot learning paradigms can be found in prompts prompts experimental results the experimental results for each dataset can be found in results results for each dataset we experiment with three prompting methods in zero shot and 5 shot learning scenarios across five llms we report the best result after multiple experimental iterations please refer to our full paper https arxiv org pdf 2308 05342 pdf for more details citation if you find this work helpful please consider citing as follows ruby article wang2023metacognitive title metacognitive prompting improves understanding in large language models author wang yuqing and zhao yun journal arxiv preprint arxiv 2308 05342 year 2023
natural-language-understanding prompting
ai
SWVA
swva engineering database website
server
EDAutopilot-v2
edautopilot an autopilot bot for elite dangerous based on opencv python notice this program only works in elite dangerous horizons currently windows 11 compatible structure v2 struct img struct jpg what can it do auto align with the displayed navigation circle run the robigo sightseeing mission unmanned with support for full automated process showcase here https streamable com p8mhoz accelerated no manual key input at all provide a gamesession api which you can write your own autonomous script see game py game py for api reference examples on the way you can fork it as you wish to make changes getting started 1 download the source code git clone https github com matrixchung edautopilot git 1 make sure you have applied the useful settings useful settings best recommended because i write this entire project in those settings 2 run pip install r requirements txt to install all dependencies 3 run the gui entry python gameui py 4 click scripts load and select robigo py for robigo sightseeing mission 5 when you are at robigo mines enter the starport menu and go to passenger lounge then simply click the home button and you can just sit back 6 you can terminate the running process anytime when you click the end button sometimes it may not kick in when in a sleep loop 7 you can leave the process unsupervised but you should keep the mouse cursor on the client screen and manually takeover when it comes to interdict you 8 see the variables variables for any self configurable variables tips in order not to get interdicted you should only choose low value target mission which is displayed in the mission details the other traits can be ignored cause we are in the outpost and won t be scanned currently i m annoyed at money grabbing stuffs for my fleet carrier so currently this project will only play its role as a simple sightseeing mission bot given that i have already created a gamesession api we can expect more features like multi hop jumping assist for long distance travel etc in the future and if you want you can fork it to do any modifications you want variables robigo py scripts robigo py table tr th variable th th description th th default th tr tr th isdebug th th provides a useful debug window th th true th tr tr th showprocessedimg th th show the extra compassimg and navpointimg for detailed debug th th true th tr tr th usingwatchdog th th watchdog can help you force exit when being interdicted or attacked th th true th tr tr th stateoverride th th manually start from the given state especially after an unexpected exit th th empty th tr tr th firstjumpdest th th middle destination from robigo to sothis th th based on ship th tr tr th thirdjumpdest th th middle destination from sothis to robigo th th based on ship th tr tr th maxmissioncount th th max acceptable missions count depending on your ship s cabin config th th 8 th tr tr th missioncountoverride th th for any unread missions or improper mission count e g former process is killed by watchdog but newly start game won t record any ongoing mission th th 0 th tr table useful settings optimized graphics settings 1 1600x900 windowed and in primary screen 2 1920x1080 desktop resolution 3 detailed graphics settings graphics 1 img graphics1 jpg graphics 2 img graphics2 jpg graphics 3 img graphics3 jpg i don t know whether i used some 3rd party graphics quality patch a long time ago so the graphics may still be different from yours even if all in game settings are correct feel free to make issue if you still encounter problems such as can t recognize some images etc in game settings 1 set the navigation filter in the first panel to only stations and points of interest 2 bookmark sothis a 5 and robigo mines in the galaxy map which means you have to run the mission yourself for at least once to discover the destination planet you can see sothis a 5 templates robigo map sothis a 5 png and robigo mines templates robigo map robigom png for example 3 set the interface brightness to level 6 7 in right panel 4 panel ship pilot preferences robigo mission preferred ship python https s orbis zone i1dm note for robigo mission whatever your ship is make sure its jumping capability is enough to provide a two hop route which means only one middle destination for a single way fov setting your appdata path local frontier developments elite dangerous options graphics settings xml fov 56 249001 fov keybinds the program will automatically parse your keybinds settings guicolor setting your steam library path steamapps common elite dangerous products elite dangerous 64 graphicsconfiguration xml xml guicolour default localisationname standard localisationname matrixred 0 0 15 1 matrixred matrixgreen 0 1 0 matrixgreen matrixblue 1 0 04 0 28 matrixblue default redtobluetest localisationname redtobluetest localisationname matrixred 0 0 1 matrixred matrixgreen 0 1 0 matrixgreen matrixblue 1 0 0 matrixblue redtobluetest desaturatetest localisationname desaturatetest localisationname matrixred 0 33 0 33 0 33 matrixred matrixgreen 0 33 0 33 0 33 matrixgreen matrixblue 0 33 0 33 0 33 matrixblue desaturatetest guicolour screenshots screenshot 1 img screenshot1 jpg screenshot 2 img screenshot2 jpg credits skai2 https github com skai2 edautopilot for initial idea and directinput system
autopilot elite-dangerous elitedangerous elite-journal python image-recognition opencv
ai
udagram
udagram udagram is a simple instagram like serverless cloud application developed along side the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed the project is split into three parts 1 the simple frontend https github com mmoustafa salama udagram tree master frontend a basic ionic client web application which consumes the backend api 2 the backend api https github com mmoustafa salama udagram tree master backend a serverless service which can be deployed to aws getting setup tip this frontend is designed to work with the backend api https github com mmoustafa salama udagram tree master backend it is recommended you stand up the backend first test using postman and then the frontend should integrate installing node and npm this project depends on nodejs and node package manager npm before continuing you must download and install node npm is included from https nodejs com en download https nodejs org en download installing ionic cli the ionic command line interface is required to serve and build the frontend instructions for installing the cli can be found in the ionic framework docs https ionicframework com docs installation cli installing project dependencies this project uses npm to manage software dependencies npm relies on the package json file located in the root of this repository after cloning open your terminal and run bash npm install tip npm i is shorthand for npm install configure the backend endpoint ionic uses enviornment files located in src enviornments enviornment ts to load configuration variables at runtime by default environment ts is used for development and enviornment prod ts is used for produciton the apihost variable should be set to your backend api url running the development server ionic cli provides an easy to use development server to run and autoreload the frontend this allows you to make quick changes and see them in real time in your browser to run the development server open terminal and run bash ionic serve building the static frontend files ionic cli can build the frontend into static html css javascript files these files can be uploaded to a host to be consumed by users on the web build artifacts are located in www to build from source open terminal and run bash ionic build features 1 register 2 login 3 post an image to the newsfeed 4 listing the posts features to be implemented 1 edit post 2 remove post 3 add edit delete comments 4 like unlike posts 5 search users 6 follow unfollow users 7 add ui improvements postman collection udagram postman collection json https github com mmoustafa salama udagram tree master backend udagram postman collection json
cloud
ReactJS_VideoGuides
react js web development the essentials bootcamp course logo course logo udemy png official repository guide to accompany the video lessons of the react js web development the essentials bootcamp course take the course here https www udemy com react js and redux mastering web apps https www udemy com react js and redux mastering web apps commit by lecture guide https github com 15dkatz react essentials bootcamp commits https github com 15dkatz react essentials bootcamp commits this breaks down the course one commit at a time per lecture for an easier checkpoint troubleshooting experience what you ll learn updated for 2022 2023 learn how to code with react redux react hooks and more from an engineer with 5 years of industry experience modern redux in 2022 2023 modern syntax and best practices react hooks in 2023 2023 explore fundamental hooks and build hooks from scratch learn react in 2022 2023 the right way and learn best practices from an engineer with 5 years of industry experience create industry relevant projects that you can use on your portfolio and resume access 3 hours of in depth javascript material to hone your js skills learn react the right way and learn best practices from an engineer with 5 years of industry experience modern react in 2022 2023 createstore functional components etc explore the react engine and learn how it works under the hood to better understand the virtual dom state props etc learn how to build applications from scratch setting up your own react app template see how react fits in the big picture of web development with a ton of detailed overviews on what is happening in the browser and the react engine this provides the completed projects for portfolio music master react app template evens or odds starter react hooks in this course you will dive into react code right away you will gain relevant experience as soon as the first section time is precious and i want to make sure that you ll never feel like you re wasting it in this course so in a matter of minutes you will be writing react code in the first section with a fully completed app by the end of it understand how react fits in the big picture of web development in the second section you will take an important step back and examine how react fits in the big picture of web development you ll build a react project from scratch discovering all the layers that are in between the supplies that supports the react app and the browser which displays the react app create relevant and compelling react apps i m betting you ll find the apps both useful and interesting useful ones like the portfolio app will help you both learn react and be valuable as a completed project for your software engineering and web developer profile fun ones like music master will make coding lively giving you apps you want to show off to your friends and family
reactjs redux es6 tutorial
front_end
MLJ.jl
div align center img src material mljlogo2 svg alt mlj width 200 div h2 align center a machine learning framework for julia p align center a href https github com alan turing institute mlj jl actions img src https github com alan turing institute mlj jl workflows ci badge svg alt build status a a href https alan turing institute github io mlj jl dev img src https img shields io badge docs stable blue svg alt documentation a a href https opensource org licenses mit img src https img shields io badge license mit yelllow alt bibtex a a href bibliography md img src https img shields io badge cite bibtex blue alt bibtex a p h2 mlj machine learning in julia is a toolbox written in julia providing a common interface and meta algorithms for selecting tuning evaluating composing and comparing about 200 machine learning models https alan turing institute github io mlj jl dev model browser model browser written in julia and other languages new to mlj start here https alan turing institute github io mlj jl dev integrating an existing machine learning model into the mlj framework start here https alan turing institute github io mlj jl dev quick start guide to adding models wanting to contribute start here contributing md phd and postdoc opportunies see here https sebastian vollmer ms jobs mlj was initially created as a tools practices and systems project at the alan turing institute https www turing ac uk in 2019 current funding is provided by a new zealand strategic science investment fund https www mbie govt nz science and technology science and innovation funding information and opportunities investment funds strategic science investment fund ssif funded programmes university of auckland awarded to the university of auckland mlj has been developed with the support of the following organizations div align center img src material turing logo png width 100 img src material uoa logo png width 100 img src material iqvia logo png width 100 img src material warwick png width 100 img src material julia png width 100 div the mlj universe the functionality of mlj is distributed over several repositories illustrated in the dependency chart below these repositories live at the juliaai https github com juliaai umbrella organization div align center img src material mlj stack svg alt dependency chart div dependency chart for mlj repositories repositories with dashed connections do not currently exist but are planned proposed br p align center a href contributing md contributing a nbsp nbsp a href organization md code organization a nbsp nbsp a href roadmap md road map a br contributors core design a blaom f kiraly s vollmer lead contributor a blaom active maintainers a blaom s okon t lienart d aluthge
machine-learning julia pipelines tuning data-science tuning-parameters predictive-modeling classification regression statistics clustering stacking ensemble-learning pipeline
ai
libremdb
libremdb a free open source imdb front end inspired by projects like teddit https codeberg org teddit teddit nitter https github com zedeus nitter and many others similar projects img src public img misc preview jpg title screenshot desktop screen light mode width 1500 img src public img misc preview2 jpg title screenshot mobile screen dark mode width 400 some features no ads or tracking browse any movie info without being tracked or bombarded by annoying ads modern interface modern interface with curated colors supporting both dark and light themes responsive design be it your small mobile or big computer screen it s fully responsive lightweight up movie page https imdb com title tt1049413 tested on firefox v104 without scroll simulated regular 4g network tab stats libremdb imdb no of requests 22 180 data transfered gzipped 468kb 1 88mb load event fired in 6 22s 10 01s instances prettier ignore instance url region notes 1 clearnet libremdb iket me https libremdb iket me canada operated by me libremdb pussthecat org https libremdb pussthecat org germany operated by pussthecat org https pussthecat org ld vern cc https ld vern cc us operated by vern https vern cc binge whatever social https binge whatever social us germany operated by whatever social https whatever social libremdb lunar icu https libremdb lunar icu germany cloudflare operated by lunar icu https lunar icu libremdb jeikobu net https libremdb jeikobu net germany cloudflare operated by shindouj https github com shindouj lmdb hostux net https lmdb hostux net france operated by hostux net https hostux net binge whateveritworks org https binge whateveritworks org germany cloudflare operated by whateveritworks https github com whateveritworks libremdb nerdyfam tech https libremdb nerdyfam tech us operated by nerdyfam tech https nerdyfam tech libremdb tux pizza https libremdb tux pizza us operated by tux pizza https tux pizza libremdb frontendfriendly xyz https libremdb frontendfriendly xyz mdash operated by frontendfriendly xyz https frontendfriendly xyz d opnxng com https d opnxng com singapore operated by opnxng https about opnxng com libremdb catsarch com https libremdb catsarch com us operated by butter cat https catsarch com 2 onion ld vernccvbvyi5qhfzyqengccj7lkove6bjot2xhh5kajhwvidqafczrad onion http ld vernccvbvyi5qhfzyqengccj7lkove6bjot2xhh5kajhwvidqafczrad onion us operated by vern https vern cc 3 i2p vernz3ubrntql4wrgyrssd6u3qzi36zrhz2agbo6vibzbs5olk2q b32 i2p http vernz3ubrntql4wrgyrssd6u3qzi36zrhz2agbo6vibzbs5olk2q b32 i2p us operated by vern https vern cc questions you might have how do i use it replace imdb com in any imdb url with any of the instances for example imdb com title tt1049413 https imdb com title tt1049413 to libremdb iket me title tt1049413 https libremdb iket me title tt1049413 to avoid changing the urls manually you can use extensions automatic redirection why is it so slow whenever you request info about a movie show on libremdb 4 trips are made 2 between your browser and libremdb s server and 2 between libremdb s server and imdb s server instead of the usual 2 trips when you visit a website for this reason there s a noticable delay this is a bit of inconvenience you ll have to face should you wish to use this website it doesn t have all routes i ll implement more with time is content served from third parties like amazon nope libremdb proxies all image and video requests through the instance to avoid exposing your ip address browser information and other personally identifiable metadata contributor https github com httpjamesm why not just use imdb refer to the features section some features above why didn t you use other databases like tmdb https www themoviedb org or omdb https www omdbapi com imdb simply has superior dataset compared to all other alternatives with that being said i d encourage you to check out those alternatives too privacy information collected none information stored in your browser a key named theme is stored in local storage provided by your browser if you ever override the default theme to remove it go to site data settings and clear the data for this website to permamently disable libremdb from storing your theme prefrences either turn off javascript or disable access to local storage for libremdb information collected by other services none libremdb proxies images anonymously through the instance for maximum privacy contributor https github com httpjamesm to do add advanced search route x add did you know and reviews on movie info page x add a way to see trailer and other videos implement movie specific routes like reviews including critic reviews video image gallery sections under did you know release info parental guide implement other routes like lists moviemeter x person info includes directors and actors company info user info x use redis or any other caching strategy x implement a better installation method x serve images and videos from libremdb itself installation as libremdb is made with next js you can deploy it anywhere where next js is supported below are a few other methods manual 1 install node js and git for node js visit their website https nodejs org en for git run sudo apt install git if you re on a debian based distro else visit their website https git scm com 2 install redis optional you can install redis from here https redis io 3 clone and set up the repo bash git clone https github com zyachel libremdb git replace github com with codeberg org if you wish so cd libremdb change the configuration file to your liking cp env local example env local replace pnpm with yarn or npm if you use those pnpm install pnpm build pnpm start optional if you re using redis redis server libremdb will start running at http localhost 3000 to change port modify the last command like this pnpm start p port number docker local you can build the docker image using the provided dockerfile thanks to httpjamesm https github com httpjamesm and set it up using the example docker compose file docker compose example yml change the docker compose file to your liking and run docker compose up d to start the container that s all docker built there s a docker image https github com pussthecat org docker libremdb quay made by thefrenchghosty https github com thefrenchghosty for pussthecat org s instance https libremdb pussthecat org you can use that as well miscellaneous automatic redirection redirector https github com einaregilsson redirector config description redirect imdb to libremdb example url https www imdb com title tt0258463 ref tt sims tt t 4 include pattern https www imdb com redirect to https libremdb iket me 2 pattern type regular expression libredirect https github com libredirect libredirect privacy redirector https github com dybdeskarphet privacy redirector similar projects teddit https codeberg org teddit teddit teddit is an alternative reddit front end focused on privacy nitter https github com zedeus nitter nitter is a free and open source alternative twitter front end focused on privacy bibliogram https sr ht cadence bibliogram bibliogram is an alternative front end for instagram invidious https invidious io invidious is an alternative front end to youtube libreddit https github com spikecodes libreddit libreddit is an alternative private front end to reddit scribe https git sr ht edwardloveall scribe scribe is an alternative medium frontend full list rarr https github com digitalblossom alternative frontends contact i m availabe on matrix https matrix to ninal matrix org and email mailto aricla protonmail com in case you wish to contact me personally license licensed under gnu agplv3 see license license for full legalese
alternative-frontends front-end privacy foss imdb scraping sass typescript
front_end
SecuML
secuml https anssi fr github io secuml https anssi fr github io secuml secuml is a python tool that aims to foster the use of machine learning in computer security it is distributed under the gpl2 license it allows security experts to train detection models easily and comes with a web user interface to visualize the results and interact with the models secuml can be applied to any detection problem it requires as input numerical features representing each instance it supports binary labels malicious vs benign and categorical labels which represent families of malicious or benign behaviours benefits of secuml secuml relies on scikit learn https www scikit learn org stable index html to train the machine learning models and offers the additionnal features web user interface diagnosis and interaction with machine learning models active learning rare category detection hide some of the machine learning machinery automation of data loading feature standardization and search of the best hyperparameters what you can do with secuml training and diagnosing a detection model before deployment with diadem annotating a dataset with a reduced workload with ilab exploring a dataset interactively with rare category detection clustering projection computing descriptive statistics of each feature see the sphinx documentation https anssi fr github io secuml for more detail papers beaugnon ana l and pierre chifflier machine learning for computer security detection systems practical feedback and solutions https www ssi gouv fr uploads 2018 11 machine learning for computer security abeaugnon pchifflier anssi pdf computer electronics security applications rendez vous c esar 2018 beaugnon ana l pierre chifflier and francis bach end to end active learning for computer security experts https hal archives ouvertes fr hal 01888983 file idea18 paper1 beaugnon pdf kdd workshop on interactive data exploration and analytics idea 2018 extended version of aics 2018 beaugnon ana l pierre chifflier and francis bach end to end active learning for computer security experts https www ssi gouv fr uploads 2018 02 end to end active learning for computer security experts abeaugnon pchifflier fbach anssi inria pdf aaai workshop on artificial intelligence for computer security aics 2018 beaugnon ana l pierre chifflier and francis bach ilab an interactive labelling strategy for intrusion detection https www ssi gouv fr uploads 2017 09 ilab beaugnonchifflierbach raid2017 pdf international symposium on research in attacks intrusions and defenses raid 2017 french bonneton ana l and antoine husson le machine learning confront aux contraintes op rationnelles des syst mes de d tection https www sstic org media sstic2017 sstic actes le machine learning confront aux contraintes oprat sstic2017 article le machine learning confront aux contraintes oprationnelles des systmes de dtection bonneton husson pdf symposium sur la s curit des technologies de l information et des communications sstic 2017 phd dissertation beaugnon ana l expert in the loop supervised learning for computer security detection systems https www ssi gouv fr uploads 2018 06 beaugnon a these manuscrit pdf ph d thesis cole normale superieure 2018 presentations french beaugnon ana l appliquer le machine learning de mani re pertinente la d tection d intrusion https www cert ist com pub files forum2017 03 anael beaugnon machine learning pdf forum annuel du cert ist cert ist 2017 bonneton ana l machine learning for computer security experts using python scikit learn http pyparis org talks html 39d62c68337f89d3c879fff02b88e23b pyparis 2017 authors ana l beaugnon anael beaugnon ssi gouv fr pierre collet pierre collet ssi gouv fr antoine husson antoine husson ssi gouv fr
machine-learning intrusion-detection interactive-machine-learning active-learning rare-category-detection malware-detection gui
ai
fullstack-backend
fullstack backend for part3 repo for university of helsinki course csm14108 full stack web development node backend app in part3 phonebook application deployed on heroku https uoh fullstack part3 phonebook herokuapp com
server
PET-Exercises
pet exercises exercises in privacy enhancing technologies ucl information security msc course compga17 how to install and run the exercises in order to run the labs you will need an ubuntu linux virtual machine ensure you have a working installation of python 2 7 and pip install packages pytest and petlib git clone this repository and follow the instructions in each of the exercise directories
server
AndroidBaseProject
androidbaseproject a base project to fast track mobile development this project is setup to implement the model view presenter architecture for mobile development it is also suggested to follow the packaging by feature style of structure libraries setup include android support library 25 3 1 https developer android com topic libraries support library revisions html retrofit networking http square github io retrofit butterknife 8 8 1 http jakewharton github io butterknife contribution guide guide to contributing to cottacush android projects https github com cottacush android guidelines blob master project style guidelines md checkstyle h5 exisiting projects h5 to setup checkstyle for already existing projects kindly follow the instructions below cd to the root directory of your project do mkdir config cd you can also do mkdir config then subsequently cd config or whichever way is convenient for you to create the config directory and cd into it do curl l o https link to raw baseproject checkstyle xml and curl l o https link to raw baseproject prepush checks sh to download the checkstyle and prepush script respectively into your config directory in your app build gradle file add the line apply from config quality gradle at the top just below apply plugin com android application in your project s build gradle file add the checkstyle dependency com puppycrawl tools checkstyle 7 1 2 in your root directory do chmod u x config prepush checks sh then do ln s pwd config prepush checks sh git hooks pre push h5 for fresh projects h5 to setup checkstyle for fresh projects that are offsprings of this baseproject s post lint era i e have the scripts in the appropriate directory in your root directory do chmod u x config prepush checks sh then do ln s pwd config prepush checks sh git hooks pre push subsequently git pushes to the master repo will either fail or pass for failed pushes find the issues at path to repo app build reports checkstyle checkstyle xml and handle accordingly h6 note the appropriate url for the checkstyle xml and prepush check scripts will be updated after both files are hosted on this repository and this note will be removed h6
front_end
lab-iac-data-lake
welcome to the your project name here repo template project for new dataops repositories how this repo is organized the structure below is designed to minimize clutter and create separation between the various layers of the project while still ensuring common resources are easy to locate 1 data this is generally speaking the most important folder in the repo it contains all business logic source mappings data transformations and data analyses generally speaking project specific source code should live here but engine code should not 1 data taps logic to extract load data into a data lake s3 or azure using the tap target paradigm from singer https www singer io 2 data transforms transform data using sql using dbt www getdbt com to orchestrate the transforms 3 data analyses any queries reports notebooks or analysis files which may be related to the project but not part of any automated execution schedule storing files here allows them to be checked by the ci cd pipeline to automatically detect any accidental breakages 2 docs a place to store documentation for use by other developers or consumers of the repo 3 infra terraform scripts which manage the cloud infrastructure needed by the project except when making changes to infrastructure you can safely ignore this folder 4 logs this is the default location for local log output 5 tests contains sql queries tests used to validate the integrity of output data 6 tools special tools or scripts that assist in automating certain processes 7 dot folders 1 secrets this is a special folder whose contents are not committed to git you can safely store secrets into this folder if needed for a project do not store secrets anywhere in the repo besides the secrets folder 2 other folders that begin with a dot these folders store temporary files or tool specific configs gradle git github vscode etc quick intro to gradle the gradle www gradle org build tool is a simple and extensible build automation tool which streamlines repetitive tasks for more info on how to use gradle with this project see the gradle overview and how to at tools gradle readme md tools gradle readme md
cloud
UdacityDataWarehouseProject
project 3 sparkify data warehouse context a music streaming startup sparkify has grown their userbase and song database and want to move their processes and data on the cloud using amazon s architecture currently their data resides in amazon s3 in a directory of json logs on useractivity taken from their app as well as a directory of json metadata on their songs in the app project description this project builds an etl pipeline for a database hosted on redshift the pipeline extracts the song and user data from amazon s3 stages the full data set in redshift and the transforms the data into a set of fact and dimension tables this will give the analytics team the ability to query the data warehouse in an efficient way to garner insights about the app getting started first you will need to update the credentials file with your aws details namely the access key and the secret install python3 and aws sdk you will then need to do the following on aws create clients for ec2 s3 iam and redshift create an iam user iam role create a redshift cluster extract the arn to do the above you will need to follow the instructions and run the jupyter file redshift iac ipynb note that you will only run step 5 clean up your resources after you are finished creating your data warehouse this will ensure that you won t run up unnecessary costs for running your redshift cluster run the below files from the command line create tables py etl py you can also run the jupyter notebook run me ipynb which is a jupyter notebook version which will run the above two files as well as explore the created tables using inline sql the data s3 to understand the data that you have on s3 and to identify which database schema to use you can run the jupyter notebook file data viewer ipynb which will allow you to browse the s3 data in dataframe the user data has the following 18 fields artist auth first name gender item in session last name length level location method page registration session id song status ts user agent user id none logged in walter m 0 frye nan free san francisco oakland hayward ca get home 1540919166796 38 none 200 1541105830796 mozilla 5 0 macintosh intel mac os x 10 9 4 39 the song data has the following 10 fields artist id artist latitude artist location artist longitude artist name duration num songs song id title year arvbrgz1187fb4675a nan nan gwen stefani 290 55955 1 sorrzgd12a6310dbc3 harajuku girls 2004 the above data is gleaned from the following aws s3 buckets log data s3 udacity dend log data log jsonpath s3 udacity dend log json path json song data s3 udacity dend song data redshift database the data is first staged into two bigger tables song staging and event staging which includes all fields the dimension and fact tables are then pulled from the 2 comprehensive staging tables to create fact table songplays songplay id start time user id level song id artist id year duration dimension tables songs song id title artist id year duration users user id first name last name gender level artists artist id name location latitude longitude time start time hour day week month year weekday for the data types in the sql tables to remove complexity i just consistently used 3 data types varchar int and bigint for additional performance enhancements distkeys and sortkeys were added to the tables table sizes staging tables table name no rows staging songs 14896 staging events 8056 database tables table name no rows songplays 333 songs 14896 artists 10025 users 104 time 8023
server
fuselage
p align center a href https rocket chat title rocket chat img src https github com rocketchat rocket chat artwork raw master logos 2020 png logo horizontal red png alt rocket chat a p h1 align center fuselage monorepo h1 issues https img shields io github issues rocketchat fuselage style flat square pull requests https img shields io github issues pr rocketchat fuselage style flat square github commit activity https img shields io github commit activity m rocketchat fuselage style flat square package description version dependencies rocket chat css in js packages css in js toolset to transpile and use css on runtime npm https img shields io npm v rocket chat css in js style flat square https www npmjs com package rocket chat css in js deps https img shields io librariesio release npm rocket chat css in js style flat square rocket chat css supports packages css supports memoized and ssr compatible facade of css supports api npm https img shields io npm v rocket chat css supports style flat square https www npmjs com package rocket chat css supports deps https img shields io librariesio release npm rocket chat css supports style flat square rocket chat emitter packages emitter event emitter by rocket chat npm https img shields io npm v rocket chat emitter style flat square https www npmjs com package rocket chat emitter deps https img shields io librariesio release npm rocket chat emitter style flat square rocket chat eslint config alt packages eslint config alt eslint configuration for rocket chat repositories npm https img shields io npm v rocket chat eslint config alt style flat square https www npmjs com package rocket chat eslint config alt deps https img shields io librariesio release npm rocket chat eslint config alt style flat square rocket chat fuselage packages fuselage rocket chat s react components library npm https img shields io npm v rocket chat fuselage style flat square https www npmjs com package rocket chat fuselage deps https img shields io librariesio release npm rocket chat fuselage style flat square rocket chat fuselage hooks packages fuselage hooks react hooks for fuselage rocket chat s design system and ui toolkit npm https img shields io npm v rocket chat fuselage hooks style flat square https www npmjs com package rocket chat fuselage hooks deps https img shields io librariesio release npm rocket chat fuselage hooks style flat square rocket chat fuselage polyfills packages fuselage polyfills a bundle of useful poly ponyfills used by fuselage npm https img shields io npm v rocket chat fuselage polyfills style flat square https www npmjs com package rocket chat fuselage polyfills deps https img shields io librariesio release npm rocket chat fuselage polyfills style flat square rocket chat fuselage toastbar packages fuselage toastbar fuselage toastbar component npm https img shields io npm v rocket chat fuselage toastbar style flat square https www npmjs com package rocket chat fuselage toastbar deps https img shields io librariesio release npm rocket chat fuselage toastbar style flat square rocket chat fuselage tokens packages fuselage tokens design tokens for fuselage rocket chat s design system npm https img shields io npm v rocket chat fuselage tokens style flat square https www npmjs com package rocket chat fuselage tokens deps https img shields io librariesio release npm rocket chat fuselage tokens style flat square rocket chat icons packages icons rocket chat s icons npm https img shields io npm v rocket chat icons style flat square https www npmjs com package rocket chat icons deps https img shields io librariesio release npm rocket chat icons style flat square rocket chat layout packages layout shared application layout components npm https img shields io npm v rocket chat layout style flat square https www npmjs com package rocket chat layout deps https img shields io librariesio release npm rocket chat layout style flat square rocket chat logo packages logo rocket chat logo package npm https img shields io npm v rocket chat logo style flat square https www npmjs com package rocket chat logo deps https img shields io librariesio release npm rocket chat logo style flat square rocket chat memo packages memo memoization utilities npm https img shields io npm v rocket chat memo style flat square https www npmjs com package rocket chat memo deps https img shields io librariesio release npm rocket chat memo style flat square rocket chat message parser packages message parser rocket chat parser for messages npm https img shields io npm v rocket chat message parser style flat square https www npmjs com package rocket chat message parser deps https img shields io librariesio release npm rocket chat message parser style flat square rocket chat mp3 encoder packages mp3 encoder a lame encoder to be used in web workers npm https img shields io npm v rocket chat mp3 encoder style flat square https www npmjs com package rocket chat mp3 encoder deps https img shields io librariesio release npm rocket chat mp3 encoder style flat square rocket chat onboarding ui packages onboarding ui set of components and functions for the onboarding experience on rocket chat npm https img shields io npm v rocket chat onboarding ui style flat square https www npmjs com package rocket chat onboarding ui deps https img shields io librariesio release npm rocket chat onboarding ui style flat square rocket chat peggy loader packages peggy loader peggy loader for webpack npm https img shields io npm v rocket chat peggy loader style flat square https www npmjs com package rocket chat peggy loader deps https img shields io librariesio release npm rocket chat peggy loader style flat square rocket chat prettier config packages prettier config prettier configuration for rocket chat repositories npm https img shields io npm v rocket chat prettier config style flat square https www npmjs com package rocket chat prettier config deps https img shields io librariesio release npm rocket chat prettier config style flat square rocket chat string helpers packages string helpers helper functions for string manipulation npm https img shields io npm v rocket chat string helpers style flat square https www npmjs com package rocket chat string helpers deps https img shields io librariesio release npm rocket chat string helpers style flat square rocket chat styled packages styled a simple styled api for react components npm https img shields io npm v rocket chat styled style flat square https www npmjs com package rocket chat styled deps https img shields io librariesio release npm rocket chat styled style flat square rocket chat stylis logical props middleware packages stylis logical props middleware stylis middleware to handle css logical properties and their fallbacks npm https img shields io npm v rocket chat stylis logical props middleware style flat square https www npmjs com package rocket chat stylis logical props middleware deps https img shields io librariesio release npm rocket chat stylis logical props middleware style flat square rocket chat ui kit packages ui kit interactive ui elements for rocket chat apps npm https img shields io npm v rocket chat ui kit style flat square https www npmjs com package rocket chat ui kit deps https img shields io librariesio release npm rocket chat ui kit style flat square
react design-system monorepo rocketchat hacktoberfest
os
php-nltk
natural processing language for php
ai
teensy-3.6-FreeRTOS-template
teensy 3 6 freertos project template purpose starting point for a teensy 3 x rtos project based on teensyduino 1 32 and freertos 9 0 0 to be used without the arduino ide and build environment setup install the teensy udev rule sudo cp tools 49 teensy rules etc udev rules d then unplug your teensy and plug it back in using 1 put your code in src main cpp 2 put any libraries you need in libraries 3 set the teensy variable in makefile according to your teensy version 4 build your code make 5 upload your code make upload make targets make alias for make hex make build compiles everything and produces a elf make hex converts the elf to an intel hex file make post compile opens the launcher with the correct file make upload uploads the hex file to a teensy board make reboot reboots the teensy where everything came from this description and template project is taken from teensy 3 x project template https github com apmorton teensy template the freertos sub folder is taken from freertos http www freertos org the teensy3 sub folder is taken from the teensy 3 cores https github com paulstoffregen cores tree master teensy3 the tools sub folder is taken from teensyduino http www pjrc com teensy td download html the src main cpp file is moved unmodified from teensy3 main cpp the 49 teensy rules file is taken from pjrc s udev rules http www pjrc com teensy 49 teensy rules modifications to makefile include add support for freertos modifications to teensy3 freertosconfig h include disable configcreate low power demo to keep delay and micros functions in arduino files operational keep configtick rate hz at 1000 anything else than 1000 hz causes timing trouble in arduino files disabled several handlers not required for this initial port modifications to teensy3 mk20dx128 c include add include freertos h include task h add freertos handler prototypes and replace systickhandler from freertos port void xportpendsvhandler void attribute naked void xportsystickhandler void void vportsvchandler void attribute naked from arduino port extern volatile uint32 t systick millis count attribute weak naked void systick isr void increment the systick counter systick millis count 1000 configtick rate hz unconditionally branch to the systick handler asm volatile b xportsystickhandler comment out isr s that will be replaced by freertos routines void usage fault isr void attribute weak alias fault isr void svcall isr void attribute weak alias unused isr void debugmonitor isr void attribute weak alias unused isr void pendablesrvreq isr void attribute weak alias unused isr void systick isr void attribute weak alias systick default isr add freertos isr s fault isr 10 vportsvchandler 11 arm supervisor call svcall debugmonitor isr 12 arm debug monitor fault isr 13 xportpendsvhandler 14 arm pendable req serv pendablesrvreq systick isr 15 arm system tick timer systick comment out systick counter initialization initialize the systick counter next 4 lines are commented out for freertos port freertos configures systick syst rvr f cpu 1000 1 syst cvr 0 syst csr syst csr clksource syst csr tickint syst csr enable scb shpr3 0x20200000 systick priority 32 modifications to teensy3 mk66fx1m0 ld include added heap memory section add this to other ld files when porting to other teensy 3 x boards heap noload align 4 heapsection ram modifications to teensy3 pins teensy c include added delay nosystick function that can safely be called before the systick counter is configured replace calls to delay by call to delay nosystick analog init for background about this startup delay please see these conversations https forum pjrc com threads 36606 startup time 400ms p 113980 viewfull 1 post113980 https forum pjrc com threads 31290 teensey 3 2 teensey loader 1 24 issues p 87273 viewfull 1 post87273 delay nosystick 400 usb init sources that were very helpful for this teensy freertos port rishi franklins blog http rishifranklin blogspot com 2014 03 freertos on teensy 31 html and repo https github com circuitsenses teensy 3 1 freertos hydrosense repo https github com hydrosense teensy freertos bastl instruments repo https github com bastl instruments teensy rtos template thank you all for posting your experiences disclaimer this freertos port has only been tested on a teensy 3 6 board with no real world application yet if you find issues please post them or create a pull request
os
iot-cardboard-js
build https github com microsoft iot cardboard js workflows build badge svg branch main storybook https cdn jsdelivr net gh storybookjs brand master badge badge storybook svg https main 601c6b2fcd385c002100f14c chromatic com npm tag https img shields io npm v microsoft iot cardboard js beta quick start what is iot cardboard js iot cardboard js or cardboard is an open source react component library for creating internet of things iot web experiences the components in cardboard are also used for building the experiences in azure digital twins 3d scenes studio https explorer digitaltwins azure net 3dscenes demo and can be leveraged by azure digital twins customers in their own applications learn more about leveraging cardboard components for 3d scenes in the wiki https github com microsoft iot cardboard js wiki embedding 3d scenes the 3d visualization components in this library leverage the fantastic babylonjs https www babylonjs com library under the hood if you haven t used it yet we can t say enough great things about the library definitely check it out viewer mode image https user images githubusercontent com 57726991 173465604 844492d1 89c8 4378 8bd7 131ef966002a png builder mode image https user images githubusercontent com 57726991 173465578 93eb1b54 e1b5 40a6 944c 9185c5fb14ca png storybook this project is developed using storybook an open source tool for building ui components in isolation our hosted storybook https main 601c6b2fcd385c002100f14c chromatic com showcases the current library of iot cardboard js components learn more about storybook https storybook js org note stories which require authentication or api interaction can be found in the local development storybook using iot cardboard js components installing install our beta package from npm with npm install microsoft iot cardboard js beta styles import the iot cardboard js themes stylesheet at the top level of your application to get theming for cardboard components via css custom properties variables these variables can be edited if you d like to change theme colors tsx import microsoft iot cardboard js themes css importing components via named exports tsx import standalonepropertyinspector adtadapter msalauthservice keyvaluepaircard from microsoft iot cardboard js this is the easiest method of importing components and in most cases will allow unused code to be tree shaken from our library if however you only need a few modules from our library you can use the direct import pattern to be more explicit about what code is imported importing components via direct imports adapters classes constants hooks and services each have their own entry point and can be imported as follows adapters tsx import adtadapter from microsoft iot cardboard js adapters classes tsx import searchspan from microsoft iot cardboard js classes constants tsx import imockadapter from microsoft iot cardboard js constants hooks tsx import useguid from microsoft iot cardboard js hooks services tsx import msalauthservice getfiletype from microsoft iot cardboard js services all cards and components have their own direct import path cards tsx import keyvaluepaircard from microsoft iot cardboard js cards keyvaluepaircard components tsx import standalonepropertyinspector from microsoft iot cardboard js components standalonepropertyinspector examples storybook stories are the best way to learn how to use our components files ending in stories tsx showcase components set up with mock data while files ending in stories local tsx showcase components which authenticate and communicate with apis stories are a great way to learn about the different ways to consume each of our components check out the keyvaluepaircard https github com microsoft iot cardboard js blob main src cards keyvaluepaircard keyvaluepaircard stories tsx stories for an example of this you can also see the code required to use a component by opening either the live https 601c6b2fcd385c002100f14c exzabxrkak chromatic com path docs keyvaluepaircard consume mock or local storybook selecting the docs tab at the top of a story and clicking show code at the bottom right of a story panel this opens a view of the code used to render that story questions for maintainers if you have a question for one of the project maintainers please post the question here https github com microsoft iot cardboard js discussions categories q a we ll get back to you as soon as possible issue templates file a bug https github com microsoft iot cardboard js issues new assignees labels bug 3abug 3a template bug report md title request a new feature https github com microsoft iot cardboard js issues new assignees labels enhancement 3abulb 3a template feature request md title contributing to contribute to this project head over to our environment setup wiki https github com microsoft iot cardboard js wiki environment setup to get started to learn about our codebase philosophy check out the design patterns wiki https github com microsoft iot cardboard js wiki design patterns to learn about coding guidelines check out the coding guidelines wiki https github com microsoft iot cardboard js wiki coding guidelines component templates for an overview of our ci cd and semantic versioning systems check out our continuous integration and delivery wiki https github com microsoft iot cardboard js wiki continuous delivery contributor license agreement this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla opensource microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g status check comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla code of conduct this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments trademarks this project may contain trademarks or logos for projects products or services authorized use of microsoft trademarks or logos is subject to and must follow microsoft s trademark brand guidelines https www microsoft com en us legal intellectualproperty trademarks usage general use of microsoft trademarks or logos in modified versions of this project must not cause confusion or imply microsoft sponsorship any use of third party trademarks or logos are subject to those third party s policies
server
AI-resources
links of resources for learning ai feel free to suggest for more resources advanced crash courses deep learning by ruslan salakhutdinov kdd 2014 http videolectures net kdd2014 salakhutdinov deep learning overview of dl including dbn rbm pgm etc which are not as popular these days very theoretical dense and mathematical maybe not that useful for beginners salakhutdinov is another major player in dl introduction to reinforcement learning with function approximation by rich sutton nips 2015 http research microsoft com apps video id 259577 another intro to rl but more technical and theoretical rich sutton is old school king of rl deep reinforcement learning by david silver rldm 2015 http videolectures net rldm2015 silver reinforcement learning advanced intro to deep rl as used by deepmind on the atari games and alphago quite technical and requires decent understanding of rl td learning and q learning etc see rl courses below david silver is the new school king of rl and superstar of deepmind s alphago which uses deep rl monte carlo inference methods by ian murray nips 2015 http research microsoft com apps video id 259575 good introduction and overview of sampling monte carlo based methods not essential for a lot of dl but good side knowledge to have how to grow a mind statistics structure and abstraction by josh tenenbaum aaai 2012 http videolectures net aaai2012 tenenbaum grow mind completely unrelated to current dl and takes a very different approach bayesian heirarchical models not much success in real world yet but i m still a fan as the questions and problems they re looking at feels a lot more applicable to real world than dl e g one shot learning and transfer learning though deepmind is looking at this with dl as well now two architectures for one shot learning by josh tenenbaum nips 2013 http videolectures net nipsworkshops2013 tenenbaum learning similar to above but slightly more recent optimal and suboptimal control in brain and behavior by nathaniel daw nips 2015 http videolectures net rldm2015 daw brain and behavior quite unrelated to dl looks at human learning combined with research from pyschology and neuroscience through the computational lens of rl requires decent understanding of rl lots more one off video lectures at http videolectures net top computer science artificial intelligence http videolectures net top computer science machine learning massive open online courses mooc these are concentrated long term courses consisting of many video lectures ordered very roughly in the order that i recommend they are watched foundation maths https www khanacademy org math probability https www khanacademy org math linear algebra https www khanacademy org math calculus home http research microsoft com apps video id 259574 http videolectures net sahd2014 lecun deep learning http videolectures net rldm2015 littman computational reinforcement p b resource for beginners b p p span style font weight 400 1 span a href http natureofcode com book introduction span style font weight 400 the nature of code span a p p span style font weight 400 2 span a href https mostafa samir github io ml theory pt1 span style font weight 400 machine learning theory span a p p span style font weight 400 3 span a href https ocw mit edu courses electrical engineering and computer science 6 0001 introduction to computer science and programming in python fall 2016 span style font weight 400 introduction to computer science and programming in python span a p p span style font weight 400 4 span a href http students brown edu seeing theory span style font weight 400 seeing theory span a p p span style font weight 400 5 span a href https www udacity com course intro to artificial intelligence cs271 span style font weight 400 udacity intro to artificial intelligence span a p p span style font weight 400 5 span a href https www udacity com course deep learning nanodegree foundation nd101 span style font weight 400 udacity deep learning foundations course span a p p span style font weight 400 6 span a href http karpathy github io neuralnets span style font weight 400 hacker 8217 s guide to neural networks span a p p span style font weight 400 7 a href http vision stanford edu teaching cs131 fall1617 index html target blank rel noopener noreferrer cs 131 computer vision foundations and applications a span p p span style font weight 400 8 span a href https www coursera org browse data science machine learning languages en amp source deprecated spark cdp span style font weight 400 coursera machine learning courses span a p p span style font weight 400 9 span a href https github com rasbt deep learning book span style font weight 400 introduction to artificial neural networks and deep learning span a p p span style font weight 400 10 span a href https pythonprogramming net span style font weight 400 python programing by harrison span a p p span style font weight 400 11 span a href https www youtube com user sentdex featured span style font weight 400 youtube channel of harrison from basic python to machine learning span a p p span style font weight 400 12 span a href http in mathworks com help nnet examples html span style font weight 400 matlab neural network toolbox span a p p span style font weight 400 13 span a href https in mathworks com campaigns products offer deep learning html s eid psm da amp requesteddomain www mathworks com span style font weight 400 matlab for deep learning span a p p span style font weight 400 14 span a href https learningcircles p2pu org en span style font weight 400 learning circles span a p p span style font weight 400 15 span a href http playground tensorflow org activation tanh amp batchsize 10 amp dataset circle amp regdataset reg plane amp learningrate 0 03 amp regularizationrate 0 amp noise 0 amp networkshape 4 2 amp seed 0 42500 amp showtestdata false amp discretize false amp perctraindata 50 amp x true amp y true amp xtimesy false amp xsquared false amp ysquared false amp cosx false amp sinx false amp cosy false amp siny false amp collectstats false amp problem classification amp initzero false amp hidetext false span style font weight 400 playground span a p p span style font weight 400 16 span a href https aiexperiments withgoogle com span style font weight 400 a i experiments span a p p span style font weight 400 17 span a href http www lauradhamilton com machine learning algorithm cheat sheet span style font weight 400 machine learning algorithm cheat sheet span a p p span style font weight 400 18 span a href http www computervisionblog com span style font weight 400 tombone 8217 s computer vision blog span a p p span style font weight 400 19 span a href http bokeh pydata org en latest docs gallery html gallery span style font weight 400 bokeh gallery span a p p span style font weight 400 20 span a href http www r2d3 us visual intro to machine learning part 1 span style font weight 400 a visual introduction to machine learning span a p p span style font weight 400 21 span a href http machinelearningmastery com span style font weight 400 machine learning mastery span a p p span style font weight 400 22 span a href https jeffknupp com python tutoring span style font weight 400 everything i know about python span a p p span style font weight 400 23 span span style font weight 400 a href https www youtube com watch v u4algiomyp4 amp app desktop tensorflow and deep learning without a phd a video span p p span style font weight 400 24 span a href http danielnouri org span style font weight 400 daniel nouri blog span a p p span style font weight 400 25 span a href https blog dbrgn ch 2013 3 26 perceptrons in python span style font weight 400 programming a perceptron in python span a p p span style font weight 400 26 span a href https iamtrask github io 2015 07 27 python network part2 span style font weight 400 improving our neural network by optimising gradient descent span a p p span style font weight 400 27 span span style font weight 400 a href https cloud google com blog big data 2017 01 learn tensorflow and deep learning without a phd learn tensorflow and deep learning without a phd a note span p p span style font weight 400 28 span a href http blog hackerearth com 13 free self study books mathematics machine learning deep learning utm source facebook post amp utm campaign hack heart amp utm medium he handle b 13 b span style font weight 400 free self study books on mathematics machine learning amp deep learning span a p p span style font weight 400 29 span a href http python flow visualizer herokuapp com span style font weight 400 python program flow visualizer span a p p span style font weight 400 30 span a href http www gitxiv com span style font weight 400 collaborative open computer science span a p p span style font weight 400 31 span a href http wiki opencog org span style font weight 400 the open cognition project span a p p 32 a href http www hvass labs org target blank rel noopener noreferrer hvass labs tensorflow tutorials a p p 33 a href http ml4a github io index target blank rel noopener noreferrer introduction to machine learning for arts music a p p 34 a href http cs231n stanford edu target blank rel noopener noreferrer stanford university class cs231n convolutional neural networks for visual recognition by prof fei fei li a p p 35 a href http lvdmaaten github io tsne target blank rel noopener noreferrer tsne a p p 36 a href http people csail mit edu torralba shortcourserloc index html target blank rel noopener noreferrer learning object categories a p p 37 a href http colah github io target blank rel noopener noreferrer chris olah 8217 s blog a p p 38 a href http cs224d stanford edu target blank rel noopener noreferrer cs224d deep learning for natural language processing a p p 39 a href https staff washington edu jakevdp speaking html target blank rel noopener noreferrer jake vanderplas blog a p p 40 a href http aidl io issues 16 start target blank rel noopener noreferrer aidl blog a p p 41 a href http www kdnuggets com target blank rel noopener noreferrer kd nuggets a p p span style font weight 400 42 span a href https www scaler com topics what is deep learning span style font weight 400 what is deep learning span a p p nbsp p p nbsp p p b resource for the average user b p p span style font weight 400 1 span a href http cs231n github io span style font weight 400 convolutional neural networks for visual recognition span a p p span style font weight 400 1 span a href http www deeplearningbook org span style font weight 400 deep learning an mit press book span a p p span style font weight 400 1 span a href https www datacamp com courses deep learning in python span style font weight 400 deep learning datacamp span a p p span style font weight 400 1 span a href https www pluralsight com courses tensorflow getting started span style font weight 400 tensorflow pluralsight span a p p span style font weight 400 1 span a href http cs224d stanford edu span style font weight 400 natural language processing stanford span a p p span style font weight 400 2 span a href http ufldl stanford edu tutorial span style font weight 400 stanford university deep learning tutorial span a p p span style font weight 400 3 span a href http yerevann com a guide to deep learning span style font weight 400 a guide to deep learning span a p p span style font weight 400 4 span a href https www google com webmasters tools home hl en span style font weight 400 deep learning for self driving car span a p p span style font weight 400 5 span a href http selfdrivingcars mit edu span style font weight 400 deep learning for self driving cars website span a p p span style font weight 400 6 span a href https github com oxford cs deepnlp 2017 lectures span style font weight 400 deep natural language processing span a p p span style font weight 400 7 span a href http deeplearning net tutorial contents html span style font weight 400 deep learning documentation span a p p span style font weight 400 8 span a href http deeplearning net tutorial deeplearning pdf span style font weight 400 deep learning tutorial span a p p span style font weight 400 9 span a href http neuralnetworksanddeeplearning com span style font weight 400 neural networks and deep learning span a p p span style font weight 400 10 span a href http deeplearning net span style font weight 400 deep learning forum span a p p span style font weight 400 11 span a href http web stanford edu class cs20si index html span style font weight 400 tensorflow for deep learning research span a p p span style font weight 400 12 span a href http deeplearning net software pylearn2 index html span style font weight 400 pylearn2 vision span a p p span style font weight 400 13 span a href https www youtube com channel ucwn3xxrkmtpmbkwht9fue5a playlists span style font weight 400 siraj raval youtube channel span a p p span style font weight 400 14 span a href https sites google com site deeplearningcvpr2014 span style font weight 400 tutorial on deep learning for vision span a p p span style font weight 400 15 span a href http www mmds org span style font weight 400 mining of massive datasets span a p p span style font weight 400 16 span a href https devblogs nvidia com parallelforall accelerate machine learning cudnn deep neural network library span style font weight 400 accelerate machine learning with the cudnn deep neural network library span a p p span style font weight 400 17 span a href https devblogs nvidia com parallelforall deep learning computer vision caffe cudnn span style font weight 400 deep learning for computer vision with caffe and cudnn span a p p span style font weight 400 18 span a href https devblogs nvidia com parallelforall embedded machine learning cudnn deep neural network library jetson tk1 span style font weight 400 embedded machine learning with the cudnn deep neural network library and jetson tk1 span a p p span style font weight 400 19 span a href http cs stanford edu people karpathy convnetjs index html span style font weight 400 deep learning in your browser convnetjs span a p p span style font weight 400 20 span a href https in mathworks com videos machine learning with matlab 100694 html form seq conf840 amp elqsid 1445946727379 amp potential use student amp country code in span style font weight 400 machine learning with matlab span a p p span style font weight 400 21 span a href http deeplearning cs toronto edu span style font weight 400 toronto deep learning demo span a p p span style font weight 400 22 span a href http www fields utoronto ca video archive event 323 2014 span style font weight 400 fields lectures span a p p span style font weight 400 23 span a href http www zipfianacademy com span style font weight 400 zipfian academy span a p p span style font weight 400 24 span a href https www youtube com playlist list plou2xlyxmsiiuibfyad6rfyqu jl2ryal span style font weight 400 machine learning recipes with josh gordon span a p p span style font weight 400 25 span a href https academy microsoft com en us professional program data science span style font weight 400 microsoft professional program span a p p span style font weight 400 26 span a href https software intel com en us ai deep learning span style font weight 400 intel for deep learning span a p p span style font weight 400 27 span a href https developer nvidia com how to cuda python span style font weight 400 gpu accelerated computing with python span a p p span style font weight 400 28 span a href http us13 campaign archive1 com home u 67bd06787e84d73db24fb0aa5 amp id 6c9d98ff2c span style font weight 400 import ai newsletter span a p p span style font weight 400 29 span a href https medium com waleedka traffic sign recognition with tensorflow 629dffc391a6 1mpu6eut8 span style font weight 400 traffic sign recognition with tensorflow span a p p span style font weight 400 30 span a href https medium com karpathy yes you should understand backprop e2f06eab496b 67yrntj2s span style font weight 400 understand backpropagation span a p p span style font weight 400 31 span a href https bigdatauniversity com cm mc uid 49194979171614828156270 amp cm mc sid 50200000 1482815627 span style font weight 400 bigdata university span a p p span style font weight 400 32 span a href https rasa ai span style font weight 400 open source language understanding for bots span a p p span style font weight 400 33 span a href http kldavenport com pure python decision trees span style font weight 400 pure python decision trees span a p p span style font weight 400 34 span a href http www kdnuggets com 2015 06 top 20 python machine learning open source projects html span style font weight 400 top 20 python machine learning open source projects span a p p span style font weight 400 35 span a href http colah github io posts 2014 07 nlp rnns representations span style font weight 400 deep learning nlp and representations span a p p span style font weight 400 36 span a href http www kdnuggets com 2017 01 deep learning review natural language processing html span style font weight 400 deep learning research review natural language processing span a p p span style font weight 400 37 span a href https phillipi github io pix2pix span style font weight 400 image to image translation with conditional adversarial nets span a p p span style font weight 400 38 span span style font weight 400 a href http cmusphinx sourceforge net wiki tutorial cmusphinx tutorial for developers a span p p 39 a href http genekogan com target blank rel noopener noreferrer machine learning in arts by gene kogan a p p 40 a href https ml4a github io classes itp s16 07 target blank rel noopener noreferrer the neural aesthetic a p p 41 a href https aiexperiments withgoogle com visualizing high dimensional space target blank rel noopener noreferrer visualizing high dimensional space a p p 42 a href https github com yosinski deep visualization toolbox target blank rel noopener noreferrer deep visualization toolbox a p p 43 a href https github com merantix picasso target blank rel noopener noreferrer picasso cnn visualizer a p p 44 a href http selfdrivingcars mit edu target blank rel noopener noreferrer self driving car a p p 45 a href http blog davidsingleton org nnrccar target blank rel noopener noreferrer nn for self driving car a p p 46 a href https github com llsourcell how to simulate a self driving car target blank rel noopener noreferrer simulate a self driving car a p p 47 a href http web stanford edu class cs20si index html target blank rel noopener noreferrer cs 20si tensorflow for deep learning research a p p nbsp p p nbsp p p b resources for advanced user and researchers b p p span style font weight 400 1 span a href https arxiv org list cs ro recent span style font weight 400 recent researches span a p p span style font weight 400 2 span a href https blog acolyer org span style font weight 400 the morning paper span a p p span style font weight 400 3 span a href https github com terryum awesome deep learning papers span style font weight 400 most cited deep learning papers span a p p span style font weight 400 4 span a href http www arxiv sanity com span style font weight 400 arxiv sanity preserver span a p p span style font weight 400 5 span a href http mlg eng cam ac uk yarin blog 2248 html span style font weight 400 uncertainty in deep learning span a p p span style font weight 400 6 span a href http www nature com articles srep26094 span style font weight 400 deep patient span a p p span style font weight 400 7 span a href https www ncbi nlm nih gov pubmed 10586990 span style font weight 400 a space time delay neural network span a p p span style font weight 400 8 span a href https cloud google com natural language span style font weight 400 google cloud natural language api span a p p span style font weight 400 8 span a href https www floydhub com span style font weight 400 floydhub heroku for dl span a p p span style font weight 400 9 span a href http www artificialbrains com blue brain project span style font weight 400 blue brain project span a p p span style font weight 400 10 span a href http www psi toronto edu q publications span style font weight 400 whole genome sequencing resource span a p p span style font weight 400 11 span a href http www alexirpan com archive span style font weight 400 sorta insightful span a p p span style font weight 400 12 span a href http soumith ch eyescream span style font weight 400 the eyescream project span a p p span style font weight 400 13 span a href http cs stanford edu people karpathy gan span style font weight 400 generative adversarial networks span a p p nbsp p p b open source libraries repositories framework b p p span style font weight 400 1 span a href https www tensorflow org span style font weight 400 tensor flow span a p p span style font weight 400 2 span a href https keras io span style font weight 400 keras span a p p span style font weight 400 3 span a href http scikit learn org stable span style font weight 400 scikit learn span a p p span style font weight 400 4 span a href https openai com blog universe span style font weight 400 universe span a p p span style font weight 400 4 span a href https www lua org span style font weight 400 lua span a p p span style font weight 400 5 span a href http torch ch span style font weight 400 torch span a p p span style font weight 400 6 span a href http deeplearning net software theano span style font weight 400 theano span a p p span style font weight 400 7 span a href http spark apache org docs latest ml guide html span style font weight 400 machine learning library mllib span a p p span style font weight 400 8 span a href http archive ics uci edu ml span style font weight 400 uc irvine machine learning repository span a p p span style font weight 400 9 span a href https www cs toronto edu kriz cifar html span style font weight 400 the cifar 10 dataset span a p p span style font weight 400 10 span a href http neupy com pages home html span style font weight 400 neupy span a p p span style font weight 400 11 span a href https deeplearning4j org span style font weight 400 deeplearning4j span a p p span style font weight 400 12 span a href http image net org index span style font weight 400 imagenet span a p p span style font weight 400 13 span a href http seaborn pydata org span style font weight 400 seaborn span a p p span style font weight 400 14 span a href http mldata org span style font weight 400 mldata span a p p span style font weight 400 15 span a href https github com microsoft cntk span style font weight 400 cntk span a p p span style font weight 400 16 span a href https www nltk org span style font weight 400 natural language toolkit nltk span a p p span style font weight 400 17 span a href https spacy io span style font weight 400 spacy span a p p span style font weight 400 18 span a href http stanfordnlp github io corenlp span style font weight 400 corenlp span a p p span style font weight 400 19 span a href http docs python requests org en master span style font weight 400 requests http for humans span a p p span style font weight 400 20 span a href https github com akshayubhat computationalhealthcare span style font weight 400 computational healthcare library span a p p span style font weight 400 21 span a href https blaze readthedocs io en latest overview html span style font weight 400 blaze span a p p span style font weight 400 22 span a href http dask pydata org span style font weight 400 dask span a p p span style font weight 400 23 span a href http www ebi ac uk arrayexpress span style font weight 400 array express span a p p span style font weight 400 24 span a href http pillow readthedocs io en 3 4 x index html span style font weight 400 pillow span a p p span style font weight 400 25 span a href http numenta org span style font weight 400 htm span a p p span style font weight 400 26 span a href https github com pybrain pybrain wiki installation span style font weight 400 pybrain span a p p span style font weight 400 27 span a href http nilearn github io introduction html python for neuroimaging a quick start span style font weight 400 nilearn span a p p span style font weight 400 28 span a href http www clips ua ac be pages pattern span style font weight 400 pattern span a p p span style font weight 400 29 span a href https github com mila udem fuel span style font weight 400 fuel span a p p span style font weight 400 30 span a href http deeplearning net software pylearn2 span style font weight 400 pylearn2 span a p p span style font weight 400 31 span a href https pythonhosted org bob index html span style font weight 400 bob span a p p span style font weight 400 32 span a href https github com jaberg skdata span style font weight 400 skdata span a p p span style font weight 400 33 span a href https github com luispedro milk span style font weight 400 milk span a p p span style font weight 400 34 span a href https github com machinalis iepy span style font weight 400 iepy span a p p span style font weight 400 35 span a href https github com machinalis quepy span style font weight 400 quepy span a p p span style font weight 400 36 span a href https github com numenta nupic span style font weight 400 nupic span a p p span style font weight 400 37 span a href https github com mohendra hebel span style font weight 400 hebel span a p p span style font weight 400 38 span a href http ramp readthedocs io en latest span style font weight 400 ramp span a p p span style font weight 400 39 span a href https github com awslabs machine learning samples tree master social media step1 span style font weight 400 machine learning samples span a p p span style font weight 400 40 span a href http www h2o ai span style font weight 400 h2o span a p p span style font weight 400 41 span a href http optunity readthedocs io en latest span style font weight 400 optunity span a p p span style font weight 400 42 span a href https github com caesar0301 awesome public datasets span style font weight 400 awesome public datasets span a p p span style font weight 400 43 span a href https github com pytorch tutorials span style font weight 400 pytorch span a p p span style font weight 400 44 span a href http blog kubernetes io 2017 02 run deep learning with paddlepaddle on kubernetes html m 1 span style font weight 400 kubernetes span a p p span style font weight 400 45 span a href https github com reedscot icml2016 span style font weight 400 generative adversarial text to image synthesis span a p p span style font weight 400 46 span a href http pydata org index html span style font weight 400 pydata span a p p span style font weight 400 47 span a href https opendatakit org span style font weight 400 open data kit odk span a p p span style font weight 400 48 span a href http opendetection com span style font weight 400 open detection span a p p span style font weight 400 49 span a href https github com mycroftai span style font weight 400 mycroft span a p p 50 a href http langlotzlab stanford edu projects medical image net target blank rel noopener noreferrer medical image net a p p 51 a href http biorxiv org target blank rel noopener noreferrer biorxiv archive and distribution service for unpublished preprints in the life sciences a p p 52 a href https github com udacity self driving car sim target blank rel noopener noreferrer udacity self driving car simulator a p p 53 a href https github com beamandrew medical data target blank rel noopener noreferrer list of medical datasets and repositories a p p nbsp p p b all video materials b p p span style font weight 400 1 span a href https www youtube com playlist list plou2xlyxmsiiuibfyad6rfyqu jl2ryal span style font weight 400 machine learning recipes with josh gordon span a p p span style font weight 400 2 span a href http on demand gputechconf com gtc 2014 webinar gtc express deep learning caffee evan shelhamer mp4 span style font weight 400 deep learning for vision with caffe framework span a p p span style font weight 400 3 span a href https www youtube com watch v uzxylbk2c7e amp list pla89dcfa6adace599 span style font weight 400 stanford university machine learning course by prof andrew ng span a p p span style font weight 400 4 span a href https www youtube com watch v qgx57x0fbda span style font weight 400 deep learning for computer vision by dr rob fergus span a p p span style font weight 400 5 span a href https www youtube com playlist list pld63a284b7615313a span style font weight 400 caltech machine learning course span a p p span style font weight 400 6 span a href http techtalks tv talks machine learning and ai via brain simulations 57862 span style font weight 400 machine learning and ai via brain simulations span a p p span style font weight 400 7 span a href https www youtube com watch feature player embedded amp v 4xsvflnhc 0 span style font weight 400 deep learning of representations google talk span a p p span style font weight 400 8 span a href https www youtube com channel ucnvzaplje2ljpzseqylseyg span style font weight 400 data school span a p p span style font weight 400 9 span a href https www youtube com watch v 8mt9ck4vb70 amp feature youtu be amp app desktop span style font weight 400 how to run neural nets on gpus 8217 by melanie warrick span a p p span style font weight 400 10 span a href https www youtube com watch v u4algiomyp4 amp app desktop span style font weight 400 tensorflow and deep learning without a phd span a p p span style font weight 400 11 span a href https www youtube com user sentdex featured span style font weight 400 youtube channel of harrison from basic python to machine learning span a p p span style font weight 400 12 span a href https www youtube com channel ucwn3xxrkmtpmbkwht9fue5a playlists span style font weight 400 siraj raval youtube channel span a p p span style font weight 400 13 span a href https www youtube com watch v ksslgdst2ms span style font weight 400 machine learning prepare data tutorial span a p p 14 a href https www youtube com channel ucbba38v6vcglqvl 8kvvmg target blank rel noopener noreferrer hvass laboratories a p p strong brain computer interfacing strong p p 1 a href https github com neurotechx awesome bci brain visualizations target blank rel noopener noreferrer all bci resources at one place a p p nbsp p p b ai companies organisations b p p span style font weight 400 1 span a href https deepmind com span style font weight 400 deepmind span a p p span style font weight 400 2 span a href https mila umontreal ca en mila span style font weight 400 mila span a p p span style font weight 400 3 span a href https www ibm com watson span style font weight 400 ibm watson span a p p span style font weight 400 4 span a href http www idsia ch span style font weight 400 the swiss ai lab idsia span a p p span style font weight 400 5 span a href http research comma ai span style font weight 400 comma ai span a p p span style font weight 400 6 span a href https indico io hack span style font weight 400 indico span a p p span style font weight 400 7 span a href http www osaro com technology span style font weight 400 osaro span a p p span style font weight 400 8 span a href https www cloudera com span style font weight 400 cloudera span a p p span style font weight 400 9 span a href http geometric ai span style font weight 400 geometric intelligence span a p p span style font weight 400 10 span a href https skymind ai span style font weight 400 skymind span a p p span style font weight 400 11 span a href https metamind io span style font weight 400 metamind span a p p span style font weight 400 12 span a href https iris ai span style font weight 400 iris ai span a p p span style font weight 400 13 span a href https feedzai com span style font weight 400 feedzai span a p p span style font weight 400 14 span a href https www loomai com span style font weight 400 loomai span a p p span style font weight 400 15 span a href http benevolent ai span style font weight 400 benevolentai span a p p span style font weight 400 16 span a href http research baidu com span style font weight 400 baidu research span a p p span style font weight 400 17 span a href https rasa ai span style font weight 400 rasa ai span a p p span style font weight 400 18 span a href https gym openai com span style font weight 400 ai gym span a p p span style font weight 400 19 span a href https www nervanasys com span style font weight 400 nervana span a p p span style font weight 400 20 span a href https crowdai com span style font weight 400 crowdai span a p p span style font weight 400 21 span a href https www idiap ch span style font weight 400 idiap research institute span a p p span style font weight 400 22 span a href http www maluuba com span style font weight 400 maluuba span a p p span style font weight 400 23 span a href http www neurala com products brains for bots sdk span style font weight 400 neurala span a p p span style font weight 400 24 span a href http ai ucsd edu span style font weight 400 artificial intelligence group at ucsd span a p p span style font weight 400 25 span a href https turi com learn span style font weight 400 turi span a p p span style font weight 400 26 span a href http www enlitic com span style font weight 400 enlitic span a p p span style font weight 400 27 span a href https www elementai com research span style font weight 400 element ai span a p p span style font weight 400 28 span a href http accel ai span style font weight 400 accel ai span a p p span style font weight 400 29 span a href http www datalog ai span style font weight 400 datalog ai span a p p span style font weight 400 30 span a href http www fast ai span style font weight 400 fast ai span a p p span style font weight 400 31 span a href http appliedbrainresearch com span style font weight 400 applied brain research span a p p span style font weight 400 32 span a href https www neuraldesigner com span style font weight 400 neuraldesigner span span style font weight 400 br span a p p 33 a href http www autox ai target blank rel noopener noreferrer autox a p p 34 a href http www niramai com niramai a p p 35 a href http isenses in isenses a p p 36 a href https www medgenome com company medgenome a p p 37 a href http www recursionpharma com target blank rel noopener noreferrer recursion pharmaceuticals inc a p p 38 a href http geometric ai target blank rel noopener noreferrer geometric intelligence a p p 39 a href https www jukedeck com target blank rel noopener noreferrer jukedeck ai musician a p p 40 a href https www galaxy ai target blank rel noopener noreferrer galaxy ai a p p 41 a href http www atomwise com target blank rel noopener noreferrer atomwise a p p 42 a href https deepart io target blank rel noopener noreferrer deepart a p p nbsp p p nbsp p p b ai personalities b p p a href http daggerfs com span style font weight 400 yangqing jia span a p p a href http imaginarynumber net span style font weight 400 evan shelhamer span a p p a href http demishassabis com publications span style font weight 400 demis hassabis span a p p a href http web mit edu cocosci josh html span style font weight 400 josh tenenbaum span a p p a href http www iro umontreal ca bengioy span style font weight 400 yoshua bengio span a p p a href http www psi toronto edu frey span style font weight 400 brendan j frey span a p p a href http cseweb ucsd edu arunkk span style font weight 400 arun kumar span a p p a href https mostafa samir github io span style font weight 400 mostafa samir span a p p a href http cs stanford edu people karpathy span style font weight 400 andrej karpathy span a p p a href http cs stanford edu people jcjohns span style font weight 400 justin johnson span a p p a href http vision stanford edu feifeili span style font weight 400 fei fei li span a p p a href http www niebles net span style font weight 400 juan carlos niebles span a p p a href http learning eng cam ac uk carl span style font weight 400 carl edward rasmussen span a p p a href https www cs toronto edu hinton span style font weight 400 geoffrey e hinton span a p p a href http yann lecun com span style font weight 400 yann lecun span a p p a href https www leadingauthorities com speakers neil jacobstein span style font weight 400 neil jacobstein span a p p a href http www andrewng org span style font weight 400 andrew ng span a p p a href https en wikipedia org wiki jeff dean computer scientist span style font weight 400 jeffrey dean span a p p a href https research google com pubs rajatmonga html span style font weight 400 rajat monga span a p p a href https research fb com people paluri manohar span style font weight 400 manohar paluri span a p p a href https research fb com people candela joaquin quinonero span style font weight 400 joaquin quinonero candela span a p p a href https www microsoft com en us research people bmitra span style font weight 400 bhaskar mitra span a p p a href https www microsoft com en us research people pkohli span style font weight 400 pushmeet kohli span a p p a href http www cs toronto edu ilya span style font weight 400 ilya sutskever span a p p a href https gregbrockman com span style font weight 400 greg brockman span a p reference http aimedicines com 2017 03 17 all ai resources at one place
ai
ember-polaris
ember polaris https github com smile io ember polaris workflows ci badge svg https github com smile io ember polaris actions image https user images githubusercontent com 5737342 26935493 c8c81c76 4c74 11e7 90dd ff8b0fdc434e png ember polaris is an ember cli addon to make shopify s polaris design system https polaris shopify com accessible to ember developers status note this addon is still in development as such not all of the components available in the react component library have been built yet and some features of those which have been built are currently unimplemented check the component list components for a list of those which are currently available compatibility ember js v3 16 or above ember cli v2 13 or above node js v10 or above installation install ember polaris using ember cli sh ember install smile io ember polaris styles this addon will install ember cli sass https github com aexmachina ember cli sass in the host app it will also set up your app s app styles app scss to import ember polaris creating the file if it does not already exist icons for icons to work you will need to copy polaris svg s into a folder in public ex public assets images svg polaris install ember svg jar add the following ember svg jar options to your ember cli build js javascript ember cli build js var app new emberapp defaults svgjar strategy inline inline strippath false optimizer removedimensions true sourcedirs public assets images svg template compiler if your app does not already import ember template compiler you may get an error similar to this one when passing a hash of componentname and props into one of the ember polaris components uncaught typeerror ember default htmlbars compile is not a function if that happens you need to add an import statement to its ember cli build js javascript ember cli build js app import node modules ember source dist ember template compiler js note this setup will be handled by ember polaris in the future usage ember polaris provides a set of ember components intended to implement the same behavior and functionality as the shopify polaris react components https github com shopify polaris in general the usage can be inferred from the polaris component documentation https polaris shopify com components get started with some exceptions as described below differences with polaris react components we have tried to keep the components provided by ember polaris as similar to the original polaris react components as possible however there are cases where it makes sense to do things in a more ember friendly way and where this is true we will describe the ember polaris usage and how it differs from the original shopify components general differences children property a large number of the polaris react components have a children property listed in their documentation in these cases the corresponding ember polaris component can be given a text attribute which will take the place of the children property and specify the text to be rendered by the component alternatively these components can be used in block form to achieve the same result note that the block content will take precedence over the text attribute if one is supplied hbs polaris button text attribute text renders a button with the content attribute text polaris button block text polaris button renders a button with the content block text polaris button text attribute text block text polaris button renders a button with the content block text element property some polaris react components accept an element property which changes the tag rendered by the component in ember polaris this is replaced by the tagname attribute unless otherwise noted this attribute cannot be dynamic the following code would cause an error hbs polaris display text tagname displaytexttagname displays fine to start with polaris display text polaris button onclick action mut displaytexttagname h5 but clicking this button will cause an error polaris button actions property some polaris react components accept an actions property as a list of actions which can be performed in ember polaris this is renamed to avoid collisions with the ember actions hash the new name will be different based on the component check the documentation for the specific component to find out what attribute to pass the actions list as components below is a categorised list of the components available in ember polaris click an item to see more information and usage examples for that component actions action list docs action list md action list button docs button md button button group docs button group md button group drop zone docs drop zone md drop zone setting toggle docs setting toggle md setting toggle images and icons avatar docs avatar md avatar badge docs badge md badge icon docs icon md icon thumbnail docs thumbnail md thumbnail feedback indicators banner docs banner md banner progress bar docs progress bar md progress bar skeleton body text docs skeleton body text md skeleton body text skeleton display text docs skeleton display text md skeleton display text skeleton page docs skeleton page md skeleton page spinner docs spinner md spinner structure callout card docs callout card md callout card card docs card md card empty state docs empty state md empty state layout docs layout md layout page docs page md page page actions docs page actions md page actions stack docs stack md stack titles and text caption docs caption md caption display text docs display text md display text footer help docs footer help md footer help heading docs heading md heading subheading docs subheading md subheading text style docs text style md text style visually hidden docs visually hidden md visually hidden forms checkbox docs checkbox md checkbox choice list docs choice list md choice list color picker docs color picker md color picker date picker docs date picker md date picker form docs form md form form layout docs form layout md form layout inline error docs inline error md inline error radio button docs radio button md radio button range slider docs range slider md range slider select docs select md select tag docs tag md tag text field docs text field md text field lists and tables data table docs data table md data table description list docs description list md description list list docs list md list option list docs option list md option list resource list docs resource list md resource list navigation link docs link md link pagination docs pagination md pagination overlays popover docs popover md popover contributing see the contributing contributing md guide for details release preparation ember polaris has an automated changelog generator when preparing releases run yarn changelog generator to generate changelog for the current branch license this project is licensed under the mit license license md
ember-cli ember-polaris ember shopify-polaris
os
jhipster-lite
logo jhipster lite jhipster image jhipster url jhipster lite jhipster lite version jhipster lite release version jhipster lite release url jhipster lite maven central jhipster lite maven central version jhipster lite maven central url jhipster lite docker hub jhipster lite docker hub jhipster lite docker hub url build status github actions jhlite image github actions url coverage status codecov image codecov url sonarcloud coverage sonarcloud coverage sonarcloud url sonarcloud quality gate sonarcloud quality gate sonarcloud url sonarcloud maintainability sonarcloud maintainability sonarcloud url sonarcloud bugs sonarcloud bugs sonarcloud url sonarcloud vulnerabilities sonarcloud vulnerabilities sonarcloud url sonarcloud security sonarcloud security sonarcloud url sonarcloud code smells sonarcloud code smells sonarcloud url description jhipster jhipster url is a development platform to quickly generate develop deploy modern web applications microservice architectures jhipster lite will help you to start your project by generating step by step only what you need the generated code uses hexagonal architecture documentation hexagonal architecture md the technical code is separated from your business code you will only generate the code you want no additional unused code the best quality as possible coverage 0 code smell no duplication this is a sample application https github com jhipster jhipster lite sample app created with jhipster lite quick start you need to clone this project and go into the folder git clone https github com jhipster jhipster lite cd jhipster lite run the project bash mvnw then you can navigate to http localhost 7471 in your browser some videos what is jhipster lite and why should you care devoxx jhlite by julien dubois jdubois simple webservices with jhipster lite webservices with jhlite by colin damon cdamon jhipster vs jhipster lite jhipster vs jhlite by julien dubois jdubois choosing the original jhipster and jhlite are not the same thing they are not generating the same code and not serving the same purpose here are some choice elements you can take into account choosing jhipster documentation jhlite choice png prerequisites java you need to have java 17 jdk 17 https openjdk java net projects jdk 17 node js and npm node js https nodejs org we use node to run a development web server and build the project depending on your system you can install node either from source or as a pre packaged bundle after installing node you should be able to run the following command to install development tools bash npm ci you will only need to run this command when dependencies change in package json package json bash npm install test the project to launch tests bash mvnw clean test to launch tests and integration tests bash mvnw clean verify graalvm native support this project has been configured to let you generate either a lightweight container or a native executable it is also possible to run your tests in a native image lightweight container with cloud native buildpacks if you re already familiar with spring boot container images support this is the easiest way to get started docker should be installed and configured on your machine prior to creating the image to create the image run the following goal bash mvnw spring boot build image pnative then you can run the app like any other container bash docker run p 7471 7471 rm docker io library jhlite version executable with native build tools use this option if you want to explore more options such as running your tests in a native image the graalvm native image compiler should be installed and configured on your machine note graalvm 22 3 is required to create the executable run the following goal bash mvnw native compile pnative dskiptests then you can run the app as follows bash target jhlite you can also run your existing tests suite in a native image this is an efficient way to validate the compatibility of your application to run your existing tests in a native image run the following goal bash mvnw test pnativetest lint we use multiple linters check and lint your code eslint https eslint org for javascript typescript prettier https github com prettier prettier for the format prettier java https github com jhipster prettier java for java stylelint https stylelint io for style stylelint scss https github com stylelint scss for scss pug lint https www npmjs com package pug lint for pug to check bash npm run lint ci to lint and fix all code bash npm run lint sonar analysis to launch local sonar analysis bash docker compose f src main docker sonar yml up d then bash mvnw clean verify sonar sonar so you can check the result at http localhost 9001 run the project you can run the project using maven as spring boot run is the default target bash mvnw or first you can package as jar bash mvnw package then run bash java jar target jar so you can navigate to http localhost 7471 in your browser these following profiles are available and you can use it to only display the frameworks you want angular react vue for example you can run bash mvnw dspring boot run profiles vue or bash java jar target jar spring profiles active vue docker podman quickstart to start a local instance of jhipster lite go to your desired application folder and run bash docker run rm pull always p 7471 7471 v pwd tmp jhlite z it jhipster jhipster lite latest or with podman bash podman run rm pull always p 7471 7471 v pwd tmp jhlite z u root it jhipster jhipster lite latest then go to http localhost 7471 http localhost 7471 e2e tests you need to run the project first then you can run the end to end tests bash npm run e2e or in headless mode bash npm run e2e headless generate your project once started go to http localhost 7471 select your option and generate the code you want step by step and only what you need contributing we are honoured by any contributions you may have small or large please refer to our contribution guidelines and instructions document https github com jhipster jhipster lite blob main contributing md for any information about contributing to the project sponsors support this project by becoming a sponsor become a sponsor https opencollective com generator jhipster or learn more about sponsoring the project https www jhipster tech sponsors thank you to our sponsors gold sponsors table tbody tr td align center valign middle a href https dev entando org jhipster target blank img width 200em src https www jhipster tech images open collective entandoe png a td tr tbody table bronze sponsors bronzesponsors bronze sponsors image bronze sponsors url backers thank you to all our backers backers backers image backers url jhipster lite release version https img shields io github v release jhipster jhipster lite jhipster lite release url https github com jhipster jhipster lite releases jhipster lite maven central version https img shields io maven central v tech jhipster lite jhlite color blue jhipster lite maven central url https repo maven apache org maven2 tech jhipster lite jhlite jhipster lite docker hub https img shields io badge docker 20hub jhipster 2fjhipster lite blue svg style flat jhipster lite docker hub version https img shields io docker v jhipster jhipster lite color 0073ec jhipster lite docker hub url https hub docker com r jhipster jhipster lite github actions jhlite image https github com jhipster jhipster lite workflows build badge svg github actions url https github com jhipster jhipster lite actions codecov image https codecov io gh jhipster jhipster lite branch main graph badge svg codecov url https codecov io gh jhipster jhipster lite jhipster image https raw githubusercontent com jhipster jhipster artwork main logos lite jhipster lite neon blue png jhipster url https www jhipster tech sonarcloud url https sonarcloud io project overview id jhipster jhipster lite sonarcloud quality gate https sonarcloud io api project badges measure project jhipster jhipster lite metric alert status sonarcloud maintainability https sonarcloud io api project badges measure project jhipster jhipster lite metric sqale rating sonarcloud bugs https sonarcloud io api project badges measure project jhipster jhipster lite metric bugs sonarcloud vulnerabilities https sonarcloud io api project badges measure project jhipster jhipster lite metric vulnerabilities sonarcloud security https sonarcloud io api project badges measure project jhipster jhipster lite metric security rating sonarcloud code smells https sonarcloud io api project badges measure project jhipster jhipster lite metric code smells sonarcloud coverage https sonarcloud io api project badges measure project jhipster jhipster lite metric coverage backers image https opencollective com generator jhipster tiers backer svg avatarheight 70 width 890 backers url https opencollective com generator jhipster bronze sponsors image https opencollective com generator jhipster tiers bronze sponsor svg avatarheight 120 width 890 bronze sponsors url https opencollective com generator jhipster devoxx jhlite https youtu be rnlgny vzli jdubois https twitter com juliendubois webservices with jhlite https youtu be meecprzjaji jhipster vs jhlite https youtu be t5ga329fmfu cdamon https www linkedin com in colin damon
java spring-boot jhipster generator hexagonal-architecture typescript vuejs hacktoberfest
front_end
dam
dam mobile application development suportul de seminar este disponibil in sectiunea wiki orar dam 2023 2024 marti grupa 1088 ora 10 30 sala 2320 grupa 1084 ora 12 00 sala 2320 joi grupa 1097 ora 10 30 sala 2013b contact email alexandru dita csie ase ro
front_end
JARVIS-OpenAI-Voice-Assistant
jarvis a real time voice assistant using openai api jarvis is an advanced voice enabled chatbot powered by openai s gpt 3 5 turbo utilizing state of the art natural language processing it delivers intelligent conversational interactions with users demo video https www youtube com watch v 2dpezmbcwpq features real time voice recognition and response customized ai behavior based on iron man s jarvis text to speech and speech to text capabilities using gtts pygame and speech recognition libraries user friendly and interactive experience installation clone the repository git clone https github com yourusername jarvis git install the required libraries pip install r requirements txt add your openai api key to the script openai api key your api key here run your code python main py how it works jarvis leverages the openai api to generate context aware responses based on user input the application uses the following libraries for audio processing gtts converts text to speech using google s text to speech api pygame plays audio files with adjustable speed and volume speech recognition transcribes audio input using google s speech recognition api contributing we welcome your contributions feel free to submit issues feature requests and pull requests to help improve jarvis license this project is licensed under the mit license we hope you enjoy using jarvis and look forward to seeing what you create
ai
FlappyLearning
flappy learning demo http xviniette github io flappylearning program that learns to play flappy bird by machine learning neuroevolution http www scholarpedia org article neuroevolution alt tag https github com xviniette flappylearning blob gh pages img flappy png raw true neuroevolution js http github com xviniette flappylearning blob gh pages neuroevolution js utilization javascript initialize var ne new neuroevolution options default options values var options network 1 1 1 perceptron structure population 50 population by generation elitism 0 2 best networks kepts unchanged for the next generation rate randombehaviour 0 2 new random networks for the next generation rate mutationrate 0 1 mutation rate on the weights of synapses mutationrange 0 5 interval of the mutation changes on the synapse weight historic 0 latest generations saved lowhistoric false only save score not the network scoresort 1 sort order 1 desc 1 asc nbchild 1 number of child by breeding update options at any time ne set options generate first or next generation var generation ne nextgeneration when an network is over save this score ne networkscore generation x score 0 you can see the neuroevolution integration in flappy bird in game js http github com xviniette flappylearning blob gh pages game js
neuroevolution machine-learning flappybird
ai
Accelerating-Server-Side-Development-with-Fastify
accelerating server side development with fastify a href https www packtpub com product accelerating server side development with fastify 9781800563582 utm source github utm medium repository utm campaign 9781803235851 img src https content packt com b16496 cover image small jpg alt height 256px align right a this is the code repository for accelerating server side development with fastify https www packtpub com product accelerating server side development with fastify 9781800563582 utm source github utm medium repository utm campaign 9781803235851 published by packt a comprehensive guide to api development for building a scalable backend for your web apps what is this book about this book is a complete guide to server side app development in fastify written by the core contributors of this highly performant plugin based web framework throughout the book you ll discover how it fosters code reuse thereby improving your time to market this book covers the following exciting features explore the encapsulation techniques implemented by fastify understand how to deploy monitor and handle errors in a running fastify instance organize the project structure and implement a microservices architecture explore fastify s core features such as code reuse runtime speed and much more discover best practices for implementing fastify in real world restful apps understand advanced backend development concepts such as performance monitoring and logging if you feel this book is for you get your copy https www amazon com dp 1800563582 today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders for example chapter02 the code will look like the following id 1 name foo hobbies soccer scuba following is what you need for this book this book is for mid to expert level backend web developers who have already used other backend web frameworks and are familiar with http protocol and its peculiarities developers looking to migrate to fastify evaluate its suitability for their next project avoid architecture pitfalls and build highly responsive and maintainable api servers will also find this book useful the book assumes knowledge of javascript programming node js and backend development with the following software and hardware list you can run all code files present in the book chapter 1 15 software and hardware list chapter software required os required 1 15 fastify 4 windows mac os x and linux any we also provide a pdf file that has color images of the screenshots diagrams used in this book click here to download it https packt link df1dm related products supercharging node js applications with sequelize packt https www packtpub com product supercharging nodejs applications with sequelize 9781801811552 utm source github utm medium repository utm campaign 9781801811552 amazon https www amazon com dp 1801811555 modern frontend development with node js packt https www packtpub com product modern frontend development with nodejs 9781804618295 utm source github utm medium repository utm campaign 9781804618295 amazon https www amazon com dp 1804618292 get to know the authors manuel spigolon is a senior backend developer at nearform he is one of the core maintainers on the fastify team manuel has developed and maintained a complex api that serves more than 10 million users worldwide maksim sinik is a senior engineering manager and a core maintainer of the fastify framework he has a decade of experience as a node js developer with a strong interest in backend scalability he designed the architecture and led the development of several service based software as a service saas platforms across multiple industries that process hundreds of thousands of requests matteo collina is the co founder and cto of platformatic dev who has the goal of removing all friction from backend development he is also a prolific open source author in the javascript ecosystem and the modules he maintains are downloaded more than 17 billion times a year previously he was the chief software architect at nearform the best professional services company in the javascript ecosystem in 2014 he defended his ph d thesis titled application platforms for the internet of things matteo is a member of the node js technical steering committee focusing on streams diagnostics and http he is also the author of the fast logger pino and the fastify web framework matteo is a renowned international speaker after more than 60 conferences including openjs world node js interactive nodeconf eu nodesummit jsconf asia webrebels and jsday to name just a few since august 2023 he also serves as a community director on the openjs foundation in the summer he loves sailing the sirocco other books by the authors node cookbook third edition https www packtpub com product node cookbook third edition 9781785880087 utm source github utm medium repository utm campaign 9781785880087
front_end
end-to-end-workshop-for-computer-vision
end to end computer vision cv training workshop this is an end to end cv mlops workshop aimed to help machine learning ml and data science ds teams build relevant aws and sagemaker competencies for an enterprise scale solution the content is derived from a real world cv use case where an image classification model is developed and trained on sagemaker and then deployed to an edge computing devices here is a diagram overview of the workshop and the learning outcome for each module workshop overview statics cv workshop overview png the curriculum consists following modules 1 data labeling optional 01 groundtruth optional readme md 2 preprocessing 02 preprocessing readme md 3 training on sagemaker 03 training readme md 4 advance training on sagemaker 04 advanced training readme md 5 model evaluation and model explainability 05 model evaluation and model explainability readme md 6 sagemaker trainning pipeline 06 training pipeline readme md 7 edge deployment 07 edge deployment readme md 8 end to end 08 end to end readme md to get started load the provided jupyter notebook and associated files to you sagemaker studio environment security see contributing contributing md security issue notifications for more information license this library is licensed under the mit 0 license see the license file
ai
machine-learning-nd
machine learning nd udacity s machine learning nanodegree project files and notes this repository contains project files and lecture notes for udacity s machine learning engineer nanodegree program https www udacity com course machine learning engineer nanodegree nd009 which i started working on in september 2016 the machine learning engineer nanodegree is an online certification it involves 1 courses in supervised learning unsupervised learning and reinforcement learning and 2 six projects p0 p5 in this directory courses include lecture videos quizzes and programming problems these courses were developed by georgia tech udacity google and kaggle this directory includes lecture notes lesson notes and project code p0 to p5 see also my notes for udacity s data analyst nanodegree https www udacity com course data analyst nanodegree nd002 v a program outline 0 exploratory project titanic survival exploration 1 model evaluation and validation project 1 predicting boston housing prices 2 supervised learning project 2 building a student intervention system predicting whether or not students will fail so schools can intervene to help them graduate 3 unsupervised learning project 3 creating customer segments segmenting customers based on spending in different categories 4 reinforcement learning project 4 train a smartcab to drive implement q learning algorithm 5 machine learning specialisation of choice
ai