names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
Data-Engineering
data engineering data engineering for data science a comprehensive github repository covering all aspects of data engineering in the context of data science including data ingestion storage processing pipelines cloud computing and best practices get hands on examples and resources for building efficient data workflows data engineering for data science this repository aims to provide a comprehensive collection of resources and examples for data engineering in the context of data science it covers a wide range of topics and techniques that are essential for effectively managing and processing data in data science projects table of contents 1 introduction to data engineering overview of data engineering role of data engineering in data science 2 data ingestion extract transform load etl process batch and real time data ingestion techniques introduction to popular data ingestion tools e g apache kafka apache nifi 3 data storage and management relational databases e g postgresql mysql nosql databases e g mongodb apache cassandra data warehousing and olap e g snowflake amazon redshift introduction to data lake architecture 4 data processing data cleaning and preprocessing techniques batch processing frameworks e g apache spark stream processing frameworks e g apache flink introduction to apache hadoop ecosystem 5 data pipelines and workflow orchestration introduction to workflow orchestration tools e g apache airflow luigi building and managing data pipelines data pipeline monitoring and error handling 6 data quality and governance data validation and quality checks data lineage and metadata management introduction to data governance frameworks 7 data integration and apis building data integration workflows working with restful and graphql apis introduction to api design and best practices 8 cloud computing and big data platforms introduction to cloud computing providers e g aws azure gcp deploying data engineering workflows on the cloud big data platforms and technologies e g apache hadoop apache spark 9 scalable data processing introduction to distributed computing parallel processing techniques scaling data engineering workflows 10 data engineering best practices design patterns for data engineering performance optimization techniques testing and monitoring data engineering workflows 11 case studies and examples real world data engineering use cases code examples and demos contributing if you would like to contribute to this repository please refer to the contributing md file for guidelines on how to contribute license this repository is licensed under the mit license please see the license file for more details feel free to customize this repository structure and add more topics based on your specific requirements
cloud
IssuesLanguageModel
this repo is deprecated development has continued here https github com kubeflow code intelligence tree master issue embeddings python 3 7 https img shields io badge python 3 7 blue svg https www python org downloads release python 370 license mit https img shields io badge license mit darkgreen svg https opensource org licenses mit powered by fast ai https img shields io badge fastai 20v1 5 3 20 20 blueviolet svg logo github https github com fastai fastai tree 69231e6026b7fcbe5b67ab4eaa23d19be3ea0659 weights and biases https img shields io badge weights 20 20biases black svg logo google analytics https app wandb ai github issues lang model a language model trained on 16m github issues for transfer learning motivation issue label bot https github com machine learning apps issue label bot predicts 3 generic issue labels bug feature request and question however it would be nice to predict personalized issue labels instead of generic ones to accomplish this we can use the issues that are already labeled in a repository as training data for a model that can predict personalized issue labels one challenge with this approach is there is often a small number of labeled issues in each repository in order to mitigate this concern we utilize transfer learning http nlp fast ai by training a language trained over 16 million github issues and fine tune this to predict issue labels end product an api that returns embeddings from github issue text the manifest files in deployment deployment define a service that will return 2400 dimensional embeddings given the text of an issue the api endpoints are hosted on https gh issue labeler com all routes expect post requests with a header containing a token field below is a list of endpoints 1 https gh issue labeler com text expects a json payload of title and body and returns a single 2 400 dimensional vector that represents latent features of the text for example this is how you would interact with this endpoint from python python import requests import json import numpy as np from passlib apps import custom app context as pwd context api endpoint https gh issue labeler com text api key your api key contact maintainers to get this a toy example of a github issue title and body data title fix the issue body i am encountering an error n when trying to push the button sending post request and saving response as response object r requests post url api endpoint headers token pwd context hash api key json data convert string back into a numpy array embeddings np frombuffer r content dtype f4 2 https gh issue labeler com all issues owner repo construction this will return a numpy array of the shape of labeled issues in repo 2400 as well a list of all the labels for each issue this endpoint is still under construction training the language model the language model is built with the fastai http nlp fast ai library the notebooks notebooks folder contains a tutorial of the steps you need to build a language model 1 01 acquiredata ipynb notebooks 01 acquiredata ipynb describes how to acquire and pre process the data using mdparse https github com machine learning apps mdparse which parses and annotates markdown files 2 02 fastai databunch ipynb notebooks 02 fastai databunch ipynb the fastai library uses an object called a databunch https docs fast ai basic data html databunch around pytorch s dataloader class to encapuslate additional metadata and functionality this notebook walks through the steps of preparing this data structure which will be used by the model for training 3 03 create model ipynb notebooks 03 create model ipynb this walks through the process of instantiating the fastai language model along with callbacks for early stopping logging and saving of artifacts additionally this notebook illustrates how to train the model 4 04 inference ipynb notebooks 04 inference ipynb shows how to use the language model to perform inference in order to extract latent features in the form of a 2 400 dimension vector from github issue text this notebook shows how to load the databunch and model and save only the model for inference flask app inference py flask app inference py contains utilities that makes the inference process easier putting it all together hyper parameter tuning the hyperparam sweep hyperparam sweep folder contains lm tune py hyperparam sweep lm tune py which is a script used to train the model most importantly we use this script in conjuction with hyper parameter sweeps in weights biases https docs wandb com docs sweep html we were able to try 538 different hyper paramter combinations using bayesian and random grid search concurrently to choose the best model hyperparam sweep images parallel coordinates png the hyperparameter tuning process is described in greater detail in the hyperparam sweep hyperparam sweep folder files notebooks notebooks contains notebooks on how to gather and clean the data and train the language model hyperparam sweep hyperparam sweep this folder contains instructions on doing a hyper parameter sweep with weights biases https www wandb com flask app flask app code for a flask app that is the api that listens for post requests script script this directory contains the entry point for running the rest api server that end users will interface with dev script dev this bash script pulls the necessary docker images and starts the api server bootstrap script bootstrap this re builds the docker image and pushes it to dockerhub it is necessary to re build the container anytime the code for the flask app or language model is updated deployment deployment this directory contains files that are helpful in deploying the app dockerfile deployment dockerfile this is the definition of the container that is used to run the flask app the build for this container is hosted on dockerhub at hamelsmu issuefeatures api cpu https hub docker com r hamelsmu issuefeatures api cpu yaml these files relate to a kubernetees deployment appendix location of language model artifacts google cloud storage model for inference 965 mb https storage googleapis com issue label bot model lang model models 22zkdqlr trained model 22zkdqlr pkl encoder for fine tuning w a classifier 965 mb https storage googleapis com issue label bot model lang model models 22zkdqlr trained model encoder 22zkdqlr pth fastai databunch 27 1 gb https storage googleapis com issue label bot model lang model data save pkl checkpointed model 2 29 gb https storage googleapis com issue label bot model lang model models 22zkdqlr best 22zkdqlr pth weights biases run https app wandb ai github issues lang model runs 22zkdqlr overview
natural-language-processing fastai pytorch deep-learning transfer-learning machine-learning data-science
ai
SH-Mobile-Development-Track-Group-1
login signup a new flutter project getting started flutter sign and sign up page group 1 mobile application track sidehustle internship 4 0 grouplist owadayo samuel damilola sh it 0069262 obasohan alex osaromaa sh it 0091399 iniubong roberts sh it 006034 akwasi adenkyekye badu akwasi aigbe osahon victory sh it 0090894 mayowa abayomi alliu sh it 0099655 abdulkarim moshood sh it 0079969 hilda anima offei sh it 0063308 omotoso oluwapelumi ayodeji sh it 0072410 happiness balogun sh it 0041818 oussadi karim sh it 0010110 umoru asana odion 26220 saka sheriff alade sh it 0106751 ezra obodoh sh it 0070905 desmond emmanuel quaye sh it 0010371 agboola nurudeen omodunmola sh it 0074596 gedion ong era onsongo sh it 010404 sadiq faiz sh it 0093019 gedion onge ra onsongo sh it 0010404 violet eseohe ehiaguina sh it 0056476 ayilara damilola mujeeb sh it 0100412 sorunke hassan sh it 0016037 olorunda ayo isaac sh it 0057972 olorunfemi sharon sh it 0097917 olayemi tega olugbon sh it 00765 chukwudire divine daniel sh it 0051172 adeyemo aishat folashade sh it 0113711 menyene moses thomas sh it 005340 baffoe eric junior sh it 0050923 olumide olayiwola 39722 img src group list png alt screenshot register page img src register page jpg alt screenshot login page img src login page jpg alt screenshot profile page img src profile jpg alt screenshot issues from group members if your name is not there in this list or you have corrections to make kindly edit the list and push back thanks
front_end
AM335X-FreeRTOS-lwip
am335x freertos lwip netconn api with fds a port of the popular freertos with lwip netconn based on the cortex a9 port with replacements for intc of the am335x instead of the gic fully nested interrupts critical sections working lwip nteconn ported though not rigourously tested pre requisites cmake gnu arm toolchain built on gcc arm 8 2 2019 01 i686 mingw32 arm eabi python 2 or 3 optional tools ccs vscode terminal emulator steps to build after a checkout update of all submodules navigate to the folder and invoke the am335xfreertos cmake makefile args py this should generate your cmake files in the necessary places configure the build folder with cmake use gcc a toolchain win10 sample cmake as a reference if required navigate to the build folder make dealing with the lwip netconn and the send recv minus plus kerfuffle the lwip netconnn apis while charming are a bit of kerfuffle with the netconn events aka recv plus minus send plus minus the events them selves are state less have to be added to context of the event to make sense of them below is an attempt on the classification of the same for a 3 thread model using the fds for a tcp client br inputs len type api from context write enabled resulting context flags outputs context netconn event len type api from write enabled resulting context read wrtite thread enable lwip state shutdn r 0 client tcp netconn lwip err tcp 0 shutdn na err shutdn reset 0 client tcp app application 0 idle na idle idle s 0 client tcp all lwip lwip netconn do connected 0 running trc tx handler activate running idle err 0 client tcp app application 0 shutdn na err running err 0 client tcp netconn lwip err tcp 0 shutdn na err running err 0 client tcp netconn lwip lwip netconn do close internal 0 shutdn na err running r 0 client tcp netconn lwip netconn recv data tcp 0 shutdn na close rx running r 0 client tcp netconn lwip lwip netconn do delconn 0 shutdn na close running r 0 client tcp netconn lwip lwip netconn do close internal 0 shutdn na close rx running r 0 client tcp netconn lwip err tcp 0 shutdn na err running s 0 client tcp netconn lwip lwip netconn do writemore 1 running na hold write running s 0 client tcp netconn lwip lwip netconn do writemore 1 running na hold write running s 0 client tcp common poll tcp 0 running trc tx handler activate write more running s 0 client tcp netconn lwip lwip netconn do close internal 1 shutdn na close tx running s 0 client tcp netconn lwip lwip netconn do delconn 1 shutdn na close running s 0 client tcp netconn err tcp 1 shutdn na closed tx running s 1 client tcp netconn lwip sent tcp 0 running trc tx handler activate write more running r 1 client tcp netconn lwip netconn recv data x running trc rx handler activate data arrived running r 1 client tcp netconn lwip recv tcp x running data arrived trc rx handler activate data arrived running r 1 client tcp raw lwip recv raw x running data arrived trc rx handler activate data arrived running r 1 client tcp netconn lwip recv udp x running data arrived trc rx handler activate data arrived running r 0 server tcp socket lwip accept function x err notmapped notmapped running r 0 server tcp socket lwip accept function x err notmapped notmapped running r 0 server tcp netconn lwip netconn accept x err notmapped notmapped running r 0 server tcp netconn lwip netconn accept x accepted notmapped notmapped running r 0 server tcp socket lwwip lwip accept x notmapped notmapped
os
ceu-data-platform-in-the-cloud-class
ceu data platform in the cloud class course page https courses ceu edu courses 2020 2021 data engineering 4 building data platform cloud the files in this repo serve as supplementary materials to the class
cloud
Python_Natural_Language_Processing
p align center a href https github com milaan9 img src https img shields io static v1 logo github label maintainer message milaan9 color ff3300 alt last commit a a href https github com milaan9 python python natural language processing graphs commit activity img src https img shields io github last commit milaan9 python python natural language processing svg colorb ff8000 style flat alt last commit a a href https github com milaan9 python python natural language processing pulse alt activity img src https img shields io github commit activity m milaan9 python python natural language processing svg colorb teal style flat a a href https hits seeyoufarm com img src https hits seeyoufarm com api count incr badge svg url https 3a 2f 2fgithub com 2fmilaan9 2fpython python natural language processing count bg 231dc92c title bg 23555555 icon icon color 23e7e7e7 title views edge flat false a a href https github com milaan9 python python natural language processing stargazers img src https img shields io github stars milaan9 python python natural language processing svg colorb 1a53ff alt stars badge a a href https github com milaan9 python python natural language processing network members img src https img shields io github forks milaan9 python python natural language processing alt forks badge a img src https img shields io github repo size milaan9 python python natural language processing svg colorb cc66ff style flat alt size a href https github com milaan9 python python natural language processing pulls img src https img shields io github issues pr milaan9 python python natural language processing svg colorb yellow style flat alt pull requests badge a a href https github com milaan9 python python natural language processing issues img src https img shields io github issues milaan9 python python natural language processing svg colorb yellow style flat alt issues badge a img src https img shields io github languages top milaan9 python python natural language processing svg colorb 996600 style flat alt language a a href https github com milaan9 python python natural language processing blob master license img src https img shields io badge license mit blueviolet svg alt mit license a p img src https badges pufler dev contributors milaan9 01 python introduction size 50 padding 5 bots true alt milaan9 p align center a href https mybinder org v2 gh milaan9 python natural language processing head img src https mybinder org badge logo svg alt binder a a href https githubtocolab com milaan9 python natural language processing img src https colab research google com assets colab badge svg alt colab a p python natural language processing this repository contains tutorials on important topics related to natural language processing npl no name 01 01 tokenization nlp https github com milaan9 python natural language processing blob main 01 tokenization nlp ipynb 02 02 stemming lemmatization https github com milaan9 python natural language processing blob main 02 stemming lemmatization ipynb 03 03 stopwords https github com milaan9 python natural language processing blob main 03 stopwords ipynb 04 04 vocabulary and matching https github com milaan9 python natural language processing blob main 04 vocabulary and matching ipynb 05 05 pos basics https github com milaan9 python natural language processing blob main 05 pos basics ipynb 06 06 named entity recognition https github com milaan9 python natural language processing blob main 06 named entity recognition ipynb 07 07 sentence segmentation https github com milaan9 python natural language processing blob main 07 sentence segmentation ipynb 08 08 stemming https github com milaan9 python natural language processing blob main 08 stemming ipynb 09 09 bagofwords n gram https github com milaan9 python natural language processing blob main 09 bagofwords n gram ipynb 10 10 tf ifd https github com milaan9 python natural language processing blob main 10 tf ifd ipynb these are read only versions however you can run all the codes online by clicking here a href https mybinder org v2 gh milaan9 python natural language processing head img src https mybinder org badge logo svg alt binder a 020 road detection frequently asked questions how can i thank you for writing and sharing this tutorial you can img src https img shields io static v1 label e2 ad 90 star message if 20useful style style flat color blue alt star badge and img src https img shields io static v1 label e2 b5 96 fork message if 20useful style style flat color blue alt fork badge starring and forking is free for you but it tells me and other people that it was helpful and you like this tutorial go here https github com milaan9 python natural language processing if you aren t here already and click star and fork button in the top right corner you will be asked to create a github account if you don t already have one how can i read this tutorial without an internet connection img alt gif src https github com thedudethatcode thedudethatcode blob master assets hmm gif width 20 1 go here https github com milaan9 python natural language processing and click the big green code button in the top right of the page then click download zip https github com milaan9 python natural language processing archive refs heads main zip download zip img dnld rep png 2 extract the zip and open it unfortunately i don t have any more specific instructions because how exactly this is done depends on which operating system you run 3 launch ipython notebook from the folder which contains the notebooks open each one of them kernel restart clear output this will clear all the outputs and now you can understand each statement and learn interactively if you have git and you know how to use it you can also clone the repository instead of downloading a zip and extracting it an advantage with doing it this way is that you don t need to download the whole tutorial again to get the latest version of it all you need to do is to pull with git and run ipython notebook again authors i m dr milaan parmar and i have written this tutorial if you think you can add correct edit and enhance this tutorial you are most welcome see github s contributors page https github com milaan9 python natural language processing graphs contributors for details if you have trouble with this tutorial please tell me about it by create an issue on github https github com milaan9 python natural language processing issues new img alt png width 30px src https user images githubusercontent com 48193918 124397909 86858c80 dd30 11eb 803c 9650d9c4a927 png h3 and i ll make this tutorial better this is probably the best choice if you had trouble following the tutorial and something in it should be explained better you will be asked to create a github account if you don t already have one if you like this tutorial please give it a star https github com milaan9 91 python mini projects licence you may use this tutorial freely at your own risk see license license copyright c 2020 dr milaan parmar div align center h3 connect with me a href https gifyu com image zy2f img src https github com milaan9 milaan9 blob main handshake gif width 60 a h3 p align center a href https www linkedin com in milaanparmar target blank img alt linkedin width 25px src https github com thedudethatcode thedudethatcode blob master assets linkedin svg a a href https www instagram com milaanparmar9 target blank img alt instagram width 25px src https github com thedudethatcode thedudethatcode blob master assets instagram svg a a href https www facebook com milaanparmar target blank img alt facebook width 25px src https upload wikimedia org wikipedia commons 5 51 facebook f logo 282019 29 svg a a href mailto milaanparmar9 gmail com target blank img alt gmail width 25px src https github com thedudethatcode thedudethatcode blob master assets gmail svg a p
nlp tokenization stemming lemmatization stopwords partofspeech-tagger named-entity-recognition sentence-segmentation bag-of-words termfrequency inversedocumentfrequency tf-idf vocabulary-matching python4datascience python4everybody tutor-milaan9 ipython-notebook
ai
e-tuition
e tuition project name e tuition project demonstration https youtu be wwx4x7lme94 this is an e learning web app this app is built using react front end c backend and mssql database project members inzamam alvee abir project description this is an e learning web app this app is built using react front end c backend and mssql database required softwares to run the project microsoft visual studio 2019 16 8 5 or higher microsoft sql server management studio 18 4 node the sql server needs to run in sql server authentication mode with appropriate port settings then database should be created using database sql file edit the appsettings json file as defaultconnectionstring server desktop 7rrriqq database test user id sa password 123456 desktop 7rrriqq should be changed to pc name sql server name run npm install in the e tuition e tuition clientapp folder to install the node modules then the e tuition sln file should be run project features there is an home section for teacher and students they can signup signin separately from home a teacher can create courses and supply materials in each course for this a teacher needs to be varified by an admin a student can enroll courses and get the materials from a course there is an admin panel at admin login route which is saperate from the main home page an admin can see various statistics of the website and varify a teacher
server
pyloom
pyloom a event sourcing framework for building large language model applications pyloom is a framework designed to streamline the development of intricate llm applications drawing inspiration from event sourcing https martinfowler com eaadev eventsourcing html pyloom applies this concept to the llm agent development offering a range of powerful features that enables better dev experience developing intricate and non deterministic llm agents involves making multiple llm calls and intricate control structures similar to constructing a marble machine if an error arises developers are frequently compelled to rerun the entire agent workflow developers care not only the agent s final outcome but also the steps it takes to arrive there at times we want to see how tweaks like extra tools or adjusted prompts in between can affect the final outcome pyloom is crafted to address these challenges features git for agent pyloom tracks state changes and the evolution of the agent at each step agent replay with pyloom developers can replay and navigate through the agent s flow once encouter an error in the agent flow you can fix the issue and restart from the exact point of failure event sourcing pyloom employs event sourcing representing agent actions as events by replaying the event stream the agent s state can be reconstructed furthermore developer can apply the same event streams to different agents and compare performance reproduce production issues leveraging event sourcing pyloom facilitates the reproducing of production errors by replaying the identical event streams in development environment pyloom can be used with other llm frameworks like langchain and guidance installation you can install pyloom using pip python pip install pyloom quick start tutorial notebooks tutorial ipynb contributing we are extremely open to contributions in various forms bug fixes new features improved documentation and pull requests your input is valuable to us
ai
StartedQuartus
startedquartus this project comprises of set up consists of a 2 bit adder 8 bit adder 16 bit adder circuits and a 32 bit alu with its schematic file along with few comparator circuits this projrct helps to understand the working of those various digital circuits and will help to have an hands on with quartus prime
quartusprime
os
blockchain-crypto-mpc
important note as of september 1 2021 this repository will be converted to read only and support will end therefore starting on september 1 2021 the following changes will occur 1 unbound will stop maintenance for this repo 1 unbound will not release any future versions of this repo 1 unbound will not provide any security updates or hotfixes should an issue arise 1 unbound will stop support for this repo this library was originally created to enable users to experiment with mpc functionality using a subset of the functionality provided in unbound products users are encouraged to explore the new unbound core offerings found in the unbound security website https www unboundsecurity com 1 introduction to blockchain crypto mpc blockchain crypto mpc is an open source library released by unbound security https www unboundsecurity com that provides the cryptographic foundation to resolve one of the hardest challenges associated with crypto asset and blockchain applications the protection of cryptographic signing keys and seed secrets unbound leverages secure multiparty computation mpc https www unboundsecurity com blog secure multiparty computation mpc for the protection and management of cryptographic keys and secrets and provides industry grade mpc based solutions for key management key protection and additional applications the protocols were designed by prof yehuda lindell https en wikipedia org wiki yehuda lindell and dr samuel ranellucci who also reviewed and approved the code and the implementation see the unbound cryptocurrency wallet library white paper docs unbound cryptocurrency wallet library white paper md for more detailed information about the protocols this readme includes an overview of this library why it is important what it allows you to achieve sample use cases and how to use the library high level description 2 who should use it blockchain crypto mpc provides 100 of the cryptography needed for strongly securing crypto asset and blockchain wallets while being as or more secure than dedicated cryptographic hardware for free if you re developing a wallet or a platform for custody exchanging of crypto assets you re in the right place leveraging mpc for the protection of cryptographic keys and secrets blockchain crypto mpc provides the security benefits of no single point of compromise and shared responsibility like multi sig but with a single signature and without any dependency on the ledger more specifically it provides three critical security properties 1 sensitive keys and secrets are split into two random shares which are stored on separate segregated machines machine stands for any computing device each of these shares by itself reveals nothing whatsoever about the key material 2 all cryptographic operations performed throughout the key lifecycle are performed without ever combining these 2 shares together this includes signing derivation and even generation bottom line there is no complete key material or secret in the memory ever it is proven mathematically that obtaining the key material requires access to both key shares and therefore requires compromising both machines a single machine even if completely compromised and controlled by an attacker reveals nothing about the key material simply because the key material never resides in any single machine 3 key shares refresh the key shares are continually modified without modifying the key itself it is computationally efficient and can be performed very frequently thus forcing the attacker to compromise both machines at virtually the same time in order to obtain key material key shares docs images os key shares png blockchain crypto mpc includes a secure mpc implementation of 100 of the functionality required to strongly secure crypto asset and blockchain wallets it s pure software open source and free to use it is highly recommended for developers of wallets and blockchain applications that deal with key management we are delighted to make this contribution to the open source community with hopes that it will enable secure convenient and easy to use blockchain applications for all 3 what s included blockchain crypto mpc includes a secure mpc implementation of the following algorithms 2 party ecdsa secp256k1 generation and signing 2 party eddsa ed25519 generation and signing 2 party bip32 based on the bip32 specification https github com bitcoin bips blob master bip 0032 mediawiki generation hard derivation and normal derivation key share refresh zero knowledge backup the source code is written in c and the external api in c detailed documentation including a whitepaper and security proofs will be available online soon it can be compiled on virtually any platform the only dependency is openssl which is available on most platforms instructions on how to remove this dependency will be included in future documentation the compiled binary is a cryptographic library that has to be deployed on two or more separate machines to provide strong security 4 what are the typical use cases blockchain crypto mpc can be used to provide security in any blockchain app in this section we describe typical use cases that are relevant to many applications 4 1 endpoint server use case this use case is common for wallet service providers the user has a mobile wallet on their endpoint device typically their mobile phone or laptop the wallet application communicates with a server application 4 1 1 suggested setup the bip32 seed and all signing keys are always split between the end user s device participant 1 and the service provider participant 2 performing any cryptographic operation on the seed or private key requires cooperation of both participants and communication between them endpoint server use case docs images use case endpoint server png 4 1 2 use case properties guaranteed non repudiation the application server cannot sign any transaction without cooperation from the endpoint device no single point of compromise compromising the seed or key material requires the attacker to compromise both the server and the endpoint simultaneously no key or seed material ever appears in the clear throughout its lifecycle including while in use and during generation resilient to side channel attacks a model that empowers a crypto service provider to create an excellent user experience by delivering a wallet service while maintaining a very high security level and granting the users full control of their crypto assets 4 2 mobile laptop use case this is a use case involving two end user devices that typically belong to the same user for example a mobile phone and a laptop each device runs an app and both participants collaborate to create a secure blockchain wallet and sign transactions 4 2 1 suggested setup the bip32 seed and all signing keys are always split between the mobile device participant 1 and the laptop participant 2 performing any cryptographic operation on the seed or private key requires cooperation of both participants and communication between them mobile laptop use case docs images use case mobile laptop png 4 2 2 use case properties both devices must collaborate and approve any transaction no single device can approve a transaction no single point of compromise compromising the seed or key material requires the attacker to compromise both the laptop and the mobile device simultaneously no key or seed material ever appears in the clear throughout its lifecycle including while in use and during generation resilient to side channel attacks a model that empowers a wallet provider to create an excellent user experience while maintaining a very high security level and granting the users full control of their crypto assets 4 3 backup backup is one of the most challenging aspects of crypto asset key management this section briefly describes the backup functionality of blockchain crypto mpc and two potential usage scenarios blockchain crypto mpc includes a unique backup mechanism that introduces zero knowledge backup an encrypted cold backup that allows public verifiability this property is significant as it allows both participants to verify the correctness of the backup at any point in time without decrypting it it therefore makes this verification secure and prevents a situation where a wrong backup was generated and stored 4 3 1 backup use case 1 user managed this is a common form of backup with the role of backup management mostly on the end user an encrypted backup of the wallet can be stored in multiple locations for redundancy for example it can be stored by the service provider as described in the endpoint server use case the private key for this backup should be in the user s sole possession preferably in a cold backup the backup recovery process should be used only for disaster recovery 4 3 2 backup use case 2 managed backup the following scenario is an expansion of the endpoint server use case that includes a 3rd party trustee service the trustee service is used only when either the user s device and or the service provider have lost their respective key shares backup use case 2 managed backup docs images use case managed backup png this model creates a user transparent backup effectively similar to a 2 of 3 scenario each quorum containing 2 of the 3 participants noted above would suffice to perform a cryptographic operation this is performed by creating three different random share pairs upon wallet and seed generation in the diagram key share a is used by the user s device and the trustee service key share b is used by the user s device and the wallet service provider and key share c is used by the wallet service provider and the trustee service it s important to highlight that each of these pairs is completely independent each is effectively a backup of the same seed 5 benchmarking and performance this repository includes a two different tools for benchmarking the blockchain crypto mpc library 1 mpc crypto bench a tool written c see mpc crypto bench https github com unboundsecurity blockchain crypto mpc blob master bench readme md in the bench bench folder for more information 2 mpc crypto python script see mpc crypto https github com unboundsecurity blockchain crypto mpc blob master python readme md in the python python folder for more information mpc crypto bench tests the raw protocols with no networking involved while the mpc crypto python script is uses a client and server with actual networking using the python script each command was run for 20 iterations and resulted in the following performance numbers algorithm command time seconds ecdsa generate 0 945 ecdsa sign 0 015 eddsa generate 0 003 eddsa sign 0 003 the tests were run on a server with an intel xeon e5 2686 v4 2 30 ghz with 32 gb ram 6 technical overview and usage guidelines unbound s blockchain crypto mpc open source library provides functions that enable you to create sign and refresh encryption keys without the whole key ever existing in any location this library can be used to create system with two peers for the management of the keys each peer uses the library to create and process messages that are sent between the peers note that the actual communication between peers is not included in this library blockchain crypto mpc system docs images os system png 6 1 definitions blockchain crypto mpc utilizes the following three structures 1 key share encryption keys never exist as complete keys in any phase of the process or in any location at any time a key share is a piece of a key which can be used by unbound s mpc technology to sign transactions 2 message data that is passed to the other peer the message contains information about the action in progress 3 context since each action such as signing with a key involves multiple messages between the two peers the status of the action is preserved in a context the key share message and context contain varying amounts of information depending on the type action and therefore they are structures 6 2 actions the library provides the following actions 2 party ecdsa secp256k1 generation and signing 2 party eddsa ed25519 generation and signing 2 party bip32 based on the bip32 specification https github com bitcoin bips blob master bip 0032 mediawiki generation hard derivation and normal derivation key share refresh zero knowledge backup the library also provides mechanisms to handle serialization deserialization and memory management for the key share message and context structures 6 3 system flow the system flow is shown in the following figure flow docs images os flow png the first step is initialization during this step you provide the library with all the relevant information required to complete the desired action this step takes that information and creates a context each peer does its own initialization the context is then passed through a series of steps each of these steps takes an input message does some manipulation and then creates an output message you then transfer this output message to the other peer the other peer receives the message and associates it with a context it then knows how to handle the incoming message based on the context when the peer is done with the last step it sets a finished flag the peer can then do any necessary cleanup such as freeing memory or copying an updated key share to storage 6 3 1 peer roles peer roles are determined by which peer initiates key generation this peer must be used for any subsequent key operations such as signing derivation and backup for example if peer a generates a key and then peer b wants to initiate a signing process it should make a request to the peer a to start the process when complete the peer a can send the result to peer b peer b can verify this result with the verify function 6 3 2 detailed flow a detailed flow is described in the following procedure 1 peer a calls the relevant initialization function 1 peer a calls the step function with a null input message and gets an output message 1 peer a sends the operation details and its output message to peer b note that the operation details are sent in this step this information enables peer b to run the initialization function alternatively this information can be sent any time before the next step 1 peer b verifies the operation details and consents to execute it 1 peer b calls the initialization function 1 peer b calls the step function with the message it received from peer a as input message 1 peer b sends the output message back to peer a 1 each peer alternates calling the step function with the input message from the other peer and then sending the output message to the other peer 1 this ping pong process continues until both peers are finished which is determined by output flags from the step function the mpc protocol finished flag denotes that it was the last step on this peer if it also includes the mpc share changed flag then the local key share changed such as with a refresh action the key share needs to retrieved from the context using the getshare function and stored for future use if the output message from the last step is not empty it must be sent to the other peer one or both peers may need to call the getresult function based on the type of operation for example a sign action only has a result for one peer but a refresh action has a new key share for both peers throughout the entire process the same context should be used if the context needs to be stored you can use the serialization function and then read it back in using the deserialization function 6 3 3 example action an example of an ecdsa signing action is shown in the following figure flow docs images os flow example png each peer starts by calling the mpccrypto initecdsasign function for initialization after initialization each peer calls the mpccrypto step function a number of times until the peer is finished with the signing process the signature which is the result of the signing process is received by calling the final function mpccrypto finalecdsasign after which the signing process is done
mpc cryptography blockchain multisig multi-signature multiparty-computation hsm hardware-security-module
blockchain
EmbeddedSystems
embedded systems design course on fundamentals of embedded systems hardware components software components hardware software interface design and hardware software co design projects were developed on an altera cyclone 4 fpga using quartus ii and sopc builder projects project 1 lighting up the de2 115 board project 2 image rotation scaling and interpolation project 3 performance analysis project 4 multiprocessor image transformation project 5 multiprocessor real time fractal
os
deepcourse
div align center h1 deepcourse deep learning for computer vision h1 h2 a href https arthurdouillard com deepcourse arthurdouillard com deepcourse a h2 div picture of the website front page with the skill tree static img deepcourse prez png this is a course i m giving to the french engineering school epita each fall this course is the new version as of 2021 of my lectures serie it s given in around 24h with sessions of 4 hours each made of 2 hours of lessons and 2 hours of coding if you find a mistake in the lesson notebook or quiz please open an issue here likewise if you have ideas to improve this course should i cover topic x etc please also open an issue roadmap to do x add all major lessons notebooks quiz new architectures x lesson transformer new kind of mlp involution nas what else x notebook transformer x notebook mlp mixer x quiz robustness uncertainty adversarial attack lesson notebook quiz continual learning domain generalization out of distribution lesson notebooks quiz graph neural network lesson notebooks quiz make website look nice on every kind of devices screens
computer-vision pytorch course research anki deep-learning
ai
laravel-ecommerce-application.
e commerce app using laravel blade vue js frontend backend app development
server
DiffEqFlux.jl
diffeqflux jl join the chat at https julialang zulipchat com sciml bridged https img shields io static v1 label zulip message chat color 9558b2 labelcolor 389826 https julialang zulipchat com narrow stream 279055 sciml bridged global docs https img shields io badge docs sciml blue svg https docs sciml ai diffeqflux stable codecov https codecov io gh sciml diffeqflux jl branch master graph badge svg https codecov io gh sciml diffeqflux jl build status https github com sciml diffeqflux jl workflows ci badge svg https github com sciml diffeqflux jl actions query workflow 3aci build status https badge buildkite com a1fecf87b085b452fe0f3d3968ddacb5c1d5570806834e1d52 svg https buildkite com julialang diffeqflux dot jl colprac contributor s guide on collaborative practices for community packages https img shields io badge colprac contributor s 20guide blueviolet https github com sciml colprac sciml code style https img shields io static v1 label code 20style message sciml color 9558b2 labelcolor 389826 https github com sciml scimlstyle diffeqflux jl fuses the world of differential equations with machine learning by helping users put diffeq solvers into neural networks this package utilizes differentialequations jl https docs sciml ai diffeqdocs stable flux jl https docs sciml ai flux stable and lux jl https lux csail mit edu v0 5 5 api as its building blocks to support research in scientific machine learning https www stochasticlifestyle com the essential tools of scientific machine learning scientific ml specifically neural differential equations and universal differential equations to add physical information into traditional machine learning tutorials and documentation for information on using the package see the stable documentation https docs sciml ai diffeqflux stable use the in development documentation https docs sciml ai diffeqflux dev for the version of the documentation which contains the unreleased features problem domain diffeqflux jl is not just for neural ordinary differential equations diffeqflux jl is for universal differential equations where these can include delays physical constraints stochasticity events and all other kinds of interesting behavior that shows up in scientific simulations neural networks can be all or part of the model they can be around the differential equation in the cost function or inside of the differential equation neural networks representing unknown portions of the model or functions can go anywhere you have uncertainty in the form of the scientific simulator for an overview of the topic with applications consult the paper universal differential equations for scientific machine learning https arxiv org abs 2001 04385 as such it is the first package to support and demonstrate stiff and non stiff universal ordinary differential equations universal odes universal stochastic differential equations universal sdes universal delay differential equations universal ddes universal partial differential equations universal pdes universal jump stochastic differential equations universal jump diffusions hybrid universal differential equations universal des with event handling with high order adaptive implicit gpu accelerated newton krylov etc methods for examples please refer to the release blog post https julialang org blog 2019 01 fluxdiffeq additional demonstrations like neural pdes and neural jump sdes can be found in this blog post https www stochasticlifestyle com neural jump sdes jump diffusions and neural pdes among many others do not limit yourself to the current neuralization with this package you can explore various ways to integrate the two methodologies neural networks can be defined where the activations are nonlinear functions described by differential equations neural networks can be defined where some layers are ode solves odes can be defined where some terms are neural networks cost functions on odes can define neural networks flux ode training animation https user images githubusercontent com 1814174 88589293 e8207f80 d026 11ea 86e2 8a3feb8252ca gif
neural-ode neural-sde neural-pde neural-dde neural-differential-equations stiff-ode ordinary-differential-equations stochastic-differential-equations delay-differential-equations partial-differential-equations neural-networks differentialequations differential-equations scientific-ml scientific-ai neural-jump-diffusions neural-sdes scientific-machine-learning pinn physics-informed-learning
ai
RPI-pico-FreeRTOS
raspberry pico blinky project with freertos and a basic gpio c class
os
medrec
medrec patient controlled medical records this project is currently not maintained https medrec media mit edu components database manager api written in golang that provides access to an underlying database r w access is governed by permissions stored on the blockchain ethereum client a pointer to the go ethereum codebase smartcontracts the solidity contracts and their tests that are used by other medrec components userclient a front facing node app that can be used by any party to interact with the medrec system this project is being developed under the gplv2 license for production 1 install golang and build install golang from the repositories for your operating system apt yum homebrew etc or from the golang website https golang org dl then build the go components of medred go build 2 install npm install npm https nodejs org en 3 install npm packages setup the userclient cd userclient npm install npm run build cd to resolve an annoying won t fix bug in the bitcoin mnemonic library you also need to run the following in the userclient directory rm r node modules bitcore mnemonic node modules bitcore lib setup the javascript helper files cd golangjshelpers npm install cd 4 setup your mysql database you need to be running a mysql database instance locally with username password root medrecpassword run query scripts medrec v1 sql it will create a schema called medrec v1 for you to store retrieve information from it is representing the remote db run query scripts medrecwebapp sql for the local db if you do not want to be able to look at the example patient records you can skip this section 5 start all of medrec s components medrec ethereumclient medrec databasemanager medrec userclient for development 1 setup a poa blockchain refer to this guide https hackernoon com setup your own private proof of authority ethereum network with geth 9a0a3750cda8 to setup a proof of authority blockchain with go ethereum as its backend change the rpc port in the above tutorial to 8545 while starting geth 2 connect to the blockchain medrec has to be modified to connect to the provider nodes of this blockchain edit medrec genesis json and startgeth js in golangjshelpers to match with your genesis and chain parameters of your poa blockchain 3 install go and golang libraries install go https github com golang go or brew install go if you re on macos cd into your gopath and run go get v 4 install npm install npm https nodejs org en 5 install npm packages cd userclient npm install npm run build to resolve an annoying won t fix bug in the bitcoin mnemonic library you also need to run the following in the userclient directory rm node modules bitcore mnemonic node modules bitcore lib 6 run the database manager you need to be running a mysql database instance locally with username password root medrecpassword create user password localhost identified by medrecpassword run query scripts medrec v1 sql mysql u password p medrec v1 sql it will create a schema called medrec v1 for you to store retrieve information from it is representing the remote db run query scripts medrecwebapp sql for the local db mysql u password p medrecwebapp sql go run main go databasemanager it should start the databasemanager running on localhost 6337 7 deploy the contracts you will need to install the program truffle using npm to deploy contracts npm install truffle g make sure that the parameters in smartcontracts truffle js match your poa chain then deploy the contract using truffle deploy the ganache cli should respond to this command showing that the contracts have been deployed 8 start the userclient cd userclient npm start if it throws an error with web3 requestmanager do npm install web3 1 0 0 beta 26 9 building a new production version go build cd userclient npm run build getting started with the medrec codebase 1 link medrec to an existing database in order to use medrec as a medical records provider medrec s database manager needs to be linked to the provider s database this does not require a change in the database itself but does involve changing medrec to fit the form of database used this might involve more or less work from changing a few table names up to completely rewriting queries depending on the form and structure of database used in order to link medrec to your database the data will need to be directly queryable by an external piece of software e g using sql or other style queries sql type databases medrec uses the golang mysql driver github com go sql driver mysql to interface with its test database there is support for a wide range of common hospital database types including fhir https golanglibs com top q fhir non sql type databases there s a a href https github com avelino awesome go database drivers pretty exhaustive list a of golang drivers for different forms of database depending on the database setup guide all the files that need to be changed are located in the databasemanager remoterpc directory this manages remote access to your database by people granted access to view a particular record 1 instantiate the database in databse go sql but not mysql replace the database driver for mysql import github com go sql driver mysql with the driver for your database format nosql other also get rid of import database sql once the driver and database type has been set replace the contents of instantiatedatabase with your own server func instantiatedatabase sql db db err sql open mysql user password tcp 127 0 0 1 port schema if err nil log fatal err return db 2 setting up the eth key value store the database manager uses a leveldb key value store to link the unique patient identifier used by medrec the patient s ethereum address to some unique identifier in the record provider s database this is managed in lookup go a unique id from the database needs to be sent to the prospective patient so that they can enter it along with the creation of an ethereum address when they create an account this then creates an entry in the leveldb lookup table that maps the patient s agentid some unique patient id to their eth address depending on the type of uid that is used to identify the patient you might want to change the type of args agentid to match that of the stored value type accountargs struct account string agentid string err tab put byte args account byte args agentid nil if err nil log println err nb you don t need to add the eth address of the agent that gets added automatically when an account is made to test the functionality of this navigate to the frontend and create a new patient account with some uid stored in your database the records returned for that patient should match those associated with that uid 3 requesting documents the management of this part of medrec will vary with the kind of information the record provider should make available to each patient the generic function patientdocuments in documentrequest go provides an outline for receiving a call from a patient s account checking that their eth address is listed in the key value store and replying with the contents of that request reply documents in order to adapt this to specific instances the names of specific tables must be changed to align the provider database with the records requested in patientdocuments these are specified as var patientid string documentid int docdatetime string practiceid string recvddatetime string the names of the result params document defined in params go remain the same the entries in the provider database that correspond to them should be changed to match 4 the rest of the database manager overview instructions 1 2 and 3 unpack the provider facing layers of the database manager which interact directly with the provider database the rest of this repo databasemanager remoterpc deals with the blockchain interface in particular ethereum go handles authentication and connection to the ethereum client authentication recover r http request args recoverargs reply recoverreply recovers 2 joining the blockchain as a provider 3 write a custom contract
blockchain
joint-many-task-model
a joint many task model growing a neural network for multiple nlp tasks iclr 2017 multiple different natural language processing tasks in a single deep model this in my opinion is indeed a very good paper it demonstrates how a neural model can be trained from low level to higher level in a fashion such that lower layers correspond to word level tasks and the higher layers correspond to tasks which are performed at sentence level the authors also show how to retain the information at lower layers while training the higher layers by successive regularization it is also clearly shown that transfer learning is possible where different datasets are exploited simultaneously after jointly pre trained for word embeddings catastrophic inference is a very crucial thing to deal with in this mode it is basically the inference in other layer s learned parameters while training a particular layer as an example you want to retain information about pos while training for say chunking later model architecture images model png data conll2000 http www cnts ua ac be conll2000 chunking sick data http clic cimec unitn it composes sick html tasks pos tagging word level chunking word level semantic relatedness sentence level textual entailment sentence level usage data py preprocesses data for the model run py runs the main model sample input python task desc pos this has increased the risk chunk this has increased the risk relatedness two dogs are wrestling and hugging there is no dog wrestling and hugging entailment two dogs are wrestling and hugging there is no dog wrestling and hugging sample output images result png note the original paper contains one more task which is dependency parsing currently that is not incorporated in the model due to non availability of good public data also need to add successive regularization citations a joint many task model growing a neural network for multiple nlp tasks kazuma hashimoto caiming xiong yoshimasa tsuruoka richard socher https arxiv org abs 1611 01587
ai
Introduction-to-Natural-Language-Processing-UMich-Coursera
introduction to natural language processing umich coursera
ai
seldon-server
update january 2018 seldon core open sourced https github com seldonio seldon core seldon core focuses purely on deploying a wide range of ml models on kubernetes allowing complex runtime serving graphs to be managed in production seldon core is a progression of the goals of the seldon server project but also a more restricted focus to solving the final step in a machine learning project which is serving models in production please have a look at the project page https github com seldonio seldon core which includes extensive documentation to investigate further seldon server archived this project is not actively maintained anymore please see seldon core https github com seldonio seldon core seldon server is a machine learning platform that helps your data science team deploy models into production it provides an open source data science stack that runs within a kubernetes http kubernetes io cluster you can use seldon to deploy machine learning and deep learning models into production on premise or in the cloud e g gcp http docs seldon io kubernetes google cloud html aws azure seldon supports models built with tensorflow keras vowpal wabbit xgboost gensim and any other model building tool it even supports models built with commercial tools and services where the model is exportable it includes an api with two key endpoints 1 predict http docs seldon io prediction guide html build and deploy supervised machine learning models created in any machine learning library or framework at scale using containers and microservices http docs seldon io api microservices html 2 recommend http docs seldon io content recommendation guide html high performance user activity and content based recommendation engine with various algorithms ready to run out of the box other features include complex dynamic algorithm configuration and combination http docs seldon io advanced recommender config html with no downtime run a b and multivariate tests cascade algorithms and create ensembles command line interface cli http docs seldon io seldon cli html for configuring and managing seldon server secure oauth 2 0 rest and grpc http docs seldon io grpc html apis to streamline integration with your data and application grafana dashboard for real time analytics http docs seldon io analytics html built with kafka streams fluentd and influxdb seldon is used by some of the world s most innovative organisations it s the perfect machine learning deployment platform for start ups and can scale to meet the demands of large enterprises get started it takes a few minutes to install seldon on a kubernetes cluster visit our install guide http docs seldon io install html and read our tech docs http docs seldon io community support join the seldon users group https groups google com forum forum seldon users register for our newsletter http eepurl com 6x6n1 to be the first to receive updates about our products and events visit our website https www seldon io follow seldon io https twitter com seldon io on twitter and like our facebook page https www facebook com seldonhq if you re in london meet us at tensorflow london https www meetup com tensorflow london a community of over 1200 data scientists that we co organise we also offer commercial support plans and managed services https www seldon io enterprise license seldon is available under apache licence version 2 0 https github com seldonio seldon server blob master readme md
machine-learning deep-learning deployment kubernetes docker microservices spark kafka kafka-streams tensorflow python java cloud aws gcp azure seldon recommender-system recommendation-engine prediction
ai
cake
the c programming language 1978 c is a general purpose programming language which features economy of expression modern control flow and data structures and a rich set of operators c is not a very high level language nor a big one and is not specialised to any particular area of application but its absence of restrictions and its generality make it more convenient and effective for many tasks than supposedly more powerful languages in our experience c has proven to be a pleasant expressive and versatile language for a wide variety of programs it is easy to learn and it wears well as one s experience with it grows the c programming language second edition 1988 as we said in the preface to the first edition c wears well as one s experience with it grows with a decade more experience we still feel that way me 2023 as my experience with any language grows more a like c cake cake is a compiler front end written from scratch in c designed from the c23 language specification it allows you to translate newer versions of c such as c23 to c99 additionally cake provides a platform for experimenting with new features for the c language including extensions like lambdas and defer and static ownership ownership html checks web playground this is the best way to try http thradams com cake playground html use cases if you have a project that is distributed with code you don t need to limit the project development at the lower supported language version for instance you can use attributes like nodiscard during the development or defer both features improving the code security then adding a extra step in your build you can distribute a readable c99 source code that compiles everywhere cake can also be used as static analyzer especially the new ownership analysis features c23 preprocessor c23 syntax analysis c23 semantic analysis 58 errors 16 warnings static ownership checks extension sarif output c backend ast build github https github com thradams cake msvc build instructions open the developer command prompt of visual studio go to the src directory and type cl build c build this will build cake exe then run cake on its own source code gcc linux build instructions got to the src directory and type gcc build c o build build to run unit tests windows linux add dtest for instance gcc dtest build c o build build emscripten build instructions web emscripten https emscripten org is required first do the normal build the normal build also generates a file lib c that is the amalgameted version of the core lib then at src dir type call emcc dmockfiles lib c o web cake js s wasm 0 s exported functions compiletext s extra exported runtime methods ccall cwrap this will generate the src web cake js running cake at command line make sure cake is on your system path samples cake source c this will output out source c see manual manual html road map ownership static analysis references how did we get here https www bell labs com usr dmr www chist html https www bell labs com usr dmr www cman pdf https www bell labs com usr dmr www ctut pdf a copy of each c standard draft in included in docs folder https en wikipedia org wiki ansi c c89 https en cppreference com w c https www ibm com docs en xl c aix 13 1 0 topic extensions c99 features https en cppreference com w c compiler support 23 https en cppreference com w c compiler support 99 a very nice introduction was written by al williams c23 programming for everyone https hackaday com 2022 09 13 c23 programming for everyone influenced by typescript small c compilers participating you can contribute by trying out cake reporting bugs and giving feedback have a suggestion for c discord server https discord gg mtfw8qvw status at version 0 5 13 we have 58 types of erros we have more messages sharing the same type and 14 warnings info how cake is tested i am using visual studio 2022 ide to write debug cake source cake is parsing itself using the includes of msvc and it generates the out dir after build i use visual studio code with wsl for testing and compiling the code for linux cake source code is not using any extension so the output is the same of input this compilation is useful for tracking errors together with the unit tests differences from cfront cfront was the original compiler for c which converted c to c cfront generated code was used only for direct compilation because it had all macros expanded making it useless to reuse the generated code in other platforms cake have two modes one is for direct compilation like cfront and the other preserves macros includes etc making it suitable for distribution the other difference is that c is a second branch of evolution making c more compatible with c89 than c99 the idea of cake is to keep the main line of evolution of c and be always 100 compatible cake c the added extensions aims to keep the spirit of the language and implement proposed features in a way they can be experimented even before standardization
c compiler front-end transpiler c23 cake c2x static-analysis
front_end
PowerGraph
graphlab powergraph v2 2 update for a signficant evolution of this codebase see graphlab create which is available for download at turi com https turi com history in 2013 the team that created graphlab powergraph started the seattle based company graphlab inc the learnings from graphlab powergraph and graphchi projects have culminated into graphlab create a enterprise class data science platform for data scientists and software engineers that can simplify building and deploying advanced machine learning models as a restful predictive service in january 2015 graphlab inc was renamed to turi see turi com https turi com for more information status graphlab powergraph is no longer in active development by the founding team graphlab powergraph is now supported by the community at http forum turi com http forum turi com introduction graphlab powergraph is a graph based high performance distributed computation framework written in c the graphlab powergraph academic project was started in 2009 at carnegie mellon university to develop a new parallel computation abstraction tailored to machine learning graphlab powergraph 1 0 employed shared memory design in graphlab powergraph 2 1 the framework was redesigned to target the distributed environment it addressed the difficulties with real world power law graphs and achieved unparalleled performance at the time in graphlab powergraph 2 2 the warp system was introduced and provided a new flexible distributed architecture around fine grained user mode threading fibers the warp system allows one to easily extend the abstraction to improve optimization for example while also improving usability graphlab powergraph is the culmination of 4 years of research and development into graph computation distributed computing and machine learning graphlab powergraph scales to graphs with billions of vertices and edges easily performing orders of magnitude faster than competing systems graphlab powergraph combines advances in machine learning algorithms asynchronous distributed graph computation prioritized scheduling and graph placement with optimized low level system design and efficient data structures to achieve unmatched performance and scalability in challenging machine learning tasks related is graphchi a spin off project separate from the graphlab powergraph project graphchi was designed to run very large graph computations on just a single machine by using a novel algorithm for processing the graph from disk ssd or hard drive enabling a single desktop computer actually a mac mini to tackle problems that previously demanded an entire cluster for more information see https github com graphchi https github com graphchi license graphlab powergraph is released under the apache 2 license http www apache org licenses license 2 0 html if you use graphlab powergraph in your research please cite our paper inproceedings low al uai10graphlab title graphlab a new parallel framework for machine learning author yucheng low and joseph gonzalez and aapo kyrola and danny bickson and carlos guestrin and joseph m hellerstein booktitle conference on uncertainty in artificial intelligence uai month july year 2010 academic and conference papers joseph e gonzalez yucheng low haijie gu danny bickson and carlos guestrin 2012 powergraph distributed graph parallel computation on natural graphs https www usenix org conference osdi12 technical sessions presentation gonzalez proceedings of the 10th usenix symposium on operating systems design and implementation osdi 12 yucheng low joseph gonzalez aapo kyrola danny bickson carlos guestrin and joseph m hellerstein 2012 distributed graphlab a framework for machine learning and data mining in the cloud http vldb org pvldb vol5 p716 yuchenglow vldb2012 pdf proceedings of the vldb endowment pvldb yucheng low joseph gonzalez aapo kyrola danny bickson carlos guestrin and joseph m hellerstein 2010 graphlab a new parallel framework for machine learning http arxiv org pdf 1006 4990v1 pdf conference on uncertainty in artificial intelligence uai li kevin gibson charles ho david zhou qi kim jason buhisi omar brown donald e gerber matthew assessment of machine learning algorithms in cloud computing frameworks http ieeexplore ieee org xpl articledetails jsp reload true arnumber 6549501 systems and information engineering design symposium sieds 2013 ieee pp 98 103 26 26 april 2013 towards benchmarking graph processing platforms http sc13 supercomputing org sites default files postersarchive post152 html by yong guo delft university of technology marcin biczak delft university of technology ana lucia varbanescu university of amsterdam alexandru iosup delft university of technology claudio martella vu university amsterdam theodore l willke intel corporation in super computing 13 aapo kyrola guy blelloch and carlos guestrin 2012 graphchi large scale graph computation on just a pc https www usenix org conference osdi12 technical sessions presentation kyrola proceedings of the 10th usenix symposium on operating systems design and implementation osdi 12 the software stack the graphlab powergraph project consists of a core api and a collection of high performance machine learning and data mining toolkits built on top the api is written in c and built on top of standard cluster and cloud technologies inter process communication is accomplished over tcp ip and mpi is used to launch and manage graphlab powergraph programs each process is multithreaded to fully utilize the multicore resources available on modern cluster nodes it supports reading and writing to both posix and hdfs filesystems graphlab powergraph software stack images gl os software stack png graphlab software stack graphlab powergraph has a large selection of machine learning methods already implemented see toolkits directory in this repo you can also implement your own algorithms on top of the graph programming api a certain degree of c knowledge is required graphlab powergraph feature highlights unified multicore distributed api write once run anywhere tuned for performance optimized c execution engine leverages extensive multi threading and asynchronous io scalable run on large cluster deployments by intelligently placing data and computation hdfs integration access your data directly from hdfs powerful machine learning toolkits tackle challenging machine learning problems with ease building the current version of graphlab powergraph was tested on ubuntu linux 64 bit 10 04 11 04 natty 12 04 pangolin as well as mac os x 10 7 lion and mac os x 10 8 mountain lion it requires a 64 bit operating system dependencies to simplify installation graphlab powergraph currently downloads and builds most of its required dependencies using cmake s external project feature this also means the first build could take a long time there are however a few dependencies which must be manually satisfied on os x g 4 2 or clang 3 0 required required for compiling graphlab on linux g 4 3 or clang 3 0 required required for compiling graphlab nix build tools patch make required should come with most mac linux systems by default recent ubuntu version will require to install the build essential package zlib required comes with most mac linux systems by default recent ubuntu version will require the zlib1g dev package open mpi or mpich2 strongly recommended required for running graphlab distributed jdk 6 or greater optional required for hdfs support satisfying dependencies on mac os x installing xcode with the command line tools in xcode 4 3 you have to do this manually in the xcode preferences gt download pane satisfies all of these dependencies satisfying dependencies on ubuntu all the dependencies can be satisfied from the repository sudo apt get update sudo apt get install gcc g build essential libopenmpi dev openmpi bin default jdk cmake zlib1g dev git downloading graphlab powergraph you can download graphlab powergraph directly from the github repository github also offers a zip download of the repository if you do not have git the git command line for cloning the repository is git clone https github com graphlab code graphlab git cd graphlab compiling and running configure in the graphlabapi directory will create two sub directories release and debug cd into either of these directories and running make will build the release or the debug versions respectively note that this will compile all of graphlab including all toolkits since some toolkits require additional dependencies for instance the computer vision toolkit needs opencv this will also download and build all optional dependencies we recommend using make s parallel build feature to accelerate the compilation process for instance make j4 will perform up to 4 build tasks in parallel when building in release mode graphlab does require a large amount of memory to compile with the heaviest toolkit requiring 1gb of ram alternatively if you know exactly which toolkit you want to build cd into the toolkit s sub directory and running make will be significantly faster as it will only download the minimal set of dependencies for that toolkit for instance cd release toolkits graph analytics make j4 will build only the graph analytics toolkit and will not need to obtain opencv eigen etc used by the other toolkits compilation issues if you encounter issues please post the following on the graphlab forum http forum graphlab com detailed description of the problem you are facing os and os version output of uname a hardware of the machine utput of g v and clang v contents of graphlab config log and graphlab configure deps writing your own apps there are two ways to write your own apps to work in the graphlab powergraph source tree recommended install and link against graphlab powergraph not recommended 1 working in the graphlab powergraph source tree this is the best option if you just want to try using graphlab powergraph quickly graphlab powergraph uses the cmake build system which enables you to quickly create a c project without having to write complicated makefiles 1 create your own sub directory in the apps directory for example apps my app 2 create a cmakelists txt in apps my app containing the following lines project graphlab add graphlab executable my app list of cpp files space separated 3 substituting the right values into the square brackets for instance project graphlab add graphlab executable my app my app cpp 4 running make in the apps directory of any of the build directories should compile your app if your app does not show up try running cd the graphlab api directory touch apps cmakelists txt 2 installing and linking against graphlab powergraph to install and use graphlab powergraph this way will require your system to completely satisfy all remaining dependencies which graphlab powergraph normally builds automatically this path is not extensively tested and is not recommended you will require the following additional dependencies libevent 2 0 18 libjson 7 6 0 libboost 1 53 libhdfs required for hdfs support tcmalloc optional follow the instructions in the compiling section to build the release version of the library then cd into the release build directory and run make install this will install the following include graphlab hpp the primary graphlab header include graphlab the folder containing the headers for the rest of the graphlab library lib libgraphlab a the graphlab static library once you have installed graphlab powergraph you can compile your program by running g o3 pthread lzookeeper mt lzookeeper st lboost context lz ltcmalloc levent levent pthreads ljson lboost filesystem lboost program options lboost system lboost iostreams lboost date time lhdfs lgraphlab hello world cpp if you have compiled with mpi support you will also need lmpi lmpi tutorials see tutorials tutorials md datasets the following are data sets links we found useful when getting started with graphlab powergraph social graphs stanford large network dataset snap http snap stanford edu data index html laboratory for web algorithms http law di unimi it datasets php collaborative filtering million song dataset http labrosa ee columbia edu millionsong movielens dataset grouplens http grouplens org datasets movielens kdd cup 2012 by tencent inc https www kddcup2012 org university of florida sparse matrix collection http www cise ufl edu research sparse matrices classification airline on time performance http stat computing org dataexpo 2009 sf restaurants http missionlocal org san francisco restaurant health inspections misc amazon web services public datasets http aws amazon com datasets release notes map reduce vertices edges and transform vertices edges are not parallelized on mac os x these operations currently rely on openmp for parallelism on os x 10 6 and earlier gcc 4 2 has several openmp bugs and is not stable enough to use reliably on os x 10 7 the clang compiler does not yet support openmp map reduce vertices edges and transform vertices edges use a lot more processors than what was specified in ncpus this is related to the question above while there is a simple temporary solution omp set num threads we intend to properly resolve the issue by not using openmp at all unable to launch distributed graphlab when each machine has multiple network interfaces the communication initialization currently takes the first non localhost ip address as the machine s ip a more reliable solution will be to use the hostname used by mpi
ai
db-capstone-project
this is my assignment as part of the meta database engineer capstone course enjoy
server
explorer
ark explorer 3 0 p align center img src arkexplorer png p designed and developed from the ground up using lean fast developmental frameworks tailwind css vue js build status https badgen now sh github status arkecosystem explorer develop https github com arkecosystem explorer actions query branch 3adevelop codecov https badgen now sh codecov c github arkecosystem explorer https codecov io gh arkecosystem explorer license mit https badgen now sh badge license mit green https opensource org licenses mit lead maintainer michel kraaijeveld https github com itsanametoo you can access it at https explorer ark io https explorer ark io build setup 1 clone the repository bash git clone https github com arkecosystem explorer 2 install dependencies bash yarn install 3 build for production 3 1 mainnet bash yarn build mainnet 3 2 devnet bash yarn build devnet 3 3 custom bash yarn build network my custom network 3 4 github pages if you are going to host your explorer instance on github pages you will need to specify your base url in most cases as github pages serves repositories from sub directories instead of sub domains bash yarn build base https username github io repository a running instance of the explorer on github pages can be found at https arkecosystem github io this step is not required if you are hosting the explorer on your root repository which is usually your username https username github io 3 5 run express server you can run the explorer as an express server this makes it a little more light weight but not needing to have services such as apache or nginx bash explorer host 127 0 0 1 explorer port 4200 node express server js keep in mind that this requires you to run your own server and a running instance of nginx 4 development 4 1 mainnet bash yarn serve or yarn serve mainnet 4 2 devnet bash yarn serve devnet 4 3 custom bash yarn serve env network custom 5 history mode if you wish to remove the from your urls you can follow those steps https router vuejs org en essentials history mode html 5 1 build bash yarn build mainnet history 5 2 development bash yarn serve env routermode history testing bash yarn test contributing if you find any bugs submit an issue issues or open a pull request pulls helping us catch and fix them engage with other users and developers on the arkecosystem slack https ark io slack join our gitter https gitter im ark developers lobby contribute bounties https github com arkecosystem bounty program security if you discover a security vulnerability within this package please send an e mail to security ark io all security vulnerabilities will be promptly addressed credits this project exists thanks to all the people who contribute contributors license mit license ark ecosystem https ark io
ark cryptocurrency tailwindcss vuejs blockchain
blockchain
solidity-t5
a code t5 based solidity lauguage model for smart contract code generation published model at huggingface see https huggingface co hululuzhu solidity t5 hello world example to utilize the trained model a hello world example to use this model notice the input text includes header solidity version like pragma solidity 0 5 7 ancestor class library info e g public functions and constants from parenta contract library interface declaration header e g helloworld ended with python pip install transformers q from transformers import autotokenizer t5forconditionalgeneration device cuda fallback to cpu if you do not have cuda tokenizer autotokenizer from pretrained hululuzhu solidity t5 model t5forconditionalgeneration from pretrained hululuzhu solidity t5 to device text pragma solidity 0 5 7 context parenta functions helloa hellob constants constanta contract helloworld is parenta input ids tokenizer text return tensors pt truncation true input ids to device need to tune beam topk topp params to get good outcome generated ids model generate input ids max length 256 num beams 5 top p 0 95 top k 50 print tokenizer decode generated ids 0 skip special tokens true expect outcome string public constant name hello world uint256 public constant override returns uint256 return initialsupply function initialsupply public view returns uint256 background base t5 code model https huggingface co salesforce codet5 large source data https huggingface co datasets mwritescode slither audited smart contracts processing steps clean contract level segmentation sepration split in and out after processing input sample pragma solidity 0 5 7 context pauserrole functions ispauser addpauser renouncepauser constants contract pausable is pauserrole after processing output sample notice indentation is bad this is intentional to reduce token size event paused address account event unpaused address account bool private pausableactive bool private paused constructor internal paused false function paused public view returns bool return paused modifier whennotpaused require paused modifier whenpaused require paused function pause public onlypauser whennotpaused whenpausableactive paused true emit paused msg sender function unpause public onlypauser whenpaused whenpausableactive paused false emit unpaused msg sender function setpausableactive bool active internal pausableactive active modifier whenpausableactive require pausableactive source training code see the end to end notebook https github com hululuzhu solidity t5 blob main code solidity t5 data processing and training ipynb at code dir here future todo the model is significantly under trained because of lack of gpu budget need 10x colab resources 100 for full train this is quite limited on how the model is used potentially we could switch to gpt2 decoder only to compare but codet5 has its strong code optimization need more classifiers t5 or bert alike to detect potential defects this is the intention of the project however i found it quite challenging to find the labeled data that points the exact line of code that has defect technically some thoughts 01 2023 as i tested a few examples using this significantly under trained model and compared with chatgpt it seems they perform similarly for code completion that shows the great potential for this model to surpass chatgpt if we have 30x training budget and reliable training pipeline 30x budget i trained 3 hours which only finished 10 of 1 epoch i expect 3 epoches are reasonable size for finetune based on my personal experience the data is specially tuned for codet5 thus it has the limitation of split to input and output in out token size limit is 512 ideally we could change the transformed data so it could be trained in bert style to compare with t5 based classifiers or fill missing splan style to add context before and after to let model output middle code training more classifiers t5 or bert alike to detect potential defects is the intention of the project however i found it quite challenging to find the labeled data that points the exact line of code that has defect the part i divide code into segment and their ancestor info could be useful but need more time to evaluate technically it is also hard to tell if my aggressive approach to remove all comments thus rely on meaning of code only is a good approach
ai
causalml
div align center a href https github com uber causalml img width 380px height 140px src https raw githubusercontent com uber causalml master docs static img logo causalml logo png a div pypi version https badge fury io py causalml svg https pypi org project causalml build status https github com uber causalml actions workflows python test yaml badge svg https github com uber causalml actions workflows python test yaml documentation status https readthedocs org projects causalml badge version latest http causalml readthedocs io en latest badge latest downloads https static pepy tech badge causalml https pepy tech project causalml cii best practices https bestpractices coreinfrastructure org projects 3015 badge https bestpractices coreinfrastructure org projects 3015 disclaimer this project is stable and being incubated for long term support it may contain new experimental code for which apis are subject to change causal ml a python package for uplift modeling and causal inference with ml causal ml is a python package that provides a suite of uplift modeling and causal inference methods using machine learning algorithms based on recent research 1 literature it provides a standard interface that allows user to estimate the conditional average treatment effect cate or individual treatment effect ite from experimental or observational data essentially it estimates the causal impact of intervention t on outcome y for users with observed features x without strong assumptions on the model form typical use cases include campaign targeting optimization an important lever to increase roi in an advertising campaign is to target the ad to the set of customers who will have a favorable response in a given kpi such as engagement or sales cate identifies these customers by estimating the effect of the kpi from ad exposure at the individual level from a b experiment or historical observational data personalized engagement a company has multiple options to interact with its customers such as different product choices in up sell or messaging channels for communications one can use cate to estimate the heterogeneous treatment effect for each customer and treatment option combination for an optimal personalized recommendation system the package currently supports the following methods tree based algorithms uplift tree random forests on kl divergence euclidean distance and chi square 2 literature uplift tree random forests on contextual treatment selection 3 literature uplift tree random forests on ddp 4 literature uplift tree random forests on iddp 5 literature interaction tree 6 literature conditional interaction tree 7 literature causal tree 8 literature work in progress meta learner algorithms s learner 9 literature t learner 9 literature x learner 9 literature r learner 10 literature doubly robust dr learner 11 literature tmle learner 12 literature instrumental variables algorithms 2 stage least squares 2sls doubly robust dr iv 13 literature neural network based algorithms cevae 14 literature dragonnet 15 literature with causalml tf installation see installation installation installation installation with conda is recommended conda environment files for python 3 7 3 8 and 3 9 are available in the repository to use models under the inference tf module e g dragonnet additional dependency of tensorflow is required for detailed instructions see below install using conda install conda with wget https repo anaconda com miniconda miniconda3 py38 23 5 0 3 linux x86 64 sh bash miniconda3 py38 23 5 0 3 linux x86 64 sh b source miniconda3 bin activate conda init source bashrc install from conda forge directly install from the conda forge channel using conda sh conda install c conda forge causalml install with the conda virtual environment this will create a new conda virtual environment named causalml tf py3x where x is in 6 7 8 9 e g causalml py37 or causalml tf py38 if you want to change the name of the environment update the relevant yaml file in envs bash git clone https github com uber causalml git cd causalml envs conda env create f environment py38 yml for the virtual environment with python 3 8 and causalml conda activate causalml py38 causalml py38 install causalml with tensorflow bash git clone https github com uber causalml git cd causalml envs conda env create f environment tf py38 yml for the virtual environment with python 3 8 and causalml conda activate causalml tf py38 causalml tf py38 pip install u numpy this step is necessary to fix 338 https github com uber causalml issues 338 install from pypi bash pip install causalml install causalml with tensorflow bash pip install causalml tf pip install u numpy this step is necessary to fix 338 https github com uber causalml issues 338 install from source create a clean conda environment conda create n causalml py38 python 3 8 conda activate causalml py38 conda install c conda forge cxx compiler conda install python graphviz conda install c conda forge xorg libxrender then bash git clone https github com uber causalml git cd causalml pip install python setup py build ext inplace with tensorflow bash pip install tf quick start average treatment effect estimation with s t x and r learners python from causalml inference meta import lrsregressor from causalml inference meta import xgbtregressor mlptregressor from causalml inference meta import basexregressor from causalml inference meta import baserregressor from xgboost import xgbregressor from causalml dataset import synthetic data y x treatment e synthetic data mode 1 n 1000 p 5 sigma 1 0 lr lrsregressor te lb ub lr estimate ate x treatment y print average treatment effect linear regression 2f 2f 2f format te 0 lb 0 ub 0 xg xgbtregressor random state 42 te lb ub xg estimate ate x treatment y print average treatment effect xgboost 2f 2f 2f format te 0 lb 0 ub 0 nn mlptregressor hidden layer sizes 10 10 learning rate init 1 early stopping true random state 42 te lb ub nn estimate ate x treatment y print average treatment effect neural network mlp 2f 2f 2f format te 0 lb 0 ub 0 xl basexregressor learner xgbregressor random state 42 te lb ub xl estimate ate x treatment y e print average treatment effect basexregressor using xgboost 2f 2f 2f format te 0 lb 0 ub 0 rl baserregressor learner xgbregressor random state 42 te lb ub rl estimate ate x x p e treatment treatment y y print average treatment effect baserregressor using xgboost 2f 2f 2f format te 0 lb 0 ub 0 see the meta learner example notebook https github com uber causalml blob master examples meta learners with synthetic data ipynb for details interpretable causal ml causal ml provides methods to interpret the treatment effect models trained as follows meta learner feature importances python from causalml inference meta import basesregressor basetregressor basexregressor baserregressor from causalml dataset regression import synthetic data load synthetic data y x treatment tau b e synthetic data mode 1 n 10000 p 25 sigma 0 5 w multi np array treatment a if x 1 else control for x in treatment customize treatment control names slearner basesregressor lgbmregressor control name control slearner estimate ate x w multi y slearner tau slearner fit predict x w multi y model tau feature randomforestregressor specify model for model tau feature slearner get importance x x tau slearner tau model tau feature model tau feature normalize true method auto features feature names using the feature importances method in the base learner lgbmregressor in this example slearner plot importance x x tau slearner tau normalize true method auto using eli5 s permutationimportance slearner plot importance x x tau slearner tau normalize true method permutation using shap shap slearner slearner get shap values x x tau slearner tau plot shap values without specifying shap dict slearner plot shap values x x tau slearner tau plot shap values with specifying shap dict slearner plot shap values x x shap dict shap slearner interaction idx set to auto searches for feature with greatest approximate interaction slearner plot shap dependence treatment group treatment a feature idx 1 x x tau slearner tau interaction idx auto div align center img width 629px height 618px src https raw githubusercontent com uber causalml master docs static img shap vis png div see the feature interpretations example notebook https github com uber causalml blob master examples feature interpretations example ipynb for details uplift tree visualization python from ipython display import image from causalml inference tree import uplifttreeclassifier upliftrandomforestclassifier from causalml inference tree import uplift tree string uplift tree plot uplift model uplifttreeclassifier max depth 5 min samples leaf 200 min samples treatment 50 n reg 100 evaluationfunction kl control name control uplift model fit df features values treatment df treatment group key values y df conversion values graph uplift tree plot uplift model fitted uplift tree features image graph create png div align center img width 800px height 479px src https raw githubusercontent com uber causalml master docs static img uplift tree vis png div see the uplift tree visualization example notebook https github com uber causalml blob master examples uplift tree visualization ipynb for details contributing we welcome community contributors to the project before you start please read our code of conduct https github com uber causalml blob master code of conduct md and check out contributing guidelines contributing md first versioning we document versions and changes in our changelog https github com uber causalml blob master docs changelog rst license this project is licensed under the apache 2 0 license see the license https github com uber causalml blob master license file for details references documentation causal ml api documentation https causalml readthedocs io en latest about html conference talks and publications by causalml team talk introduction to causalml at causal data science meeting 2021 https www causalscience org meeting program day 2 talk introduction to causalml at 2021 conference on digital experimentation mit code mit https ide mit edu events 2021 conference on digital experimentation mit codemit talk causal inference and machine learning in practice with econml and causalml industrial use cases at microsoft tripadvisor uber at kdd 2021 tutorials https kdd org kdd2021 tutorials website and slide links https causal machine learning github io kdd2021 tutorial publication causalml white paper causalml python package for causal machine learning https arxiv org abs 2002 11631 publication uplift modeling for multiple treatments with cost optimization https ieeexplore ieee org document 8964199 at 2019 ieee international conference on data science and advanced analytics dsaa http 203 170 84 89 idawis33 dsaa2019 preliminary program publication feature selection methods for uplift modeling https arxiv org abs 2005 03447 citation to cite causalml in publications you can refer to the following sources whitepaper causalml python package for causal machine learning https arxiv org abs 2002 11631 bibtex misc chen2020causalml title causalml python package for causal machine learning author huigang chen and totte harinen and jeong yoon lee and mike yung and zhenyu zhao year 2020 eprint 2002 11631 archiveprefix arxiv primaryclass cs cy literature 1 chen huigang totte harinen jeong yoon lee mike yung and zhenyu zhao causalml python package for causal machine learning arxiv preprint arxiv 2002 11631 2020 2 radcliffe nicholas j and patrick d surry real world uplift modelling with significance based uplift trees white paper tr 2011 1 stochastic solutions 2011 1 33 3 zhao yan xiao fang and david simchi levi uplift modeling with multiple treatments and general response types proceedings of the 2017 siam international conference on data mining society for industrial and applied mathematics 2017 4 hansotia behram and brad rukstales incremental value modeling journal of interactive marketing 16 3 2002 35 46 5 jannik r ler richard guse and detlef schoder the best of two worlds using recent advances from uplift modeling and heterogeneous treatment effects to optimize targeting policies international conference on information systems 2022 6 su xiaogang et al subgroup analysis via recursive partitioning journal of machine learning research 10 2 2009 7 su xiaogang et al facilitating score and causal inference trees for large observational studies journal of machine learning research 13 2012 2955 8 athey susan and guido imbens recursive partitioning for heterogeneous causal effects proceedings of the national academy of sciences 113 27 2016 7353 7360 9 k nzel s ren r et al metalearners for estimating heterogeneous treatment effects using machine learning proceedings of the national academy of sciences 116 10 2019 4156 4165 10 nie xinkun and stefan wager quasi oracle estimation of heterogeneous treatment effects arxiv preprint arxiv 1712 04912 2017 11 bang heejung and james m robins doubly robust estimation in missing data and causal inference models biometrics 61 4 2005 962 973 12 van der laan mark j and daniel rubin targeted maximum likelihood learning the international journal of biostatistics 2 1 2006 13 kennedy edward h optimal doubly robust estimation of heterogeneous causal effects arxiv preprint arxiv 2004 14497 2020 14 louizos christos et al causal effect inference with deep latent variable models arxiv preprint arxiv 1705 08821 2017 15 shi claudia david m blei and victor veitch adapting neural networks for the estimation of treatment effects 33rd conference on neural information processing systems neurips 2019 2019 16 zhao zhenyu yumin zhang totte harinen and mike yung feature selection methods for uplift modeling arxiv preprint arxiv 2005 03447 2020 17 zhao zhenyu and totte harinen uplift modeling for multiple treatments with cost optimization in 2019 ieee international conference on data science and advanced analytics dsaa pp 422 431 ieee 2019 related projects uplift https cran r project org web packages uplift index html uplift models in r grf https cran r project org web packages grf index html generalized random forests that include heterogeneous treatment effect estimation in r rlearner https github com xnie rlearner a r package that implements r learner dowhy https github com microsoft dowhy causal inference in python based on judea pearl s do calculus econml https github com microsoft econml a python package that implements heterogeneous treatment effect estimators from econometrics and machine learning methods
incubation machine-learning causal-inference uplift-modeling
ai
LuatOS
p align center a href target blank rel noopener noreferrer img width 100 src logo jpg alt luatos logo a p star https gitee com openluat luatos badge star svg theme gvp https gitee com openluat luatos stargazers fork https gitee com openluat luatos badge fork svg theme gvp https gitee com openluat luatos members license https img shields io github license openluat luatos license air101 https pg air32 cn openluat luatos actions workflows air101 yml badge svg https nightly link openluat luatos workflows air101 master air105 https pg air32 cn openluat luatos actions workflows air105 yml badge svg https nightly link openluat luatos workflows air105 master esp32 https pg air32 cn openluat luatos actions workflows esp32 idf5 yml badge svg https nightly link openluat luatos workflows esp32 idf5 master win32 https pg air32 cn openluat luatos actions workflows win32 yml badge svg https nightly link openluat luatos workflows win32 master linux https pg air32 cn openluat luatos actions workflows linux yml badge svg https nightly link openluat luatos workflows linux master ec618 air780e http luat papapoi com 23380 api badges openluat luatos status svg luatos powerful embedded lua engine for iot devices with many components and low memory requirements 16k ram 128k flash lua mcu 16k ram 128k flash 1 air780e 4g cat 1 https item taobao com item htm id 693774140934 air101 mcu https luat taobao com air103 mcu https luat taobao com air105 mcu https luat taobao com air601 wifi https luat taobao com esp32c3 wifi https luat taobao com 2 https wiki luatos com boardguide flash html 3 demo https gitee com openluat luatos tree master demo api https wiki luatos com api index html 30 lua https www bilibili com video bv1vf4y1l7rb spm id from 333 999 0 0 4 wiki luatos https wiki luatos com mit license license lua print luatos print thank you for using luatos
lua rtos wifi nbiot 4g-lte luatos air101 air103 air105 esp32c3 air600e air780e air780eg air601
os
api-practice
rest graphql for fe https www inflearn com course inst 4227b52f reactjs sns rest api graphql mysql mongodb firebase database server api core nodejs express json database file system code base optional react js next js grapql axios reactquery lowdb javascript crud create read update delete rest api graphql db 1 client 2 server rest api express json database server routes ch1 ch2 https github com roy jung api practice pull 9 files 3 client rest api rest api ch2 ch3 https github com roy jung api practice pull 10 files 4 server graphql graphql schema resolver graphql playground ch3 ch4 https github com roy jung api practice pull 11 files 5 client graphql graphql graphql ch4 ch5 https github com roy jung api practice pull 12 files 6 client graphql useinfinitequery mutation ch5 ch6 https github com roy jung api practice pull 13 files 7 lowdb ch3 lowdb rest https github com roy jung api practice pull 14 files ch6 lowdb graphql https github com roy jung api practice pull 15 files json server lowdb rest json server https github com roy jung api practice pull 16 files 8 typescript rest https github com roy jung api practice tree ts ch3 ch3 typescript https github com roy jung api practice pull 17 files graphql https github com roy jung api practice tree ts ch6 ch6 typescript https github com roy jung api practice pull 18 files
front_end
unlock
unlock unlock protocol com public images unlock word mark dark png gh dark mode only unlock unlock protocol com public images unlock word mark png gh light mode only this repository includes all the code deployed by unlock including smart contracts and the web app running at unlock protocol com https unlock protocol com unlock is a membership protocol built on a blockchain it enables creators to monetize their content or software without relying on a middleman it lets consumers manage all of their subscriptions in a consistent way as well as earn discounts when they share the best content and applications they use read more about why we re building unlock https medium com unlock protocol its time to unlock the web b98e9b94add1 license mit https img shields io badge license mit yellow svg https opensource org licenses mit demo you can try unlock using the ethereum blockchain on our homepage https unlock protocol com we are building this in the open which means you can also run the code locally see instructions below you can try out the staging version which runs the latest in progress code but against the goerli test network contributing thanks for your interest in contributing to unlock we re excited you re here there are a variety of ways to contribute to the project please read more about contributing in our contributor guide https github com unlock protocol unlock wiki getting started please also check our code of conduct https github com unlock protocol unlock blob master code of conduct md for all participants in our community getting started the code unlock uses a monorepo which includes all the services and applications we develop get the code git clone https github com unlock protocol unlock cd unlock you ll need yarn https yarnpkg com installed globally yarn install all dependencies may take a while build all packages yarn build to execute commands inside the repo we use the pattern yarn workspace workspace name command build the contracts yarn workspace unlock protocol smart contracts build validate lint for paywall yarn workspace unlock protocol paywall lint etc the protocol you can run a local version of the protocol using docker https docs docker com install cd docker docker compose up build this will create the required infrastructure database local ethereum test network subgraph and start core services such as the locksmith locksmith api and a wedlocks wedlocks mailing service for debug purposes nb config is defined in both docker compose yml and docker compose override yml deploy and provision the contracts the following script will deploy the contracts create some dummy locks and send you some local tokens for development cd docker docker compose exec eth node yarn provision run one of the app the main dashboard lives in the unlock app folder of this repo to launch it locally install deps yarn start unlock main app yarn workspace unlock protocol unlock app start this will start http localhost 3000 to start using the application and deploy locks locally http localhost 3002 our static landing page site secrets secrets are stored in the unlock labs 1password account in the api services vault they can be loaded in our github actions through the use of dedicated actions here is an example yaml steps name load secrets from 1password uses 1password load secrets action v1 2 0 with export env true env op service account token secrets op service account token username op secrets test api username credential op secrets test api credential more steps that require username and credential to be set these reference uris have the following syntax op vault item section field so for example the reference uri op app cicd aws secret access key would be interpreted as vault app cicd item aws field secret access key thank you img src https user images githubusercontent com 624104 52508260 d0daa180 2ba8 11e9 970c 3ef9596f6b4e png alt browserstack logo width 120 https www browserstack com thanks to browserstack https www browserstack com for providing the infrastructure that allows us to test in real browsers thank you to all the members of our lock as well you can easily join this list by clicking on the sponsor button it s free at the top of this page too members https member wall unlock protocol com api members network 137 locks 0xb77030a7e47a5eb942a4748000125e70be598632 maxheight 300
ethereum solidity javascript unlock blockchain protocol memberships nft infrastructure
blockchain
flutter_ddd_todo_firebase
flutter firebase ddd flutter domain driven development using firebase getting started this project is a starting point for a flutter application a few resources to get you started if this is your first flutter project lab write your first flutter app https flutter dev docs get started codelab cookbook useful flutter samples https flutter dev docs cookbook for help getting started with flutter view our online documentation https flutter dev docs which offers tutorials samples guidance on mobile development and a full api reference
server
Group6_IIT_A3
group6 iit a3 financialgen introduction to information technology assessment 3 class cosc 2083 summer 2023 wednesday 10 30 12 00pm group 6 members nguyen kim long s4031709 che minh quan s3893635 quan hung s3980813 pham dao tan loc s3924306 the link to the website https longs1709 github io group6 iit a3 group6 iit a3 html
server
bfapi
brainf ck as a service a little bf interpreter inspired by modern systems design trends how to run it docker compose up d bash hello sh should print hello world how does it work microservices there is some nginx gateway that does load balancing to the api instances each api instance see api directory handles parsing and loops of course nobody in his sane mind would implement a stack manually these days so we use redis for stack operations pointer movements are controlled by another microservice ptr this one uses mongodb because well what could possibly go wrong finally there is memory access service mem that uses postgres as a memory storage battle tested scalable technology as a result a typical hello world script runs in 400ms on a modern macbook not bad at all 80 of the code was copied from stackoverflow so it should be bug free enjoy
os
iotdb-client-go
licensed to the apache software foundation asf under one or more contributor license agreements see the notice file distributed with this work for additional information regarding copyright ownership the asf licenses this file to you under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license english readme md readme zh md apache iotdb apache iotdb database for internet of things is an iot native database with high performance for data management and analysis deployable on the edge and the cloud due to its light weight architecture high performance and rich feature set together with its deep integration with apache hadoop spark and flink apache iotdb can meet the requirements of massive data storage high speed data ingestion and complex data analysis in the iot industrial fields apache iotdb client for go e2e tests https github com apache iotdb client go actions workflows e2e yml badge svg https github com apache iotdb client go actions workflows e2e yml github release https img shields io github release apache iotdb client go svg https github com apache iotdb client go releases license https img shields io badge license apache 202 4eb1ba svg https www apache org licenses license 2 0 html https github size badge herokuapp com apache iotdb client go svg https img shields io badge platform win10 20 7c 20macos 20 7c 20linux yellow svg iotdb website https img shields io website up down green red https shields io svg label iotdb website https iotdb apache org overview this is the golang client of apache iotdb apache iotdb website https iotdb apache org apache iotdb github https github com apache iotdb prerequisites golang 1 13 how to use the client quick start with go mod sh export go111module on export goproxy https goproxy io mkdir session example cd session example curl o session example go l https github com apache iotdb client go raw main example session example go go mod init session example go run session example go without go mod sh get thrift 0 15 0 go get github com apache thrift cd gopath src github com apache thrift git checkout 0 15 0 mkdir p gopath src iotdb client go example session example cd gopath src iotdb client go example session example curl o session example go l https github com apache iotdb client go raw main example session example go go run session example go how to use the sessionpool sessionpool is a wrapper of a session set using sessionpool the user do not need to consider how to reuse a session connection if there is no available connections and the pool reaches its max size the all methods will hang until there is a available connection the putback method must be called after use new sessionpool standalone golang config client poolconfig host host port port username user password password sessionpool client newsessionpool config 3 60000 60000 false cluster or doublelive golang config client poolconfig username user password password nodeurls strings split 127 0 0 1 6667 127 0 0 1 6668 sessionpool client newsessionpool config 3 60000 60000 false get session through sessionpool putback after use set storage group golang session err sessionpool getsession defer sessionpool putback session if err nil session setstoragegroup sg query statement golang var timeout int64 1000 session err sessionpool getsession defer sessionpool putback session if err nil log print err return sessiondataset err session executequerystatement sql timeout if err nil defer sessiondataset close printdataset1 sessiondataset else log println err developer environment requirements for iotdb client go os linux macos or other unix like os windows bash wsl cygwin git bash command line tools golang 1 13 make 3 0 curl 7 1 1 thrift 0 15 0 troubleshooting thrift version compatibility issues in the branch rel 0 13 and earlier versions the version of apache thrift is v0 14 1 in the latest version apache thrift has been upgraded to v0 15 0 the two versions are not compatible on some interfaces using mismatched version will cause compilation errors the interfaces changed in the two versions are as follows 1 newtsocketconf this function returns two values in the version v0 14 1 and only one value in the version v0 15 0 2 newtframedtransport has been deprecated use newtframedtransportconf instead for more details please take a look at this pr update thrift to 0 15 0 to fit iotdb 0 13 0 https github com apache iotdb client go pull 41 parameter name mismatch with actual usage in function open the implementation of the function client session go open is mismatched with the description the parameter connectiontimeoutinms represents connection timeout in milliseconds however in the older version this function did not implement correctly regarding it as nanosecond instead the bug is now fixed positive value of this parameter means connection timeout in milliseconds set 0 for no timeout
timeseries database go client
server
iot-uni-dongle
iot uni dongle an project of universal stick for various iot appliances controlled via uart ready made are available in tindie shop https www tindie com products dudanov universal iot dongle 1 images main gif main goals 1 unified design for all types of home devices from different manufacturers 2 use of various connectors with a pitch of 2 54 mm and usb 3 possibility of swapping the middle signal tx rx contacts 4 ir transmission and receiving via one wire connected to tsop output 5 minimalistic design 30 5x19mm pcb size 23 5x19mm 6 using popular wi fi module esp12 f https docs ai thinker com media esp8266 docs esp 12f product specification en pdf based on esp8266 soc 1 images view01 jpg 1 images view02 jpg a far from complete list of supported brands 1 midea https www midea com 2 electrolux https www electrolux ru 3 qlima https www qlima com 4 artel https www artelgroup com 5 carrier https www carrier com 6 comfee http www comfee russia ru 7 inventor https www inventorairconditioner com 8 dimstal simando https www simando24 de some sockets mounting examples 1 images sockets01 jpg 1 images sockets02 jpg signal lines tx rx to select the location of the signal contacts it is necessary to close the jumper platforms with drops of solder in the photos is marked in red signal lines are named according to the master that is esp12 f 1 images midea png 2 images haier png sending and receiving ir remote control commands due to the fact that not all capabilities are implemented in the uart protocol for example indication control and followme feature it is possible to sending ir commands by supplying a demodulated signal duty 100 to pin gpio13 to do this connect the ir pad located on the top side of the stick and the output of the tsop ir demodulator on the display board the pictures below show an example for a tsop1738 ir receiver 2 images tsop stick jpg 2 images tsop display jpg you can also read the ir signal on the gpio12 pin from all remote controls including third party ones it can help in researching protocols and various automation goals without resorting to additional devices thus saving energy smt assembly on jlcpcb the single smt jlcpcb single smt directory contains the files necessary for manufacturing and assembling the board at the jlcpcb https jlcpcb com factory that is the received order will look like this 1 images smt png 2 images smt after png you just have to solder the module and the required connector on the back of the board and select the position of the signal contacts depending on your device frequently asked questions how can i flash my stick i ve written some tutorial flashing md read it how can i tell if my air conditioner is supported or not none 100 answer to the question but there is a high probability of support if your air conditioner has a usb connector a regular place for a stick uart is used what firmware would you recommend initially the stick was developed for esphome https esphome io and home assistant https www home assistant io but it is possible to write your own firmware for your tasks and needs if you have the appropriate skills is it possible to purchase a ready made stick yes you can buy it in tindie shop https www tindie com products dudanov universal iot dongle write me to my telegram https t me dudanov or e mail mailto sergey dudanov gmail com if this project was useful to you you can buy me https paypal me dudan0v a cup of coffee
usb-dongles iot-device home-automation home-assistant esp8266 esp8266-arduino esphome esphome-devices midea-dongle haier-dongle
server
WIFI_BussinessBigDataAnalyseSystem
wifi bussinessbigdataanalysesystem wifi super super handsome https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e9 97 ae e9 a2 98 e5 88 86 e6 9e 90 png wifi tomcat 1300 hdfs linux spark habse 3 english introduction the commercial big data analysis system based on the wifi probe uses the hadoop data analysis platform and the spark framework to quickly analyze the data collected by the probes it adopts the tomcat vertical cluster server construction mode to implement the high concurrency processing of the data receiving server to the web interface and uses the echarts chart visualization can be used to analyze the trend of changes in the customer flow of the mall and to use machine learning regression prediction my main responsibility is to build the overall business logic and write the front and back code 1 https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e5 95 86 e5 9c ba e6 80 bb e4 bd 93 e6 83 85 e5 86 b5 png 2 https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e5 95 86 e5 9f 8e e5 8c ba e5 9f 9f e7 83 ad e7 82 b9 png 3 https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e5 86 b3 e7 ad 96 e6 94 af e6 8c 81 e4 b9 8b e8 90 a5 e9 94 80 e6 96 b9 e6 a1 88 e6 8e a8 e9 80 81 png 1 spark hbase tomcat hadoop hbase hdfs hbase hdfs spark tomcat echarts bootstrap pc 2 3 http v youku com v show id xmjg1ode1ndc2ma html spm a2hzp 8244740 0 0 super2017 4 pdf pdf pdf pdf spark wifi pdf https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e5 ae a3 e4 bc a0 e5 86 8c 00 20 e5 b0 81 e9 9d a2 jpg https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about 01 20 jpg https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e5 ae a3 e4 bc a0 e5 86 8c 02 20 e6 8a 80 e6 9c af e6 9e b6 e6 9e 84 jpg https github com rainmaple wifi bussinessbigdataanalysesystem raw master images about e5 ae a3 e4 bc a0 e5 86 8c 03 20 e5 8a 9f e8 83 bd e6 a6 82 e8 a7 88 jpg pdf pdf pdf pdf pdf 5 wifi off on off on super super handsome
hadoop-cluster spark realtime hbase echarts
os
SmingRTOS
sming rtos beta version warning this project is no longer mainained you can take a look at the original sming https github com sminghub sming project which is actively developed sming based on espressif rtos sdk based on sming develop including commits until sming 2 1 0 release based on espressif rtos sdk 1 3 0 build https travis ci org sminghub smingrtos svg branch master https travis ci org sminghub smingrtos feedback when finding issues during conversion of applications submit an issue on the sming rtos repository requirements espressif rtos sdk 1 3 0 udk 2 0 9 for toolchain and sdk on windows or esp open sdk on linux be sure to have a clean sdk without previously applied updates esp alt sdk provides downloads for all platforms kireevco to supply links usage beware some environment variables names are identical to sming nonos clone this repository set environment variables valid for the location set sdk location in makefile platform or as environment variables sdk base set wifi ssid wifi pwd in environment variables compile flash additional needed software spiffy source included in sming repository esptool2 https github com raburton esptool2 esptool2 completeness all core and network functionality is ported to sming rtos spiffy spiffs filesystem functionality included applications application which includes httpajax hwpwm example source httpajax works hwpwm compiles not tested on hardware neopixel working scanneri2c working websockets working sdcard working hcsr04 ultrasonic working dht11 22 dht11 tested and working dht22 untested ssd1306 screen i2c module tested and working spi module untested ds1820 working systemclock ntp working bmp180 pressure working rboot working pcd8544 lcd nokia 5110 lcd working known rtos nonos differences with application consequences generic no wdt routines soft timer only on milli and not on micro level spi size the espressif sdk is more restrictive on use of flash the spi size needs to be set correct to prevent issues especially spiffs include files the user config h file does not contain any framework includes anymore remove the nonos user config h if not removed there will be unresolved h files the directory structure has changed the only include necessary is the include smingcore h when using libraries use include libraries for case sensitive systems beware of naming spiffs spiffs fs es created on nonos are not supported create a new spiffs using the rtos supplied spiffy when mounting the fs on flash is checked for generic spiffy structure blocksize fs length spiffs mount and spiffs mount manual return bool based on mount result when using spiffs mount the size of the spiffs fs is autodetected when using spiffs mount manual use the actual flash address for conversion of current spiffs mount manual subtract 0x40200000 from start address use filesize is 0 zero for autodetected size of fs see the remark above on spi size if set not correct the spiffs mount and spiffs mount manual will fail rboot works as in sming but see the above on spiffs mount manual hardwarepwm duty cycle is now independent on period values are from 0 1023
os
foundation-sites
p align center a href https get foundation img src https user images githubusercontent com 9939075 38782856 2a64a43e 40fa 11e8 89cd e873af03b3c4 png alt foundation for sites 6 width 448px style max width 100 a p p align center a href https get foundation sites docs installation html b install b a a href https get foundation sites docs documentation a a href https github com foundation foundation sites releases releases a a href contributing md contributing a p build status https github com foundation foundation sites workflows ci badge svg https github com foundation foundation sites actions workflow ci npm version https badge fury io js foundation sites svg https badge fury io js foundation sites jsdelivr hits https data jsdelivr com v1 package npm foundation sites badge style rounded https www jsdelivr com package npm foundation sites netlify status https api netlify com api v1 badges da72b0f9 3d51 4d50 951e 6bbf5fe88601 deploy status https app netlify com sites foundation sites deploys quality gate status https sonarcloud io api project badges measure project foundation foundation sites metric alert status https sonarcloud io dashboard id foundation foundation sites known vulnerabilities https snyk io test github foundation foundation sites badge svg https snyk io test github foundation foundation sites browserstack status https automate browserstack com badge svg badge key zljcvgixaegvafc4twhbz0hfwgtimjbrzew0unfruys3zgdrbmz6tw5lzz0tlu9wzudfv2lmnvd1du9xbwxuq05bogc9pq 915d78e23eeed2ae37ce7a670bc370011a9e4fd9 https automate browserstack com public build zljcvgixaegvafc4twhbz0hfwgtimjbrzew0unfruys3zgdrbmz6tw5lzz0tlu9wzudfv2lmnvd1du9xbwxuq05bogc9pq 915d78e23eeed2ae37ce7a670bc370011a9e4fd9 foundation is the most advanced responsive front end framework in the world quickly go from prototype to production building sites or apps that work on any kind of device with foundation includes a fully customizable responsive grid a large library of sass mixins commonly used javascript plugins and full accessibility support run locally documentation to run the documentation locally on your machine you need node js https nodejs org en installed on your computer your node js version must be 12 or 14 run these commands to set up the documentation bash install git clone https github com foundation foundation sites cd foundation sites yarn start the documentation yarn start testing foundation has three kinds of tests javascript sass and visual regression refer to our testing guide https github com foundation foundation sites wiki testing guide for more details run tests with bash sass unit tests yarn test sass javascript unit tests yarn test javascript units visual tests yarn test visual contributing check out contributing md contributing md to see how to report an issue or submit a bug fix or a new feature and our contributing guide https get foundation get involved contribute html to learn how you can contribute more globally to foundation you can also browse the help wanted https github com foundation foundation sites labels help 20wanted tag in our issue tracker to find things to do testing powered by a target blank href https www browserstack com img width 200 src https www browserstack com images layout browserstack logo 600x315 png a br browserstack open source program https www browserstack com open source training want the guided tour to foundation from the team that built it the zurb team offers comprehensive training courses for developers of all skill levels if you re new to foundation check out the introduction to foundation course http zurb com university foundation intro utm source github 20repo utm medium website utm campaign readme utm content readme 20training 20link to kickstart your skills amplify your productivity and get a comprehensive overview of everything foundation has to offer more advanced users should check out the advanced foundation course http zurb com university advanced foundation training utm source github 20repo utm medium website utm campaign readme utm content readme 20training 20link to learn the advanced skills that zurb uses to deliver quality client work in short timeframes copyright 2020 foundation community
front_end
instagramfashionblogger
recommendation system based on deep learning this is an instagram fashion blogger recommendation app built on deep learning and nlp the original idea is to make customized recommendations of instagram fashion bloggers through natural language processing whenever the user type in one fashion keyword in his her mind such as party paris or shoes this app will search in the blogger posts database for the best matches a word2vec deep learning neural network has been trained on a 15k word casual corpus and validated by a relational word benchmark a blogger corpus with 0 25 million words has been established by web scrapping 1300 bloggers recommended by 500 fashion articles are included in this database basic stats are visualized by d3 using nvd3 package try the app at https instagram blogger herokuapp com
deep-learning social-network recommendation-system flask-application natural-language-processing
ai
AwesomeComputerVision
awesomecomputervision multi object tracking paper list https github com spyderxu multi object tracking paper list awesome object detection https github com hoya012 deep learning object detection awesome image classification https github com weiaicunzai awesome image classification visual tracking paper list https github com foolwood benchmark results awesome semantic segmentation https github com mrgloom awesome semantic segmentation awesome human pose estimation https github com cbsudux awesome human pose estimation awesome face recognition https github com chanchichoi awesome face recognition
ai
Expose-Outsourcing-Information-Technology
expose outsourcing information technology expose outsourcing information technology
server
quickfix-messenger
quickfix messenger build status http img shields io travis jramoyo quickfix messenger svg style flat https travis ci org jramoyo quickfix messenger issues http img shields io github issues jramoyo quickfix messenger svg style flat https github com jramoyo quickfix messenger issues state open gittip http img shields io gittip jramoyo svg style flat https www gittip com jramoyo quickfix messenger is a front end messaging application built on top of the quickfix j http www quickfixj org engine it is a tool designed for technical individuals working on the financial information exchange fix http fixprotocol org protocol the fix protocol is an industry driven messaging standard for the electronic communication of trade related messages quickfix messenger is ideal for testing different messages and scenarios on fix infrastructures it provides a simple graphical user interface gui for constructing and sending fix messages across a given fix session visit our user s guide https github com jramoyo quickfix messenger wiki user s guide for more information if you d like to report a bug or request for a new feature please log a ticket on our tracker https github com jramoyo quickfix messenger issues for support questions or general discussions kindly visit our forum https groups google com forum fromgroups forum quickfix messenger discuss because of new features added to version 2 0 java 7 jre 1 7 or later is required to run quickfix messenger downloads version 2 0 bin https code google com p quickfix messenger downloads detail name qfix messenger 2 0 bin zip src https code google com p quickfix messenger downloads detail name qfix messenger 2 0 src zip features quickfix messenger is fully open source and supports the following features fix versions 4 0 to 5 0 including fixt 1 1 add messages to a project export import messages as xml repeating groups components filters required optional fields modify header trailer send messages from text i e log files act as session initiator or acceptor verify data format custom logon a message session management message logging quick links to fixwiki http fixwiki org i e messages fields components screenshots the stand alone application http quickfix messenger googlecode com svn images 2 0 main png the application in project mode http quickfix messenger googlecode com svn images 2 0 project view png technology quickfix messenger is written in java swing and would run on most platforms attribution logo derived from artwork by mateusz dembek http dembsky deviantart com buttons from mateusz dembek http dembsky deviantart com icons by yusuke kamiyamane http p yusukekamiyamane com release notes version 2 0 http code google com p quickfix messenger issues list can 1 q milestone release2 0 sort status priority colspec id 20type 20status 20priority 20milestone 20owner 20summary version 1 3 http code google com p quickfix messenger issues list can 1 q milestone release1 3 sort priority colspec id 20type 20status 20priority 20milestone 20owner 20summary version 1 2 http code google com p quickfix messenger issues list can 1 q milestone release1 2 sort priority colspec id 20type 20status 20priority 20milestone 20owner 20summary version 1 1 http code google com p quickfix messenger issues list can 1 q milestone release1 1 sort priority colspec id 20type 20status 20priority 20milestone 20owner 20summary
front_end
moko-web3
moko web3 https user images githubusercontent com 5010169 128702515 fc3928c7 d391 4234 9caa 15ab265cd7c1 png github license https img shields io badge license apache 20license 202 0 blue svg style flat http www apache org licenses license 2 0 download https img shields io maven central v dev icerock moko web3 https repo1 maven org maven2 dev icerock moko web3 kotlin version https kotlin version aws icerock dev kotlin version group dev icerock moko name web3 the library is being maintained here symbiosis finance moko web3 https github com symbiosis finance moko web3 mobile kotlin web3 this is a kotlin multiplatform library that allow you to interact with ethereum networks by web3 protocol table of contents features features requirements requirements installation installation usage usage samples samples set up locally set up locally contributing contributing license license features requirements gradle version 6 8 android api 16 ios version 11 0 installation root build gradle groovy allprojects repositories mavencentral project build gradle groovy dependencies commonmainapi dev icerock moko web3 0 18 0 usage samples more examples can be found in the sample directory sample set up locally in web3 directory web3 contains web3 library in sample directory sample contains samples on android ios mpp library connected to apps contributing all development both new features and bug fixes is performed in develop branch this way master sources always contain sources of the most recently released version please send prs with bug fixes to develop branch fixes to documentation in markdown files are an exception to this rule they are updated directly in master the develop branch is pushed to master during release more detailed guide for contributers see in contributing guide contributing md license copyright 2021 icerock mag inc licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license
android ios kotlin kotlin-multiplatform kotlin-native moko
front_end
distribution-tools
overview this repository contains some of the scripts we have used for running our airdrop campaigns and other distributions initially all of these were made purely for internal use and there were little to none ui considerations however as we decided to release these publicly we have made some improvements on that front we also created accompanying readme documents that will hopefully allow you to use these in your own projects without issues before using our scripts we strongly recommend that you get to grips with the mechanics of using the tools on solana s testnet devnet https docs solana com cluster rpc endpoints before doing any live mainnet transactions any mistakes you make there are completely harmless and will serve as a learning experience built with everything was written and tested on a linux based 64 bit system running ubuntu 20 04 and python 3 8 5 these tools should also work on other linux distros and macos systems but they will definitely not run on windows if you are on a windows machine we recommend looking into wsl and wsl2 https docs microsoft com en us windows wsl install win10 all tools currently available are built with python3 and no external modules or packages but they are dependant on having local installations of solana cli tools available on the path below are links on the installation steps for solana cli and spl token cli tools solana cli https docs solana com cli install solana cli tools spl token cli https spl solana com token if you haven t used these tools before we would recommend following along with the documentation as it explains some basic usage scenarios that will familiarize you with them available tools tools description address fetcher tools address fetcher fetching addresses of token holders and accounts flat distributor tools flat distributor distribute a fixed amount of tokens proportional distributor tools proportional distributor distribute a proportional amount of tokens the accompanying readme for each tool is located in its directory refer to them for usage examples issues feel free to report any issues you encounter while using these tools or ideas you may have for improvement we will try to help out with any problems when we can but understand that these tools were made for private use and there might be usage scenarios that we never thought of or got around to testing license distributed under the mit license links solana https solana com solana discord https discord com invite pquxpsq contact me on discord praskoo 0892
blockchain
Toto
toto toto is a small framework intended to accelerate web service development it is built on top of tornado tornado and can currently use mysql mysql mongodb mongodb postgresql postgres or redis redis as a backing database build status https travis ci org jeremyot toto svg https travis ci org jeremyot toto features uses json or bson or msgpack for easy consumption by clients on any platform easy to add new methods simple authentication built in with hmac sha1 verification for authenticated requests session state persistence for authenticated requests sessions stored in database to simplify scaling across servers installation the simplest way to install toto is with pip simply run pip install e git git github com jeremyot toto git egg toto to install the latest version of the toto module on your machine documentation complete documentation is available here http toto li docs docs usage getting started with toto is easy all you need to do is make a new instance of toto totoserver and call run toto needs a root module to use for method lookup by default a totoserver will look for a module called methods the method module parameter can be used to specify another module by name configuration by default toto is configured to run on port 8888 and connect to a mongodb server running on localhost configuration can be performed in three ways with each overriding the last 1 by passing options as named parameters to the totoserver constructor 2 through a configuration file by passing the path to the config file as the first parameter to the totoserver constructor 3 with command line parameters option string value option 1234 combining the configuration methods can be useful when debugging run your script with help to see a full list of available parameters methods methods are referenced by name in each request a b c or a b c maps to methods a b c to add new methods add modules and packages to the methods or specified package see the account package for reference and ensure that each callable module defines invoke handler parameters where handler is the totohandler subclass of tornado web requesthandler handling the current request handler connection db provides direct access to the database used by the sessions and accounts framework handler session provides access to the current session or none if not authenticated available properties session user id the current user id session expires the unix timestamp when the session will expire session session id the current session id session state a python dict containing the current state you must call session save state to persist any changes the session object acts like a proxy to state so you can use dictionary accessors on it directly to enforce authentication for any method decorate the invoke function with toto invocation authenticated unauthorized attempts to call authenticated methods will return a not authorized error required parameters can be specified by decorating an invoke function with toto invocation requires param1 param2 method modules can take advantage of tornado s tornado non blocking features by decorating an invoke function with toto invocation asynchronous data can be sent to the client with handler respond and handler raw respond optionally modules can implement on connection close to clean up any resources if the client closes the connection see requesthandler on connection close in the tornado tornado documentation for more information it is important to remember that tornado tornado requires that all calls to respond respond raw write flush and finish are performed on the main thread you can schedule a function to run on the main thread with ioloop instance add callback callback note any data returned from a call to method invoke will be sent to the client as json data and be used to generate the x toto hmac header for verification this may cause issues with asynchronous methods if method invoke returns none a response will not automatically be sent to the client and no x toto hmac header will be generated requests non authenticated methods 1 call service with json object in the form method a b c parameters parameters instead of passing the method argument in the request body it is also possible to call methods by url the url equivalent to the above call is http service com service a b c 2 parse response json account creation 1 call account create method with user id user id password password 2 verify that the base64 encoded hmac sha1 of the response body with user id as the key matches the x toto hmac header in the response 3 parse response json 4 read and store session id from the response object login 1 call account login method with user id user id password password 2 verify that the base64 encoded hmac sha1 of the response body with user id as the key matches the x toto hmac header in the response 3 parse response json 4 read and store session id from the response object authenticated methods 1 login see above 2 call service with json object in the form method a b c parameters parameters with the x toto session id header set to the session id returned from login and the x toto hmac header set to the base64 encoded hmac sha1 generated with user id as the key and the json request string as the message 3 verify that the base64 encoded hmac sha1 of the response body with user id as the key matches the x toto hmac header in the response 4 parse response json note these instructions assume that method invoke returns an object to be serialized and sent to the client methods that return none can be used the send any data and must be handled accordingly events sometimes you may need to send events from one request to another toto s toto events eventmanager makes this easy to send an event use eventmanager instance send eventname args eventmanager uses python s cpickle module for serialization so you can pass anything cpickle can handle as args to receive an event you must register a handler with totohandler register event handler eventname handler handler is a function that takes one parameters and will be called with args when the eventmanager sends an event with eventname toto s events were primarily designed to be combined with tornado s support for non blocking requests see the chat template for an example toto s event system supports sending events across multiple instances both on the same machine and in a distributed system run your server with help for more configuration options daemonization the toto server can be run as a daemon by passing the argument start to stop any running processes pass stop this will stop any processes that share the specified pid file format default toto pid the processes n option may be used to specify the number of server instances to run multiple instances will be run on sequential ports starting at the port specified by port if 0 is used as the argument to processes toto will run one process per cpu as detected by python s multiprocessing module additional daemonization options can be viewed from help clients to help you get started javascript and ios client libraries are in development at https github com jeremyot totoclient js https github com jeremyot totoclient js and https github com jeremyot totoclient ios https github com jeremyot totoclient ios respectively tornado http www tornadoweb org mysql http www mysql com mongodb http www mongodb org docs http toto li docs http toto li docs postgres http www postgresql org redis http redis io
front_end
azure-iot-cli-extension
microsoft azure iot extension for azure cli python https img shields io pypi pyversions azure cli svg maxage 2592000 build status https dev azure com azureiotdevxp aziotcli apis build status merge 20 20azure azure iot cli extension branchname dev the azure iot extension for azure cli aims to accelerate the development management and automation of azure iot solutions it does this via addition of rich features and functionality to the official azure cli https docs microsoft com en us cli azure news when upgrading your azure cli core version for the best experience and to avoid breaking changes we recommend updating your azure iot extension to the latest available https github com azure azure iot cli extension releases azure cli 2 24 0 requires an azure iot extension update to 0 10 11 or later for iot hub commands to work properly however we recommend at least azure iot 0 10 14 updating the extension can be done with az extension update name azure iot note a common error that arises when using an older azure iot with azure cli 2 24 0 shows as follows attributeerror iothubresourceoperations object has no attribute config starting with version 0 10 13 of the iot extension you will need an azure cli core version of 2 17 1 or higher iot extension version 0 10 11 remains on the extension index to support environments that cannot upgrade core cli versions the legacy iot extension id azure cli iot ext is deprecated in favor of the new modern id azure iot azure iot is a superset of azure cli iot ext and any new features or fixes will apply to azure iot only also the legacy and modern iot extension should never co exist in the same cli environment uninstall the legacy extension with the following command az extension remove name azure cli iot ext related if you see an error with a stacktrace similar to text azure cli iot ext azext iot common azure py ln 90 in get iot hub connection string client iot hub service factory cmd cli ctx cliextensions azure cli iot ext azext iot factory py ln 29 in iot hub service factory from azure mgmt iothub iot hub client import iothubclient modulenotfounderror no module named azure mgmt iothub iot hub client the resolution is to remove the deprecated azure cli iot ext and install any version of the azure iot extension commands please refer to the official az iot reference on microsoft docs https docs microsoft com en us cli azure ext azure iot iot for a complete list of supported commands you can also find iot cli usage tips on the wiki https github com azure azure iot cli extension wiki tips installation 1 install the azure cli https docs microsoft com en us cli azure install azure cli you must have at least v2 32 0 for the latest versions of azure iot which you can verify with az version 1 add update or remove the iot extension with the following commands add az extension add name azure iot update az extension update name azure iot remove az extension remove name azure iot please refer to the installation troubleshooting guide docs install help md if you run into any issues or the alternative installation methods docs alt install methods md if you d like to install from a github release or local source usage after installing the azure iot extension your cli environment is augmented with the addition of hub central dps dt edge and device commands for usage and help content of any command or command group pass in the h parameter root command group details are shown for the following iot services click a section to expand or collapse details open summary digital twins summary text az dt h group az dt manage azure digital twins solutions infrastructure subgroups data history manage and configure data history endpoint manage and configure digital twins instance endpoints job manage and configure jobs for a digital twin instance model manage dtdl models and definitions on a digital twins instance network manage digital twins network configuration including private links and endpoint connections role assignment manage rbac role assignments for a digital twins instance route manage and configure event routes twin manage and configure the digital twins of a digital twins instance commands create create or update a digital twins instance delete delete an existing digital twins instance list list the collection of digital twins instances by subscription or resource group reset preview reset an existing digital twins instance by deleting associated assets currently only supports deleting models and twins show show an existing digital twins instance wait wait until an operation on an digital twins instance is complete details details summary iot central summary text az iot central h group az iot central manage iot central resources iot central is an iot application platform that reduces the burden and cost of developing managing and maintaining enterprise grade iot solutions choosing to build with iot central gives you the opportunity to focus time money and energy on transforming your business with iot data rather than just maintaining and updating a complex and continually evolving iot infrastructure iot central documentation is available at https aka ms iotcentral documentation additional information on cli commands is available at https aka ms azure cli iot ext subgroups api token manage api tokens for your iot central application app manage iot central applications device manage and configure iot central devices device group manage and configure iot central device groups device template manage and configure iot central device templates diagnostics preview perform application and device level diagnostics enrollment group manage and configure iot central enrollment group export preview manage and configure iot central data exports file upload config manage and configure iot central file upload job manage and configure jobs for an iot central application organization manage and configure organizations for an iot central application role manage and configure roles for an iot central application scheduled job manage and configure iot central schedule job user manage and configure iot central users commands query preview query device telemetry or property data with iot central query language details details summary iot device provisioning summary text az iot dps h group az iot dps manage entities in an azure iot hub device provisioning service dps augmented with the iot extension subgroups access policy manage azure iot hub device provisioning service access policies certificate manage azure iot hub device provisioning service certificates connection string manage connection strings for an azure iot hub device provisioning service instance enrollment manage individual device enrollments in an azure iot hub device provisioning service enrollment group manage enrollment groups in an azure iot hub device provisioning service linked hub manage azure iot hub device provisioning service linked iot hubs commands create create an azure iot hub device provisioning service delete delete an azure iot hub device provisioning service list list azure iot hub device provisioning services show get the details of an azure iot hub device provisioning service update update an azure iot hub device provisioning service details details summary iot edge summary text az iot edge h group az iot edge manage iot solutions on the edge subgroups deployment manage iot edge deployments at scale commands set modules set edge modules on a single device details details open summary iot hub summary text az iot hub h group az iot hub manage entities in an azure iot hub subgroups certificate manage iot hub certificates configuration manage iot automatic device management configuration at scale connection string manage iot hub connection strings consumer group manage the event hub consumer groups of an iot hub device identity manage iot devices device twin manage iot device twin configuration devicestream manage device streams of an iot hub digital twin manipulate and interact with the digital twin of an iot hub device distributed tracing preview manage distributed settings per device identity manage identities of an azure iot hub job manage iot hub jobs v2 message enrichment manage message enrichments for endpoints of an iot hub module identity manage iot device modules module twin manage iot device module twin configuration policy manage shared access policies of an iot hub route manage routes of an iot hub routing endpoint manage custom endpoints of an iot hub commands create create an azure iot hub delete delete an iot hub generate sas token generate a sas token for a target iot hub device or module invoke device method invoke a device method invoke module method invoke a module method list list iot hubs list skus list available pricing tiers manual failover initiate a manual failover for the iot hub to the geo paired disaster recovery region monitor events monitor device telemetry messages sent to an iot hub monitor feedback monitor feedback sent by devices to acknowledge cloud to device c2d messages query query an iot hub using a powerful sql like language show get the details of an iot hub show connection string deprecated show the connection strings for an iot hub show quota metrics get the quota metrics for an iot hub show stats get the statistics for an iot hub update update metadata for an iot hub details details open summary device update for iot hub summary text az iot du h group az iot du device update for iot hub is a service that enables you to deploy over the air updates ota for your iot devices as organizations look to further enable productivity and operational efficiency internet of things iot solutions continue to be adopted at increasing rates this makes it essential that the devices forming these solutions are built on a foundation of reliability and security and are easy to connect and manage at scale device update for iot hub is an end to end platform that customers can use to publish distribute and manage over the air updates for everything from tiny sensors to gateway level devices to learn more about the device update for iot hub service visit https docs microsoft com en us azure iot hub device update subgroups account device update account management device device update device management instance device update instance management update device update update management details scenario automation please refer to the scenario automation docs scenario automation md page for examples of how to use the iot extension in scripts contributing please refer to the contributing contributing md page for developer setup instructions and contribution guidelines feedback we are constantly improving and are always open to new functionality or enhancement ideas submit your feedback in the project issues https github com azure azure iot cli extension issues code of conduct this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments
azure-cli azure-iot-edge azure-iot-dps azure-iothub azure-cli-extension azure-iot-automation iot-extension iot-edge azure-iot-devops command-line azure-iot-extension azure-iot-dt iot-cli cicd azure-iot-central azure-iot azure iot azure-device-update
server
learn-to-program
learn to program awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome foundation in web development the internet is filled with an ever expanding number of courses books and guides for programmers of all skill levels to improve their skills unfortunately these resources are either hard to find or of low quality this list aims to be a curated set of high quality educational resources the availability of free content on the platform is highlighted along with the primary topics covered beginner codecademy https www codecademy com freemium platform for learning to code in many different programming languages html css javascript jquery python ruby rails php c java khan academy s hour of code https www khanacademy org hourofcode free interactive 1 hour courses to learn the very basics of web development html css javascript sql upleveled bootcamp prep course https learn upleveled io freemium platform for learning the basics of web development html css javascript node js git github treehouse https teamtreehouse com paid platform for courses how to build websites apps web design front end web development rails ios android php learn css layout https learnlayout com free tutorial for how to do layout with css css udemy programming https www udemy com courses search q programming development https www udemy com courses development freemium marketplace of courses from third party providers quality may vary html css javascript ruby rails python ios android code avengers https www codeavengers com freemium platform for basic web and app development courses html css javascript shay howe s learn to code html css https learn shayhowe com free beginner to intermediate guides on web development html css javascript html dog https www htmldog com free beginner and intermediate guides on web development html css javascript freecodecamp https www freecodecamp org free learn to code and help nonprofits at the same time html css javascript databases git github node js react js d3 js python vertabelo academy https academy vertabelo com free sql courses with interactive exercises and quizzes sql database concepts the odin project https www theodinproject com free beginner to intermediate full stack courses with ruby and js learning paths html css javascript ruby rails github skills https github com skills free self paced interactive projects to learn git and github created and maintained by github s training team git github grid garden https cssgridgarden com free game that teaches the css grid system created by thomaspark https github com thomaspark css hexlet io https en hexlet io mixed self paced interactive projects to learn javascript c regular expressions and computer science in general javascript regular expressions bash computer science ansible programming historian https programminghistorian org en lessons free peer reviewed introductory courses for digital humanists python r unity qgis html regular expressions software carpentry https software carpentry org lessons free foundational coding and data science skills for researchers python r openrefine unix shell git hyperskill by jetbrains academy https hi hyperskill org freemium wide range of tracks in popular programming languages and development frameworks python java kotlin sql cratecode https cratecode com free online platform that teaches programming through interactive lessons javascript typescript html p5 js backend web development datacamp https www datacamp com freemium interactive platform for learning data science python r sql power bi chatgpt developer roadmaps https roadmap sh free learning roadmap guides with links to educational content fullstack ux design cyber security computer science blockchain devops postgresql intermediate khan academy computer programming https www khanacademy org computing computer programming computer science https www khanacademy org computing computer science free intermediate to advanced courses on how to program drawings animations games and webpages and more advanced computer science topics html css javascript sql algorithms cryptography udacity https www udacity com free platform for computer science and web development courses html css javascript data science python computer science topics learn python the hard way https learnpythonthehardway org book paid book and course for beginner through intermediate python programming python object oriented programming web development michael hartl s ruby on rails tutorial https www railstutorial org book free online book covering all stages of creating a ruby on rails application html css javascript ruby rails linkedin learning web development https www linkedin com learning topics web development web design https www linkedin com learning topics web design paid platform for video courses on web development and design html css javascript web development web design thinkful https www thinkful com paid platform for mentored web and mobile development courses from industry experts web development frontend web development angularjs android ios exercism io https exercism io free crowdsourced mentorship platform of programming exercises and code reviews clojure coffeescript c c elixir erlang f go haskell javascript common lisp lua objective c ocaml perl 5 pl sql python ruby scala swift pluralsight https www pluralsight com browse paid platform for courses in web development software development security and more html css javascript react angularjs java sql codechef courses https www codechef com learn freemium platform with courses in web development and software development html css javascript python java c codechef problems https www codechef com problems easy freemium intermediate to advanced programming problems programming codingbat https codingbat com free practice problems in python and java python java codewars https www codewars com free code challenges compare your solution with those of others javascript coffeescript ruby python clojure haskell java codingame https www codingame com free learn to code and game at the same time c c java javascript python bash c clojure dart f go groovy haskell lua objectivec pascal perl php ruby rust scala swift vb net 1 million women to tech summer of code https github com 1millionwomentotech toolkitten tree master summer of code free programming course material for beginner intermediate and advanced levels python javascript data science artificial intelligence machine learning ar vr wes bos https wesbos com courses mixed guided video courses to build products using new technologies javascript css react node js graphql redux level up tutorials https levelup video library free video tutorials for web developers and designers html css javascript react svelte vue node js graphql typescript deno github figma ruby drupal magento wordpress advanced mit opencourseware https ocw mit edu search t computer 20science free courses from mit on advanced computer science topics varied and extensive computer science topics c c edx https www edx org learn computer programming free courses from harvard mit and other universities varied computer science subjects including theory and programming data science algorithms coursera https www coursera org courses categories cs ai cs programming cs systems cs theory infotech mixed platform for courses from universities and organizations worldwide varied computer science subjects including theory and programming data science algorithms awesome cs courses https github com prakhar1989 awesome courses blob master readme md free university level courses scoured from around the internet varied and extensive computer science topics metacademy roadmaps https metacademy org roadmaps course guides https metacademy org course guides free graphs of interconnected topics required to master concepts programming machine learning hackerrank https www hackerrank com free programming challenges and contests artificial intelligence algorithms functional programming machine learning hackerearth https www hackerearth com free programming challenges hackathons and contests dynamic programming artificial intelligence algorithms functional programming machine learning project euler https projecteuler net free mathematical computer programming problems programming mathematics codesignal https codesignal com free programming challenges java c python javascript ruby c php and perl license cc0 https licensebuttons net p zero 1 0 88x31 png https creativecommons org publicdomain zero 1 0 to the extent possible under law karl horky https github com karlhorky has waived all copyright and related or neighboring rights to this work contributing in lieu of a formal style guide take care to maintain the existing style of this list
front_end
LaVIN
assets logo png this repository contains the implementation of the neurips 2023 paper cheap and quick efficient vision language instruction tuning for large language models project page https luogen1996 github io lavin paper https arxiv org pdf 2305 15023 pdf br gen luo https luogen1996 github io sup 1 sup yiyi zhou sup 12 sup tianhe ren https rentainhe github io sup 1 sup shengxin chen sup 1 sup xiaoshuai sun https sites google com view xssun sup 12 sup rongrong ji https mac xmu edu cn rrji sup 12 sup br sup 1 sup media analytics and computing lab department of artificial intelligence school of informatics xiamen university sup 2 sup institute of artificial intelligence xiamen university in this work we propose a novel and affordable solution for vision language instruction tuning namely mixture of modality adaptation mma particularly mma is an end to end optimization regime which connects the image encoder and llm via lightweight adapters meanwhile we also propose a novel routing algorithm in mma which can help the model automatically shifts the reasoning paths for single and multi modal instructions based on mma we develop a large vision language instructed model called lavin which demonstrates superior training efficiency and better reasoning ability than existing multimodal llms in various instruction following tasks div align center img src assets teaser 1 png width 95 div news 2023 09 22 our paper is accepted by neurips 2023 2023 06 30 with very limited training data and cost lavin achieves 5 th place of perception and cognition on mme benchmark https github com bradyfu awesome multimodal large language models tree evaluation outperforming seven existing multimodal llms evaluation codes are available 2023 06 27 4 bit trainings are available now lavin lite can be trained on one 3090 gpu taking around 9g and 15g gpu memory for the scales of 7b and 13b respectively technical details are available in https zhuanlan zhihu com p 638784025 2023 05 29 we released the demo and the pre trained checkpoint llama 13b for multimodal chatbot 2023 05 25 we released the code of lavin large vision language instructed model which achieves 89 4 lavin 7b and 90 8 lavin 13b accuracy on scienceqa with the proposed mixture of modality adaptation the training time and trainable parameters can be reduced to 1 4 hours and 3 8m respectively checkout the paper https arxiv org pdf 2305 15023 pdf todo x release training codes x release checkpoints and demo x 4 bit training support more modalities e g audio and video contents setup setup fine tuning fine tuning demo demo model zoo model zoo setup install package bash conda create n lavin python 3 8 y conda activate lavin install pytorch conda install pytorch 1 12 1 torchvision 0 13 1 torchaudio 0 12 1 c pytorch install dependency and lavin pip install r requirements txt pip install e data preparation for scienceqa please prepare the dataset from the official repo https github com lupantech scienceqa for multimodal chatbot download the images in train2014 split from mscoco http images cocodataset org zips train2014 zip and obtain the prepared 52k text only and 158k text image instruction following data from here https drive google com file d 1gordpruqwxbgy6nymhpdxo7t089yzsg3 view usp share link obtain the weights of llama from this form https forms gle jk851ebvbx1m5tav5 official or download llama 7b https huggingface co nyanko7 llama 7b tree main and llama 13b https huggingface co thebloke llama 13b from huggingface unofficial if you want to use vicuna weights to initialize the model please download from here https huggingface co lmsys after that the file structure should look like bash lavin lavin scripts train py eval py data problem json pid splits json captions json all data json images train2014 mscoco 2014 val2014 mscoco 2014 train scienceqa train image val scienceqa val image test scienceqa test image weights tokenizer model 7b params json consolidated 00 pth 13b params json consolidated 00 pth consolidated 01 pth vicuna 7b vicuna 13b config json generation config json pytorch model bin index json special tokens map json tokenizer config json tokenizer model pytorch model 00001 of 00003 bin pytorch model 00002 of 00003 bin pytorch model 00003 of 00003 bin fine tuning scienceqa reproduce the performance of lavin 7b on scienceqa for 7b model we fine tune it on 2x a100 we find that the performance will be affected by the number of gpus we are working to address this problem llama weights bash bash scripts finetuning sqa 7b sh vicuna weights bash bash scripts finetuning sqa vicuna 7b sh lavin lite with llama weights single gpu bash bash scripts finetuning sqa vicuna 7b lite sh reproduce the performance of lavin 13b on scienceqa 2 hours on 8x a100 80g for 13b model we fine tune it on 8x a100 llama weights bash bash scripts finetuning sqa 13b sh vicuna weights bash bash scripts finetuning sqa vicuna 13b sh lavin lite with llama weights single gpu bash bash scripts finetuning sqa vicuna 13b lite sh multimodal chatbot fine tune lavin 13b on 210k instruction following data 75 hours with 15 epochs and 25 hours with 5 epochs on 8x a100 80g llama weights bash bash scripts vl instruction tuning 13b sh vicuna weights bash bash scripts vl instruction tuning vicuna 13b sh to train on fewer gpus you can reduce the number of gpus in the scripts and increase gradient accumulation via accum iter to guarantee the total batch size of 32 setting gradient checkpointing and bits 4bit in the scripts will greatly reduce the requirements of gpu memory demo lavin supports both single and multi modal instruction inputs try your custom instructions in our demo launch a gradio web server on your machine then you can interact with lavin as you like torchrun nproc per node 1 demo py server name 127 0 0 1 div align center img src assets demo gif width 95 div model zoo scienceqa model weights time memory params acc weights lavin 7b lite llama 29 hours single gpu 9g 3 8m 88 35 google drive https drive google com file d 1ovtotgt d9eqmrvic27ozuren9dlclmo view usp sharing lavin 13b lite llama 42 hours single gpu 14g 5 4m 89 44 google drive https drive google com file d 1pyvsap3fnmgxogxfxjysatr75cfypahw view usp sharing lavin 7b llama 1 4 hours 33 9g 3 8m 89 37 google drive https drive google com file d 10x2qcbyrlh1grzohwhrmxluoz s6msgv view usp share link lavin 7b vicuna 1 4 hours 33 9g 3 8m 89 41 google drive https drive google com file d 1numxeiwlnjkxdybcshg8pvgsvlc5dzy8 view usp share link lavin 13b llama 2 hours 55 9g 5 4m 90 54 google drive https drive google com file d 1lkkuy54spzkkexrr7bdmu xmk9yadckm view usp share link lavin 13b llama 4 hours 55 9g 5 4m 90 8 multimodal chatbot model weights time memory params acc weights lavin 13b llama 25 hours 55 9g 5 4m lavin 13b llama 75 hours 55 9g 5 4m google drive https drive google com file d 1rhqnsaigzfhyggsamtyspynd5aw4oe9j view usp share link examples div align center img src assets examples png width 95 div star history star history chart https api star history com svg repos luogen1996 lavin type date https star history com luogen1996 lavin date citation if you think our code and paper helpful please kindly cite lavin and repadapter https github com luogen1996 repadapter bibtex article luo2023towards title towards efficient visual adaption via structural re parameterization author luo gen and huang minglang and zhou yiyi and sun xiaoshuai and jiang guangnan and wang zhiyu and ji rongrong journal arxiv preprint arxiv 2302 08106 year 2023 article luo2023cheap title cheap and quick efficient vision language instruction tuning for large language models author luo gen and zhou yiyi and ren tianhe and chen shengxin and sun xiaoshuai and ji rongrong journal arxiv preprint arxiv 2305 15023 year 2023 acknowledgement this repo borrows some data and codes from llama https github com facebookresearch llama stanford alpaca https github com tatsu lab stanford alpaca llava https github com haotian liu llava minigpt 4 https github com vision cair minigpt 4 and llama adapter https github com zrrskywalker llama adapter thanks for their great works
ai
mobile-system-design
a simple framework for mobile system design interviews work in progress below is a simple framework for mobile system design interviews as an example we are going to use the design twitter feed question the proposed solution is far from being perfect but it is not the point of a system design interview round no one expects you to build a robust system in just 30 min the interviewer is mostly looking for specific signals from your thought process and communication everything you say or do should showcase your strengths and help the interviewer to evaluate you as a candidate join the mobile system design https discord gg awdnrvqban discord server for general discussions and feedback div align center img height 60 alt discord src https user images githubusercontent com 786644 91370247 cf4cea80 e7db 11ea 8330 dc29c0fe8969 png https discord gg awdnrvqban div disclaimer the framework was heavily inspired by the similar scalable backend design articles learning the framework does not guarantee passing the interview the structure of the interview process depends on the personal style of the interviewer the dialog is really important make sure you understand what the interviewer is looking for make no assumptions and ask clarifying questions this guide does not reflect the interviewing policies from google or any other company interview process 45 60 min 2 5 min acquaintances 5 min defining the task and gathering requirements 10 min high level discussion 20 30 min detailed discussion 5 min questions to the interviewer acquaintances your interviewer tells you about themselves and you tell them about yourself it s better to keep it simple and short for example my name is alex i ve been working on ios android since 2010 mostly on frameworks and libraries for the past 2 5 years i ve been leading a team on an xyz project my responsibilities include system design planning and technical mentoring the only purpose of the introduction is to break the ice and provide a gist of your background the more time you spend on this the less time you will have for the actual interview defining the task the interviewer defines the task for example design twitter feed your first step is to figure out the scope of the task client side only just a client app you have the backend and api available client side api likely choice for most interviews you need to design a client app and api client side api back end less likely choice since most mobile engineers would not have a proper backend experience if the interviewer asks server side questions let them know that you re most comfortable with the client side and don t have hands on experience with backend infrastructure it s better to be honest than trying to fake your knowledge if they still persist let them know that everything you know comes from books youtube videos and blog posts gathering requirements task requirements can be split into functional non functional and out of scope we ll define them in the scope of design twitter feed task functional requirements think about 3 5 features that would bring the biggest value to the business users should be able to scroll through an infinite list of tweets users should be able to like a tweet users should be able to open a tweet and see comments read only non functional requirements not immediately visible to a user but plays an important role for the product in general offline support real time notifications optimal bandwidth and cpu battery usage out of scope features which will be excluded from the task but would still be important in a real project login authentication tweet sending followers retweets analytics providing the signal system design questions are made ambiguous the interviewer is more interested in seeing your thought process than the actual solution you produce what assumptions did you make and how did you state them what features did you choose what clarifying questions did you ask what concerns and trade offs did you mention your best bet is to ask many questions and cover as much ground as possible make it clear that you re starting with the bfs approach and ask them which feature or component they want to dig in some interviewers would let you choose which topics you want to discuss pick something you know best clarifying questions here are some of the questions you might ask during the task clarification step do we need to support emerging markets publishing in a developing country brings additional challenges the app size should be as small as possible due to the widespread use of low end devices and higher cost of cellular traffic the app itself should limit the number and the frequency of network requests and heavily rely on caching what number of users do we expect this seems like an odd question for a mobile engineer but it can be very important a large number of clients results in a higher back end load if you don t design your api right you can easily initiate a ddos attack on your own servers make sure to discuss an exponential backoff and api rate limiting with your interviewer how big is the engineering team this question might make sense for senior candidates building a product with a smaller team 2 4 engineers is very different from building it with a larger team 20 100 engineers the major concern is project structure and modularization you need to design your system in a way that allows multiple people to work on it without stepping on each other s toes high level diagram once your interviewer is satisfied with the clarification step and your choice for the system requirements you should ask if they want to see a high level diagram below is a possible solution for the twitter feed question high level diagram images twitter feed high level diagram svg server side components backend represents the whole server sider infrastructure most likely your interviewer won t be interested in discussing it push provider represents the mobile push provider infrastructure receives push payloads from the backend and delivers them to clients cdn content delivery network responsible for delivering static content to clients client side components api service abstracts client server communications from the rest of the system persistence a single source of truth the data your system receives gets persisted on the disk first and then propagated to other components repository a mediator component between api service and persistence tweet feed flow represents a set of components responsible for displaying an infinite scrollable list of tweets tweet details flow represents a set of components responsible for displaying a single tweet s details di graph dependency injection graph tbd image loader responsible for loading and caching static images usually represented by a 3rd party library coordinator organizes flow logic between tweet feed and tweet details components helps decoupling components of the system from each other app module an executable part of the system which glues components together providing the signal the interviewer might be looking for the following signals the candidate can present the big picture without overloading it with unnecessary implementation details the candidate can identify the major building blocks of the system and how they communicate with each other the candidate has app modularity in mind and is capable of thinking in the scope of the entire team and not limiting themselves as a single contributor this might be more important for senior candidates frequently asked questions why using a high level diagram is necessary can i skip it altogether or draw a detailed diagram right away a high level diagram is by no means necessary you can take any other approach which seems more appropriate for a specific interview case however there are some advantages in starting with a high level approach time management drawing a 30 000 feet view is quick and brings immediate topics for further discussion modularity each high level component can be potentially isolated in a separate module to allow a team of engineers to work on the project simultaneity without stepping on each other toes best practices this approach is wildly used for the backend system design and closely resembles the c4 model for visualising software architecture https c4model com still not sure ask your interviewer if they want you to draw a high level diagram or skip it and jump to some concrete components deep dive tweet feed flow after a high level discussion your interviewer might steer the conversation towards some specific component of the system let s assume it was tweet feed flow things you might want to talk about architecture patterns mvp mvvm mvi etc mvc is considered a poor choice these days it s better to select a well known pattern since it makes it easier to onboard new hires compared to some less known home grown approaches pagination essential for an infinite scroll functionality for more details see pagination dependency injection helps building an isolated and testable module image loading low res vs full res image loading scrolling performance etc details diagram images tweet details svg components feed api service abstracts twitter feed api client provides the functionality for requesting paginated data from the backend injected via di graph feed persistence abstract cached paginated data storage injected via di graph remote mediator triggers fetching the next prev page of data redirects the newly fetched paged response into a persistence layer feed repository consolidates remote and cached responses into a pager object through remote mediator pager trigger data fetching from the remote mediator and exposes an observable stream of paged data to ui tweet like and tweet details use cases provide delegated implementation for like and show details operations injected via di graph image loader abstracts image loading from the image loading library injected via di graph providing the signal the interviewer might be looking for the following signals the candidate is familiar with most common mvx patterns the candidate achieves a clear separation between business logic and ui the candidate is familiar with dependency injection methods the candidate is capable of designing self contained isolated modules frequently asked questions how much detail should i provide in the deep dive section there s no rule of thumb here work closely with the interviewer ask them if you need to go deeper or move on to the next topic if you have an in person video interview watch their facial expressions for example if you see that the interviewer wants to interrupt you stop talking and ask if they have any questions the whole point is to work together that provides a good signal for you as a team player collaborator why didn t you mention specific classes like recyclerview uicollectionview and vendors like room coredate realm etc to make the guide stable and platform agnostic the libraries and the frameworks are constantly evolving picking up a specific vendor can only be relevant for a short amount of time using an abstraction is more robust since you only concentrate on the functionality it provides without digging too much into the implementation details vendor selection is biased and depends on personal experience and current trends big tech companies like faang might not care much about vendors since they build their custom proprietary stacks there are tons of implementation specific details all over the internet already what drawing tool should i use at the time of this writing excalidraw https excalidraw com google jamboard https jamboard google com and google drawings https docs google com drawings could be the most popular choice some interviewers would skip diagramming altogether and prefer a collaborative editor and a verbal discussion due to privacy issues some companies would not let the candidate share the screen and use a tool of personal choice api design the goal is to cover as much ground as possible you won t have enough time to cover every api call just ask the interviewer if they are particulary interested in a specific part or choose something you know best in case they don t have a strong preference real time notification we need to provide real time notifications support as a part of the design bellow are some of the approaches you can mention during the discussion push notifications pros easier to implement compared to a dedicated service can wake the app in the background cons not 100 reliable may take up to a minute to arrive relies on a 3rd party service users can opt out easily http polling polling requires the client to periodically ask the server for updates the biggest concern is the amount of unnecessary network traffic and increased backend load short polling the client samples the server with a predefined time interval pros simple and not as expensive if the time between requests is long no need to keep a persistent connection cons the notification can be delayed for as long as the polling time interval additional overhead due to tls handshake and http headers long polling pros instant notification no additional delay cons more complex and requires more server side resources keeps a persistent connection until the server replies server sent events allows the client to stream events over an http connection without polling pros real time traffic using a single connection cons keeps a persistent connection web sockets provide a bi directional communication between client and server pros can transmit both binary and text data cons more complex to set up compared to polling sse keeps a persistent connection the interviewer would expect you to pick a concrete approach most suitable for the design task at hand one possible solution for the design twitter feed question could be using a combination of sse a primary channel of receiving real time updates on likes with push notifications sent if the client does not have an active connection to the backend protocols rest a text based stateless protocol most popular choice for crud create read update and delete operations pros easy to learn understand and implement easy to cache using built in http caching mechanism loose coupling between client and server cons less efficient on mobile platforms since every request requires a separate physical connection schemaless it s hard to check data validity on the client stateless needs extra functionality to maintain a session additional overhead every request contains contextual metadata and headers graphql a query language for working with api allows clients to request data from several resources using a single endpoint instead of making multiple requests in traditional restful apps pros schema based typed queries clients can verify data integrity and format highly customizable clients can request specific data and reduce the amount of http traffic bi directional communication with graphql subscriptions websocket based cons more complex backend implementation leaky abstraction clients become tightly coupled to the backend the performance of a query is bound to the performance of the slowest service on the backend in case the response data is federated between multiple services websocket full duplex communication over a single tcp connection pros real time bi directional communication provides both text based and binary traffic cons requires maintaining an active connection might have poor performance on unstable cellular networks schemaless it s hard to check data validity on the client the number of active connections on a single server is limited to 65k learn more about websockets websocket tutorial how websockets work https www youtube com watch v pnxk8fpkstc grpc remote procedure call framework which runs on top of http 2 supports bi directional streaming using a single physical connection pros lightweight binary messages much smaller compared to text based protocols schema based built in code generation with protobuf provides support of event driven architecture server side streaming client side streaming and bi directional streaming multiple parallel requests cons limited browser support non human readable format steeper learning curve the interviewer would expect you to pick a concrete approach most suitable for the design task at hand since the api layer for the design twitter feed question is pretty simple and does not require much customization we can select an approach based on rest pagination endpoints that return a list of entities must support pagination without pagination a single request could return a huge amount of results causing excessive network and memory usage types of pagination offset pagination provides a limit and an offset query parameters example get feed offset 100 limit 20 pros easiest to implement the request parameters can be passed directly to a sql query stateless on the server cons bad performance on large offset values the database needs to skip offset rows before returning the paginated result inconsistent when adding new rows into the database page drift keyset pagination uses the values from the last page to fetch the next set of items example get feed after 2021 05 25t00 00 00 limit 20 pros translates easily into a sql query good performance with large datasets stateless on the server cons leaky abstraction the pagination mechanism becomes aware of the underlying database storage only works on fields with natural ordering timestamps etc cursor seek pagination operates with stable ids which are decoupled from the database sql queries usually a selected field is encoded using base64 and encrypted on the backend side example get feed after id t1234xzy limit 20 pros decouples pagination from sql database consistent ordering when new items are inserted cons more complex backend implementation does not work well if items get deleted ids might become invalid you need to select a single approach after listing the possible options and discussing their pros and cons we ll pick cursor pagination in the scope of the design twitter feed question a sample api request might look like this get v1 feed after id p1234xzy limit 20 authorization bearer token data items id t123 author id a123 title title description description likes 12345 comments 10 attachments media image url https static cdn com image1234 webp thumb url https static cdn com thumb1234 webp created at 2021 05 25t17 59 59 000z cursor count 20 next id p1235xzy prev id null additional information evolving api pagination at slack https slack engineering evolving api pagination at slack everything you need to know about api pagination https nordicapis com everything you need to know about api pagination although we left it out of scope it s still beneficial to mention http authentication you can include an authorization header and discuss how to properly handle 401 unauthorized response scenario also don t forget to talk about rate limiting strategies 429 too many requests make sure to keep it brief and simple without unnecessary details your primary goal during a system design interview is to provide signal and not to build a production ready solution providing the signal the interviewer might be looking for the following signals the candidate is aware of the challenges related to poor network conditions and expensive traffic the candidate is familiar with most common protocols for unidirectional and bi directional communication the candidate is familiar with rest full api design the candidate is familiar with authentication and security best practices the candidate is familiar with network error handling and rate limiting data storage data storage options bellow are the most common options for local device data storage key value storage userdefaults sharedpreferences property list usually baked by xml or binary files allows associating primitive data with string based keys works best for simple unstructured non sensitive data settings flags etc pros easy to use built in api cons insecure android provides encryptedsharedpreferences https developer android com reference androidx security crypto encryptedsharedpreferences 3rd party wrapper libraries available on ios not suitable for storing large data no schema support nor ability to query data no data migration support poor performance database orm sqlite room core data realm etc based on a relational database perfect for large amounts of structured data that needs complex querying logic pros object relational mapping support schema and querying support data migration support cons more complex setup insecure wrapper libraries available on ios android bigger memory footprint custom binary datastore nscoding codable etc handles storing and loading data on a low level works best when you need to customize the data storage pipeline pros highly customizable performant cons no schema migration support lots of manual effort on device secure storage keystore key chain use os encrypted storage for creating storing encryption keys and key value data pros secure not 100 unless provided by the hardware cons not optimized for storing anything but encryption keys encryption decryption performance overhead no schema migration support storage location internal sand boxed by the app and not readable to other apps with few exceptions external publicly visible and most likely not deleted when your app is deleted media scoped special type of storage for media files storage type documents automatically backed up user generated data that cannot be easily re generated and will be automatically backed up cache data that can be downloaded again or regenerated can be deleted by user to free up space temp data that is only used temporary and should be deleted when no longer needed best practices store as little sensitive data as possible use encrypted storage if you can t avoid storing sensitive data do not allow your app storage grow uncontrollably make sure that cleaning up cached files won t affect app functionality approach you need to select a single approach after listing options and discussing their pros and cons don t worry about building a complete solution just try to lay out a high level idea without going too deep into details a possible breakdown for the design twitter feed question might look like this feed pagination table create a feed database table for storing paginated feed response item id string author id string title string description string likes int comments int attachments string comma separated list created at date also used for sorting cursor next id string points to the next cursor page cursor prev id string points to the prev cursor page limit the total number of entries to 500 in order to control local storage size flatten attachments into a comma separated list alternatively you can create an attachments table and join it with the feed table on item id explicitly store next prev cursor id with each item to simplify paged navigation attachments storage store attachments as files in internal cached storage use attachment urls as cache keys delete attachments after deleting corresponding items from the feed table limit the cache size to 200 400 mb a possible follow up question might require you to handle sensitive media data private accounts etc in this case you can selectively encrypt image files with encryption keys stored in keystore keychain providing the signal the interviewer might be looking for the following signals the candidate is aware of mobile storage types security limitations and compatibility the candidate is capable of designing a storage solution for most common scenarios additional topics major concerns for mobile development here s a list of concerns to keep in mind while discussing your solution with the interviewer user data privacy leaking customer data can damage your business and reputation security protect your products against reverse engineering more important for android target platform changes each new ios android release may limit an existing functionality and make privacy rules more strict non reversibility of released products assume everything you ship is final and would never change make sure to use staged rollouts and server side feature flags performance stability metered data usage cellular network traffic can be very expensive bandwidth usage constant waking up of the cellular radio results in a faster battery drain cpu usage higher computational load results in a faster battery drain and device overheating memory usage higher memory usage increases the risk of the app being killed in the background startup time doing too much work at the app start creates a poor user experience crashers anrs doing too much work on the main thread can lead to app shutdowns and ui jank app crashes is the leading factor of poor store ratings geo location usage don t compromise user privacy while using location services prefer the lowest possible location accuracy progressively ask for increased location accuracy if needed provide location access rationale before requesting permissions 3rd party sdks usage 3rd party sdks might cause performance regressions and or serious outages example https www bugsnag com blog sdks should not crash apps each sdk should be guarded by a feature flag a new sdk integration should be introduced as an a b test or a staged rollout you need to have a support and upgrade plan for the 3rd party sdks in a long term privacy security keep as little of the user s data as possible don t collect things you won t need avoid collecting device ids prefer one time generated pseudo anonymous ids anonymize collected data minimize the use of permissions be prepared for the user to deny permissions and respect the user s choice when they deny permission the second time be prepared for the system to auto reset permissions be prepared for app hibernation of unused apps delegate functionality to the 1st party apps camera photo picker file manager etc assume that on device storage is not secure even while using keystore keychain functionality assume that the backend storage is also not secure discussed possible end to end encryption mechanisms assume that the target platform s ios android security privacy rules will change make critical functionality controllable by remote feature flags user s perception of security is as important as the implemented security measures make sure to discuss how you would educate your customers about data collection storage and transmission cloud vs on device at some point during the interview you might need to choose between running some functionality on a device and moving it to a cloud the approach you select would have the biggest impact when it comes to on device ai but can also be extended to any kind of data processing as well advantages of running things in a cloud device independent your customers are not limited by the characteristics of their devices better usage of client system resources any intensive computation can be executed on the backend to save the device s battery the fast pace of changes you don t need to release an update to all the clients in order to get the desired functionality available much bigger computational resources your backend can autoscale as the load pressure grows better security the client code can be tampered and reverse engineered while the backend applications tend to be more secure easier analytics and offline data analysis you gather all the data you need in your data centers advantages of running things on a device better privacy the data does not leave the user s device and is not stored on the cloud real time functionality certain operations can run much faster on a user s device than being sent to a backend lower bandwidth usage you don t need to send data over the network offline functionality the client does not need a persistent network connection to operate lower backend usage the load is split between all the clients and the backend things you should never run on a device new resource creation generating coupons tickets etc transactions and payment verification unless it s delegated to a 3rd party sdk offline and opportunistic state adding an offline state ensures the app s usability without a network connection user can like comment delete tweets without having a network connection the ui gets updated opportunistically assuming that each request will be sent when the network goes back online the app should display cached results when possible the app properly notifies users about its offline state all state changes are batched and sent when the network goes back online request de duplication the client should ensure that retrying the same request won t create a duplication resource on the server idempotence a possible workaround could involve unique client side generated request identifiers and server side de duplication synching local and remote states a follow up question might require you to properly handle state synchronization across multiple devices with the same account merge conflict resolution could be trickly below are a few possible solutions local conflict resolution a local device pulls the remote state from the backend after going online merges it with the local state and upload changes pros easy to implement cons insecure gives a local device authority over the backend won t solve the problem if multiple devices send their updates simultaneously the last update wins any changes in merging logic requires app update remote conflict resolution a local device sends its local state to the backend after going online receives a new state and overwrites the local state pros moves conflict resolution authority to the backend does not require client updates cons more complex backend implementation more information offline functionality for trello s mobile applications airplane mode enabling trello mobile offline https tech trello com sync architecture syncing changes https tech trello com syncing changes sync failure handling https tech trello com sync failure handling the two id problem https tech trello com sync two id problem offline attachments https tech trello com sync offline attachments sync is a two way street https tech trello com sync downloads displaying sync state https tech trello com sync indicators caching tbd quality of service to make your system more energy efficient you can introduce the quality of service classes for your network operations the implementation is quite tricky but you can discuss it on a higher level limit the number of concurrent network operations 4 10 the number itself may depend on the device state battery wall charger wifi cellular doze standby etc assign a quality of service class to each of your network requests user critical should be dispatched as fast as possible fetching the next page of data for the tweet feed requesting tweet details ui critical should be dispatched after user critical requests fetching low res thumb images for tweets on the feed while scrolling canceled if the user scrolls past the target tweet may be delayed in case of fast scrolling ui non critical should be dispatched after ui critical requests fetching high res images for tweets on the feed canceled if the user scrolls past the target tweet may be delayed in case of fast scrolling background should be dispatched after all the above is finished posting likes analytics introduce a priority queue for scheduling network requests dispatch requests in the order of their priority suspend low priority requests if the max number of concurrent operations is reached and a high priority request is scheduled resumable uploads a resumable chunked media upload breaks down a single upload request in three stages upload initialization chunk bytes uploading appending upload finalization resumable uploads images resumable uploads svg advantages of resumable uploads allows you to resume interrupted data transfer operations without restarting from the beginning disadvantages of resumable uploads an overhead from additional connections and metadata note resumable uploads are most effective with large uploads videos archives on unstable networks for smaller files images texts and stable networks a single request upload should be sufficient more info youtube resumable uploads https developers google com youtube v3 guides using resumable upload protocol google photos resumable uploads https developers google com photos library guides resumable uploads google cloud resumable uploads https cloud google com storage docs resumable uploads twitter chunked media upload https developer twitter com en docs twitter api v1 media upload media uploading media chunked media upload prefetching prefetching improves app performance by hiding the data transfer latency over slow and unreliable networks the biggest concern while designing your prefetching solution is the increase in battery and cellular data usage tbd more info optimize downloads for efficient network access https developer android com training efficient downloads efficient network access improving performance with background data prefetching https instagram engineering com improving performance with background data prefetching b191acb39898 informed mobile prefetching https dl acm org doi 10 1145 2307636 2307651 understanding prefetching and how facebook uses prefetching https www facebook com business help 1514372351922333 conclusion there s a significant amount of randomness during a system design interview the process and the structure can vary depending on the company and the interviewer things you can control your attitude always be friendly no matter how the interview goes don t be pushy and don t argue with the interviewer this might provide a bad signal your preparation the better your preparation is the bigger the chance of a positive outcome practice mock design interviews with your peers you can find people on teamblind https www teamblind com search mock your knowledge the more knowledge you have the better your chances are gain more experience study popular open source projects ios https github com search q ios type repositories android https github com search o desc q android s stars type repositories read development blogs from tech companies uber https eng uber com category articles mobile instagram ios https instagram engineering com tagged ios android https instagram engineering com tagged android trello https tech trello com meta engineering at meta https engineering fb com tag mobile meta tech podcast https open spotify com show 1nltm7okzmcropfvlqlbmz square https code cash app dropbox https dropbox tech mobile reddit https www redditinc com blog topic technology airbnb https medium com airbnb engineering tagged mobile lyft tech podcast https podcasts apple com us podcast lyft mobile id1453587931 curated list of blog posts blogposts md your resume make sure to list all your accomplishments with measurable impact things you cannot control your interviewer s attitude they might have a bad day or simply dislike you your competition sometimes there s simply a better candidate the hiring committee they make a decision based on the interviewers report and your resume judging the outcome you can influence the outcome but you can t control it don t let minor setbacks determine your self worth frequently asked questions how do you know this approach works why is this necessary there s no guarantee that the suggested approach would work well in many cases the structure of the system design round depends on a personal interviewer style having a good interview plan at hand allows both the interviewer and the candidate to concentrate more on the content of the discussion and not organizational aspects of the actual round can you go a bit deeper at x this is not necessary since there might be lots of alternative solutions and the guide does not provide the ground truth the implementation details should depend on the personal experience of the candidate and not on an opinionated approach of some random people from the internet i m an interviewer this ruins the process for all of us now the candidates just memorize the solutions to cheat during the interview the system design is much more broad compared to coding rounds learning a particular solution is not nearly enough to be successful the interviewer can slightly tweak the requirements to make it a brand new question it s really obvious when the candidate memorized a certain approach instead of relying on experience learning some patterns and approaches before the interview might help the candidate to ease the stress and deliver the solution in a more clear and structured way additional information more system design exericses here exercises junior middle senior and staff level interviews the system design experience would be different depending on the candidate s target level an approximate engineering level breakdown can be found here https candor co articles tech careers google promotions the real scoop on leveling up note there s no clear mapping between years of experience and seniority some ranges might exist but it largely depends on the candidate s background junior engineers the system design round of junior engineers is optional since it s pretty unlikely they would have experience designing software systems middle level engineers the middle level engineering design round might be heavy on the implementation side the interviewer and the candidate would mostly talk about building a specific component using platform libraries senior level engineers the senior level engineering design round could be more high level compared to the previous levels the interviewer and the candidate would mostly talk about multiple components and how they communicate with each other the implementation details could be less important unless the candidate needs to make a decision that drastically affects the application performance the candidate should also be able to select a technical stack and describe its advantages and trade offs staff level engineers the staff level engineering design round moves away from technical to strategic decisions the candidate might want to discuss the target audience available computational and human resources expected traffic and deadlines instead of thinking in terms of implementation tasks the candidate should put business needs first for example being able to explain how to reduce product time to market how to safely rollout and support features how to handle omg situations and large scale outages the user privacy topics and their legal implications become extremely important and should be discussed in great detail looking for more content system design exercises check out the collection exercises of mobile system design exercises common system design interview mistakes check the guide here common interview mistakes md mock interviews check out the mock interview archive https www youtube com playlist list plamn jyh50oyafxjepiqtytd gxtf7x9d or sign up https forms gle cez2r31wqtgp3rsx9 to be a candidate yourself consider starring the repository to help other people discover the guide thank you
os
machinelearning-az
repositorio del curso machine learning de a a la z r y python para data science https cursos frogamesformacion com courses machine learning az creado por kirill eremenko https www udemy com user kirilleremenko y hadelin de ponteves https www udemy com user hadelin de ponteves traducido al espa ol por juan gabriel gomila salas https www udemy com user juangabriel2 actualizado en agosto de 2023 en la estructura del proyecto encontrar s tres carpetas additional materials con las transparencias y slides usados en las clases de teor a del curso original los ficheros de python y r originales del curso con los que se grabaron las clases update ficheros de python jupyter notebook y de r actualizados en agosto de 2023 de este modo podr s combinar lo que ves en los v deos con el c digo en la ltima versi n para siempre tener el curso 100 actualizado tambi n te hemos preparado el c digo directamente en google colab por si prefieres seguirlo online sin necesidad de instalar python en tu ordenador puedes acceder a la carpeta de materiales de colab desde aqu https drive google com drive folders 1mcmhzmvdrizji jljqqu1rbehpwhq a usp share link est s interesado en conocer a fondo el mundo del machine learning entonces este curso est dise ado especialmente para ti este curso ha sido dise ado por data scientists profesionales para compartir nuestro conocimiento y ayudarte a aprender la teor a compleja los algoritmos y librer as de programaci n de un modo f cil y sencillo en l te guiaremos paso a paso en el mundo del machine learning con cada clase desarrollar s nuevas habilidades y mejorar s tus conocimientos de este complicado y lucrativa sub rama del data science temario del curso este curso es divertido y ameno pero al mismo tiempo todo un reto pues tenemos mucho de machine learning por aprender lo hemos estructurado del siguiente modo parte 1 preprocesamiento de datos parte 2 regresi n regresi n lineal simple regresi n lineal m ltiple regresi n polinomial svr regresi n en rboles de decisi n y regresi n con bosques aleatorios parte 3 clasificaci n regresi n log stica k nn svm kernel svm naive bayes clasificaci n con rboles de decisi n y clasificaci n con bosques aleatorios parte 4 clustering k means clustering jer rquico parte 5 aprendizaje por reglas de asociaci n apriori eclat parte 6 reinforcement learning l mite de confianza superior muestreo thompson parte 7 procesamiento natural del lenguaje modelo de bag of words y algoritmos de nlp parte 8 deep learning redes neuronales artificiales y redes neuronales convolucionales parte 9 reducci n de la dimensi n acp lda kernel acp parte 10 selecci n de modelos boosting k fold cross validation ajuste de par metros grid search xgboost adem s el curso est repleto de ejercicios pr cticos basados en ejemplos de la vida real de modo que no solo aprender s teor a si no tambi n pondr s en pr ctica tus propios modelos con ejemplos guiados y como bonus este curso incluye todo el c digo en python y r para que lo descargues y uses en tus propios proyectos puedes apuntarte al curso de machine learning de la a a la z con descuento desde aqu https cursos frogamesformacion com courses machine learning az lo que aprender s dominar el machine learning con r y con python tener intuici n en la mayor a de modelos de machine learning hacer predicciones precisas y acertadas crear unos an lisis elaborados crear modelos de machine learning robustos y consistentes crear valor a adido a tu propio negocio utilizar el machine learning para cuestiones personales dominar aspectos espec ficos como por ejemplo reinforcement learning nlp o deep learning conocer las t cnicas m s avanzadas como la reducci n de la dimensionalidad saber qu modelo de machine learning usar para cada tipo de problema crear toda una librer a de modelos de machine learning y saber c mo combinarlos para resolver cualquier problema hay requisitos para seguir correctamente el curso con el nivel de matem ticas de secundaria y bachillerato es suficiente para qui n es este curso cualquier estudiante que est interesado en el machine learning estudiantes con nivel de matem ticas de bachillerato que quieren iniciarse en machine learning estudiantes de nivel intermedio con conocimientos b sicos de machine learning incluyendo algoritmos cl sicos de regresi n lineal o log stica pero que quieren aprender m s y explorar los diferentes campos del machine learning estudiantes que no se sienten c modos programando pero se interesan por el machine learning y quieren aplicar las t cnicas al an lisis de data sets universitarios que quieren iniciarse en el mundo del data science cualquier analista de datos que quiera mejorar sus habilidades en machine learning personas que no est n satisfechas con su trabajo y quieren convertirse en data scientist cualquier persona que quiera a adir valor a su empresa con el poder del machine learning
ai
Boxing-java-web-game
boxing java web game a project undertaken to learn spring mysql maven web app back end development this builds the backend of a java web app using maven mysql spring and spring security it has login user authentication access restrictions and a database of fighters that can be updated from the web interface the initial version of the game runs on the console and is currently found in boxing java web game src boxingsim of the master branch the backend was built to connect mysql database to a web interface using spring to transform the initial console version into a java spring app the spring hibernate branch incorporated hibernate to add a mysql database functionality to the app the current in work branch is the maven security branch which adds login user roles and maven support to the web app this is currently being merged with the initial console game to complete the prototype of the full web application
server
git-brunching
https raw githubusercontent com wiki se701group3 git brunching images logo png welcome to the git brunching repository git brunching is an open source restaurant manager if you re looking to contribute to the application you can check out our contribution guide https github com se701group3 git brunching wiki getting started with contribution you can view more detailed module specific readme s in each backend frontend and testing as well
os
SYNwall
synwall zero config iot firewall synwall is a project built for the time being as a linux kernel module to implement a transparent and no config no maintenance firewall you may want to have a look to the use case https github com synwall synwall blob master usecase md for an example of usage basics usually iot devices are out of a central control with low profile hardware tough environmental conditions and we have no time to dedicate to maintain the security so may be we can not patch our iot infrastructure and it will be very hard to maintain a firewall like access control the idea is to create a de centralized one way onetimepassword code to enable the network access to the device all the traffic not containing the otp will be discarded no prior knowledge about who need to access is required we just need a pre shared key to deploy the protection will be completely transparent to the application level because implemented at network protocol level tcp and udp install this repository contains the linux kernel module it has been tested with 3 x 4 x and 5 x version on x86 64 arm mips and aarch64 architectures it requires the current kernel headers for compilation which can be usually installed with the proper package manager for example on debian like distros sudo apt get install linux headers uname r than it should be enough to run the compilation make configuration the module can be loaded in the usual way with insmod or modprobe it has several parameters that allow you to customize the behaviour pre shared key used for the onetimepassword psk the psk must be a sequence of bytes from 32 to 1024 it will be part of the otp so the length of it will influence the size of the otp injected in the packet without this parameter the module will not load enable udp enable udp 0 enable disable the otp for udp protocol by default it is disabled set to 1 to enable it the otp on udp requires the module to be active on both of the communicating devices since the otp must be removed by the module before the packet is forwarded to the application level if this is not true you may experience weird behaviors the udp connection tracking relies on conntrack module so you may have to insert it to use this functionality this depends on the installation an error will be displayed in the kernel log if so note by default port 53 dns and 123 ntp are blacklisted for outgoing connection so the otp is not added if you need to change this look for udp blacklist array i will add a parameter for this in the future time precision parameter precision 10 the otp is computed also with the current device time since the date could be different on the participating devices you can round the time on a specific value to allow time skew default is 10 the precision is expressed in power of two you may argue why it has been a decision to increase performance and have low impact on low end devices 9 1 second 10 8 seconds precision under 8 is probably not going to work if you increase the precision value at 11 or more consider to increase also the max trash value in syngate netfilter c disable the otp for outgoing packets disable out 0 you may want to disable the otp in outgoing packet by settings this to 1 in this case the module will just drop the packets without otp but it will not participate to the communication mesh with other synwall devices it can be useful in case of issues with the outgoing packets on uplink devices enable dos protection enable antidos 0 this option can be enabled by setting this to 1 if set this will limit the otp computation on the device to a given number allow otp ms variable set to 1000 by default in this case only one otp computation per second is allowed preserving the cpu time of the device in case of a dos attack enable ip spoofing protection enable antispoof 0 by default the ip is not part of the otp this could lead to some replay attack you can enable the antispoof protection to be fully safe this may break the communication if some nats are in place between the devices delay in starting up the module functionalities ms load delay 10000 you can decide to wait a while before activating the protection after the module load this could be useful in case of issues and after a reboot to gain access to the device the default is 10 seconds list of ports for port knocking failsafe portk 0 0 0 0 0 if the device clock is going bananas it could be difficult to get access one way could be the delay discussed before but you can also set a sequence of port knocking which can disable the module for a while the list if defined must be of 5 tcp ports if the module identify a syn packet on these ports in one second it disable itself for the same time set as load delay note if you actively use the sequence remember to change it since it can be easily sniffed example of usage warning this is going to drop all the traffic to your device so be sure to know how to access with another synwall device or by disabling it remotely port knocking sudo insmod synwall ko psk 123456789012345678901234567890123 precision 10 portk 12 13 14 15 16 load delay 5000 enable udp 1 project structure synwall repository synwall netfilter c and h netfilter main package with hooks and basic process functions synauth c and h authentication functions used to manage hashes and crypt stuff synquark c and h quark hashing implementation directly based on the work done by jean philippe aumasson veorq at https github com veorq quark https github com veorq quark syngate netfilter c and h netfilter package for socks server module it implements only the outgoing packet marking and is able to manage multiple psk and networks synwall distrib repository ansible scripts for automatic distribution see readme md there synwall ataes132 repository poc for secure eeprom usage psk storage see readme md there synwall docs repository some docs videos demos performance everything has been implemented to be used on low end devices with very low resources the choice of quark hashing for the crypto hash has been done for this reason the overhead added by the otp computation is almost invisible in the regular usage low https github com synwall synwall site blob master assets images synwall constant load png whilst you can see a consistent cpu saving when a lot of traffic is sent to the device high https github com synwall synwall site blob master assets images synwall heavy load png syngate as a companion tool the repository has also the syngate module the syngate has been built with the same logic of the base module synwall but it is working for multiple networks and psk you can define a multiple set of networks with the related psk and other options the idea is to install it on a socks server to allow to use it for different protocols and destinations the synwall vm https github com synwall synwall vm repository contains some script to build such a system with a socks server and the module pre installed syngate is working only on outgoing traffic to compile the syngate module just use make syngate 1 syngate configuration the module can be loaded in the usual way with insmod or modprobe it has several parameters that allow you to customize the behaviour it is very similar to the synwall configuration but with a different logic parameters are comma separated list of values and all of them have to be specified the first value of a list correspond to the first of the others not all params are available just the ones that make sense remember the syngate will not affect incoming traffic it has only one parameter different from the synwall config the dstnet list destination network dstnet list ip1 mask1 ip2 mask2 list of networks in the ip mask format example 192 168 1 0 24 if an ip is given instead of network address the network will be computed all the ips belonging to this network will have the connection parameters psk precision etc specified in the other lists at the same array index pre shared key used for the onetimepassword psk list pks1 pks2 the psk must be a sequence of bytes from 32 to 1024 it will be part of the otp so the length of it will influence the size of the otp injected in the packet without this parameter the module will not load enable udp enable udp list 0 1 0 1 enable disable the otp for udp protocol set to 0 to disable it or 1 to enable it the otp on udp requires the module to be active on both of the communicating devices since the otp must be removed by the module before the packet is forwarded to the application level if this is not true you may experience weird behaviors the udp connection tracking relies on conntrack module so you may have to insert it to use this functionality this depends on the installation an error will be displayed in the kernel log if so note by default port 53 dns and 123 ntp are blacklisted for outgoing connection so the otp is not added if you need to change this look for udp blacklist array i will add a parameter for this in the future time precision parameter precision list 9 10 9 10 the otp is computed also with the current device time since the date could be different on the participating devices you can round the time on a specific value to allow time skew the precision is expressed in power of two you may argue why it has been a decision to increase performance and have low impact on low end devices 9 1 second 10 8 seconds enable ip spoofing protection enable antispoof list 0 1 0 1 enable disable the antispoof protection set to 0 to disable it or 1 to enable it by default the ip is not part of the otp on synwall this could lead to some replay attack you can enable the antispoof protection to be fully safe this may break the communication if some nats are in place between the devices cross compiling you can easily cross compile by setting the following variables before doing the make arch target architecture cross compile toolchain path kernel kernel source path example of usage example with two networks directly from cli sudo insmod syngate ko dstnet list 10 1 1 0 24 198 168 10 0 24 psk list d41d8cd98f00b204e9800998ecf8427e efebceec0de382839cd38bffcdc6bf0c enable udp list 0 0 precision list 10 9 enable antispoof list 0 1 or using a configuration file sudo cat etc modprobe d syngate conf syngate config file keep it safe options syngate dstnet list 10 1 1 0 24 198 168 10 0 24 options syngate psk list d41d8cd98f00b204e9800998ecf8427e efebceec0de382839cd38bffcdc6bf0c options syngate enable udp list 0 0 options syngate precision list 10 9 options syngate enable antispoof list 0 1 license gpl 2 0 author information sorint lab
linux-kernel firewall driver security c
server
bds
bds logo docs bds logo png readme zh cn md introduction jd cloud blockchain data service bds is a realtime data aggregating analyzing and visualization service for chain like unstructured data from all kinds of 3rd party blockchains splitter is the key module of blockchain data service bds and provides data analysis capability splitter is responsible for consuming blockchain data from message queue kafka and inserting data into persistent data storage services relational database data warehouse etc for further processing architecture architecture docs bds architecture jpg environment deployment install bds environment initialization before compiling and running bds you must install go s compilation environment locally go install https golang org doc install install splitter steps 1 set the path of project gopath src github com jdcloud bds bds 2 input go build v github com jdcloud bds bds cmd bds splitter compile to get executable file bds splitter 3 build new configuration file splitter conf see config splitter example conf configuration file template 4 run program bds splitter c splitter conf install confluent and kafka install kafka see kafka https kafka apache org quickstart modify config server properties message max bytes 1048576000 install confluent see confluent https docs confluent io current installation installing cp zip tar html prod kafka cli install unzip the confluent package and run confluent rest proxy modify path to confluent etc kafka rest kafka rest properties max request size 1048576000 buffer memory 1048576000 send buffer bytes 1048576000 database database we now support sql server postgresql you can choose one as a data storage method sql server buy jcs for sql server https www jdcloud com cn products jcs for sql server postgresql buy jcs for postgresql https www jdcloud com cn products jcs for postgresql after you run the database you need to manually create new database and use the database name initialization splitter conf install grafana see grafana official https grafana com source code splitter modules splitter readme md development steps 1 define the data structure of kafka messages 2 define table structure 3 analyze kafka message and store data in database contributing contributing guide contributing md license apache license 2 0 license project demonstration blockchain data service https bds jdcloud com
bitcoin bitcoin-cash bitcoin-abc bitcoin-sv ethereum ethereum-classic xlm dogecoin doge tron litecoin troncoin stellar golang bds kafka sql-server
blockchain
airpub
airpub logo http ww1 sinaimg cn large 61ff0de3gw1ejdogm8gyxj20pt0hamyc jpg http airpub io airpub npm version https img shields io npm v airpub svg style flat airpub http airpub io angular js http duoshuo com api airpub github pages ftp vps http airpub io e5 a6 82 e4 bd 95 e5 ae 89 e8 a3 85 e9 85 8d e7 bd ae e6 8c 87 e5 8d 97 e5 bc 80 e5 8f 91 e6 8c 87 e5 8d 97 e8 b4 a1 e7 8c ae e4 bb a3 e7 a0 81 airpub x markdown x x x x angular directive add on x bower x angular seo x x x todo list x bootstrap css x angular md5 angular base64 acl airpub cli installer https github com airpub installer airpub sudo npm install g airpub installer mkdir my airpub blog cd my airpub blog airpub install airpub installer e9 85 8d e7 bd ae e6 8c 87 e5 8d 97 vi configs js airpub airpub git git clone https github com turingou airpub git cd airpub cp configs sample js configs js vi configs js airpub bower install airpub airpub configs js airpubconfig airpub name description url upyun bucket bucket policy signature policy policy upyun sign signature signature upyun sign form api secret form api secret host endpoint api duoshuoquery short name airpub short name short name short name myname wordpress myname airpub api form api secret form api secret js base64 js md5 airpub bower npm airpub node js npm bower cd airpub bower install npm install npm run dev npm run dev localhost 3000 dist airpub airpub fork pull request airpub add on issue https github com duoshuo airpub issues mit license copyright c 2014 guo yu lt o u turing gmail com gt permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software docor https cdn1 iconfinder com data icons windows8 icons iconpharm 26 doctor png generated using docor https github com turingou docor git 0 1 0 brought to you by turingou https github com turingou
front_end
IoT
iot iot devices and sensors is a collection of libraries for windows 10 iot core
server
generator-jhipster
logo jhipster image jhipster url npm version npm image npm url downloads npmcharts image npmcharts url gitter gitter badge image gitter badge url known vulnerabilities snyk image snyk url package health health image health url generator build status github actions generator image github actions url angular build status github actions angular image github actions url react build status github actions react image github actions url vue build status github actions vue image github actions url revved up by gradle enterprise https img shields io badge revved 20up 20by gradle 20enterprise 06a0ce logo gradle labelcolor 02303a https ge jhipster tech scans greetings java hipster full documentation and information is available on our website at https www jhipster tech jhipster url please read our guidelines contributing md submitting an issue before submitting an issue if your issue is a bug please use the bug template pre populated here issue template for feature requests and queries you can use this template feature template if you have found a potential security issue please read our security policy and contact us privately first https github com jhipster generator jhipster security policy contributing we are honoured by any contributions you may have small or large please refer to our contribution guidelines and instructions document https github com jhipster generator jhipster blob main contributing md for any information about contributing to the project sponsors support this project by becoming a sponsor become a sponsor https opencollective com generator jhipster or learn more about sponsoring the project https www jhipster tech sponsors thank you to our sponsors gold sponsors table tbody tr td align center valign middle a href https dev entando org jhipster target blank img width 200em src https www jhipster tech images open collective entandoe png a td tr tbody table bronze sponsors bronzesponsors bronze sponsors image bronze sponsors url backers thank you to all our backers backers backers image backers url object data https opencollective com generator jhipster tiers backer svg avatarheight 40 width 890 button false type image svg xml object azure builds additional builds at hipster labs jhipster daily builds https github com hipster labs jhipster daily builds pipeline status angular maven sql github angular maven sql github actions angular maven nosql github angular maven nosql github actions angular gradle sql github angular gradle sql github actions angular gradle nosql github angular gradle nosql github actions react maven sql github react maven sql github actions react maven nosql github react maven nosql github actions react gradle sql github react gradle sql github actions react gradle nosql github react gradle nosql github actions vue maven sql github vue maven sql github actions vue maven nosql github vue maven nosql github actions vue gradle sql github vue gradle sql github actions vue gradle nosql github vue gradle nosql github actions elasticsearch github elasticsearch github actions monolith oauth2 github monolith oauth2 github actions no database github no database github actions microservices jwt github ms jwt github actions microservices oauth2 github ms oauth2 github actions docker image github docker image github actions official windows github official windows github actions analysis of the sample jhipster project sonar quality gate sonar quality gate sonar url sonar coverage sonar coverage sonar url sonar bugs sonar bugs sonar url sonar vulnerabilities sonar vulnerabilities sonar url github actions https github com hipster labs jhipster daily builds actions github official windows https github com hipster labs jhipster daily builds workflows official 20windows badge svg github angular maven sql https github com hipster labs jhipster daily builds workflows angular 20maven 20sql badge svg github angular maven nosql https github com hipster labs jhipster daily builds workflows angular 20maven 20nosql badge svg github angular gradle sql https github com hipster labs jhipster daily builds workflows angular 20gradle 20sql badge svg github angular gradle nosql https github com hipster labs jhipster daily builds workflows angular 20gradle 20nosql badge svg github react maven sql https github com hipster labs jhipster daily builds workflows react 20maven 20sql badge svg github react maven nosql https github com hipster labs jhipster daily builds workflows react 20maven 20nosql badge svg github react gradle sql https github com hipster labs jhipster daily builds workflows react 20gradle 20sql badge svg github react gradle nosql https github com hipster labs jhipster daily builds workflows react 20gradle 20nosql badge svg github vue maven sql https github com hipster labs jhipster daily builds workflows vue 20maven 20sql badge svg github vue maven nosql https github com hipster labs jhipster daily builds workflows vue 20maven 20nosql badge svg github vue gradle sql https github com hipster labs jhipster daily builds workflows vue 20gradle 20sql badge svg github vue gradle nosql https github com hipster labs jhipster daily builds workflows vue 20gradle 20nosql badge svg github elasticsearch https github com hipster labs jhipster daily builds workflows elasticsearch badge svg github monolith oauth2 https github com hipster labs jhipster daily builds workflows monolith 20oauth 202 0 badge svg github no database https github com hipster labs jhipster daily builds workflows no 20database badge svg github ms jwt https github com hipster labs jhipster daily builds workflows microservices 20jwt badge svg github ms oauth2 https github com hipster labs jhipster daily builds workflows microservices 20oauth 202 0 badge svg github docker image https github com hipster labs jhipster daily builds workflows docker 20image badge svg sonar url https sonarcloud io dashboard id jhipster sample application sonar quality gate https sonarcloud io api project badges measure project jhipster sample application metric alert status sonar coverage https sonarcloud io api project badges measure project jhipster sample application metric coverage sonar bugs https sonarcloud io api project badges measure project jhipster sample application metric bugs sonar vulnerabilities https sonarcloud io api project badges measure project jhipster sample application metric vulnerabilities jhipster image https raw githubusercontent com jhipster jhipster artwork main logos v2 normal v2 20jhipster 20rgb png jhipster url https www jhipster tech npm image https badge fury io js generator jhipster svg npm url https npmjs org package generator jhipster github actions generator image https github com jhipster generator jhipster workflows generator badge svg github actions angular image https github com jhipster generator jhipster workflows angular badge svg github actions react image https github com jhipster generator jhipster workflows react badge svg github actions vue image https github com jhipster generator jhipster workflows vue badge svg github actions url https github com jhipster generator jhipster actions backers image https opencollective com generator jhipster tiers backer svg avatarheight 70 width 890 backers url https opencollective com generator jhipster bronze sponsors image https opencollective com generator jhipster tiers bronze sponsor svg avatarheight 120 width 890 bronze sponsors url https opencollective com generator jhipster issue template https github com jhipster generator jhipster issues new template bug report md feature template https github com jhipster generator jhipster issues new template feature request md npmcharts image https img shields io npm dm generator jhipster svg label downloads style flat npmcharts url https npmcharts com compare generator jhipster gitter badge image https badges gitter im jhipster generator jhipster svg gitter badge url https gitter im jhipster generator jhipster utm source badge utm medium badge utm campaign pr badge snyk image https snyk io test npm generator jhipster badge svg snyk url https snyk io test npm generator jhipster health image https snyk io advisor npm package generator jhipster badge svg health url https snyk io advisor npm package generator jhipster
angular spring-boot webpack java docker yeoman-generator generator react kubernetes cloud jhipster hacktoberfest
front_end
Horust
img src https github com federicoponzi horust raw master res horust logo png width 300 align center https github com federicoponzi horust raw master res horust logo png ci https github com federicoponzi horust workflows ci badge svg branch master event push https github com federicoponzi horust actions query workflow 3aci mit licensed https img shields io badge license mit blue svg license gitter chat https badges gitter im gitterhq gitter png https gitter im horust init community horust https github com federicoponzi horust is a supervisor init system written in rust and designed to be run inside containers table of contents goals goals status status usage usage contributing contributing license license goals supervision be a fully featured supervision system designed to be run in containers but not only easy to grasp have code that is easy to understand modify and remove when the situation calls for it completeness be a drop in replacement for your own init system rock solid be your favorite egyptian god to trust across all use cases status at this point this should be considered alpha software as in you can and should use it but under your own discretion horust can be used on macos in development situations due to limitations in the macos api subprocesses of supervised processes may not correctly be reaped when their parent process exits usage assume you d like to create a website health monitoring system you can create one using horust and a small python script 1 create a new directory that will contain our services shell mkdir p etc horust services 2 create your first horust service pro tip you can also bootstrap the creation of a new service by using horust sample service new service toml create a new configuration file for horust under etc horust services healthchecker toml toml command tmp healthcheck py start delay 10s restart strategy never a couple of notes are due here this library uses toml https github com toml lang toml for configuration to go along nicely with rust s chosen configuration language there are many supported https github com federicoponzi horust blob master documentation md properties for your service file but only command is required on startup horust will read this service file and run the command according to the restart strategy never as soon as the service has carried out its task it will not restart and horust will exit as you can see it will run the tmp healthcheck py python script which doesn t exist yet let s create it 3 create your python script create a new python script under tmp healthcheck py python usr bin env python3 import urllib request import sys req urllib request request https www google com method head resp urllib request urlopen req if resp status 200 sys exit 0 else sys exit 1 don t forget to make it executable shell chmod x tmp healthcheck py 4 build horust this step is only required because we don t have a release yet or if you like to live on the edge for building horust you will need rust https www rust lang org learn get started as soon as it s installed you can build horust with rust s cargo shell cargo build release 4 run horust now you can just shell horust by default horust searches for services inside the etc horust services folder which we have created in step 1 every 10 seconds from now on horust will send an http head request to https google it if the response is different than 200 then there is an issue in this case we re just exiting with a different exit code i e a 1 instead of a 0 but in real life you could trigger other actions maybe storing this information in a database for long term analysis or sending an e mail to the website s owner 5 finish up use kbd ctrl kbd kbd c kbd to stop horust horust will send a sigterm signal to all the running services and if it doesn t hear back for a while it will terminate them by sending an additional sigkill signal check out the documentation https github com federicoponzi horust blob master documentation md for a complete reference of the options available on the service config file a general overview is available below as well toml command bin bash c echo hello world start delay 2s start after database backend toml stdout stdout stderr var logs hello world svc stderr log user root working directory tmp restart strategy never backoff 0s attempts 0 healthiness http endpoint http localhost 8080 healthcheck file path var myservice up failure successful exit code 0 1 255 strategy ignore termination signal term wait 10s die if failed db toml environment keep env false re export path db pass additional key value contributing thanks for considering contributing to horust to get started have a look on contributing md https github com federicoponzi horust blob master contributing md license horust is provided under the mit license please read the attached license https github com federicoponzi horust blob master license file
rust rust-lang supervisor init init-system docker operating-system
os
udagram-restapi-feed
udagram image filtering microservice udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into three parts 1 the simple frontend https github com rajan98 udacity c2 frontend a basic ionic client web application which consumes the restapi backend 2 the restapi backend https github com rajan98 udacity c2 restapi a node express server which can be deployed to a cloud service 3 the image filtering microservice https github com rajan98 udacity c2 image filter the final project for the course it is a node express application which runs a simple python script to process images tasks setup python environment you ll need to set up and use a virtual environment for this project to create a virtual environment run the following from within the project directory 1 install virtualenv dependency pip install virtualenv 2 create a virtual environment virtualenv venv 3 activate the virtual environment source venv bin activate note you ll need to do this every time you open a new terminal 4 install dependencies pip install r requirements txt when you re done working and leave the virtual environment run deactivate setup node environment you ll need to create a new node server open a new terminal within the project directory and run 1 initialize a new project npm init 2 install express npm i express save 3 install typescript dependencies npm i ts node dev tslint typescript types bluebird types express types node save dev 4 look at the package json file from the restapi repo and copy the scripts block into the auto generated package json in this project this will allow you to use shorthand commands like npm run dev create a new server ts file use our basic server as an example to set up this file for this project it s ok to keep all of your business logic in the one server ts file but you can try to use feature directories and app use routing if you re up for it use the restapi structure to guide you add an endpoint to handle post imagetoprocess requests it should accept two post parameter image url string a public url of a valid image file upload image signedurl string optional a url which will allow a put request with the processed image it should respond with 422 unprocessable if either post parameters are invalid it should require a token in the auth header or respond with 401 unauthorized it should be versioned the matching token should be saved as an environment variable tip we broke this out into its own auth router before but you can access headers as part of the req headers within your endpoint block it should respond with the image as the body if upload image signedurl is included in the request it should respond with a success message if upload image signedurl is not included in the request refactor your restapi server add a request to the image filter server within the restapi post feed endpoint it should create new signedurls required for the imagetoprocess post request body it should include a post request to the new server tip keep the server address and token as environment variables it should overwrite the image in the bucket with the filtered image in other words it will have the same filename in s3 deploying your system follow the process described in the course to eb init a new application and eb create a new environment to deploy your image filter service stand out postman integration tests try writing a postman collection to test your endpoint be sure to cover post requests with and without tokens post requests with valid and invalid parameters refactor data models try adding another column to your tables to save a separate key for your filtered image remember you ll have to rename the file before adding it to s3 advanced refactor data models try adding a second opencv filter script and add an additional parameter to select which filter to use as a post parameter
cloud
blockchain.eos
blockchain eos eos io
blockchain
jeecg-boot-iotcp
jeecg https static oschina net uploads img 201905 24164523 xdhg png jeecgboot jeecg boot 2 2 1 2020 07 13 aur https img shields io badge license apache 20license 202 0 blue svg https github com zhangdaiscott jeecg boot blob master license https img shields io badge author orange svg http www jeecg com https img shields io badge version 2 2 1 brightgreen svg https github com zhangdaiscott jeecg boot github stars https img shields io github stars zhangdaiscott jeecg boot svg style social label stars https github com zhangdaiscott jeecg boot github forks https img shields io github forks zhangdaiscott jeecg boot svg style social label fork https github com zhangdaiscott jeecg boot h3 align center java low code platform for enterprise web applications h3 jeecgboot springboot2 x ant design vue mybatis plus shiro jwt jeecgboot online coding merge java 70 jeecgboot no jeecg online coding merge online jeecg boot j2ee saas mis oa erp crm merge 70 http boot jeecg com http boot jeecg com http www jeecg com http www jeecg com http doc jeecg com http doc jeecg com 1273753 jeecgboot http www jeecg com doc video http bbs jeecg com forum php mod viewthread tid 7816 extra page 3d1 http www jeecg com doc log qq 769925425 284271917 issues https github com zhangdaiscott jeecg boot issues new jeecg http www jeecg com doc join online 1 https my oschina net jeecg blog 3083313 jeecg boot 1 springboot mybatis antd 2 3 4 23 5 online 5 1 online 5 2 online 6 7 excel 80 8 excel 9 pdf excel word 10 ui 11 sql 12 13 14 saas saas 15 minio oss 16 mysql postgresql oracle 17 activiti bpm bpm java 18 activiti 19 20 cas 21 select radio checkbox textarea date popup 22 restful swagger ui jwt token 23 24 25 redis tomcat jvm sql 26 27 websocket 28 app 29 30 31 ui 32 33 ie11 34 35 maven 36 37 rbac role based access control java 8 ide java idea eclipse lombok ide webstorm idea maven mysql5 7 oracle 11g sqlserver2017 redis spring boot 2 1 3 release mybatis plus 3 1 2 apache shiro 1 4 0 jwt 3 7 0 druid 1 1 10 redis logback fastjson poi swagger ui quartz lombok vue 2 6 10 https cn vuejs org vuex https vuex vuejs org zh vue router https router vuejs org zh axios https github com axios axios ant design vue https vuecomponent github io ant design vue docs vue introduce cn webpack https www webpackjs com yarn https yarnpkg com zh hans vue cropper https github com xyxiao001 vue cropper antv g2 https antv alipay com zh cn index html alipay antv viser vue https viserjs github io docs html viser guide installation antv g2 eslint vue cli 3 2 1 https cli vuejs org zh guide vue print nb 4 excel excel redis tomcat jvm sql swagger ui tab table table jeditable jeditable pdf jeditabletable code cas app websocket online online online online online jeecg boot https static jeecg com upload test jeecg boot lantu202005 1590912449914 jpg java maven jdk8 mysql redis jeecg boot docs jeecg boot mysql sql admin 123456 node yarn webpack eslint vue cli 3 2 1 ant design vue https github com vuecomponent ant design vue ant design of vue vue cropper https github com xyxiao001 vue cropper antv g2 https antv alipay com zh cn index html alipay antv viser vue https viserjs github io docs html viser guide installation antv g2 jeecg boot angular https gitee com dangzhenghui jeecg boot bash git clone https github com zhangdaiscott jeecg boot git cd jeecg boot ant design jeecg vue 1 node js 2 ant design jeecg vue yarn npm install g yarn yarn install yarn run serve yarn run build lints and fixes files yarn run lint https static oschina net uploads img 201912 25133248 ag1c jpg https static oschina net uploads img 201912 25133301 k9kc jpg pc https static oschina net uploads img 201904 14155402 amlv png https static oschina net uploads img 201904 14160657 chwb png https static oschina net uploads img 201904 14160813 kmxs png https static oschina net uploads img 201904 14160935 nibs png https static oschina net uploads img 201904 14161004 bxq4 png https static oschina net uploads img 201908 27095258 m2xq png https static oschina net uploads img 201904 14160957 hn3x png https static oschina net uploads img 201904 14160828 pkfr png https static oschina net uploads img 201904 14160834 lo23 png https static oschina net uploads img 201904 14160842 qk7b png https static oschina net uploads img 201904 14160849 gbm5 png https static oschina net uploads img 201904 14160858 6ram png https static oschina net uploads img 201904 14160623 8fwk png https static oschina net uploads img 201904 14160917 9ftz png https static oschina net uploads img 201904 14160633 u59g png https static oschina net uploads img 201907 05165142 yyq7 png https oscimg oschina net oscnet da543c5d0d57baab0cecaa4670c8b68c521 jpg https oscimg oschina net oscnet fda4bd82cab9d682de1c1fbf2060bf14fa6 jpg pad https oscimg oschina net oscnet e90fef970a8c33790ab03ffd6c4c7cec225 jpg https oscimg oschina net oscnet d78218803a9e856a0aa82b45efc49849a0c jpg https oscimg oschina net oscnet 0404054d9a12647ef6f82cf9cfb80a5ac02 jpg https oscimg oschina net oscnet 59c23b230f52384e588ee16309b44fa20de jpg vue cli3 https cli vuejs org guide cli eslint package json eslintconfig ant design vue config js less ant design https ant design docs react customize theme cn ecmascript 6 css loaderoptions less modifyvars less ant design primary color f5222d link color f5222d border radius base 4px javascriptenabled true ant design vue https vuecomponent github io ant design vue docs vue introduce cn viser vue https viserjs github io demo html viser bar basic bar vue https cn vuejs org v2 guide https github com zhangdaiscott jeecg boot tree master ant design jeecg vue src router readme md antd https github com zhangdaiscott jeecg boot tree master ant design jeecg vue src defaultsettings js vue cli eslint vue rules https static oschina net uploads img 201903 08155608 0efx png
server
mbed-rtos
mbed rtos the mbed rtos together with a skeleton for a makefile in case arm mess up their export tools again to use just clone and add your code version is unknown maybe 5 x
os
nlp
license https img shields io badge license apache 202 0 blue svg https opensource org licenses apache 2 0 join cdap community https cdap users herokuapp com badge svg t wrangler https cdap users herokuapp com t 1 overview following is available in this repository nlp directives nlp transform plugins running integration tests to run the tests the path to a service account key must be provided the service account key can be found on the dashboard in the cloud platform console make sure the account key has permission to access natural language api mvn clean test dservice account file path to service account key json contact mailing lists cdap user group and development discussions cdap user googlegroups com https groups google com d forum cdap user the cdap user mailing list is primarily for users using the product to develop applications or building plugins for applications you can expect questions from users release announcements and any other discussions that we think will be helpful to the users slack team cdap users on slack cdap users team https cdap users herokuapp com license and trademarks copyright 2019 cask data inc licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license cask is a trademark of cask data inc all rights reserved apache apache hbase and hbase are trademarks of the apache software foundation used with permission no endorsement by the apache software foundation is implied by the use of these marks
directives nlp natural-language-processing cdap plugin
ai
blockchainguide
mindcarver 2018 image 20201124130443834 https tva1 sinaimg cn large 0081kckwgy1gl06iaf8eej313g0e8aal jpg image 20201124131619288 https tva1 sinaimg cn large 0081kckwgy1gl06udgxr2j315e0sa79d jpg ethfans https learnblockchain cn 2019 11 08 zkp info 1 https www etherchain org 2 https ethstats net 3 https ethereum github io yellowpaper paper pdf 4 https medium com preethikasireddy how does ethereum work anyway 22d1df506369 5 ethereum improvement proposals https eips ethereum org ethereum patricia tree understanding the ethereum trie https easythereentropy wordpress com 2014 06 04 understanding the ethereum trie ethereum ethereum patricia tree https github com ethereum wiki wiki patricia tree ethereum wiki mpt merkle patricia tree http blog csdn net qq 33935254 article details 55505472 merkle patricia tree mpt merkle http blog csdn net zslomo article details 53434883 t 1498537389197 merkle patricia tree mpt http www cnblogs com fengzhiwu p 5584809 html https github com wanshan1024 ethereum yellowpaper blob master ethereum yellow paper cn pdf solidity http me tryblockchain org getting up to speed on ethereum html solidity http wiki jikexueyuan com project solidity zh http ethfans org posts block chain technology smart contracts and ethereum https github com consensys smart contract best practices https learnblockchain cn manuals erc https github com blockchainguide eip cn git erc makerdao https www wanghaoyi com defi maker dao basis html compound https www wanghaoyi com defi compound profile html synthetix synthetix tokensets tokensets pooltogether pooltogether sablier https www wanghaoyi com defi sablier profile html dex uniswap https www wanghaoyi com defi dex uniswap profile html dex dydx https www wanghaoyi com defi dex dydx profile html nexus mutual https www wanghaoyi com defi insurance profile html https www wanghaoyi com blockchain blockchain app datashare https www wanghaoyi com blockchain blockchain app attestation https www wanghaoyi com blockchain blockchain app traceability https www wanghaoyi com blockchain blockchain app token hyperledger fabric fabric fabric http wutongtree github io translations next consensus architecture proposal zh http www 8btc com wiki bitcoin a peer to peer electronic cash system utxo http 8btc com article 4381 1 html http www 8btc com blockchain tech algorithm http www 8btc com blockchain tech mining 1 http www 8btc com blockchain tech consensus mechanism pow pos dpos pbft http blog csdn net lsttoy article details 61624287 https yq aliyun com articles 60400 http 8btc com article 2238 1 html http blog csdn net jeffrey zhou article details 56672948 https www zhihu com question 53385152 paxosstore paxos http www infoq com cn articles wechat paxosstore paxos algorithm protocol raft http www infoq com cn articles raft paper pos https yq aliyun com articles 60400 pos pow http 8btc com article 1882 1 html dpos http www 8btc com dpos back to satoshi dpos http www 8btc com dpossha dpos https steemit com dpos legendx dpos dpos vs pow https zhuanlan zhihu com p 28127511 pos dpos pow https www zhihu com question 49995385 https github com blockchainguide zkp tech git http ethfans org posts blockchain infrastructure landscape a first principles http www 8btc com elwingao blockchain 6 cto http www 8btc com blockchain architecture http www 8btc com ethereum top 10 app https github com blockchainguide zkp tech git http blog csdn net elwingao article details 53410750 http www 8btc com bytom sidechain rootstock http www 8btc com sidechains drivechains and rsk 2 way peg design http www 8btc com enabling blockchain innovations with pegged sidechains abstract introduction off chain http view xiaomiquan com view 59a3e22d2540ed222c6075b8 corda http www 8btc com ln rn corda rootstock http www 8btc com tan90d88 rootstock http www 8btc com tan90d84 btc relay rootstock http www 8btc com btc relay and rootstock btc relay http btcrelay org ipfs ipfs https ipfser org ipfs https zhuanlan zhihu com ipfsguide filecoin https filecoin io zh cn file https spec filecoin io filecoin https research protocol ai groups cryptolab https protocol ai filecoin https docs filecoin io https gguoss github io 2017 05 28 ipfs https github com ipfs papers raw master ipfs cap2pfs ipfs p2p file system pdf
blockchain
pcap2socks-gui
pcap2socks gui pcap2socks gui is a front end interface for pcap2socks https github com zhxie pcap2socks for development convenience this project is currently in simplified chinese only dependencies please refer to dependencies of pcap2socks https github com zhxie pcap2socks dependencies troubleshoot please refer to troubleshoot of pcap2socks https github com zhxie pcap2socks troubleshoot license pcap2socks gui is licensed under the mit license license
front_end
guiadofrontend
p align center a href https github com arthurspk guiadofrontend img src images guia png alt guia do desenvolvedor front end width 160 height 160 a h1 align center guia do desenvolvedor front end h1 p dart o guia para alavancar a sua carreira abaixo voc encontrar conte dos para te guiar e ajudar a se torna um desenvolvedor front end caso voc j atue como front end confere o reposit rio para descobrir novas ferramentas para o seu dia a dia os caminhos que voc pode tomar e as tecnologias para incorporar na sua stack para se tornar um profissional atualizado e diferenciado em front end fa a bom uso do guia e bons estudos sub strong segue nas redes sociais para acompanhar mais conte do strong br img src https img shields io badge github 100000 style for the badge logo github logocolor white https github com arthurspk img src https img shields io badge facebook 1877f2 style for the badge logo facebook logocolor white https www facebook com seixasqlc img src https img shields io badge linkedin 230077b5 svg style for the badge logo linkedin logocolor white https www linkedin com in arthurspk img src https img shields io badge twitter 1da1f2 style for the badge logo twitter logocolor white https twitter com manotoquinho discord badge https img shields io badge discord 5865f2 style for the badge logo discord logocolor white https discord gg nbmqupjhz7 img src https img shields io badge instagram 23e4405f svg style for the badge logo instagram logocolor white https www instagram com guiadevbrasil youtube badge https img shields io badge youtube ff0000 style for the badge logo youtube logocolor white https www youtube com channel uczmxzz vr0li8 yovwn t3g sub doa es ol se voc est lendo isso porque provavelmente j conhece o meu reposit rio no github que oferece conte do gratuito para ajudar desenvolvedores a aprimorarem suas habilidades e se voc est aqui talvez esteja considerando contribuir com uma doa o para apoiar a continua o do projeto clique aqui para realizar realizar uma doa o https beacons ai doacoesguiadev se voc quiser contribuir existem v rias op es dispon veis incluindo paypal pagseguro mercado pago buy me a coffe pic pay e pix qualquer doa o por menor que seja extremamente bem vinda e ser usada com responsabilidade e transpar ncia obrigado por considerar apoiar meu projeto juntos podemos continuar a compartilhar conhecimento e ajudar a criar uma comunidade de desenvolvedores mais forte e colaborativa closed book e book este reposit rio um projeto gratuito para a comunidade de desenvolvedores voc pode me ajudar comprando o e book e front se estiver interessado em aprender ou melhorar suas habilidades de desenvolvimento front end o e book completo e cobre tecnologias essenciais como html css javascript react typescript e mais o valor simb lico e sua compra me ajuda a produzir e fornecer mais conte do gratuito para a comunidade adquira agora e comece sua jornada no desenvolvimento front end efront estudando desenvolvimento front end do zero clique aqui para comprar https hotm art csmobu direitos autorais esse projeto tomou como refer ncia para ser feito os roadmaps feito pelo projeto roadmap sh https roadmap sh roadmaps voc pode conferir todo o projeto original feito pelos autores principais pelos links abaixo desde j agradecendo a todos eles por fornecer esse conte do que serviu de extrema import ncia e de base para a cria o deste reposit rio roadmap sh https roadmap sh roadmaps site do roadmap sh onde voc encontrar diversos roadmaps em ingl s reposit rio oficial do projeto https github com kamranahmedse developer roadmap blob master license reposit rio oficial do projeto em ingl s warning nossa proposta a proposta deste guia dar uma ideia sobre o atual panorama e gui lo se voc estiver confuso sobre qual o pr ximo aprendizado n o influenciar voc a seguir os hypes e trendys do momento acreditamos que com um b maior conhecimento das diferentes estruturas e solu es dispon veis poder escolher a ferramenta que melhor se aplica s suas demandas b e lembre se hypes e trendys nem sempre s o as melhores op es beginner para quem est come ando agora n o se assuste com a quantidade de conte do apresentados neste guia mesmo o foco sendo para profissionais j consolidados que desejam se manter atualizados acredito que quem est come ando pode us lo n o como um objetivo mas como um apoio para os estudos b neste momento d enfoque no que te d produtividade e o restante marque como i ver depois i b ao passo que seu conhecimento se torna mais amplo a tend ncia este guia fazer mais sentido e f cil de ser assimilado bons estudos e entre em contato sempre que quiser punch colabore abra pull requests com atualiza es discuta ideias em issues compartilhe o reposit rio com a sua comunidade mande feedbacks no linkedin https www linkedin com in arthurspk tradu o se voc deseja acompanhar esse reposit rio em outro idioma que n o seja o portugu s brasileiro voc pode optar pelas escolhas de idiomas abaixo voc tamb m pode colaborar com a tradu o para outros idiomas e a corre es de poss veis erros ortogr ficos a comunidade agradece img src https i imgur com lpp9v2p png alt guia extenso de programa o width 16 height 15 b english b click here https github com arthurspk guiadevbrasil br img src https i imgur com gprsvje png alt guia extenso de programa o width 16 height 15 b spanish b click here https github com arthurspk guiadevbrasil br img src https i imgur com 4dx1q8l png alt guia extenso de programa o width 16 height 15 b chinese b click here https github com arthurspk guiadevbrasil br img src https i imgur com 6mnaomg png alt guia extenso de programa o width 16 height 15 b hindi b click here https github com arthurspk guiadevbrasil br img src https i imgur com 8t4zbfd png alt guia extenso de programa o width 16 height 15 b arabic b click here https github com arthurspk guiadevbrasil br img src https i imgur com iodztmd png alt guia extenso de programa o width 16 height 15 b french b click here https github com arthurspk guiadevbrasil br img src https i imgur com pilsgao png alt guia extenso de programa o width 16 height 15 b italian b click here https github com arthurspk guiadevbrasil br img src https i imgur com 0lzosiy png alt guia extenso de programa o width 16 height 15 b korean b click here https github com arthurspk guiadevbrasil br img src https i imgur com 3s5pflq png alt guia extenso de programa o width 16 height 15 b russian b click here https github com arthurspk guiadevbrasil br img src https i imgur com i6dqjza png alt guia extenso de programa o width 16 height 15 b german b click here https github com arthurspk guiadevbrasil br img src https i imgur com wwrzmnk png alt guia extenso de programa o width 16 height 15 b japanese b click here https github com arthurspk guiadevbrasil br frontend roadmap frontend roadmap images frontend jpg http o hypertext transfer protocol sigla http um protocolo de comunica o utilizado para sistemas de informa o de hiperm dia distribu dos e colaborativos ele a base para a comunica o de dados da world wide web mdn http https developer mozilla org pt br docs web http o mdn web docs o website oficial de mozilla para desenvolvimento de padr es web html html linguagem de marca o de hipertexto o bloco de constru o mais b sico da web define o significado e a estrutura do conte do da web outras tecnologias al m do html geralmente s o usadas para descrever a apar ncia apresenta o css ou a funcionalidade comportamento javascript de uma p gina da web sendo assim html uma linguagem ess ncia para voc que quer se tornar desenvolvedor front end mdn html https developer mozilla org pt br docs web html o mdn web docs o website oficial de mozilla para desenvolvimento de padr es web cursos de html https github com arthurspk guiadevbrasil cursos de html e css cursos de html do reposit rio geral do guia dev brasil css css cascading style sheets um mecanismo para adicionar estilo a um documento web o c digo css pode ser aplicado diretamente nas tags ou ficar contido dentro das tags style tamb m poss vel em vez de colocar a formata o dentro do documento criar um link para um arquivo css que cont m os estilos mdn css https developer mozilla org pt br docs web css o mdn web docs o website oficial de mozilla para desenvolvimento de padr es web cursos de css https github com arthurspk guiadevbrasil cursos de html e css cursos de css do reposit rio geral do guia dev brasil javascript javascript uma linguagem de programa o interpretada estruturada de script em alto n vel com tipagem din mica fraca e multiparadigma juntamente com html e css o javascript uma das tr s principais tecnologias da world wide web mdn javascript https developer mozilla org pt br docs web css o mdn web docs o website oficial de mozilla para desenvolvimento de padr es web cursos de javascript https github com arthurspk guiadevbrasil cursos de javascript cursos de javascript do reposit rio geral do guia dev brasil frameworks framework um conjunto de c digos prontos que podem ser usados no desenvolvimento de aplicativos e sites o objetivo dessa ferramenta aplicar funcionalidades comandos e estruturas j prontas para garantir qualidade no desenvolvimento de um projeto cada linguagem possi diversos frameworks que podem ser utilizado para te ajudar em umafuncionalidade espec fica por configura o durante a programa o de uma aplica o frameworks para html https www webfx com blog web design html5 frameworks 10 frameworks responsivos para ser utilizado no html5 frameworks para css https rockcontent com br talent blog frameworks css 14 frameworks que podem ser utilizados no seu css frameworks no front end https mundodevops com blog framework front end frameworks mais utilizados no front end frameworks e bibliotecas javascript https blog geekhunter com br frameworks javascript e bibliotecas java text um 20framework 20javascript 20 c3 a9 20uma ficar 20reescrevendo 20linhas 20de 20c c3 b3digo frameworks e bibliotecas para ser utilizadas no javascript apis o conceito de api nada mais do que uma forma de comunica o entre sistemas elas permitem a integra o entre dois sistemas em que um deles fornece informa es e servi os que podem ser utilizados pelo outro sem a necessidade de o sistema que consome a api conhecer detalhes de implementa o do software as apis permitem que o usu rio final utilize um aplicativo software ou at uma simples planilha consultando alterando e armazenando dados de diversos sistemas sem que o usu rio precise acess los diretamente mdn apis https developer mozilla org pt br docs web api o mdn web docs o website oficial de mozilla para desenvolvimento de padr es web apis p blicas https github com public apis public apis uma lista coletiva de apis gratuitas para uso em software e desenvolvimento web recomenda o de livros clean code c digo limpo https g co kgs 62wx9t refactoring refatora o https g co kgs hf2ey3 clean archtecture arquitertura limpa https g co kgs hf2ey3 o programador pragm tico https g co kgs 5nbqb3 ferramentas sites para desenvolvedor front end text pop 3d https textpop3d web app cria efeitos de texto 3d shape dividers https shapedividers com gera divisores de formas verticais responsivos e animados facilmente com este gerador de divisores de formas svg couleur https couleur io uma ferramenta de cores simples para ajud lo a encontrar uma boa paleta de cores para seu projeto da web baseline css filters https baseline is tools css photo filters 36 belos filtros de fotos com edi o simples e css para copiar ui deck https uideck com modelo de p gina de destino html gratuitos e premium temas de bootstrap modelos de react modelos de tailwind modelos de site html e kits de interface de us ario naevner https naevner com descri o de cores em linguagem natural gerador de c digos em cores hexadecimais templates html gratuitos bootstrap made https bootstrapmade com temas html5 css3 gratuitos w3 layouts https w3layouts com temas html5 css3 gratuitos one page love https onepagelove com temas html5 css3 gratuitos themewagon freebies https themewagon com theme tag free temas html5 css3 gratuitos sites para aprender ou treinar css css grid garden http cssgridgarden com ferramenta online para estudos de grid css flukeout http flukeout github io aplica o online para aprender css de forma pr tica flex box froggy https flexboxfroggy com desafio de programa o front end focados na propriedade flex box ide integrada flexbox defense http www flexboxdefense com aprenda flexbox com um game 100 dias de css https 100dayscss com 100 desafios de css css battle https cssbattle dev batalhas tempor rias de css ide integrada css tricks https css tricks com guides site para treinar css hell https csshell dev cole o de erros comuns de css e como corrigi los geradores de css neumorphism https neumorphism io tend ncia aplica o border radius fancy border radius https 9elements github io fancy border radius gerador de formas com border radius no css wait animate https waitanimate wstone io gerador de anima es de css best css button generator https www bestcssbuttongenerator com gerador de but es do css html css js generator https html css js com css generator gerador de html css js bennettfeely https bennettfeely com clippy criador de clip path ferramentas de desenvolvimento internxt https internxt com internxt drive um armazenamento de arquivos de conhecimento zero servi o baseado na melhor privacidade e seguran a da classe motion https motion dev uma nova biblioteca de anima o constru da na api web animations para o menor tamanho de arquivo e o desempenho mais r pido hokusai https hokusai app conte do sobre nft s hidden tools https hiddentools dev ampla cole o de ferramentas feitas pela comunidade disponive s para uso dev hints https devhints io cole o de cheatsheets bundlephobia https bundlephobia com descubra o custo de adicionar um pacote npm ao seu pacote refactoring guru https refactoring guru pt br design patterns padr es de projetos design patterns devdocs https devdocs io devdocs combina v rias documenta es de api em uma interface r pida organizada e pesquis vel html validator https www freeformatter com html validator html valida o de arquivo html html 5 test https html5test com index html testa arquivos html5 image slide maker https imageslidermaker com v2 ferramenta de gera o gratuita do image slider maker net fiddle https dotnetfiddle net codifique e compartilhe projetos c online 1pagerank http www 1pagerank com rankeie seu site nos mecanismos de buscas e aprenda com a concorr ncia any api https any api com diret rio gratuito com apis p blicas autoprefixer css http autoprefixer github io transpile c digo css atual para c digo css com maior cobertura de navegadores antigos browser diet https browserdiet com pt guia de performance para desenvolvimento web can i email https www caniemail com descubra o que do html e css pode ser usado em estruturas de e mail can i use https caniuse com descubra se voc pode usar determinadas tecnologias web cloudflare https www cloudflare com pt br cdn gr tis cmder https cmder net linha de comando simples consegue rodar comands bash e shell alternativa ao hyper codepen https codepen io rede social de desenvolvedores front end codesandbox https codesandbox io caixa de rea para desenvolvedores web connection strings https www connectionstrings com site com strings de conex o para diversas plataformas css formatter https www cleancss com css beautify retire a minifica o e formate o c digo css css minifier https cssminifier com conversor de c digo css para css minificado css w3 org https jigsaw w3 org css validator validar css debuggex online visual regex tester javascript python and pcre https www debuggex com construa e teste express es regulares docsify https docsify js org crie docs incr veis de projetos easyforms https easyforms vercel app api open source que permite cria o formul rios de contato com html puro editor md https pandao github io editor md en html editor markdown online e open source es6console https es6console com compilador de js para ecmascript firebase https firebase google com hl pt br desenvolva aplicativos mobile e web incr veis este servi o da google firefox developer edition https www mozilla org pt br firefox developer navegador web para desenvolvedores web full page screen capture https chrome google com webstore detail full page screen capture fdpohaocaechififmbbbbbknoalclacl hl pt br capture p ginas inteiras com uma extens o para chrome generatedata http www generatedata com gerador de dados para testes git command explorer https gitexplorer com encontre os comandos certos que voc precisa sem vasculhar a web github gist https gist github com fa a pequenas anota es e pequenos c digos no github gist google transparency report https transparencyreport google com safe browsing search verificar seguran a de um site grader https website grader com avalia o de site how to center in css http howtocenterincss com gerador de c digo para divs ou textos que necessitam de centraliza o hyper https hyper is linha de comando simples til e gratuito joomla https www joomla org cms gratuita js bin https jsbin com codifique e compartilhe projetos html css e js jscompress https jscompress com conversor de c digo js para js minificado json editor online https jsoneditoronline org ferramenta para visualizar e editar arquivos json jsfiddle https jsfiddle net codifique projetos js online jsonlint https jsonlint com ferramenta para validar seu json json generator https app json generator com ferramenta para gerar json com base em template keycdn tools https tools keycdn com fa a uma an lise das suas aplica es web liveweave https liveweave com codifique projetos html css e js lorem ipsum https br lipsum com gerador de texto fict cio mapbox https www mapbox com mapas e localiza o para desenvolvedores memcached https memcached org melhore o desempenho de seu website com cache mockaroo https www mockaroo com gerador de dados para testes mussum ipsum https mussumipsum com gerador de texto fict cio npm http server https www npmjs com package http server rode um servidor local com um pacote npm password generator https danieldeev github io password generator um gerador de senhas simples com foco na seguran a online c compiler https www onlinegdb com online c compiler ferramenta para compilar c online react dev tools https chrome google com webstore detail react developer tools fmkadmapgofadopljbjfkapdkoienihi ferramenta para debug do reactjs react hook form https react hook form com valide seus formul rios de projetos que utilizam react ou react native relax http dbis uibk github io relax index htm crie express es alg bricas relacionais de consultas responsive http www codeorama com responsive teste a responsividade do seu site shields io https shields io gerador de badges para markdown ssl server test https www ssllabs com ssltest testar ssl de sites streamyard https streamyard com o streamyard um est dio de transmiss es ao vivo no seu navegador swagger https swagger io ferramenta para projetar construir documentar e usar servi os da web restful tabela ascii https web fe up pt ee96100 projecto tabela 20ascii htm tabela completa com caracteres ascii telegram https telegram org mensageiro criptografado tinyjpg https tinyjpg com comprima imagem do formato jpg tinypng https tinypng com comprima imagem do formato png creately https creately com crie e altere lindos diagramas em tempo real com a sua equipe carbon https carbon now sh crie snippets de codigo clean e bonitos dbdiagram https dbdiagram io home crie elegrantes diagramas de banco de dados e gere script ddl sqldesigner https ondras zarovi cz sql demo crie diagramas de banco de dados de maneira rapida e gere script ddl w3 org https validator w3 org validar html wakatime https wakatime com gerencie seu tempo de desenvolvimento web developer https chrome google com webstore detail web developer bfbameneiokkgbdmiekhjnmfkcnldhhm hl pt br extens o para chrome com multi fun es web dev https web dev testar website criado pela google webpagetest https www webpagetest org testar perfomance de site wedsites https wedsites com liste suas atividades e acompanhe seu progresso wordpress https wordpress org cria o de blogs xml sitemaps https www xml sitemaps com criador de sitemaps xml minimamente https www minimamente com project magic efeitos para utilizar no css hamburgers https jonsuh com hamburgers menu de hamburgers para utilizar em css hover effects https ianlunn github io hover hover effects para utilizar no css design front end adobe xd https www adobe com br products xd html software de design para projetos awwwards https www awwwards com inspira o para interfaces e templates com o que h de mais novo em quest o de design de interfaces bootstrap https www getbootstrap com framework css buildbootstrap https buildbootstrap com crie layout responsivo para o framework bootstrap na vers o 3 e 4 bulma css https bulma io estrutura css gratuita baseada no flexbox canva https www canva com ferramenta de design online chart js https www chartjs org biblioteca javascript de cria o de gr ficos colors and fonts https www colorsandfonts com apresenta paletas de cores e tipografia coolors https coolors co palhetas de cores e monte a sua pr pria colors lol https colors lol reposit rio de paletas de cores cruip https cruip com recursos de templates css effects snippets https emilkowalski github io css effects snippets anima es css prontas para usar css layout https csslayout io layouts e padr es populares feitos com css css reference https cssreference io guia visual para css com referencias de uso css tricks https css tricks com blog com v rios tutoriais frontend devsamples https www devsamples com exemplos de c digos f ceis de usar para html css e javascript excalidraw https excalidraw com desenhe diagramas como se tivessem sido feitos a m o fancy border radius https 9elements github io fancy border radius gerador de formas com border radius no css figma https www figma com desenhe projetos online de apps softwares e websites flatui color picker http www flatuicolorpicker com paleta de cores interativa de forma harmonizar o front font flipper https fontflipper com ferramenta para testar fontes fontpair https fontpair co ferramenta para combina es de fontes fontspark https fontspark app gera fontes aleat rias de uma lista de fontes famosas usadas na web foundation https foundation zurb com framework responsivo framer https www framer com ferramenta de cria o de interfaces interativas freefrontend https freefrontend com exemplos de c digos tutoriais e artigos de html css javascript angular jquery react vue gravit designer https www designer io ferramenta de design online com suporte a ilustra o vetorial grid layoutit https grid layoutit com gerador de grid para c digo css html dom https htmldom dev gerenciar o dom html com javascript vanilla interfacer https interfacer xyz recursos para cria o de interfaces interfaces pro https interfaces pro inspira o para interfaces invision https www invisionapp com software de design para projetos lottie https lottiefiles com anima es em after effects via json luna https github com officialmarinho luna framework css brasileiro material ui https material ui com um framework de interface de usu rio para react mockup https mockup io about visualize e colabore no design de aplicativos para dispositivos m veis nes css https nostalgic css github io nes css framework css estilo nes neumorphism https neumorphism io tend ncia aplica o border radius normalize css https necolas github io normalize css normaliza estruturas entre navegadores pixilart https www pixilart com draw desenhe pixel arts online pixlr https pixlr com br conjunto de ferramentas e utilit rios de edi o de imagem baseado em nuvem psd to css shadow http psd to css shadows com gera o script para uma sombra box shadow text shadow no css baseado nas configura es de sombra no photoshop pure css https purecss io framework css responsivo remove bg https www remove bg remove fundos de imagens automaticamente sketch https www sketch com desenvolvimento de layouts em alta qualidade squoosh app https squoosh app compressor de imagens e comparador via navegador sweetalert2 https sweetalert2 github io biblioteca javascript de alertas responsivos e customiz veis tailwind css https tailwindcss com framework de estilo css ui gradients https uigradients com ui gradientes para utilizar vectorizer https www vectorizer io converta imagens como pngs bmps e jpegs em gr ficos vetoriais svg eps dxf whimsical https whimsical com flowchart wireframe sticky notes e mind map x icon editor http www xiconeditor com gerador de favicon com alta resolu o a partir de imagens desafios ace front end https www acefrontend com desafios de programa o front end resultados via texto ide integrada adventoofcode https adventofcode com desafios de programa o por temporada sem ide integrada valida o manual feita pelo usu rio app ideas https github com florinpop17 app ideas compilado de desafios para voc testar seus conhecimentos e aumentar seu portf lio codepen challenges https codepen io challenges desafios de programa o front end ide integrada codier https codier io challenge desafios de programa o front end an lise dos resultados feita pela comunidade ide integrada css battle https cssbattle dev batalhas tempor rias de css ide integrada devchallenge https www devchallenge com br site com desafios de front end back end e mobile flex box defense http www flexboxdefense com desafio de programa o front end focados na propriedade flex box ide integrada flex box froggy https flexboxfroggy com desafio de programa o front end focados na propriedade flex box ide integrada front end challanged club https piccalil li category front end challenges club bogs com desafios de programa o front end frontend challengens https github com felipefialho frontend challenges reposit rio no github com v rios desafios solicitados reais solicitados por empresas frontend mentor https www frontendmentor io desafios de programa o front end an lise dos resultados feita pela comunidade sem ide integrada code well https codewell cc treine suas habilidade de html e css com alguns templates bibliotecas para javascript lax js https github com alexfoxy lax js swiper https swiperjs com wow https wowjs uk animate https animate style apexcharts https apexcharts com particles js https vincentgarreau com particles js scrollmagic https scrollmagic io ferramentas para hospedar seu site github pages https pages github com hospedado diretamente de seu reposit rio github basta editar enviar e suas altera es entrar o em vigor award space https www awardspace com hospedagem gratuita na web um subdom nio gratuito php mysql instalador de aplicativo envio de e mail e sem an ncios byet https byet host hospedagem gratuita e servi os de hospedagem premium infinity free https infinityfree net free unlimited web hosting 1freehosting http www 1freehosting com hospedagem de sites gr tis com 100gb de largura de banda amazon web services https aws amazon com pt servi o de aluguel de servidores e outros servi os bluehost https www bluehost com empresa americana de hospedagem de sites digitalocean https www digitalocean com aluguel de servidores dedicados e compartilhados dreamhost https www dreamhost com hospedagem de sites de alta disponibilidade embratel https www embratel com br cloud hospedagem de sites hospedagem de sites nacional godaddy https br godaddy com hosting web hosting hospedagem de sites internacional godaddy https br godaddy com empresa de aluguel de servidores compartilhados dedicados e registro de dom nio google cloud https cloud google com solutions smb web hosting servi o de aluguel de servidores da google heroku https www heroku com hospedagem de sites gr tis com suporte nodejs java ruby php python go scala e clojure hostgator https www hostgator com hospedagem compartilhada e dedicada para sites e servi os hostinger https www hostinger com br hospedagem de sites hostoo https hostoo io hospedagem de sites em cloud computing dedicado ipage https www ipage com hospedagem de sites gringa com descontos para an ncios kinghost https king host hospedagem compartilhada e dedicada para sites e servi os de marketing por e mail netlify https www netlify com hospedagem para sites est ticos que combina implanta o global integra o cont nua e https autom tico one com https www one com pt br servi os gerais digitais incluindo hospedagem de sites surge https surge sh hospedagem gratuita para p ginas est ticas umbler https www umbler com br hospedagem compartilhada cloud computing sob taxa o de uso vercel https vercel com hospedagem gr tis de sites est ticos e serveless sites para inspirar o seu desenvolvimento product hunt https www producthunt com namify https namify tech ref producthunt dribbble https dribbble com pinterest https br pinterest com deviant art https www deviantart com lapa https www lapa ninja hyper pixel https hyperpixel io one page love https onepagelove com one page love avatars https onepagelove com boring avatars land book https land book com awwwards https www awwwards com best folios https www bestfolios com home sitesee https sitesee co httpster https httpster net 2021 jun builders club https builders club com css nectar https cssnectar com collect ui https collectui com best web site https bestwebsite gallery banco de imagens gratuitas 500px https 500px com creativecommons banco de imagens gratuitas burst https pt shopify com burst plataforma de imagens do servi o de ecommerce shopify cupcake http cupcake nilssonlee se imagens gratuitas para uso comercial banco de imagens com diversidade https github com julianahelena5 bancodeimagenscomdiversidade banco de imagens com pessoas diversas drawkit https www drawkit io ilustra es para qualquer um usar flaticon https www flaticon com banco de cones vetoriais flickr https flickr com rede social de fot grafos freeimages https pt freeimages com banco de imagens gratuitas freepik stories https stories freepik com banco de ilustra es gratuitas freerange https freerangestock com index php banco de imagens gratuitas glaze https www glazestock com banco de ilustra es sem direitos autorais gratisography https gratisography com banco de imagens gratuitas humaaans https www humaaans com ilustra es de human ides icons8 https icons8 com br cones em diversos estilos imgur https imgur com plataforma com milh es de imagens iradesign https iradesign io illustrations ilustra es edit veis de cores e objetos life of pix https www lifeofpix com banco de imagens gratuitas little visuals https littlevisuals co banco de imagens gratuitas lorempixel http lorempixel com banco de imagens para uso como template lukas zadam https lukaszadam com illustrations ilustra es svg em diferentes tamanhos e estilos manypixels https www manypixels co gallery galeria de ilustra es com direito a edi o de cores morguefile https morguefile com banco de imagens gratuitas nappy https www nappy co banco de imagens gratuitas atribui o recomendada nos twnsnd https nos twnsnd co arquivo p blico de fotos antigas openmoji https openmoji org banco de emojis para uso pexels https www pexels com banco de imagens gratuitas photopin http photopin com banco de imagens gratuitas no estilo criativo picjumbo https picjumbo com banco de imagens gratuitas picsum https picsum photos banco de imagens para uso como template pixabay http www pixabay com banco de imagens gratuitas n o requer atribui o public domain archive https www publicdomainarchive com banco de imagens gratuitas remixicon https remixicon com banco de cones para uso gratuito stocksnap https stocksnap io banco de imagens gratuitas n o requer atribui o undraw https undraw co ilustra es livres para usar unsplash https unsplash com banco de imagens gratuitas visual hunt https visualhunt com banco de imagens gratuitas wikimedia commons https commons wikimedia org wiki main page banco de imagens mundial sites para baixar e encontrar fontes adobe fonts https fonts adobe com google fonts https fonts google com dafont https www dafont com pt netfontes https www netfontes com br urbanfonts https www urbanfonts com pt befonts https befonts com fonts space https www fontspace com 1001 fonts https www 1001fonts com abstract fonts https www abstractfonts com fontget https www fontget com material design icons https materialdesignicons com sites de paletas de cores paletton https paletton com adobe color https color adobe com pt create color wheel color hunt https colorhunt co happy hues https www happyhues co coolors https coolors co gradient hunt https gradienthunt com flat ui colors https flatuicolors com grabient https www grabient com pigment https pigment shapefactory co webgradient https webgradients com color lol https colors lol lista de ilustra es drawkit https www drawkit io humaaans https www humaaans com open doodle https www opendoodles com storyset https storyset com undraw https undraw co 404 illustrations by kapwing https www kapwing com 404 illustrations 404 illustrations https error404 fun ouch https icons8 com br illustrations delesing https delesign com free designs graphics pixeltru https www pixeltrue com free illustrations iconscout https iconscout com site de icones drawkit https www drawkit free icons eva icons https akveo github io eva icons feather icons https feathericons com font awesome https fontawesome com heroicons https heroicons dev iconsvg https iconsvg xyz icons8 line awesome https icons8 com line awesome icons8 https icons8 com br shape https shape so flaticon https www flaticon com br bootstrap icons https icons getbootstrap com devicon https devicon dev extens es para o seu navegador file icons for github and gitlab https chrome google com webstore detail file icons for github and ficfmibkjjnpogdcfhfokmihanoldbfe gofullpage https chrome google com webstore detail gofullpage full page scre fdpohaocaechififmbbbbbknoalclacl web developer https chrome google com webstore detail web developer bfbameneiokkgbdmiekhjnmfkcnldhhm hl pt br react developer tools https chrome google com webstore detail react developer tools fmkadmapgofadopljbjfkapdkoienihi window resizer https chrome google com webstore detail window resizer kkelicaakdanhinjdeammmilcgefonfh hl pt br vue devtools https chrome google com webstore detail vuejs devtools nhdogjmejiglipccpnnnanhbledajbpd hl pt br dark reader https chrome google com webstore detail dark reader eimadpbcbfnmbkopoojfekhnkhdbieeh hl pt br
css javascript html css3 frontend front-end-development javascript-library javascript-framework javascript-applications html5 html-css-javascript roadmap roadmaps guia guia-de-estudos
front_end
esp-ac-dimmer
esp ac dimmer library for esp open rtos https github com superhouse esp open rtos to control ac dimmer license mit licensed see the bundled license https github com maximkulkin esp ac dimmer blob master license file for more details
os
mks-restaurant-project
restaurant project assignment your assignment is to recreate this restaurant site img src mockup jpg fork git repo fork this git repo on github then clone it into your makersquare projects folder not your github io folder just makersquare remember we don t want git repos inside git repos you can do this via command line or via the github app command line cd into your makersquare projects folder run git clone link to repo cd mks restaurant project github go to the repositories view click on your username in the sidebar find the mks restaurant project repo in the list click clone to computer and install it into the makersquare projects folder assignment detail everywhere there is something you need to do there is a todo comment for you to find and write code there use them to guide you as you go through these steps in order 1 install foundation http foundation zurb com develop download html download all their css and js and put it in the appropriate css and js folders 2 link to the foundation normalize css and foundation min css in your index html you should see the typography change once they are linked verify they are working with dev inspector 3 write html for the layout we re using foundation so we can use their grid system use rows and columns make it responsive remember that when you use the small grid classes they apply up to large sizes too 4 style the sections with typography colors links etc add the google font bitter in your index html and then in your css in application css write css to give the different page sections the appropriate background colors fonts etc be sure you have done the html first 5 make the slideshow work download and install slides js into your js folder http www slidesjs com in your index html jquery is already linked and now we need to link to the slides js file with a script tag now look at the slides js site to see how to use it http slidesjs com you ll write some jquery to make the slides plugin work you can write it in index html after your link to slides min jquery js we want it to play automatically with the effect fade and an interval of 7000 have pagination false not show the pagination numbers have the navigation use the fade effect the css for the slideshow is already linked in index html for you just make the js work next class more details tbd 6 add the menu items via a json feed 7 write js to make the menu sections show hide on click
front_end
IT_PSRU59
it psru59 information technology chittrapon iwchaona
server
front-end-community
front end language community what is this community for this is a space where women can come together and help each other learn and understand web development we welcome devs of all levels and encourage those with more experience to help out those that are just starting there s always at least one mentor leader present to help you with best practices or to guide you through if you are stuck with a nasty problem every study group will have the following format 6 30pm begin 6 40 6 50pm quick introductions 6 50 8 00pm code 8 00 8 20pm lightning talk a quick presentation or talk if planned 8 20 8 30pm wrap up all events are scheduled via our meetup http www meetup com women who code dc page we meet every monday 6 30 08 30 pm what if i m a first timer that s never coded before check out our guide to front end for first timers https github com womenwhocodedc front end community blob master first timers guides first timers guide md that also contains information on how to learn about coding go through the recommended resources and ask lots of questions what if i m a web development pro we super encourage you to attend the study group work on a personal project or take one of the recommended courses to further your skills help others with questions they might have consider doing a lighting talk consider being a lead resources study guides please refer to these guidelines first timer s guide http bit ly fehn first timers html study guide http bit ly fehn html guide javascript study guide http bit ly fehn js guide lightning talk possible topics present an app you re working on tools editors that help productivity html5 and css tricks bootstrap vs foundation mvc different js frameworks node js databases that work well with javascript like mongodb and parse networking internet technologies what do leads do leads are present to facilitate the study group they help find hosts schedule meetups come up with curriculum and are also available to ask questions can i be a lead yes we are always looking for more leads and if you re interested in getting involved in women who code in any capacity talk to one of the leads present how can i contribute without being a lead we need lots of help coordinating events if you want to help out please let us know slack the wwc dc community uses slack to keep in touch share resources and chat about all things tech you can use slack to ask troubleshooting questions or general questions about coding in any language and the tech industry we encourge you to sign up here http bitly com womenwhocodedcslack and read our guide if you need more help http bit ly slackguide building access certain building managements requires meetup hosts to let people in and do not encourage them being contacted please respect the policy in case the location requires access to the building contact the group lead if information is provided or drop a comment on the meetup page we ll come and get you all women welcome this study group is intended to be exclusively open for any who identify as women we are transgender friendly happy coding alt tag http www barbarianmeetscoding com images i know javascript jpg
front_end
NLPBench
nlpbench evaluating nlp related problem solving ability in large language models nlpbench is a novel benchmark for natural language processing problems consisting of 378 questions sourced from the nlp course final exams at yale university data our example questions example questions assets q eg png our dataset is licensed under the cc by nd https creativecommons org licenses by nd 4 0 deed en you can download our dataset through this link https drive google com drive folders 1haglwzdz fejn7s nbpdlz8gzpowzszn usp sharing environment preparation you can import our environment from the environment yml by bash conda env create f environment yml then activate our conda environment by bash conda activate nlpbench evaluation our evaluations are based on both online gpt 3 5 gpt 4 and palm 2 and open sourced llama 2 falcon bloom etc llms for online llm online llm often requires an api key before access if you want to access the openai model you need to add the openai api key to the system environment as follows bash export openai api key your openai api key and for palm 2 you need to add the palm api key to your system environment as follows bash export palm api key your palm api for open sourced llm we use vllm https github com vllm project vllm to start an openai like endpoint for evaluation all configurations are summarized in utils utils py oai llm config check this list https vllm readthedocs io en latest models supported models html for information on the supported open source model basically if you want to evaluate other open sourced models add your model s configuration in the following format into the oai llm config json huggingface repo model huggingface repo api key empty api base your endpoint host default http 127 0 0 1 8000 v1 then start the endpoint with the following script bash bash scripts serving sh m huggingface repo n number of gpus a host address default 127 0 0 1 p port default 8000 run evaluation we have two steps for evaluation 1 solving the problems and 2 calculating the accuracy we adopt sacred https github com idsia sacred to manage our configurations all configs can be found in configs you can also add your config by creating a specific yaml file as an example you can run the following code to let gpt 3 5 with only zero shot prompting answer the questions without context bash python run py with configs zero shot yaml model name gpt 3 5 turbo ctx false the answer results will be saved in res seed no ctx zero shot gpt 3 5 turbo json you can evaluate the above result by running the following code python evaluate py then the result will be saved in res seed prompt all the prompts in our evaluation can be found in prompts including prompt for question answering qa prompt py system prompt sys prompt py and prompt for tree of thought tot prompt py you can customize your prompt by modifying the above three files citation if you think our repository and result is useful please cite our paper by misc song2023nlpbench title nlpbench evaluating large language models on solving nlp problems author linxin song and jieyu zhang and lechao cheng and pengyuan zhou and tianyi zhou and irene li year 2023 eprint 2309 15630 archiveprefix arxiv primaryclass cs cl
ai
mortgage-blockchain-demo
mortgage blockchain demo an illustrative dapp to address a potential use case in the mortgage space smart contracts can automate mortgage contracts by automatically connecting the parties providing for a frictionless and less error prone process the smart contract can automatically process payment and release liens from land records when the loan is paid they can also improve record visibility for all parties and facilitate payment tracking and verification they reduce errors and costs associated with manual processes setup npm install g ethereumjs testrpc npm install g truffle mkdir demo cd demo git clone https github com rajivjc mortgage blockchain demo git cd mortgage blockchain demo npm install truffle build truffle serve access application http localhost 8080 alternatively you can run the following docker images docker run p 8545 8545 d rajivjc testrpc docker run p 8080 8080 d rajivjc mortgage blockchain demo addtional documentation check out this blog https medium com rajiv cheriyan lets get started with your first ethereum dapp f09feb59dd78 for more details
truffle ethereum smart-contracts blockchain mortgage
blockchain
Spring18_Adv_MAD_examples
spring18 adv mad examples atls 4320 5320 mobile app development advanced topics class examples ios 11
front_end
blockchain
java https blog csdn net junmoxi article details 79740930 https blog csdn net junmoxi article details 79740930 https github com pibigstar blockchain raw master 01 png hash hash hash hash hash data hash hash hash hash hash bob 2 sally 2 john bob 1 1 1 1000 hash merkle root merkle root stringutils
java blockchain
blockchain
Students
lvcc a y 2016 2017 1st semester act students names github url facebook url 1 acao indira indiraacao https github com indiraacao indira acao 7 https www facebook com indira acao 7 2 adao evan joshua joshua adao https github com joshua adao joshua adao https facebook com joshua adao 3 alviar stephanie grace stephanie101 https github com stephanie101 stephaniegracealviar https facebook com stephaniegracealviar 4 artuza janbert janbertartuza23 https github com janbertartuza23 artuzajanbert https facebook com artuzajanbert 5 berce ruth maruthberce20 https github com maruthberce20 100008426380954 https facebook com profile php id 100008426380954 6 boado jerome boadojerome https github com boadojerome traxex0123 https facebook com traxex0123 7 briones ysrael bibleaffliction11 https github com bibleaffliction11 ysrael o2 https facebook com ysrael o2 8 cabangon karl cabangonkulas https github com cabangonkulas karl cabangon 1 https facebook com karl cabangon 1 9 casipagan brigette xxbrigettexx https github com xxbrigettexx senseibrigette https facebook com senseibrigette 10 de guzman reinerio reineriodeguzman https github com reineriodeguzman reinerio deguzman1 https facebook com reinerio deguzman1 11 del rosario dave dabe2106 https github com dabe2106 dave l delrosario 9 https facebook com dave l delrosario 9 12 diverson christian christiandiverson https github com christiandiverson enixer https facebook com enixer 13 enriquez jessa isang11jessa https github com isang11jessa jessal enriquez https facebook com jessal enriquez 14 espinosa elsie elsieespinosa10 https github com elsieespinosa10 elsie espinoa 180 http www facebook com elsie espinoa 180 15 gaviola mariella mariella18 https github com mariella18 myca gaviola 3 https facebook com myca gaviola 3 16 linao rose ann roseannlinao https github com roseannlinao roseanne linao https facebook com roseanne linao 17 macapagal rangel angelo rangel cruz macapagal https github com rangel cruz macapagal angelo macapagal 94 https facebook com angelo macapagal 94 18 napilot noella amor amornapilot https github com amornapilot noellaamor napilot https www facebook com noellaamor napilot 19 otadoy cherry chery02 https github com chery02 cherryann otadoy https facebook com cherryann otadoy 20 ponce sarah mae saranghae08 https github com saranghae08 sarahnghaeyyooo https facebook com sarahnghaeyyooo 21 quial paul reynald paulreynald https github com paulreynald quialpaul https facebook com quialpaul 22 ramos jaison jan jaison25 https github com jan jaison25 jaison ramos 98 https facebook com jaison ramos 98 23 reyes nicko samcoii22 https github com samcoii22 nicko reyes 921 https facebook com nicko reyes 921 24 ricohermoso russel ixrussel https github com ixrussel russelricohermoso http facebook com russelricohermoso 25 saludo dan mhark dhansaludo https github com dhansaludo danmhark saludo https facebook com danmhark saludo 26 salvador al john john1509 https github com john1509 aljohn1509 https facebook com aljohn1509 27 tan taiki taikitan09 https github com taikitan09 taikihapon https facebook com taikihapon 28 vibar leo leovibar https github com leovibar leo vibar 7 https facebook com leo vibar 7 29 villafuerte adrian rudolf upethegreat2024 https github com upethegreat2024 upethegreat http facebook com upethegreat 30 wabinga jessa mae jessamae24 https github com jessamae24 jessamae wabinga https facebook com jessamae wabinga bsis students names github url facebook url 1 alquiroz lynyrd ross lynyrdross https github com lynyrdross mr lyrhen17 https facebook com mr lyrhen17 2 aranas michaela michaela4 https github com michaela4 michaela aranas 9 https facebook com michaela aranas 9 3 avila ronnel nelronel avila19 https github com nelronel avila19 ronnel avila 73 https facebook com ronnel avila 73 4 bernardino eldrin ldrin01 https github com ldrin01 ldrin01 https facebook com ldrin01 5 calingasan ronmar ronmaru009 https github com ronmaru009 ronsky calingasan https facebook com ronsky calingasan 6 ca eda mark paul markpaulcan https github com markpaulcan markpaul caneda 7 https facebook com markpaul caneda 7 7 cao john michael leahmic https github com leahmic jmichcao https facebook com jmichcao 8 cortez chyrine cortezchyrine https github com cortezchyrine cortez chyrine https facebook com cortez chyrine 9 crisostomo rhez sean rhezsean https github com rhezsean rhez sean https facebook com rhez sean 10 dagdag ruthille ruthille https github com ruthille ruthille tan https www facebook com ruthille tan 11 dayrit kent kdpogz https github com kdpogz kenken19xd https facebook com kenken19xd 12 de guzman gerald igooop https github com igooop gherald17 https facebook com gherald17 13 de leon john johndeleon https github com johndeleon huynhthilan deleon https facebook com huynhthilan deleon 14 de mesa josel charille chadenggg https github com chadenggg chadenggg https facebook com chadenggg 15 dimabuyo john kim kenneth kennethdimabuyo https github com kennethdimabuyo kenethdimabuyu https facebook com kenethdimabuyu 16 florentino lemuel rafael lemlemz22 https github com lemlemz22 lemlemz22 https facebook com lemlemz22 17 florentino lorenz lorenzhahaha https github com lorenzhahaha lorenzhahaha https facebook com lorenzhahaha 18 guevarra carlo trunks07 https github com trunks07 carlo guevarra 923 https facebook com carlo guevarra 923 19 inductivo marella xxxmain https github com xxxmain xxmain02 https facebook com xxmain02 20 isip michael john mj isip23 https github com mj isip23 100005215443628 https facebook com profile php id 100005215443628 21 llamado lenny lengg https github com lengg lhenny moraleda https facebook com lhenny moraleda 22 loayon dan avery danavery19 https github com danavery19 dloayon03 https facebook com dloayon03 23 maunes joane pauline paulinemaunes https github com paulinemaunes pau o03 https facebook com pau o03 24 nedia javee javened https github com javened javee nedia 3 https facebook com javee nedia 3 25 orozco kimberly kimorozco https github com kimorozco kimberly orozco 50 https facebook com kimberly orozco 50 26 pangan sarah saaraah22 https github com saaraah22 spangan3 https facebook com spangan3 27 pate a joseph sephatena https github com sephatena joseppatena http facebook com joseppatena 28 pineda aldrin aldriiiiin https github com aldriiiiin aldrin pineda 54 https facebook com aldrin pineda 54 29 quiza charlie kizaken https github com kizaken charlie quiza https facebook com charlie quiza 30 quiza cristian kirisaite https github com kirisaite cristian quiza https facebook com cristian quiza 31 requinto angela mikaela angelarequinto https github com angelarequinto angela requinto https facebook com angela requinto 32 rigo cristine amie amierigo https github com amierigo cristine rigo 7 https facebook com cristine rigo 7 33 san buenaventura reymart mdash131 https github com mdash131 mdash131 https facebook com mdash131 34 sedurante norlieta norlietasedurante https github com norlietasedurante norlieta sedurante https facebook com norlieta sedurante 35 simbulan john rian von wagpokoya https github com wagpokoya totoyshabz https facebook com totoyshabz 36 sison ramil princeramil https github com princeramil princeramil sison 06 https facebook com princeramil sison 06 37 tan kyle vincent elyk18 https github com elyk18 kylevincent q tan https facebook com kylevincent q tan 38 tubtub allyson mae allysonmaetubtub https github com allysonmaetubtub tubtuballysonmae http facebook com tubtuballysonmae 39 tuno danalene dandanalene https github com dandanalene dhantuno https facebook com dhantuno 40 visaya christian kulitfreez https github com kulitfreez khuletutzmappetutz https facebook com khuletutzmappetutz 41 zabala abner zabalaabner https github com zabalaabner abner zabala 3 https facebook com abner zabala 3
front_end
HAL_FreeRTOS_Modbus
armink freemodbus slave master rtt stm32 freemodbus modbus modbus freemodbus freemodbus v1 6 modbus 1 1 freemodbus modbus mb c modbus freemodbus modbus mb m c modbus freemodbus modbus ascii mbascii c ascii freemodbus modbus functions mbfunccoils c freemodbus modbus functions mbfunccoils m c freemodbus modbus functions mbfuncdisc c freemodbus modbus functions mbfuncdisc m c freemodbus modbus functions mbfuncholding c freemodbus modbus functions mbfuncholding m c freemodbus modbus functions mbfuncinput c freemodbus modbus functions mbfuncinput m c freemodbus modbus functions mbfuncother c modbus freemodbus modbus functions mbutils c freemodbus modbus rtu mbcrc c crc freemodbus modbus rtu mbrtu c rtu freemodbus modbus rtu mbrtu m c rtu freemodbus modbus tcp mbtcp c tcp freemodbus port port c freemodbus port portevent c freemodbus port portevent m c freemodbus port portserial c freemodbus port portserial m c freemodbus port porttimer c freemodbus port porttimer m c freemodbus port user mb app c modbus freemodbus port user mb app m c modbus m modbus 2 1 2 1 1 rt thread 1 ucos freertos freemodbus port portevent m c xmbmasterporteventinit xmbmasterporteventpost xmbmasterporteventget vmbmasterosresinit xmbmasterrunrestake vmbmasterrunresrelease vmbmastererrorcbrespondtimeout vmbmastererrorcbreceivedata vmbmastererrorcbexecutefunction modbus vmbmastercbrequestscuuess embmasterwaitrequestfinish modbus modbus modbus modbus poll 1 2 1 2 freemodbus port user mb app m c 4 freemodbus id usmregholdbuf 2 1 id 3 1 2 1 3 modbus modbus 4 modbus modbus modbus embmasterreginputcb embmasterregholdingcb embmasterregcoilscb embmasterregdiscretecb easydatamanager https github com armink easydatamanager 2 2 freemodbus port cpu stm32f103x rt thread 1 rt thread bsp mcu 2 2 1 freemodbus port portserial m c vmbmasterportserialenable 485 vmbmasterportclose xmbmasterportserialinit 485 xmbmasterportserialputbyte xmbmasterportserialgetbyte prvvuarttxreadyisr pxmbmasterframecbtransmitterempty prvvuartrxisr pxmbmasterframecbbytereceived cpu 2 2 2 freemodbus port porttimer m c xmbmasterporttimersinit t3 5 usprescalervalue ust35timeout50us vmbmasterporttimerst35enable t3 5 vmbmasterporttimersconvertdelayenable vmbmasterporttimersrespondtimeoutenable vmbmasterporttimersdisable prvvtimerexpiredisr pxmbmasterportcbtimerexpired 1 usprescalervalue ust35timeout50us 2 freemodbus modbus include mbconfig h cpu 3 api modbus api mb mre no err mb mre no reg mb mre ill arg mb mre rev data mb mre timedout mb mre master busy mb mre exe fun modbus c master station usmregholdbuf 0 i 1 3 1 c embmasterreqerrcode embmasterreqwriteholdingregister uchar ucsndaddr ushort usregaddr ushort usregdata long ltimeout ucsndaddr 0 usregaddr usregdata ltimeout 3 2 c embmasterreqerrcode embmasterreqwritemultipleholdingregister uchar ucsndaddr ushort usregaddr ushort usnregs ushort pusdatabuffer long ltimeout ucsndaddr 0 usregaddr usnregs pusdatabuffer ltimeout 3 3 c embmasterreqerrcode embmasterreqreadholdingregister uchar ucsndaddr ushort usregaddr ushort usnregs long ltimeout ucsndaddr 0 usregaddr usregdata ltimeout 3 4 c embmasterreqerrcode embmasterreqreadwritemultipleholdingregister uchar ucsndaddr ushort usreadregaddr ushort usnreadregs ushort pusdatabuffer ushort uswriteregaddr ushort usnwriteregs long ltimeout ucsndaddr 0 usreadregaddr usnreadregs pusdatabuffer uswriteregaddr usnwriteregs ltimeout 3 5 c embmasterreqerrcode embmasterreqreadinputregister uchar ucsndaddr ushort usregaddr ushort usnregs long ltimeout ucsndaddr 0 usregaddr usregdata ltimeout 3 6 c embmasterreqerrcode embmasterreqwritecoil uchar ucsndaddr ushort uscoiladdr ushort uscoildata long ltimeout ucsndaddr 0 uscoiladdr uscoildata ltimeout 3 7 c embmasterreqerrcode embmasterreqwritemultiplecoils uchar ucsndaddr ushort uscoiladdr ushort usncoils uchar pucdatabuffer long ltimeout ucsndaddr 0 uscoiladdr usncoils pucdatabuffer ltimeout 3 8 c embmasterreqerrcode embmasterreqreadcoils uchar ucsndaddr ushort uscoiladdr ushort usncoils long ltimeout ucsndaddr 0 uscoiladdr usncoils ltimeout 3 9 c embmasterreqerrcode embmasterreqreaddiscreteinputs uchar ucsndaddr ushort usdiscreteaddr ushort usndiscretein long ltimeout ucsndaddr 0 usdiscreteaddr usndiscretein ltimeout 4 1 freemodbus modbus include mbconfig h modbus rtu modbus ascii modbus tcp 3 modbus rtu 1 4 2 1 embmasterinit modbus 2 embmasterenable modbus 3 embmasterpoll 4 api modbus 4 3 api freemodbus port portevent m c bsd license 1 http www rt thread org
os
memery
memery use human language to search your image folders what is memery meme about having too many memes images e2goemyweaakclz jpeg the problem you have a huge folder of images memes screenshots datasets product photos inspo albums anything you know that somewhere in that folder is the exact image you want but you can t remember the filename or what day you saved it there s nothing you can do but scroll through the folder skimming hundreds of thumbnails hoping you don t accidentally miss it and that you ll recognize it when you do see it humans do this amazingly well but even with computers local image search is still a manual effort you re still sorting through folders of images like an archivist of old now there s memery the memery package provides natural language search over local images you can use it to search for things like a line drawing of a woman facing to the left and get reasonably good results you can do this over thousands of images it s not optimized for performance yet but search times scale well under o n you can view the images in a browser gui or pipe them through command line tools you can use memery or its modules in jupyter notebooks including gui functions under the hood memery makes use of clip the contrastive language image pretraining transformer https github com openai clip released by openai in 2021 clip trains a vision transformer and a language transformer to find the same latent space for images and their captions this makes it perfect for the purpose of natural language image search clip is a giant achievement and memery stands on its shoulders outline usage install locally use gui use cli use the library development contributing who works on this project quickstart windows run the windows install bat file run the windows run bat file installation with python 3 9 or greater from github recommended pip install git https github com deepfates memery git or git clone https github com deepfates memery git cd memery poetry install from pypi pip install memery pip install git https github com openai clip git currently memery defaults to gpu installation this will probably be switched in a future version for now if you want to run cpu only run the following command after installing memery pip install torch 1 7 1 cpu torchvision 0 8 2 cpu torchaudio 0 7 2 f https download pytorch org whl torch stable html someday memery will be packaged in an easy to use format but since this is a python project it is hard to predict when that day will be if you want to help develop memery you ll need to clone the repo see below usage what s your use case i have images and want to search them with a gui app use the browser gui i have a program workflow and want to use image search as one part of it use as a python module use from command line or shell scripts i want to improve on and or contribute to memery development start by cloning the repo use gui currently memery has a rough browser based gui to launch it run the following in a command line memery serve or set up a desktop shortcut that points to the above command optionally you can pass a directory to open on startup like so memery serve home user pictures memes relative directories will also work cd pictures memery serve memes the default directory passed will be images which is memery s example meme directory memery will open in a browser window the interface is pretty straightforward but it has some quirks screenshot of memery gui displaying wholesome memes images streamlit screenshot png the sidebar on the left controls the location and query for the search the directory box requires a full directory path unfortunately streamlit does not yet have a folder picker component the path is relative to your current working directory when you run memery serve the search will run once you enter a text or image query if you enter both text and image queries memery will search for the combination beneath these widgets is the output area for temporary messages displayed with each search mostly this can be ignored the right hand panel displays the images and associated options major errors will appear here as giant stack traces sometimes changing variables in the other widgets will fix these errors live if you get a large error here it s helpful to take a screenshot and share it with us in github issues use cli the memery command line matches the core functionality of memery use the recall command to search for images passing the path and optionally passing the n flag to control how many images are returned default 10 use the t flag to pass a text query and the i flag to pass an image query or both memery recall path to image folder t text query i path to image jpg n 20 you can encode and index all the images with the build command optionally specifying the number of workers to build the dataset with default 0 memery build path to image folder workers 4 clear out the encodings and index using the purge command memery purge path to image folder use as a library the core functionality of memery is wrapped into the memery class from memery core import memery memery memery the function currently called query flow accepts a folder name and a query and returns a ranked list of image files you can query with text or a filepath to an image or both ranked memery query flow images dad joke print ranked 5 converting query searching 82 images done in 4 014755964279175 seconds images memes wholesome meme 68 jpg images memes wholesome meme 74 jpg images memes wholesome meme 88 jpg images memes wholesome meme 78 jpg images memes wholesome meme 23 jpg here s the first result from that list images memes wholesome meme 68 jpg so that s how to use memery let s look at how you can help make it better development pull the repo clone this repository from github git clone https github com deepfates memery git install dependencies and memery enter the memery folder and install requirements cd memery poetry install and finally install your local editable copy of memery with pip install e contributing memery is open source and you can contribute see contributing md contributing md for guidelines on how you can help who works on this project memery was first written by max anton brewer aka deepfates in the summer of 2021 some commits are listed from robotface io but that was just me using the wrong account when i first started many ui and back end improvements were added by wkrettek in 2022 i wrote this to solve my own needs and learn notebook based development i hope it helps other people too if you can help me make it better please do i welcome any contribution guidance or criticism the ideal way to get support is to open an issue on github however the fastest way to get a response from me is probably to direct message me on twitter twitter com deepfates
natural-language computer-vision local-search image-search python
ai
Prompt-Development-with-Vertex-AI
prompt development with vertex ai with the current ways of generating new and unique content using llm this section will introduce the vertex ai platform and the generative ai studio tool it will also provide an overview of foundational models and how they can be used to build tune and deploy generative ai applications intro in this section you will learn about the vertex ai platform generative ai studio and foundation models you will also learn how to use these tools to build tune and deploy generative ai applications here are some of the key points that will be covered in this section what is vertex ai what is generative ai studio what are foundation models how to use vertex ai and generative ai studio to build tune and deploy generative ai applications what is vertex ai vertex ai is a machine learning platform that helps you build deploy and scale machine learning models it provides a unified experience for data scientists machine learning engineers and operations engineers making it easier to collaborate and manage your ml projects here are some of the key features of vertex ai automl automl makes it easy to train machine learning models without writing any code you simply provide your data and automl will automatically select the best algorithm and hyperparameters for your model explainable ai vertex ai explainable ai helps you understand how your machine learning models make predictions this can be helpful for debugging models and ensuring that they are making fair and unbiased predictions continuous monitoring vertex ai continuous monitoring helps you track the performance of your machine learning models over time this can help you identify problems early on and take corrective action before your models start to degrade deployment vertex ai makes it easy to deploy your machine learning models into production you can deploy models to a variety of endpoints including google cloud functions app engine and cloud run vertex ai is a powerful tool for building and deploying machine learning models if you re looking for a way to simplify your ml workflow and improve the performance of your models then vertex ai is a great option what is generative ai studio generative ai studio is a google cloud console tool for rapidly prototyping and testing generative ai models it provides a variety of features that make it easy to experiment with generative ai including a library of pre trained models generative ai studio includes a library of pre trained models that you can use to generate text images and other content a chat interface you can interact with generative ai studio using a chat interface which makes it easy to experiment with different prompts and parameters a prompt designer the prompt designer allows you to create custom prompts for your generative ai models a tuning tool the tuning tool allows you to fine tune your generative ai models to improve their performance generative ai studio is a powerful tool for exploring the possibilities of generative ai it can be used to create a variety of different types of content including text generative ai studio can be used to generate text such as poems code scripts and musical pieces images generative ai studio can be used to generate images such as paintings sketches and photographs audio generative ai studio can be used to generate audio such as music speech and sound effects generative ai studio is still under development but it is a valuable tool for anyone who is interested in exploring the possibilities of generative ai here are some specific examples of what generative ai studio can do generate text that is similar to the style of a particular author create images that are based on a specific description generate music that is in a particular genre translate text from one language to another write different kinds of creative content such as poems code scripts musical pieces emails letters etc generative ai studio is a powerful tool that can be used to create a wide variety of content it is still under development but it is already a valuable resource for anyone who is interested in exploring the possibilities of generative ai what are foundation models foundation models are large pre trained machine learning models trained on a massive dataset of unlabeled data they are designed to be adapted to a wide range of downstream tasks such as natural language processing computer vision and question answering foundation model images foundationmodel png foundation models are trained using a technique called self supervised learning in self supervised learning the model is not given any labels for the data that it is trained on instead the model learns to predict relationships between different parts of the data this allows the model to learn more general representations of the data which makes it easier to adapt to new tasks foundation models have been shown to be very effective at a variety of tasks for example the gpt 3 foundation model has been shown to be able to generate text translate languages write different kinds of creative content and answer your questions in an informative way foundational models are still under development but they have the potential to revolutionize the way that ai is used they could be used to create new applications that are not possible with today s ai technology here are some examples of foundation models gpt 3 gpt 3 is a foundation model developed by openai it is a large language model that can be used to generate text translate languages write different kinds of creative content and answer your questions in an informative way bert bert is a foundation model developed by google ai it is a language model that can be used for a variety of natural language processing tasks such as text classification question answering and sentiment analysis dall e 2 dall e 2 is a foundation model developed by openai it is a generative model that can be used to create images from text descriptions foundation models are a powerful new tool for ai developers they have the potential to revolutionize the way that ai is used we can expect to see many new and exciting applications of foundation models in the years to come hands on tutorial now let s dive into using vertex ai for a hands on tutorial 1 create a google cloud account 2 then open google console button after a few moments the cloud console opens in this tab click on the drop down button on the right of google cloud google cloud dropdown images googleclouddropdown png 3 next type the name of the project you want to create create account images createaccount png enable the vertex ai api 1 in the google cloud console type vertex ai api in the top search bar vaertex ai api images vertexaiapi png 2 click on the result for vertex ai api under marketplace vertex ai marketplace images vertexaipaimerketplace png 3 click enable task 1 text prompts text prompt lets you designs prompts for tasks relevant to your business use case including code generation 1 in the google cloud console enter artificial intelligence 2 click on vertex ai 3 in the vertex ai menu under generative ai studio click language 4 click on the text prompt icon text prompt images textprompt png 5 you will be redirected to the following page prompt design images promptdesign png prompt design is the process of figuring out and designing the best input text prompt to get the desired response back from a language model the model will provide a response based on how you structure your prompt for example if you ask a question the model will try to answer it there is no one best way to design prompts but there are three general methods that you can use to shape the model s response in a way that you desire prompting images prompting png 1 zero shot prompting zero shot prompting images zeroshotprompting png can be used the llm can be prompted to generate creative text such as poems code scripts musical pieces emails letters etc for example you could prompt the llm with write a poem about a cat who goes on an adventure and it would generate a poem about a cat who goes on an adventure 2 one shot prompting one shot prompting images oneshotprompting png can be used one shot prompting is a powerful technique that can be used to train llms to perform a variety of tasks it is beneficial when there is limited data available for the task however it is important to note that one shot prompting is not always perfect and the llm may not always be able to perform the task correctly the quality of the results will depend on the quality of the prompt and the example 3 few shot prompting few shot prompting images fewshotprompting png can be used to train an llm to write poems about cats you could provide the llm with a prompt like write a poem about a cat and a few examples of poems about cats to train an llm to translate languages you could provide the llm with a prompt like translate this sentence from english to french and a few examples of sentences that have been translated from english to french to train an llm to answer questions you could provide the llm with a prompt like what is the capital of france and a few examples of questions that have been answered correctly on the language ui image above you ll notice the free form and structured tabs those are the two modes that you can use when designing your prompt free form this mode provides a free and easy approach to designing your prompt it is suitable for small and experimental prompts with no additional examples you will be using this to explore zero shot prompting structured this mode provides an easy to use template approach to prompt design context and multiple examples can be added to the prompt in this mode this is especially useful for one shot and few shot prompting methods which you will be exploring later free form mode let s try zero shot prompting you can use this mode to design prompts i will be using the following prompt to explore zero shot you can follow along with me copy and paste the prompt below into the text field and then click on the submit button copy what is vertex ai freeform images freeform png the model will be able to respond accordingly to the term in the prompt gallery examples to explore when you adjust the parameters on the right 1 adjust the token limit parameter to 1 and click the submit button token limit1 images tokenlimit1 png 2 adjust the token limit parameter to 1024 and click the submit button token limit2 images tokenlimit2 png 3 adjust the temperature parameter to 0 5 and click the submit button temperature limit1 images temperaturelimit1 png 4 adjust the temperature parameter to 1 0 and click the submit button temperature limit2 images temperaturelinit2 png as you can see we are having unique responses as the parameters change structured mode using the structured mode we can learn one shot and few shot prompting your prompts can be designed in a more organized way by providing context and examples in their input fields we will be asking the model to complete a sentence 1 click on the structured tab 2 under the test field write the following in the input field 3 click on the submit button structured mode images structuredmode png 4 you will be able to see the result in the output now let s try to influence the model with one shot prompting by adding an example for the model to learn in the examples field 1 copy the following to the input field copy the color of the grass is 2 copy the following to the output field copy green 3 click on the submit button examlples field images examplesfield png congratulations you ve successfully influenced the response of the model sentiment analysis we would now perform sentiment analysis on the generated responses such as determining the reviews of a meal in a restaurant as positive or negative 1 under the examples field delete the previous prompts for input and output for green grass sentimental analysis images sentimentanalysis png the model did not have enough data to know it was supposed to use sentiment analysis by providing the model with a few examples of sentiment analysis 2 copy the following prompt to the input field under the test field input output test images inputoutputtest png 3 click on the submit button senteminatal analysis2 images sentimentanalysis2 png as you can see the model is able to use sentiment analysis the input text it was a time well spent got labeled as positive to save the prompt you designed click on the save button and name it what you like such as the sentiment analysis test sentimental analysis test images sentimentanalysistest png you can access the saved prompt under the my prompts tab my prompts tab images mypromptstab png task 2 chat prompt the chat prompt creates conversations between the user and the model tracking the user input and the model s output 1 return to the language page chat prompt images chatprompt png 2 click on the text chat button to create a new chat prompt 3 a new chat prompt page will open chat prompt page images chatpromptpage png great we will now train the model on the chat prompt by adding context to the model to produce a response 1 add these contexts to the context field copy your name is roger copy you are a sous chef of a restaurant copy you only respond with have you tried the soup du jour to any queries 2 copy the following to the chatbox under responses copy my meal was not great 3 click enter key or click the send message button right arrow head button responses images responses png congarulations you learned how to create and test a prompt create a conversation and explore the prompt gallery you have taken the first step to start your journey using generative ai studio and other generative ai development tools div style width 480px iframe allow fullscreen frameborder 0 height 320 src https giphy com embed uh26nurbarpbzy8yro video width 480 iframe div prompt development with vertex ai
cloud
punitkatiyar.github.io
start full stack development img src web application development png hi i am punit full stack developer and corporate trainer i m currently working on ducat india pvt ltd i m currently learning react and node i m looking to collaborate on youtube i m looking for help with i am a fullstack developer with php and node how to reach me https www techunitbook com pronouns fun fact 1 before start development system software application software desktop application web application mobile application web development module 1 client side server side domain name dns hosting client side module 2 html and html5 css and css3 javascript jquery bootstrap server side module 3 php node java net python favicon generator https favicon io font awesome https use fontawesome com releases v5 15 4 css all css a full stack freelancer and consultant is a professional who possesses a wide range of skills and expertise in both frontend and backend development they have the ability to work independently or as part of a team to deliver end to end solutions for clients or businesses here is a description that outlines the key responsibilities and skills of a full stack freelancer and consultant responsibilities developing and maintaining websites web applications or software solutions collaborating with clients or project stakeholders to understand their requirements and translate them into technical specifications designing and implementing user interfaces ui and user experiences ux that are visually appealing intuitive and responsive creating and managing databases ensuring efficient data storage and retrieval building and integrating apis application programming interfaces for seamless communication between different components of a system implementing security measures to protect applications and user data writing clean well documented and maintainable code conducting thorough testing and debugging to identify and fix issues providing technical guidance recommendations and support to clients or project teams staying updated with the latest industry trends technologies and best practices skills proficiency in front end technologies such as html css javascript and frameworks like react angular or vue js strong understanding of back end development using languages such as python php ruby or node js experience with server side frameworks like express django flask or ruby on rails knowledge of databases such as mysql postgresql mongodb or redis familiarity with version control systems particularly git ability to work with apis and integrate third party services understanding of responsive design principles and mobile first development knowledge of web performance optimization techniques strong problem solving and debugging skills effective communication and collaboration abilities to work with clients and project teams in addition to technical skills a full stack freelancer and consultant should also possess good project management skills business acumen and the ability to work independently and meet deadlines they should be adaptable continuously learning and capable of providing valuable insights and guidance to clients or businesses
html-css-javascript php node npm react
front_end
RoadToFAANG_CV
the hitchhiker s guide in getting a faang job computer vision note faang is now mamaa meta amazon microsoft alphabet apple check wikipedia https en wikipedia org wiki big tech fang faang and mamaa short term learning list i e i have an interview very soon what should i do youtube talks how to get a job at google amazon facebook or microsoft in 2019 university of southern california https www youtube com watch v 6nodotyhsbc chip huyen on machine learning interviews full stack deep learning november 2019 https www youtube com watch v pli1k75psa8 playlist mock interviews data science jay https youtube com playlist list plxxms4piug2gzxeeqrxxzkbpxvqlksxat online reading materials chip huyen introduction to machine learning interviews https huyenchip com ml interviews book machine learning glossary https ml cheatsheet readthedocs io en latest index html cheatsheet cs 229 machine learning https github com afshinea stanford cs 229 machine learning blob master en super cheatsheet machine learning pdf data science cheatsheet 2 0 https github com aaronwangy data science cheatsheet blob main data science cheatsheet pdf cheatsheet for machine learning and data science https sites google com view datascience cheat sheets explore interviewcake webpages https www interviewcake com scan tech interview handbook https www techinterviewhandbook org long term learning list i e i am searching for a job in next few months what should i do free courses cs229 machine learning stanford university https cs229 stanford edu cs230 deep learning stanford university https cs230 stanford edu cs231n deep learning for computer vision stanford university http cs231n stanford edu cs 329s machine learning systems design stanford university https stanford cs329s github io non free courses grokking the system design interview educative https www educative io courses grokking the system design interview grokking the machine learning interview educative https www educative io courses grokking the machine learning interview ace the python coding interview educative https www educative io path ace python coding interview mlexpert algoexpert https www algoexpert io machine learning product algoexpert algoexpert https www algoexpert io product systemsexpert algoexpert https www algoexpert io systems product books introduction to algorithms fourth edition goodreads https www goodreads com book show 60869154 introduction to algorithms fourth edition cracking the coding interview 6th edition goodreads https www goodreads com book show 55014663 cracking the coding interview system design interview an insider s guide vol i goodreads https www goodreads com book show 54109255 system design interview an insider s guide system design interview an insider s guide vol ii goodreads https www goodreads com book show 60631342 system design interview an insider s guide designing data intensive applications the big ideas behind reliable scalable and maintainable systems goodreads https www goodreads com book show 34626431 designing data intensive applications designing machine learning systems an iterative process for production ready applications goodreads https www goodreads com book show 60715378 designing machine learning systems ace the data science interview goodreads https www goodreads com book show 58949285 ace the data science interview solve as much as possible problems using leetcode recommended https leetcode com or hackerearth https www hackerearth com practice or freecodecamp beginner level https www freecodecamp org learn coding interview prep do mock interviews with experts using interviewquery https www interviewquery com interviewing io https interviewing io or igotanoffer https app igotanoffer com coaching tech
computer-science data-science deep-learning machine-learning computer-vision faang image-processing jobs
ai
last-watch-ai
last watch ai build status https travis ci com akmolina28 last watch ai svg branch master https travis ci com akmolina28 last watch ai styleci https github styleci io repos 296938596 shield style flat branch master https github styleci io repos 296938596 last watch ai is a locally hosted application for creating if then automations based on computer vision events last watch can be used in tandem with ip camera software to trigger actions if the ai detects certain objects like cars people or animals this project was heavily inspired by gentlepumpkin bi aidetection https github com gentlepumpkin bi aidetection detection event3 previews detection event3 jpg how it works last watch watches for new image files in a configurable directory then checks each image for a range of objects such as cars or people different automations can be set up to trigger if the ai detects relevant objects in the images a primary use case for last watch is to enhance video management systems such as blue iris for example last watch can be set up to trigger security recordings https kleypot com last watch ai blue iris integration only when a person is detected on security camera thereby eliminating irrelevant motion events last watch can also be integrated with automation platforms suchs as home assistant node red or openhab for instance you can create virtual sensors https kleypot com vehicle presence sensor with home assistant and last watch ai which are set when a person is in a room or when a car is parked in the garage features platform independence everything runs in docker containers web interface desktop and mobile friendly web api https github com akmolina28 last watch ai blob dev docs api md everything can be managed with rest offline and locally hosted supported automations telegram send images to bot folder copy copy images to a local folder smb cifs upload images to a samba share synology windows share home assistant web request make http get and post requests mqtt publish out mqtt messages installation last watch has been tested on windows debian and arch linux dual core cpu and 2gb of memory are recommended detailed windows setup guide https kleypot com last watch ai windows setup detailed ubuntu setup guide https kleypot com last watch ai ubuntu installation and upgrading install from source recommended 1 clone the repo git clone https github com akmolina28 last watch ai git cd last watch ai 2 create the env file from one of the examples cp configs env linux env nano env 3 build the application cp src env example src env sudo docker compose up d mysql sudo docker compose run rm composer install optimize autoloader no dev sudo docker compose run rm artisan route cache sudo docker compose run rm artisan key generate force sudo docker compose run rm artisan storage link sudo docker compose run rm artisan migrate force sudo docker compose run rm npm install verbose sudo docker compose run rm npm run prod verbose 4 start the containers sudo docker compose up d build site upgrade from source 1 stop the containers during the upgrade cd path to last watch ai sudo docker compose down 2 pull latest source code git pull 3 re build application and migrate database sudo docker compose run rm composer install optimize autoloader no dev sudo docker compose run rm artisan route cache sudo docker compose run rm artisan migrate force sudo docker compose run rm npm install verbose sudo docker compose run rm npm rebuild sudo docker compose run rm npm run prod verbose 4 restart the containers sudo docker compose up d build site install from release build 1 install docker https docs docker com docker for windows install and docker compose https docs docker com compose install 2 download the latest release https github com akmolina28 last watch ai releases un zip and extract the files 3 set the watch folder and web port by editing the env file 4 start the containers with docker compose docker compose up d build site user guide and documentation user guide https kleypot com last watch ai user guide full walkthrough with blue iris https kleypot com last watch ai blue iris integration api reference https github com akmolina28 last watch ai blob dev docs api md automation examples car presence sensor with home assistant and last watch ai https kleypot com vehicle presence sensor with home assistant and last watch ai contributing contributions are welcome and will be credited development and testing last watch is written in php on the laravel framework https laravel com and the interface is written in vue js https vuejs org the laravel application also functions as a headless api such that the interface can be completely bypassed if needed the application is made up of several docker containers each container serves a different purpose the nginx mysql and php containers make up the lemp stack to host the laravel app the watcher is a node js app that scans for image files and sends webhook messages to the laravel app the deepstack container hosts the object detection api the queue container handles queued jobs https laravel com docs 8 x queues from the laravel app using supervisor the scheduler is a cron like container that handles scheduled tasks https laravel com docs 8 x scheduling from the laravel app npm artisan composer and testing containers are also defined to help with development and building to get started with development follow the steps to install the project from source use your favorite ide to edit the source files and recompile the front end using webpack you can run docker compose run rm npm run watch to automatically recompile when scripts are changed to execute tests use the built in phpunit container by running docker compose run rm phpunit this container will also stand up separate testing containers for mysql and deepstack to run the feature tests a href https www buymeacoffee com akmolina28 target blank img src https cdn buymeacoffee com buttons v2 default yellow png alt buy me a coffee style height 20px important width 72px important a
docker home-security home-automation computer-vision deepstack laravel
ai
shapash
p align center img src https raw githubusercontent com maif shapash master docs static shapash resize png width 300 title shapash logo p p align center tests a href https github com maif shapash workflows build 20 26 20test badge svg img src https github com maif shapash workflows build 20 26 20test badge svg alt tests a pypi a href https img shields io pypi v shapash img src https img shields io pypi v shapash alt pypi a downloads a href https static pepy tech personalized badge shapash period total units international system left color grey right color orange left text downloads img src https static pepy tech personalized badge shapash period total units international system left color grey right color orange left text downloads alt downloads a python version a href https img shields io pypi pyversions shapash img src https img shields io pypi pyversions shapash alt pyversion a license a href https img shields io pypi l shapash img src https img shields io pypi l shapash alt license a doc a href https shapash readthedocs io en latest img src https readthedocs org projects shapash badge version latest alt doc a p what s new version new feature description tutorial 2 3 x additional dataset columns br new demo https shapash demo ossbymaif fr br article https pub towardsai net shapash 2 3 0 comprehensive model interpretation 40b50157c2fb in webapp target and error columns added to dataset and possibility to add features outside the model for more filtering options img src https raw githubusercontent com maif shapash master docs static add column icon png width 50 title add column https github com maif shapash blob master tutorial generate webapp tuto webapp01 additional data ipynb 2 3 x identity card br new demo https shapash demo ossbymaif fr br article https pub towardsai net shapash 2 3 0 comprehensive model interpretation 40b50157c2fb in webapp new identity card to summarize the information of the selected sample img src https raw githubusercontent com maif shapash master docs static identity card png width 50 title identity https github com maif shapash blob master tutorial generate webapp tuto webapp01 additional data ipynb 2 2 x picking samples br article https www kdnuggets com 2022 11 picking examples understand machine learning model html new tab in the webapp for picking samples the graph represents the true values vs predicted values img src https raw githubusercontent com maif shapash master docs static picking png width 50 title picking https github com maif shapash blob master tutorial plots and charts tuto plot06 prediction plot ipynb 2 2 x dataset filter br new tab in the webapp to filter data and several improvements in the webapp subtitles labels screen adjustments img src https raw githubusercontent com maif shapash master docs static webapp png width 50 title webapp https github com maif shapash blob master tutorial tutorial01 shapash overview launch webapp ipynb 2 0 x refactoring shapash br refactoring attributes of compile methods and init refactoring implementation for new backends img src https raw githubusercontent com maif shapash master docs static modular png width 50 title modular https github com maif shapash blob master tutorial explainer and backend tuto expl06 shapash custom backend ipynb 1 7 x variabilize colors br giving possibility to have your own colour palette for outputs adapted to your design img src https raw githubusercontent com maif shapash master docs static variabilize colors png width 50 title variabilize colors https github com maif shapash blob master tutorial common tuto common02 colors ipynb 1 6 x explainability quality metrics br article https towardsdatascience com building confidence on explainability methods 66b9ee575514 to help increase confidence in explainability methods you can evaluate the relevance of your explainability using 3 metrics stability consistency and compacity img src https raw githubusercontent com maif shapash master docs static quality metrics png width 50 title quality metrics https github com maif shapash blob master tutorial explainability quality tuto quality01 builing confidence explainability ipynb 1 5 x acv backend br a new way of estimating shapley values using acv more info about acv here https towardsdatascience com the right way to compute your shapley values cfea30509254 img src https raw githubusercontent com maif shapash master docs static wheel png width 50 title wheel acv backend tutorial explainer and backend tuto expl03 shapash acv backend ipynb 1 4 x groups of features br demo https shapash demo2 ossbymaif fr you can now regroup features that share common properties together br this option can be useful if your model has a lot of features img src https raw githubusercontent com maif shapash master docs static groups features gif width 120 title groups features https github com maif shapash blob master tutorial common tuto common01 groups of features ipynb 1 3 x shapash report br demo https shapash readthedocs io en latest report html a standalone html report that constitutes a basis of an audit document img src https raw githubusercontent com maif shapash master docs static report icon png width 50 title shapash report https github com maif shapash blob master tutorial generate report tuto shapash report01 ipynb overview shapash is a python library which aims to make machine learning interpretable and understandable by everyone it provides several types of visualization that display explicit labels that everyone can understand data scientists can understand their models easily and share their results end users can understand the decision proposed by a model using a summary of the most influential criteria shapash also contributes to data science auditing by displaying usefull information about any model and data in a unique report readthedocs documentation badge https readthedocs org projects shapash badge version latest https shapash readthedocs io en latest presentation video for french speakers https www youtube com watch v r1r a9b9apk medium understand your model with shapash towards ai https pub towardsai net shapash making ml models understandable by everyone 8f96ad469eb3 model auditability towards ds https towardsdatascience com shapash 1 3 2 announcing new features for more auditable ai 64a6db71c919 group of features towards ai https pub towardsai net machine learning 6011d5d9a444 building confidence on explainability towards ds https towardsdatascience com building confidence on explainability methods 66b9ee575514 picking examples to understand machine learning model https www kdnuggets com 2022 11 picking examples understand machine learning model html enhancing webapp built in features for comprehensive machine learning model interpretation https pub towardsai net shapash 2 3 0 comprehensive model interpretation 40b50157c2fb p align center img src https raw githubusercontent com maif shapash master docs static shapash global gif width 800 p contributors div align left div style display flex align items flex start img align middle src https github com maif shapash blob master docs static logo maif png width 18 img align middle src https github com maif shapash blob master docs static logo quantmetry png width 18 img align middle src https github com maif shapash blob master docs static logo societe generale png width 18 img align middle src https github com maif shapash blob master docs static logo groupe vyv png width 18 img align middle src https github com maif shapash blob master docs static logo sixfoissept png width 18 div div awards a href https raw githubusercontent com maif shapash master docs static awards argus or png img align left src https raw githubusercontent com maif shapash master docs static awards argus or png width 180 a a href https www kdnuggets com 2021 04 shapash machine learning models understandable html img src https www kdnuggets com images tkb 2104 g png raw true width 65 a features display clear and understandable results plots and outputs use explicit labels for each feature and its values p align center img align left src https github com maif shapash blob master docs static shapash grid images 02 png raw true width 28 img src https github com maif shapash blob master docs static shapash grid images 06 png raw true width 28 img align right src https github com maif shapash blob master docs static shapash grid images 04 png raw true width 28 p p align center img align left src https github com maif shapash blob master docs static shapash grid images 01 png raw true width 28 img src https github com maif shapash blob master docs static shapash resize png raw true width 18 img align right src https github com maif shapash blob master docs static shapash grid images 13 png raw true width 28 p p align center img align left src https github com maif shapash blob master docs static shapash grid images 12 png raw true width 33 img src https github com maif shapash blob master docs static shapash grid images 03 png raw true width 28 img align right src https github com maif shapash blob master docs static shapash grid images 10 png raw true width 25 p allow data scientists to quickly understand their models by using a webapp to easily navigate between global and local explainability and understand how the different features contribute live demo shapash monitor https shapash demo ossbymaif fr summarize and export the local explanation shapash proposes a short and clear local explanation it allows each user whatever their data background to understand a local prediction of a supervised model thanks to a summarized and explicit explanation evaluate the quality of your explainability using different metrics easily share and discuss results with non data users select subsets for further analysis of explainability by filtering on explanatory and additional features correct or wrong predictions picking examples to understand machine learning model https www kdnuggets com 2022 11 picking examples understand machine learning model html deploy interpretability part of your project from model training to deployment api or batch mode contribute to the auditability of your model by generating a standalone html report of your projects report example https shapash readthedocs io en latest report html we hope that this report will bring a valuable support to auditing models and data related to a better ai governance data scientists can now deliver to anyone who is interested in their project a document that freezes different aspects of their work as a basis of an audit report this document can be easily shared across teams internal audit dpo risk compliance p align center img src https raw githubusercontent com maif shapash master docs static shapash report demo gif width 800 p how shapash works shapash is an overlay package for libraries dedicated to the interpretability of models it uses shap or lime backend to compute contributions shapash builds on the different steps necessary to build a machine learning model to make the results understandable p align center img src https raw githubusercontent com maif shapash master docs static shapash diagram png width 700 title diagram p shapash works for regression binary classification or multiclass problem br it is compatible with many models catboost xgboost lightgbm sklearn ensemble linear models svm br shapash can use category encoders object sklearn columntransformer or simply features dictionary br category encoder onehotencoder ordinalencoder basenencoder binaryencoder targetencoder sklearn columntransformer onehotencoder ordinalencoder standardscaler quantiletransformer powertransformer installation shapash is intended to work with python versions 3 8 to 3 10 installation can be done with pip pip install shapash in order to generate the shapash report some extra requirements are needed you can install these using the following command pip install shapash report if you encounter compatibility issues you may check the corresponding section in the shapash documentation here https shapash readthedocs io en latest installation instructions index html quickstart the 4 steps to display results step 1 declare smartexplainer object there 1 mandatory parameter in compile method model you can declare features dict here to specify the labels to display from shapash import smartexplainer xpl smartexplainer model regressor features dict house dict optional parameter preprocessing encoder optional compile step can use inverse transform method postprocessing postprocess optional see tutorial postprocessing step 2 compile dataset there 1 mandatory parameter in compile method dataset xpl compile x xtest y pred y pred optional for your own prediction by default model predict y target ytest optional allows to display true values vs predicted values additional data x additional optional additional dataset of features for webapp additional features dict features dict additional optional dict additional data step 3 display output there are several outputs and plots available for example you can launch the web app app xpl run app live demo shapash monitor https shapash demo ossbymaif fr step 4 generate the shapash report this step allows to generate a standalone html report of your project using the different splits of your dataset and also the metrics you used xpl generate report output file path to output report html project info file path to project info yml x train xtrain y train ytrain y test ytest title story house prices report title description this document is a data science report of the kaggle house prices tutorial project it was generated using the shapash library metrics name mse path sklearn metrics mean squared error report example https shapash readthedocs io en latest report html step 5 from training to deployment smartpredictor object shapash provides a smartpredictor object to deploy the summary of local explanation for the operational needs it is an object dedicated to deployment lighter than smartexplainer with additional consistency checks smartpredictor can be used with an api or in batch mode it provides predictions detailed or summarized local explainability using appropriate wording predictor xpl to smartpredictor see the tutorial part to know how to use the smartpredictor object tutorials this github repository offers many tutorials to allow you to easily get started with shapash details summary b overview b summary launch the webapp with a concrete use case tutorial tutorial01 shapash overview launch webapp ipynb jupyter overviews the main outputs and methods available with the smartexplainer object tutorial tutorial02 shapash overview in jupyter ipynb shapash in production from model training to deployment api or batch mode tutorial tutorial03 shapash overview model in production ipynb use groups of features tutorial common tuto common01 groups of features ipynb deploy local explainability in production with smartpredictor tutorial predictor to production tuto smartpredictor introduction to smartpredictor ipynb details details summary b charts and plots b summary shapash features importance tutorial plots and charts tuto plot03 features importance ipynb contribution plot to understand how one feature affects a prediction tutorial plots and charts tuto plot02 contribution plot ipynb summarize display and export local contribution using filter and local plot method tutorial plots and charts tuto plot01 local plot and to pandas ipynb contributions comparing plot to understand why predictions on several individuals are different tutorial plots and charts tuto plot04 compare plot ipynb visualize interactions between couple of variables tutorial plots and charts tuto plot05 interactions plot ipynb display true values vs predicted values tutorial plots and charts tuto plot06 prediction plot ipynb customize colors in webapp plots and report tutorial common tuto common02 colors ipynb details details summary b different ways to use encoders and dictionaries b summary use category encoder inverse transformation tutorial use encoders tuto encoder01 using category encoder ipynb use columntransformers tutorial use encoders tuto encoder02 using columntransformer ipynb use simple python dictionnaries tutorial use encoders tuto encoder03 using dict ipynb details details summary b displaying data with postprocessing b summary using postprocessing parameter in compile method tutorial postprocess tuto postprocess01 ipynb details details summary b using different backends b summary compute shapley contributions using shap tutorial explainer and backend tuto expl01 shapash viz using shap contributions ipynb use lime to compute local explanation summarize it with shapash tutorial explainer and backend tuto expl02 shapash viz using lime contributions ipynb use acv backend to compute active shapley values and sdp global importance tutorial explainer and backend tuto expl03 shapash acv backend ipynb compile faster lime and consistency of contributions tutorial explainer and backend tuto expl04 shapash compute lime faster ipynb use fasttreeshap or add contributions from another backend tutorial explainer and backend tuto expl05 shapash using fasttreeshap ipynb use class shapash backend tutorial explainer and backend tuto expl06 shapash custom backend ipynb details details summary b evaluating the quality of your explainability b summary building confidence on explainability methods using stability consistency and compacity metrics tutorial explainability quality tuto quality01 builing confidence explainability ipynb details details summary b generate a report of your project b summary generate a standalone html report of your project with generate report tutorial generate report tuto shapash report01 ipynb details details summary b analysing your model via shapash webapp b summary add features outside of the model for more exploration options tutorial generate webapp tuto webapp01 additional data ipynb details
python machine-learning explainability explainable-ml transparency ethical-artificial-intelligence shap lime interpretability
ai
espDrone-IDF
why esp8266drone rtos sdk 1 run multiple tasks independently 2 faster 3 learn more about socs technology the components 1 the pid component from https github com ussserrr pid controller server git 2 the mpu6050 component from http www i2cdevlib com 3 the tcp ip adapter component from the esp8266 sdk included by default in the project makefile we are goint to set a http server or use mqtt modbus still looking for the best solution environment setup 1 follow the https docs espressif com projects esp8266 rtos sdk en latest get started programming guide 2 clone the this repository into esp directory 3 on the terminal esp path type make menuconfig you can also change set the toolchain path here but it is recommended to do this on the bashrc profile 4 on the same terminal type make flash monitor
os
TTK4155-Byggern
ping pong game this is the term project for embedded and industrial computer systems design at ntnu fall 2019 the goal has been to make a ping pong game to shoot a ping pong ball with a solenoid and control a servo and motor so this solenoid can move main breadboard breadboard bird documentation images breadboard bird png connections breadboard fritzing documentation breadboard bb png schematic breadboard schematic documentation breadboard schem png solenoid and ir diode breadboard solenoid documentation solenoid bb png schematic breadboard solenoid documentation solenoid schem png
os
Azure-Data-Engineering-Project
azure data engineering project create a real time azure cloud data engineering project using the olympics dataset architecture diagram azure architecture diagram https github com mehmaam99 azure data engineering project assets 45142408 b1d95d82 7cd1 42f4 97c6 4a39100b9bb4 data source download the dataset from kaggle and store it on github access the dataset using azure data factory dataset link olympics dataset https www kaggle com datasets arjunprasadsarkhel 2021 olympics in tokyo azure data factory create a pipeline using azure data factory adf to extract data via http and store it in azure data lake storage gen 2 azure databricks retrieve data from azure data lake storage gen 2 perform necessary transformations using azure databricks and then store the transformed data back in azure data lake storage gen 2 azure synapse analytics utilize azure synapse analytics for data warehousing execute sql commands to uncover insights within the data establish a linked service to connect to power bi for data visualization important notes encountering issues here s where to find solutions data source and data factory components are functioning smoothly integrating with databricks is presenting some challenges if you encounter an error while creating an account for the synapse analytics service you can refer to this link for assistance azure synapse analytics account creation error https learn microsoft com en us answers questions 739391 permission on subscription when creating azure syn connecting with power bi requires a workspace name to obtain this you need an organizational account and a power bi pro version
cloud
riscv-fw-infrastructure
comment n solid https content riscv org wp content uploads 2018 09 unnamed png https nodesource com products nsolid n solid http riscv net wp content uploads 2015 01 riscv logo retina png wd risc v firmware package this repository is wd risc v firmware package holds wd firmware rtos al comrv demos for swerv cores from web download gcc 11 toolchain for risc v llvm clang 12 0 0 toolchain for risc v along with custom gcc binutils comrv eclipse mcu getting the sources this repository uses submodules you need the recursive option to fetch the submodules automatically git clone recursive https github com chipsalliance riscv fw infrastructure git alternatively git clone https github com chipsalliance riscv fw infrastructure git cd riscv fw infrastructure git submodule update init recursive toolchain binary to get the built toolchains please use the following links latest gnu debian download link https wdc box com shared static yah1vncrsv7iebcsq2a80wv2wfpyn0s7 gz llvm debian download link https wdc box com shared static f9mn68lddc91ssw9vumj3hp9pdko3zd7 gz folders for additional releases gnu gnu release folders https wdc box com s lfam8pwhghwshkmjf1yc542ghzfbyn7y llvm llvm release folders https wdc box com s v562eei6d01bhzqcc4si76z1vq0w4ibu nbsp using gcc toolchain from the repo root folder unzip riscv gnu toolchain debian tar gz to the wd firmware demo build toolchain directory tar xvf riscv gnu toolchain debian tar gz c wd firmware demo build toolchain using llvm toolchain from the repo root folder unzip riscv llvm toolchain debian tar gz to the wd firmware demo build toolchain directory tar xvf riscv llvm toolchain debian tar gz c wd firmware demo build toolchain notes comrv demo will work only with the llvm toolchain gcc is not supported toolchains are build for debian and centos7 nbsp code convention see the coding convention coding convention asciidoc wd firmware the constitutes an sdk fw it contains firmware applications and processor support package psp for various cores alongside demos that support all features the following readme file describes how to setup a build environment for the wd risc v firmware it guides how to build the program downloading it and debugging it on the supported platforms and cores the fw infra was verified with vmware player v 15 hosting debian 9 6 current fw support rtos al abstraction layer al for small footprint embedded real time operation systems rtos the target is to provide a homogeneous api for the developer usage so the kernel can be replaced for several different rtoss platforms boards and new firmware features comrv risc v overlay repo https github com riscv riscv overlay cacheable overlay manager risc v comrv a software technique to load code in real time at the moment it is needed for execution comrv is a full sw solution suite and it does not require any hw support swerv cores psp and demos this repo includes the processor support package suite it is a full sw package to support swerv cores this repo also provides demos that run the psp suite and demonstrate the core features current platform and core support hifive1 swervolf running on nexys a7 fpga eh1 eh2 and el2 with full soc link to source https github com chipsalliance cores swervolf whisper iss tool running eh1 eh2 el2 link to source https github com chipsalliance swerv iss source tree structure javascript wd firmware board supported boards hifive 1 nexys a7 eh1 support for swerv eh1 running on swervolf nexys a7 eh2 support for swerv eh2 nexys a7 el2 support for swerv el2 running on swervolf whisper swerv iss support for swerv eh1 eh2 el2 common common source comrv comrv cacheable overlay mangager for risc v core source demo demos source build example build scripts toolchain container for the unzip toolchains demo rtosal c abstruction layer al demo on freertos main c the main of all demos psp psp functionality rtos rtosal rtos abstraction layer rtos core specific rtos source code freertos additional downloads eclipse mcu download eclipse mcu with risc v support from two options option 1 eclipse offical https projects eclipse org projects iot embed cdt downloads option 2 wd offical eclipse tested and supported https wdc box com shared static 4bpcd5cn07lzicyadjkd0lcimtdl1d8t gz nbsp from the repo unzip eclipse mcu 2019 12 tar gz to your designated directory for the eclipse mcu tar xvf eclipse mcu 2019 12 tar gz c eclipse mcu root other download standard packages are required the following command can install those sudo apt get install scons libftdi1 2 libmpfr4 note if libmpfr4 can not be installed in cases of newer versions 6 on the host machine you can create a symbolic link to libmpfr so 6 sudo ln s usr lib x86 64 linux gnu libmpfr so 6 usr lib x86 64 linux gnu libmpfr so 4 download and install java se runtime environment for risc v openocd you will need the following depended libs libusb 0 1 libusb 1 0 0 dev libusb dev sudo apt get install libusb 0 1 libusb 1 0 0 dev libusb dev shortcut bootstraping if you want to download all tools from wd box llvm gnu eclipse in one shoot use python bootstrap py building for source preparations launch eclipse mcu eclipse mcu root eclipse eclipse import wd firmware code from eclipse mcu menu bar select file import in the import window select general existing projects into workspace next in the next import window select root directory browse and choose the infra riscv fw wd firmware you ve downloaded in getting the firmware sources section press finish button nbsp build compile and link you will need to choose a specific demo for building a full solution from the eclipse terminal or console cd wd firmware root wd firmware demo build config sh then you will be asked to choose a demo for more explanation on adding new demos please read the readme file on demos note to run the script you will need python we support python 2 7 only from eclipse mcu menu bar select project build all note that you can select which platform to build for platforms downloading debugging we provide several platforms to work with please follow the instructions for the one you preferred setting up hifive1 ftdi over usb taken from sifive freedom studio manual v1p6 connect to your hifive1 debug interface and type lsusb to see if ft2232c is connected lsusb bus device id 0403 6010 future technology devices international ltd ft2232c dual usb uart fifo ic set the udev rules to allow the device to be accessed by the plugdev group sudo cp wd firmware root wd firmware board hifive1 99 openocd rules etc udev rules d add and verify the current user to plugdev group sudo usermod a g plugdev user groups plugdev power off on debian station nbsp setting up nexys a7 for swerv swervolf eh1 and el2 swervolf are fpga files create by olof kindgren under chips alliance if you wish to know more please use this link cores swervolf https github com chipsalliance cores swervolf for the fpga bit file loading do the following steps fpga image file loading using usd device copy the fpga bit file to a usd device from the following path 1 for swerv eh1 wd firmware board nexys a7 eh1 eh1 reference design bit 2 for swerv eh2 wd firmware board nexys a7 eh2 swervolf eh2 bit 3 for swerv el2 wd firmware board nexys a7 el2 swervolf el2 bit connect the usd to the nexys a7 board usd slot is on board s bottom set the following jumpers jp1 usb sd pins jp2 connect the 2 pins on sd side slide switch sw0 to off and all others to on prog mode png nexys a7 png at power on the fpga bit file is loaded to the fpga led busy should be orange while flushing is done wait for orange led to be off once off the board is ready to be used fpga image file loading using eclipse mcu from eclipse ide menu bar open the external tools configurations run external tools external tools configurations under the program list select 1 for swerv eh1 nexys a7 eh1 flush and press the run button 2 for swerv eh2 nexys a7 eh2 flush and press the run button 3 for swerv el2 nexys a7 el2 flush and press the run button the eclipse ide console will display shutdown command invoked upon completion nbsp setting up iss works as a simulator for swerv eh1 eh2 and el2 there is nothing to set for swerv iss just select debugger luncher following next nbsp platforms debug on eclipse mcu debug select from the eclipse mcu menu bar run debug configurations choose the platform you wish to run on from left main windows menu current support javascript hifive1 hifive eval board nexys a7 eh1 swerolf nexys a7 digilent fpga board running swerv eh1 with full system on chip support from chipsalliance cores swervolf nexys a7 el2 swerolf nexys a7 digilent fpga board running swerv el2 with full system on chip support from chipsalliance cores swervolf nexys a7 eh2 swerolf nexys a7 digilent fpga board running swerv eh2 with full system on chip support whisper eh1 connect and debug iss simulator for swerv eh1 whisper eh2 connect and debug iss simulator for swerv eh2 whisper el2 connect and debug iss simulator for swerv el2 whisper eh2 connect and debug multi hart same as whisper eh2 connect and debug but with 2 hw threads harts running simultaneously adding new source modules the folder wd firmware demo build contains a template file sconscript template which can be used nbsp supporting gcc releases riscv gcc 10 2 0 riscv official 10 2 0 gcc release official gdb 9 2 0 check gcc hash txt for the precise commits of toolchain build supporting llvm releases riscv llvm clang 12 0 0 initial llvm clang official 12 0 0 release comrv support modules check llvm hash txt for the precise commits of toolchain build notes and status this repo is always under work the following are notes and status for items that is still missing or under work 05 oct 2020 all eh2 demos are working only on whisper no fpga currently software interrupts are not supported in el2 fpga until it will be supported you can use whisper for this demo bitmanip is supported only on llvm 28 feb 2021 all eh2 demos are working including fpga support supporting smp debugging for eh2 support comrv data overlay el2 includes dccm added getchar demo july 2021 adding asciidoc for psp psp reference manual adoc https github com chipsalliance riscv fw infrastructure tree master wd firmware psp docs psp reference manual adoc move to gcc 11
os
ml-course-msu
img src http www machinelearning ru wiki images 2 28 ml surfaces png width 280 https docs google com spreadsheets d 1t0agj2eiwick0n9dyk3c fmuvshzxxpng7zvi28oqx8 edit gid 2044373835 ml cmc msu gmail com https docs google com forms d 1j8zmremtl bceavisxx v42 y8gaveoloffuahqjhbc viewform wiki http wiki cs hse ru 1 https github com esokolov ml course hse 1 2 80 3 1 3 4 7 1 ul li li li li ul ml16 lecture notes sem01 intro pdf 14 2 ul li li li li ul 1 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture02 linregr pdf br 2 https github com esokolov ml course hse blob master 2017 fall seminars sem02 linregr part1 pdf ml17 homeworks theory homework theory 1 differentiation pdf 28 3 ul li li li li li li ml16 lecture notes sem02 knn pdf 5 4 ul li locality sensitive hashing li ml16 lecture notes sem03 knn pdf ml16 homeworks sem03 knn hw pdf 12 5 ul li li li li ul 1 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture02 linregr pdf 4 br 2 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture03 linregr pdf 5 https github com esokolov ml course hse blob master 2017 fall homeworks theory homework theory 01 linregr pdf 5 19 6 ul li li li li ul https github com esokolov ml course hse blob master 2017 fall lecture notes lecture04 linclass pdf https github com esokolov ml course hse blob master 2017 fall seminars sem04 linclass metrics pdf https github com esokolov ml course hse blob master 2017 fall homeworks theory homework theory 03 linclass metrics pdf 26 7 ul li li li li ul 1 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture05 linclass pdf br 2 https www dropbox com s 6wpcznye5otzeu7 sem05 pdf dl 0 https github com esokolov ml course hse blob master 2017 fall homeworks theory homework theory 04 linclass pdf svm 2 9 1 https github com esokolov ml course msu blob master ml16 lecture notes sem10 linear pdf 2 https github com esokolov ml course msu blob master ml16 lecture notes sem11 linear pdf 1 https github com esokolov ml course msu blob master ml16 homeworks sem10 linear hw pdf 2 https github com esokolov ml course msu blob master ml16 homeworks sem11 linear hw pdf 9 10 16 11 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture07 trees pdf https github com esokolov ml course hse blob master 2017 fall homeworks theory homework theory 05 knn trees pdf knn 30 12 bias variance decomposition https github com esokolov ml course hse blob master 2017 fall seminars sem08 bvd pdf https github com esokolov ml course hse blob master 2017 fall homeworks theory homework theory 06 bvd pdf 14 13 ul li li li li ul 1 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture08 ensembles pdf br 2 https github com esokolov ml course hse blob master 2017 fall seminars sem09 gbm part1 pdf 8 14 1 https github com esokolov ml course hse blob master 2017 fall lecture notes lecture11 dl pdf br 2 https github com esokolov ml course hse blob master 2017 fall seminars sem10 nn part1 pdf 15 15 1 https github com esokolov ml course hse blob master 2017 spring lecture notes lecture14 kernels pdf br 2 https github com esokolov ml course hse blob master 2017 spring lecture notes lecture16 kernels pdf br https github com esokolov ml course hse blob master 2017 spring seminars sem12 kernels pdf https github com esokolov ml course hse blob master 2017 spring homeworks theory homework theory 09 kernels pdf 1 5 22 16 1 https github com esokolov ml course hse blob master 2017 spring seminars sem15 flda pdf br https github com esokolov ml course hse blob master 2017 spring seminars sem14 kernels pdf 1 17 https github com esokolov ml course msu blob master ml15 lecture notes sem13 bayes pdf 15 18 em https github com esokolov ml course hse blob master 2017 spring lecture notes lecture17 em pdf https github com esokolov ml course hse blob master 2017 spring seminars sem16 em pdf https github com esokolov ml course hse blob master 2017 spring homeworks theory homework theory 10 unsupervised pdf 1 4 22 19 https github com esokolov ml course hse blob master 2016 spring seminars sem21 gp pdf 29 20 ul li li li li ul https github com esokolov ml course msu blob master ml17 contests contest 1 rudnev solotky pdf 5 21 https github com esokolov ml course msu blob master ml17 lecture notes sem20 glm pdf https github com esokolov ml course msu blob master ml17 homeworks theory homework theory 20 glm pdf 12 22 1 https www kaggle com c jigsaw toxic comment classification challenge 2 https www kaggle com c cryptocurrency trading cmc msu cs hse 2018 leaderboard 1 1 ml17 homeworks practice homework practice 01 homework practice 01 ipynb 18 09 2017 01 10 2017 23 59 msk 08 10 2017 23 59 msk 2 ml17 homeworks practice homework practice 02 ipynb 04 11 2017 19 11 2017 23 59 msk 26 11 2017 23 59 msk 3 ml17 homeworks practice homework practice 03 homework practice 03 ipynb 07 12 2017 21 12 2017 23 59 msk 31 12 2017 23 59 msk 4 ml17 homeworks practice homework practice 04 homework practice 04 ipynb 10 03 2018 25 03 2018 23 59 msk 01 04 2018 23 59 msk 5 ml17 homeworks practice homework practice 05 homework practice 05 ipynb 07 05 2018 20 05 2018 23 59 msk 30 05 2018 23 59 msk 6 ml17 homeworks practice homework practice 06 ipynb 30 05 2018 23 59 msk
ai
TEAM-FALCON-Frontend
team falcon frontend kindly ensure you work from your develop branch and create all pull requests to develop branch this is to ensure you do not experience conflicts issues when creating pull requests any pull request made to the master branch will be closed tips use git branch to see the branch you are currently working on if for example you want to work on about page in design two create about html file inside design two folder if it s header and footer name your file header footer html building and running on localhost first install dependencies sh yarn install to create a development build sh yarn dev to create a production build sh yarn build deploying the dist folder is the webroot directory
front_end
Blog-App-Backend
blog app online blog application using mern stack
mongodb mongoose nodejs expressjs
server
vimrc
slava s vim setup this is my setup i ve been using for the last 6 month or so this configuration wasn t meant to be used on remote hosts where you edit your configs over ssh because nobody does it likely this config can be used for day to day development if you are brave enough to use someone s config on your vim i am not however it can act as a good learning material to someone features syntax highlighting for common things in webdev javascript https github com pangloss vim javascript less https github com groenewege vim less json https github com elzr vim json typescript https github com leafgarland typescript vim coffee https github com kchmck vim coffee script markdown https github com tpope vim markdown unite vim https github com shougo unite vim with awesome fuzzy search features c n to open a file in the current directory leader c n to open a file in some subdirectory recursively using an external program for speed uses find by default install ag for speed will be used if available c p to open a buffer leader d to change the current directory vimshell https github com shougo vimshell vim to open a shell written in vimscript it is not the best shell ever but is good enough to run tests quickly you will probably need to change the default vim command from mvim to what you want gvim or vim leader p to paste a segment from yank history leader j to open a list of other menu items editable in vimrc gundo vim https github com sjl gundo vim to jump between file s edit versions surround vim https github com tpope vim surround to surround text with tags brackets parentheses or quotes fugitive vim https github com tpope vim fugitive git wrapper tern js http ternjs net plugin with meteor support https github com slava tern meteor like intellisense for javascript weird visual things you might want to change vim airline https github com bling vim airline status bar needs a customized font for macvim xxx add this to install script tomorrow night theme https github com slava vim tomorrow js specifically a fork extended for js vim signature https github com kshenoy vim signature to visually see marks new splits are added on the bottom not on the top new visual splits appear on the right not on the left weird bindings you might want to change leader is mapped to comma c hjkl mapped to movement between splits leader to kill current search leader s to enable disable the spellchecker leader l to highlight non printing characters leader m and leader n to switch between tabs leader w saves the file maps to w enter leader j to open a list of menu items in the normal mode enters the command mode just like the original action of repeat the last t is not preserved quick jk in insert mode is mapped to esc to avoid pressing esc what sucks i will be honest the following things suck and i didn t fix them yet formatting is broken for a lot of things plugins are written in javascript python and vimscript 3 different languages everything is c n and c p centric because i mapped them to my thumb on kinesis doesn t mean it is great for everyone feature requests fixed identation for js handlebars and css no dependency on fonts install script should take into consideration the existing vimrc and vim folders not to override someone s setup accidentally dependencies latest macvim works well as of 7 4 patch around august 2013 vim built with python for ternjs and gundo vim built with lua for neocomplete vim npm and node js for ternjs c compiler for unite vim make for vimproc git to fetch dependencies this sucks i know ag optional for the speedy subdirectory search see the repo https github com ggreer the silver searcher installation first of all be sure to have a compatible version of vim the easiest way to install a full featured vim on mac os x with homebrew is to run brew install macvim v override system vim with lua with luajit in your terminal instructions for other platforms satisfying the dependencies dependencies are welcome after you can run the bash script from this repo which will take care of everything but will do something terrible if you already have any of vim or vimrc but it works great on a clean set up curl https raw github com slava vimrc master install script sh install script sh bash install script sh manual installation download vimrc file curl https raw github com slava vimrc master vimrc vimrc install neobundle mkdir p vim bundle git clone https github com shougo neobundle vim vim bundle neobundle vim open vim install all the packages quit vim vim c neobundleinstall c q finish installation by installing tern meteor cd vim bundle tern for vim npm install curl https raw github com slava tern meteor master meteor js node modules tern plugin meteor js
front_end
zephyr-os-bluepill-playground
getting started with zephyr rtos on bluepill board project initialization create a new project using platformio with the following settings board bluepill f103c8 generic framework zephyr rtos upload using st link there are several ways to upload the firmware to this board but the default way is to use the st link debugger https stm32 base org guides connecting your debugger html in linux you can use the following commands to install the st link drivers console sudo apt y install stlink tools sudo systemctl restart udev modify platformio ini file to set stlink as default upload method upload protocol stlink now you can upload the firmware to the board using plaformio https platformio org you can also use st flash tool for programming using st link in linux console st flash write firmware bin 0x8000000 using uart and embedded bootloader all the stm32 microcontrollers come with built in bootloaders that burned in during production memory mapping https github com m3y54m zephyr os bluepill playground assets 1549028 9bf5569e b50e 4b1a a864 6f612b9ceae4 a couple of special mcu pins has to be set up to proper logical values to enter the bootloader the pins are named boot0 and boot1 on the stm32 microcontroller boot pins can select several modes of bootloader operation boot1 boot0 boot mode aliasing x 0 main flash memory main flash memory is selected as boot space 0 1 system memory system memory is selected as boot space 1 1 embedded sram embedded sram is selected as boot space boot mode https github com m3y54m zephyr os bluepill playground assets 1549028 61dce9f7 215b 49e4 8745 91ad020d334f as you can see there are three cases the first case is when the boot0 pin is tied to the ground and boot1 is open or at a logical state of 0 after the reset program is executed from main flash memory grounded boot pins are a standard configuration when executing programs in the field the second case boot1 0 boot0 1 means that after reset execution starts at system memory were built into bootloader resides this is the case when we need to upload binaries via usart1 the third case means that program execution is performed in sram read this article https scienceprog com flashing programs to stm32 embedded bootloader to understand how you can use uart1 for programming this board supported features in zephyr the stm32 min dev board configuration supports the following hardware features interface controller driver component nvic on chip nested vectored interrupt controller systick on chip system clock uart on chip serial port gpio on chip gpio i2c on chip i2c pwm on chip pwm spi on chip spi usb on chip usb device adc on chip adc other hardware features are not supported by the zephyr kernel connections and ios default zephyr peripheral mapping uart 1 tx rx pa9 pa10 uart 2 tx rx pa2 pa3 uart 3 tx rx pb10 pb11 i2c 1 scl sda pb6 pb7 i2c 2 scl sda pb10 pb11 pwm 1 ch1 pa8 spi 1 nss oe sck miso mosi pa4 pa5 pa6 pa7 spi 2 nss oe sck miso mosi pb12 pb13 pb14 pb15 usb dc dm dp pa11 pa12 adc 1 pa0 system clock the on board 8mhz crystal is used to produce a 72mhz system clock with pll serial port stm32 minimum development board has 3 u s arts the zephyr console output is assigned to uart 1 default settings are 115200 8n1 on board leds the board has one on board led that is connected to pc13 pinout bluepill pinout https github com m3y54m zephyr os bluepill playground assets 1549028 9ed22ff4 b452 4d14 81eb e22531671370 schematics bluepill schematic https github com m3y54m zephyr os bluepill playground assets 1549028 5cc9dd1b 9f18 4a7a 9086 c50335ad85d4 usb to serial cable pl2303hxd cable https github com m3y54m zephyr os bluepill playground assets 1549028 148db7e0 329e 42b2 8abb eb73640658fc usb to serial uart cable is used to connect the board to a pc blue pill usb to serial a9 tx1 rxd a10 rx1 txd g gnd access to serial port in linux in order to upload the compiled program to your board you should have access to serial ports this is done by adding your user to dialout and tty groups console sudo usermod a g dialout user sudo usermod a g tty user you can verify if your user is added to dialout and tty groups using this command console groups user note you should log out and log in or reboot your computer to apply the changes resources blue pill stm32f103c8t6 https stm32 base org boards stm32f103c8t6 blue pill html getting started with stm32f103c8t6 blue pill https www electronicshub org getting started with stm32f103c8t6 blue pill platformio bluepill f103cb https docs platformio org en latest boards ststm32 bluepill f103c8 html connecting st link debugger https stm32 base org guides connecting your debugger html how to fix platformio stm32 error libusb open failed with libusb error access https techoverflow net 2021 09 22 how to fix platformio stm32 error libusb open failed with libusb error access zephyr for bluepill documentation https docs zephyrproject org latest boards arm stm32 min dev doc index html zephyr blinky sample application https github com zephyrproject rtos zephyr blob main samples basic blinky src main c zephyr tutorial for beginners https github com maksimdrachov zephyr rtos tutorial iot rtos zephyr on cheap stm32 minimum development board https embedjournal com iot rtos zephyr stm32 minimum system development board bootloader for stm32f103 boards to use with the arduino stm32 repo and the arduino ide https github com rogerclarkmelbourne stm32duino bootloader blob master bootloader only binaries generic boot20 pc13 bin dfu bootloader for stm32 chips https github com devanlai dapboot an2606 application note stm32 microcontroller system memory boot mode https www st com content ccc resource technical document application note b9 9b 16 3a 12 1e 40 0c cd00167594 pdf files cd00167594 pdf jcr content translations en cd00167594 pdf accessing devices without sudo https elinux org accessing devices without sudo programming stm32f103 blue pill using usb bootloader and platformio https coytbarringer com programming stm32f103 blue pill using usb bootloader platformio
stm32 embedded stm32f103c8t6 zephyr-rtos rtos bluepill-board platformio
os
Programming-In-Python-Course
programming in python course 1 week 1 goals recognize common applications of the python programming language explain foundational software engineering concepts use operators to program a simple output in python use control flow and loops to solve a problem 2 week 2 goals explain the core concepts that underpin the python programming language work with variables and different data types in python use control flow and loops to execute code under specific conditions work with functions and data structures in python recognize possible errors their causes and how to handle them create read and write data in files 3 week 3 goals use functions to explore algorithmic thinking use the logical concepts associated with procedural program flow identify and explain the paradigms of procedural programming instantiate and work with objects classes and methods in python explain the object oriented programming concepts that underpin python 4 week 4 goals find import and use popular python modules and packages leverage powerful tools to optimize the programming workflow explain the types of testing and their features use testing tools to write a test
backend fundamentals programming python
server
aws-iot-analytics-workshop
aws iot analytics workshop in this workshop you will learn about the different components of aws iot analytics you will configure aws iot core to ingest stream data from the aws device simulator process batch data using amazon ecs build an analytics pipeline using aws iot analytics visualize the data using amazon quicksight and perform machine learning using jupyter notebooks join us and build a solution that helps you perform analytics on appliance energy usage in a smart building and forecast energy utilization to optimize consumption table of contents prerequisites prerequisites solution architecture overview solution architecture overview step 1a build the streaming data workflow step 1a build the streaming data workflow step 1b create stream analytics pipeline step 1b create stream analytics pipeline step 1c analyse the data step 1c analyse the data step 2a build the batch analytics workflow step 2a build the batch analytics workflow step 2b create batch analytics pipeline step 2b create batch analytics pipeline step 2c analyse stream and batch data step 2c analyse stream and batch data recap and review so far recap and review what have we done so far step 3 visualize your data using quicksight step 3 visualize your data using quicksight step 4 machine learning and forecasting with jupyter notebooks step 4 machine learning and forecasting with jupyter notebooks recap and review what did we learn in this workshop recap and review what did we learn in this workshop clean up resources in aws clean up resources in aws troubleshooting troubleshooting prerequisites to conduct the workshop you will need the following tools setup knowledge aws account laptop secure shell ssh to login into your docker instance ec2 mac os linux command lines tools are installed by default windows putty ssh client http www putty org http www putty org manual connect ssh to an ec2 instance from windows with putty http docs aws amazon com awsec2 latest userguide putty html http docs aws amazon com awsec2 latest userguide putty html you need to have an ssh key pair a ssh key pair can be generated or imported in the aws console under ec2 key pairs download the pem file locally to log into the ec2 docker instance later in the workshop for mac unix change permissions chmod 400 paste your keypair filename you are in one of the following regions us east 1 n virgina us east 2 ohio us west 2 oregon eu west 1 ireland you do not have more than 3 vpcs already deployed in the active region in the workshop you will create 2 vpcs and the limit for vpcs per region is 5 you do not have more than 3 elastic ip addresses already deloyed in the active region in the workshop you will create 2 elastic ip addresses for the device simulator and the limit of elastic ips per region is 5 solution architecture overview alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture step 1a build the streaming data workflow launch aws iot device simulator with cloudformation the iot device simulator allows you to simulate real world devices by creating device types and data schemas via a web interface and allowing them to connect to the aws iot message broker by choosing one of the links below you will be automatically redirected to the cloudformation section of the aws console where your iot device simulator stack will be launched launch cloudformation stack in us east 1 https console aws amazon com cloudformation home region us east 1 stacks create review stackname iotdevicesimulator templateurl https s3 amazonaws com solutions reference iot device simulator latest iot device simulator template n virginia launch cloudformation stack in us west 2 https console aws amazon com cloudformation home region us west 2 stacks create review stackname iotdevicesimulator templateurl https s3 amazonaws com solutions reference iot device simulator latest iot device simulator template oregon launch cloudformation stack in us east 2 https console aws amazon com cloudformation home region us east 2 stacks create review stackname iotdevicesimulator templateurl https s3 amazonaws com solutions reference iot device simulator latest iot device simulator template ohio launch cloudformation stack in eu west 1 https console aws amazon com cloudformation home region eu west 1 stacks create review stackname iotdevicesimulator templateurl https s3 amazonaws com solutions reference iot device simulator latest iot device simulator template ireland after you have been redirected to the aws cloudformation console take the following steps to launch your stack 1 parameters input administrator name email an id and password will be emailed to you for the iot device simulator 2 capabilities check i acknowledge that aws cloudformation might create iam resources at the bottom of the page 3 create stack 4 wait until the stack creation is complete the cloudformation creation may take between 10 25 mins to complete in the outputs section of your cloudformation stack you will find the management console url for the iot simulator please copy the url to use in the next section top top connect your smart home to aws iot core you will provision the smart home endpoint to publish telemetric data points to aws iot alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture please login to the iot device simulator management console link copied from the earlier step with the provided credentials credentials for the device simulator will be mailed to the email address provided during cloudformation stack creation create the simulated device 1 navigate to modules device types click add device type device type name smart home data topic smarthome house1 energy appliances data transmission duration 7200000 data transmission interval 3000 message payload click add attribute and add the following attributes attribute name data type float precision integer minimum value integer maximum value sub metering 1 float 2 10 100 sub metering 2 float 2 10 100 sub metering 3 float 2 10 25 global active power float 2 1 8 global reactive power float 2 5 35 voltage float 2 10 250 timestamp utc timestamp choose default 3 once the sample message payload shows all the attributes above click save 4 navigate to modules widgets add widget select smart home as the device type number of widgets 1 submit we have now created a simulated smart home device which is collecting power usage data and publishing that data to aws iot core on the smarthome house1 energy appliances topic verify that the data is being published to aws iot note you will use the aws console for the remainder of the workshop sign in to the aws console https aws amazon com console we will verify that the smart home device is configured and publishing data to the correct topic 1 from the aws console choose the iot core service 2 navigate to test on the left pane 3 under subscription input the following subscription topic smarthome house1 energy appliances click subscribe to topic after a few seconds you should see your simulated devices s data that is published on the smarthome house1 energy appliances mqtt topic top top step 1b create stream analytics pipeline in this section we will create the iot analytics components analyze data and define different pipeline activities alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture create your iot analytics s3 storage buckets first you will need to create 3 s3 buckets one for your iot analytics channel one for the data store that holds your transformed data and one for the data set that is resulted from an iot analytics sql query 1 navigate to the s3 management console 2 choose create bucket bucket name give your bucket a unique name must be globally unique and append it with channel for example my iot analytics channel region the region should be the same as where you launched the device simulator cloud formation template 3 click next and keep all options default click on create bucket to finish the creation 4 repeat steps 1 3 twice more to finish creating the required buckets use the appendices datastore and dataset to differentiate the buckets you will also need to give appropriate permissions to iot analytics to access your data store bucket 1 navigate to the s3 management console 2 click on your data store bucket ending with datastore 3 navigate to the permissions tab 4 click on bucket policy and enter the following json policy be sure to edit to include your s3 bucket name version 2012 10 17 id iotadatastorepolicy statement sid iotadatastorepolicyid effect allow principal service iotanalytics amazonaws com action s3 getbucketlocation s3 getobject s3 listbucket s3 listbucketmultipartuploads s3 listmultipartuploadparts s3 abortmultipartupload s3 putobject s3 deleteobject resource arn aws s3 your bucket name here arn aws s3 your bucket name here 5 click save create the iot analytics channel next we will create the iot analytics channel that will consume data from the iot core broker and store the data into your s3 bucket 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to channels 3 create a new channel id streamchannel choose the storage type customer managed s3 bucket and choose your channel s3 bucket created in the previous step iam role create new and give your new iam role a name this will give iot analytics the correct iam policies to access your s3 bucket 7 click next and input the following this step will create an iot rule that consumes data on the specified topic iot core topic filter smarthome house1 energy appliances iam role create new and give your new iam role a name this will give iot analytics the correct iam policies to access your aws iot core topic click on see messages to see the messages from your smartbuilding device arriving on the topic ensure your device is still running in device simulator if you do not see any messages 8 click create channel create the iot analytics data store for your pipeline 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to data stores 3 create a new data store id iotastore choose the storage type customer managed s3 bucket and choose your data store s3 bucket created in the previous step iam role create new and give your new iam role a name this will give iot analytics the correct iam policies to access your s3 bucket 4 click next and then create data store create the iot analytics pipeline 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to pipelines 3 create a new pipeline id streampipeline pipeline source streamchannel 6 click next 7 iot analytics will automatically parse the data coming from your channel and list the attributes from your simulated device by default all messages are selected 8 click next 9 under pipeline activites you can trasform the data in your pipeline add or remove attributes 10 click add activity and choose calculate a message attribute as the type attribute name cost formula sub metering 1 sub metering 2 sub metering 3 1 5 13 test your formula by clicking update preview and the cost attribute will appear in the message payload below 14 add a second activity by clicking add activity and remove attributes from a message attribute name id and click next the id attribute is a unique identifier coming from the device simulator but adds noise to the data set 16 click update preview and the id attribute will disappear from the message payload 17 click next 18 pipeline output click edit and choose iotastore 19 click create pipeline your iot analytics pipeline is now set up top top step 1c analyse the data in this section you will learn how to use iot analytics to extract insights from your data set using sql over a specified time period alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture create a data set 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to data sets 3 choose create a data set 4 select create sql id streamdataset select data store source iotastore this is the s3 bucket containing the transformed data created in step 1b 7 click next 8 keep the default sql statement which should read select from iotastore and click next 9 input the folllowing data selection window delta time offset 5 seconds the 5 second offset is to ensure all in flight data is included into the data set at time of execution timestamp expression from iso8601 timestamp timestamp we need to convert the iso8601 timestamp coming from the streaming data to a standard timestamp 12 keep all other options as default and click next until you reach configure the delivery rules of your analytics results 13 click add rule 14 choose deliver result to s3 s3 bucket select the s3 bucket that ends with dataset 13 click create data set to finalise the creation of the data set execute and save the dataset 1 navigate to data sets on the lefthand navigation pane of the aws iot analytics console 2 click on streamdataset 3 click on actions and in the dropdown menu choose run now 4 on the left navigation menu choose content and monitor the status of your data set creation 5 the results will be shown in the preview pane and saved as a csv in the dataset s3 bucket top top step 2a build the batch analytics workflow in this section we will create an ec2 instance and docker image to batch a public dataset from s3 to an iot analytics data store using containers alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture launch docker ec2 instance with cloudformation by choosing one of the links below you will be automatically redirected to the cloudformation section of the aws console where your stack will be launched before launching the cloudformation you will need an ssh key pair to log into the ec2 instance if you don t have an ssh key pair you can create one by 1 navigate to the ec2 console 2 click on key pairs 3 click on create key pair and input a name 4 save the pem file in a directory accessible on your computer 5 if you are running mac or linux set the appropriate permissions on the keypair chmod 400 myec2keypair pem to launch the cloudformation stack choose one of the following links for your region and follow the steps below launch cloudformation stack in us east 1 https console aws amazon com cloudformation home region us east 1 stacks create review stackname iotanalyticsstack templateurl https s3 amazonaws com iotareinvent18 iota reinvent cfn east json n virginia launch cloudformation stack in us west 2 https console aws amazon com cloudformation home region us west 2 stacks create review stackname iotanalyticsstack templateurl https s3 amazonaws com iotareinvent18 iota reinvent cfn west json oregon launch cloudformation stack in us east 2 https console aws amazon com cloudformation home region us east 2 stacks create review stackname iotanalyticsstack templateurl https s3 amazonaws com iotareinvent18 iota reinvent cfn east 2 json ohio launch cloudformation stack in eu west 2 https console aws amazon com cloudformation home region eu west 1 stacks create review stackname iotanalyticsstack templateurl https s3 amazonaws com iotareinvent18 iota reinvent cfn west 2 json ireland 1 navigate to parameters sshkeyname select the ssh key pair you will use to login to the ec2 instance 2 check the box i acknowledge that aws cloudformation might create iam resources 3 click create stack the cloudformation stack will take approximately 5 7 minutes to complete launching all the necessary resources once the cloudformation has completed navigate to the outputs tab and see the sshlogin parameter copy this string to use when sshing to the ec2 instance setup the docker image on ec2 1 ssh to the ec2 instance using the sshlogin string copied from the above step example ssh i iotaworkshopkeypair pem ec2 user ec2 my ec2 instance eu west 1 compute amazonaws com 2 move to the docker setup folder cd home ec2 user docker setup 3 update your ec2 instance sudo yum update 4 build the docker image docker build t container app ia 5 veryify the image is running docker image ls grep container app ia you should see an output similar to container app ia latest ad81fed784f1 2 minutes ago 534mb 6 create a new repository in amazon elastic container registry ecr using the aws cli pre built on your ec2 instance aws ecr create repository repository name container app ia the output should include a json object which includes the item repositoryuri copy this value into a text editor for later use 7 login to your docker environment you need the for the command to work aws ecr get login no include email 8 tag the docker image with the ecr repository uri docker tag container app ia latest your repostoryuri here latest 9 push the image to ecr docker push your repositoryuri here top top step 2b create batch analytics pipeline alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture in this section we will create the iot analytics pipeline for your public data set analyze data and define different pipeline activities create the iot analytics channel next we will create the iot analytics channel that will consume data from the iot core broker and store the data into your s3 bucket 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to channels 3 create a new channel id batchchannel choose the storage type service managed store in this step we will use an iot analytics managed s3 bucket but you may specify a customer managed bucket as in step 1b if you wish 4 iot core topic filter leave this blank as the data source for this channel will not be from aws iot core 5 leave all other options as default and click next 6 click create channel create the iot analytics pipeline 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to pipelines 3 create a new pipeline id batchpipeline pipeline source batchchannel 4 click next 5 see attributes of messages you will not see any data on this screen as the data source has not been fully configured yet 6 click next 7 under pipeline activites you can trasform the data in your pipeline add or remove attributes 8 click add activity and choose calculate a message attribute as the type attribute name cost formula sub metering 1 sub metering 2 sub metering 3 1 5 9 click next 10 pipeline output click edit and choose iotastore 11 click create pipeline now we have created the iot analytics pipeline we can load the batch data create container data set a container data set allows you to automatically run your analysis tools and generate results it brings together a sql data set as input a docker container with your analysis tools and needed library files input and output variables and an optional schedule trigger the input and output variables tell the executable image where to get the data and store the results 1 navigate to the iot analytics console 2 click on data sets on the left hand navigation pane 3 create a new data set 4 choose create container next to container data sets id container dataset 5 click next 6 click create next to create an analysis without a data set frequency not scheduled click next 7 select analysis container and map variables choose select from your elastic container registry repository select your custom analysis container from elastic container registry container app ia select your image latest 8 under configure the input variables of your container add the following variables which will be passed to your docker container and the python script running in the instance name type value inputdatas3bucketname string iotareinvent18 inputdatas3key string inputdata csv iotchannel string batchchannel 9 click next 10 under iam role click edit and add the role iotacontainerrole this role was created as part of the cloudformation stack 11 configure the resources for container execution compute resource 4 vcpus and 16 gib memory volume size gb 2 12 click next 13 under configure the results of your analytics keep all options as default 14 click next 15 leave configure delivery rules for analysis results as default and click next on this menu you can configure the data set to be delivered to an s3 bucket if you wish 16 finalize the creation of the data set by clicking create data set execute the data set 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to data sets 3 click on container dataset 4 choose actions and run now 5 the container dataset can take up to 10 minutes to run if there are no errors you will see a succeeded message if it fails with an error please see the troubleshooting section below to enable logging top top step 2c analyse stream and batch data alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture now that we have two data sets into the same data store we can analyse the result of combining both the container data and simulated device data create and execute the combined data set 1 navigate to the aws iot analytics console 2 in the left navigation pane navigate to data sets id batchdataset select data store source iotastore 3 click next 4 sql query select from iotastore limit 5000 5 click next 6 leave the rest of the options as default and click next 7 optionally you can choose to have your dataset be placed into an s3 bucket on the configure delivery rules for analysis results pane 8 click create data set 9 click on your newly created batchdataset 10 click on actions and then run now 11 the data set will take a few minutes to execute you should see the results of the executed data set in the result preview as well as an outputted csv file which includes the complete data set top top recap and review what have we done so far in the workshop so far you have accomplished the following launched an iot device simulator using cloudformation used the iot device simulator to simulate data coming from a home energy monitoring device created an iot analytics pipeline that consumes cleans and transforms that data created a custom data set with sql that can be used for further analytics used a second cloudformation template to launch an ec2 instance with a script to simulate a public data set used amazon elastic container registry to host your docker image and iot analytics to launch that docker container to simulate a custom data anlytics workload combined that simulated public data analytics workload with the simulated device data to make a meaningful data set the purpose of the workshop has been to show you how you can use iot analytics for your various data analytics workloads whether that be from external data sets a custom analytics workload using external tools or consuming data directly from iot devices in real time top top step 3 visualize your data using quicksight in this section we will visualize the time series data captured from your smart home alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture sign up for aws quicksight if you have used aws quicksight in the past you can skip these steps 1 navigate to the aws quicksight console 2 if you have never used aws quicksight before you will be asked to sign up be sure to choose the standard tier and choose the correct region for your locale 3 during the sign up phase give quicksight access to your amazon s3 buckets and aws iot analytics via the sign up page visualise your data set with aws quicksight 1 navige to the aws quicksight console 2 click on new analysis 3 click on new data set 4 choose aws iot analytics under from new data sources 5 configure your aws iot analytics data source data source name smarthome dashboard select an aws iot analytics data set to import batchdataset 6 click on create data source to finalise the quicksight dashboard data source configuration you should see the data has also been imported into spice which is the analytics engine driving aws quicksight 7 click on visualize and wait a few moments until you see import complete in the upper right hand of the console you should see all 5000 rows have been imported into spice 8 under fields list choose timestamp to set the x axis for the graph 9 click on sub metering 1 sub metering 2 and sub metering 3 to add them to the value column 10 for each sub metering value choose the drop down menu and set aggregate to average the graphs will look similar to below alt text https github com aws samples aws iot analytics workshop blob master images quicksight png quicksight you can setwith different fields or visual types for visualizing other smart home related information top top step 4 machine learning and forecasting with jupyter notebooks alt text https github com aws samples aws iot analytics workshop blob master images arch png architecture in this section we will configure an amazon sagemaker instance to forecast energy utilisation in the home 1 navigate to the aws iot analytics console 2 select notebooks from the left hand navigation pane 3 click create to begin configuring a new notebook 4 click blank notebook and input the following name smarthome notebook select data set sources batchdataset select a notebook instance iotaworkshopsagemaker this is the name of the notebook created with cloudformation 5 create notebook 6 click on iotaworkshopsagemaker choose iot analytics in the drop down menu next to smarthome notebook ipynb choose open in jupyter 7 a new amazon sagemaker window should pop up and a message that says kernel not found click continue without kernel 8 download the following jupyter notebook https s3 amazonaws com iotareinvent18 smarthomenotebook ipynb 9 in the jupyter notebook console click file open 10 choose upload and select the smarthomenotebook ipynb notebook downloaded in step 8 11 click on the smarthomenotebook ipynb to be taken back to the jupyter notebook 12 select conda mxnet p36 as the kernel and click set kernel 13 you should see some pre filled python code steps in the jupyter notebook 14 click run for each step follow the documentation presented in the notebook the notebook includes information about how the machine learning process works for each step 15 run through all the steps in the smarthomenotebook to understand how the machine learning training process works with jupyter notebooks note wait until the asterisk disappears after running each cell if you click run before the previous step is completed this could cause some code to fail to complete and the algorithm will fail top top recap and review what did we learn in this workshop in steps 3 and 4 we used aws quicksight and jupyter notebooks to visualize and use machhine learning to gather additional insights into our data during the workshop we saw how you can combine raw iot data coming in real time from devices and public data sets analysed by third party tools using pipelines you can clean and standardise this data so you can then perform advanced visualisations and analytics on the data iot analytics is designed to solve the challenge of cleaning organising and making usable data out of hundreds thousands or even millions of data points clean up resources in aws in order to prevent incurring additonal charges please clean up the resources created in the workshop 1 ssh to the ec2 docker instance from step 2c and execute clean up sh this script will delete the iot analytics channels pipelines and datasets example ssh i iotaworkshopkeypair pem ec2 user ec2 my ec2 instance eu west 1 compute amazonaws com cd home ec2 user clean up sh clean up sh 2 navigate to the aws cloudformation console to delete the cloudformation stacks and their associated resources click on iotanalyticsstack and click delete click on iotdevicesimulator and click delete note deleting the cloudformation stacks can take several minutes 3 navigate to the aws quicksight console click on manage data click on batchdataset and then delete data set then delete 4 navigate to the amazon ecs console click on repositories under amazon ecr select container app ia and click delete 5 navigate to the aws ec2 console click on key pairs in the left navigation pane choose the ec2 keypair you created to ssh to your docker instance and click delete 6 navigate to the s3 management console and delete the following the 3 buckets you created in step 1b ending with channel dataset and datastore each bucket with the prefix iotdevicesimulator each bucket with the prefix sagemaker your region 7 navigate to the aws iot analytics console check that all of your pipelines data sets and channels have been deleted if not you can manually delete them from the console top top troubleshooting to aide in troubleshooting you can enable logs for iot core and iot analytics that can be viewed in aws cloudwatch aws iot core 1 navigate to the aws iot core console 2 click on settings in the left hand pane 3 under logs click on edit level of verbosity debug most verbose set role click create new name iotcoreloggingrole the log files from aws iot are send to amazon cloudwatch the aws console can be used to look at these logs for additional troubleshooting refer to here iot core troubleshooting https docs aws amazon com iot latest developerguide iot troubleshooting html top top aws iot analytics 1 navigate to the aws iot analytics console 2 click on settings in the left hand pane 3 under logs click on edit level of verbosity debug most verbose set role click create new name iotanalyticsloggingrole create role 4 click on update the log files from aws iot analytics will be send to amazon cloudwatch the aws console can be used to look at these logs for additional troubleshooting refer to here iot analytics troubleshooting https docs aws amazon com iotanalytics latest userguide troubleshoot html top top
server
ML-Tutorial-Experiment
ml tutorial experiment coding the machine learning tutorial for learning to learn tensorflow https www jiqizhixin com articles 2017 08 29 14 https github com jiqizhixin ml tutorial experiment blob master experiments tf cnn tutorial ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2ftf cnn tutorial ipynb https github com jiqizhixin ml tutorial experiment blob master experiments tf trial 1 ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2ftf trial 1 ipynb keras cnn https github com jiqizhixin ml tutorial experiment blob master experiments tf keras cnn ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2ftf keras cnn ipynb tensorflow lenet 5 https github com jiqizhixin ml tutorial experiment blob master experiments tf lenet5 ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2ftf lenet5 ipynb densnet cliquenet https www jiqizhixin com articles 2018 05 23 6 gan https www jiqizhixin com articles 2017 10 1 1 https github com jiqizhixin ml tutorial experiment blob master experiments keras gan ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2fkeras gan ipynb gan tensorflow https github com jiqizhixin ml tutorial experiment blob master experiments tf gan ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2ftf gan ipynb capsnet https www jiqizhixin com articles 2017 11 05 https github com jiqizhixin ml tutorial experiment blob master experiments tf orginal capsnet ipynb img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2ftf orginal capsnet ipynb https www jiqizhixin com articles capsule implement sara sabour feb02 rnn cnn https www jiqizhixin com articles 2018 04 12 3 lstm https github com jiqizhixin ml tutorial experiment blob master experiments lstm ptb ipynb tcn https github com locuslab tcn tcn colaboratory https colab research google com drive 1gaxc0j9qzlyqu8g9 p ehi ttym7uhxf img align right height 22 src https beta deepnote org buttons launch in deepnote svg https beta deepnote org launch template data science url https 3a 2f 2fgithub com 2fjiqizhixin 2fml tutorial experiment 2fblob 2fmaster 2fexperiments 2flstm ptb ipynb transformer https www jiqizhixin com articles synced github implement project machine translation by transformer colaboratory https colab research google com drive 1wt9jwynnki6lipwucy0sz5wkg7mysgs0 github issue https www jiqizhixin com articles 2017 08 07 2 https www jiqizhixin com articles 2017 09 28 python python https www jiqizhixin com articles 2017 10 13 python https www jiqizhixin com articles 062403 numpy numpy https www jiqizhixin com articles 070101 numpy https www jiqizhixin com articles 2017 10 28 numpy https www jiqizhixin com articles 2018 04 21 7 tensorflow https www jiqizhixin com articles 2017 05 14 2 python 8 https www jiqizhixin com articles 2018 01 01 python https www jiqizhixin com articles 03132 logistic logistics https www jiqizhixin com articles 2018 05 13 3 python logistic https www jiqizhixin com articles 2017 02 17 5 https www jiqizhixin com articles 033088 k pca t sne staking bagging boosting adaboost entropy based graph based pixelrnn pixelcnn vae gan
ai
Mobius
mobius onem2m iot server platform version 2 5 x 2 5 13 introduction mobius is the open source iot server platform based on the onem2m http www onem2m org standard as onem2m specifies mobius provides common services functions e g registration data management subscription notification security as middleware to iot applications of different service domains not just onem2m devices but also non onem2m devices i e by onem2m interworking specifications and keti tas can connect to mobius certification mobius has been received certification of onem2m standard by tta telecommunications technology association onem2m certification guarantees that onem2m products meet onem2m specification and test requirements which ensure interoperability as mobius is certified it will be used as a golden sample to validate test cases and testing system div align center img src https user images githubusercontent com 29790334 40639101 e9ecd06c 6349 11e8 9fc2 0806d9bf5dc7 png width 800 div trsl test requirements status list is available on onem2m certification website http www onem2mcert com sub sub05 01 php system stucture in onem2m architecture mobius implements the in cse which is the cloud server in the infrastructure domain iot applications communicate with field domain iot gateways devices via mobius div align center img src https user images githubusercontent com 29790334 28322739 d7fddbc4 6c11 11e7 9180 827be6d997f0 png width 800 div connectivity stucture to enable internet of things things are connected to cube via tas thing adaptation software then cube communicate with mobius over onem2m standard apis also iot applications use onem2m standard apis to retrieve thing data control things of mobius div align center img src https user images githubusercontent com 29790334 28322868 33e97f4c 6c12 11e7 97fc 6de66c06add7 png width 800 div software architecture div align center img src https user images githubusercontent com 29790334 28245393 a1159d5e 6a40 11e7 8948 4262bf29c371 png width 800 div supported protocol bindings http coap mqtt websocket installation the mobius is based on node js framework and uses mysql for database div align center img src https user images githubusercontent com 29790334 28322607 7be7d916 6c11 11e7 9d20 ac07961971bf png width 600 div br mysql server https www mysql com downloads br the mysql is an open source rdb database so that it is free and ligth and rdb is very suitable for storing tree data just like onem2m resource stucture most of ncube rosemary will work in a restricted hardware environment and the mysql can work in most of embeded devices node js https nodejs org en br node js is a javascript runtime built on chrome s v8 javascript engine node js uses an event driven non blocking i o model that makes it lightweight and efficient node js package ecosystem npm is the largest ecosystem of open source libraries in the world node js is very powerful in service impelementation because it provide a rich and free web service api so we use it to make restful api base on the onem2m standard mosquitto https mosquitto org br eclipse mosquitto is an open source epl edl licensed message broker that implements the mqtt protocol versions 3 1 and 3 1 1 mqtt provides a lightweight method of carrying out messaging using a publish subscribe model this makes it suitable for internet of things messaging such as with low power sensors or mobile devices such as phones embedded computers or microcontrollers like the arduino mobius https github com iotketi mobius archive master zip br mobius source codes are written in javascript so they don t need any compilation or installation before running mobius docker version we deploy mobius as a docker image using the virtualization open source tool docker mobius docker https github com iotketi mobius docker br configuration import sql script br after installation of mysql server you need the db schema for storing onem2m resources in mobius you can find this file in the following mobius source directory mobius home mobius mobiusdb sql run mosquitto mqtt broker br mosquitto v open the mobius source home directory install dependent libraries as below npm install modify the configuration file conf json per your setting csebaseport 7579 mobius http hosting port dbpass mysql root password run use node js application execution command as below node mobius js div align center img src https user images githubusercontent com 29790334 28245526 c9db7850 6a43 11e7 9bfd f0b4fb20e396 png width 700 div br library dependencies this is the list of library dependencies for mobius body parser cbor coap crypto events express file stream rotator fs http https ip js2xmlparser merge morgan mqtt mysql shortid url util websocket xml2js xmlbuilder document if you want more details please download the full installation guide document https github com iotketi mobius raw master doc installation 20guide mobius v2 0 0 en 170718 pdf author jaeho kim jhkim keti re kr il yeup ahn iyahn keti re kr
onem2m iot server nodejs mobius
server
Hyperion
div align center img src assets logo svg width 250 hyperion a modern alternate youtube front end for android zt64 hyperion https img shields io static v1 label zt64 message hyperion color teal logo github style for the badge https github com zt64 hyperion go to github repo stars hyperion https img shields io github stars zt64 hyperion style for the badge https github com zt64 hyperion forks hyperion https img shields io github forks zt64 hyperion style for the badge https github com zt64 hyperion div features art dynamic color no entry sign ad free x sponsorblock thumbsdown return youtube dislikes arrow down download videos vhs picture in picture black medium square white medium square dark light theme with without following system color mode a multi language more to come bust in silhouette google account login art custom themes installation bugs are to be expected at this state of development br the minimum supported android version is android 5 0 api level 21 br the latest apk can be downloaded from here https nightly link zt64 hyperion workflows build debug main app debug zip br github releases will be added once hyperion is in a more stable state translation please contribute to translations on crowdin https crowdin com project hyperion app acknowledgements sponsorblock https sponsor ajay app return youtube dislikes https github com anarios return youtube dislike license hyperion is licensed under the gnu general public license license gpl v3 https img shields io badge license gpl 20v3 blue svg style for the badge https www gnu org licenses gpl 3 0
kotlin android youtube jetpack-compose kotlin-android kotlin-coroutines kotlin-multiplatform media3 youtube-api
front_end
FE-Summary
html css javascript web just star front thinking https github com front thinking natumsol https github com natumsol tommy feng https github com tommy feng
front_end
awdwr
introduction agile web development with rails awdwr is a test suite for the scenario found in the book by the same name https pragprog com titles rails4 agile web development with rails 4th edition published by pragmatic programmers it was originally developed out of self defence to keep up with the rapid pace of change in rails itself and has also proven valuable as a system test https github com rails rails blob master releasing rails md is sam ruby happy if not make him happy for rails itself this code has been developed over a long period of time and still has accommodations for things like the days before puma was the default the days before scss and coffeescript the days before sprockets when images were placed in public images the days before bundler existed ruby 1 8 7 s hash syntax and the requirement to require rubygems if you wanted to use it it works natively on mac os x el capitan and ubuntu 14 04 it works with either rbenv or rvm the dashboard can be run as cgi under apache httpd using passenger on nginx or simply with webbrick instructions are provided separately for usage under vagrant vagrant readme and cloud9 cloud9 md or install rails dev box https github com rails rails dev box readme ssh into the vagrant machine and run the following command eval curl https raw githubusercontent com rubys awdwr master rdb setup control over directory locations and versions to be tested is provided by dashboard yml dashboard yml and testrails yml testrails yml installation installation of all necessary dependencies from a fresh install of ubuntu or mac os x ruby setup rb see comments if dependencies aren t met execution instructions ruby makedepot rb version restore range save work dir port n description of the options restore restore from snapshot before resuming execution version specifies the rails version to test examples edge 2 2 2 git range specifies a set of sections to execute examples 6 2 6 5 7 1 9 5 16 save save snapshot after execution completes work dir name of work directory to use default work port n port number to use for the test default 3000 output will be produced as makedepot html tests against makedepot html can also be run separately ruby checkdepot rb output will be produced as checkdepot html automation tools setup rb initial setup and verification testrails rb front end to makedepot that manages the environment dashboard rb cgi to monitor status start jobs sample configuration data testrails yml provides mappings for edition rails and ruby versions dashboard yml lists test configurations sample output http intertwingly net projects dashboard html m1 mac setup brew install svn brew install mysql brew services restart mysql brew install chromedriver install google chrome add to zshrc export puppeteer executable path applications google chrome app contents macos google chrome brew install gpg brew install postgresql brew install zstd brew install openjdk 11 sudo ln sfn opt homebrew opt openjdk 11 libexec openjdk jdk library java javavirtualmachines openjdk 11 jdk brew install node brew install imagemagick mkdir p git cd git git clone git github com rubys awdwr git cd awdwr ruby setup rb npm install sudo launchctl load w system library launchdaemons org apache httpd plist later in work directory bundle config local build mysql2 with ldflags l opt homebrew cellar zstd 1 5 0 lib bundle config build eventmachine with cppflags i brew prefix openssl include bundle install rvm wrapper users rubys git awdwr rproxy rb cp awdwr rproxy plist library launchagents launchctl load w library launchagents awdwr rproxy plist
front_end
InnerEye-DeepLearning
innereye deeplearning build status https innereye visualstudio com innereye apis build status innereye deeplearning innereye deeplearning pr branchname main https innereye visualstudio com innereye build definitionid 112 branchname main innereye deeplearning ie dl is a toolbox for easily training deep learning models on 3d medical images simple to run both locally and in the cloud with azureml https docs microsoft com en gb azure machine learning it allows users to train and run inference on the following segmentation models classification and regression models any pytorch lightning model via a bring your own model setup docs source md bring your own model md in addition this toolbox supports cross validation using azureml where the models for individual folds are trained in parallel this is particularly important for the long running training jobs often seen with medical images hyperparameter tuning using hyperdrive https docs microsoft com en us azure machine learning how to tune hyperparameters building ensemble models easy creation of new models via a configuration based approach and inheritance from an existing architecture documentation for all documentation including setup guides and apis please refer to the ie dl read the docs site https innereye deeplearning readthedocs io quick setup this quick setup assumes you are using a machine running ubuntu with git git lfs conda and python 3 7 installed please refer to the setup guide docs source md environment md for more detailed instructions on getting innereye set up with other operating systems and installing the above prerequisites 1 clone the innereye deeplearning repo by running the following command shell git clone recursive https github com microsoft innereye deeplearning cd innereye deeplearning 2 create and activate your conda environment shell conda env create file environment yml conda activate innereye 3 verify that your installation was successful by running the helloworld model no gpu required shell python innereye ml runner py model helloworld if the above runs with no errors congratulations you have successfully built your first model using the innereye toolbox if it fails please check the troubleshooting page on the wiki https github com microsoft innereye deeplearning wiki issues with code setup and the helloworld model full innereye deployment we offer a companion set of open sourced tools that help to integrate trained ct segmentation models with clinical software systems the innereye gateway https github com microsoft innereye gateway is a windows service running in a dicom network that can route anonymized dicom images to an inference service the innereye inference https github com microsoft innereye inference component offers a rest api that integrates with the innereye gateway to run inference on innereye deeplearning models details can be found here docs source md deploy on aml md docs deployment png docs source images deployment png benefits of innereye deeplearning in combiniation with the power of azureml innereye provides the following benefits traceability azureml keeps a full record of all experiments that were executed including a snapshot of the code tags are added to the experiments automatically that can later help filter and find old experiments transparency all team members have access to each other s experiments and results reproducibility two model training runs using the same code and data will result in exactly the same metrics all sources of randomness are controlled for cost reduction using azureml all compute resources virtual machines vms are requested at the time of starting the training job and freed up at the end idle vms will not incur costs azure low priority nodes can be used to further reduce costs up to 80 cheaper scalability large numbers of vms can be requested easily to cope with a burst in jobs despite the cloud focus innereye is designed to be able to run locally too which is important for model prototyping debugging and in cases where the cloud can t be used therefore if you already have gpu machines available you will be able to utilize them with the innereye toolbox licensing mit license license you are responsible for the performance the necessary testing and if needed any regulatory clearance for any of the models produced by this toolbox acknowledging usage of project innereye oss tools when using project innereye open source software oss tools please acknowledge with the following wording this project used microsoft research s project innereye open source software tools https aka ms innereyeoss https aka ms innereyeoss contact if you have any feature requests or find issues in the code please create an issue on github https github com microsoft innereye deeplearning issues please send an email to innereyeinfo microsoft com if you would like further information about this project publications oktay o nanavati j schwaighofer a carter d bristow m tanno r jena r barnett g noble d rimmer y glocker b o hara k bishop c alvarez valle j nori a evaluation of deep learning to augment image guided radiotherapy for head and neck and prostate cancers jama netw open 2020 3 11 e2027426 doi 10 1001 jamanetworkopen 2020 27426 https pubmed ncbi nlm nih gov 33252691 bannur s oktay o bernhardt m schwaighofer a jena r nushi b wadhwani s nori a natarajan k ashraf s alvarez valle j castro d c hierarchical analysis of visual covid 19 features from chest radiographs icml 2021 workshop on interpretable machine learning in healthcare https arxiv org abs 2107 06618 https arxiv org abs 2107 06618 bernhardt m castro d c tanno r schwaighofer a tezcan k c monteiro m bannur s lungren m nori s glocker b alvarez valle j oktay o active label cleaning for improved dataset quality under resource constraints https www nature com articles s41467 022 28818 3 https www nature com articles s41467 022 28818 3 accompanying code innereye dataquality https github com microsoft innereye deeplearning blob 1606729c7a16e1bfeb269694314212b6e2737939 innereye dataquality readme md contributing this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla opensource microsoft com https cla opensource microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g status check comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments maintenance this toolbox is maintained by the microsoft medical image analysis team https www microsoft com en us research project medical image analysis
azure medical-imaging healthcare deep-learning
ai
inferoxy
div align center inferoxy docs inferoxy dark png div codecov https codecov io gh eora ai inferoxy branch master graph badge svg token hv6znpufze https codecov io gh eora ai inferoxy what is it inferoxy is a service for quick deploying and using dockerized computer vision models it s a core of eora s computer vision platform vision hub https www visionhub ru that runs on top of aws eks why use it you should use it if you want to simplify deploying computer vision models with an appropriate data science stack to production all you need to do is to build a docker image with your model including any pre and post processing steps and push it into an accessible registry you have only one machine or cluster for inference cpu gpu you want automatic batching for multi gpu multi node setup model versioning architecture overall architecture docs inferoxy general png inferoxy is built using message broker pattern roughly speaking it accepts user requests through different interfaces which we call bridges multiple bridges can run simultaneously current supported bridges are rest api grpc and zeromq the requests are carefully split into batches and processed on a single multi gpu machine or a multi node cluster the models to be deployed are managed through model manager that communicates with redis to store retrieve models information such as docker image url maximum batch size value etc batching batching docs inferoxy batching png one of the core inferoxy s features is the batching mechanism for batch processing it s taken into consideration that different models can utilize different batch sizes and that some models can process a series of batches from a specific user e g for video processing tasks the latter models are called stateful models while models which don t depend on user state are called stateless multiple copies of the same model can run on different machines while only one copy can run on the same gpu device so to increase models efficiency it s recommended to set batch size for models to be as high as possible a user of the stateful model reserves the whole copy of the model and releases it when his task is finished users of the stateless models can use the same copy of the model simultaneously numpy tensors of rgb images with metadata are all going through zeromq to the models and the results are also read from zeromq socket cluster management cluster docs inferoxy cluster png the cluster management consists of keeping track of the running copies of the models load analysis health checking and alerting requirements you can run inferoxy locally on a single machine or k8s https kubernetes io cluster to run inferoxy you should have a minimum of 4gb ram and cpu or gpu device depending on your speed cost trade off basic commands local run to run locally you should use inferoxy docker image the last version you can find here https github com eora ai inferoxy releases bash docker pull public registry visionhub ru inferoxy v1 0 4 after image is pulled we need to make basic configuration using env file env env cloud client docker task manager docker config network inferoxy task manager docker config registry task manager docker config login task manager docker config password model storage database host redis model storage database port 6379 model storage database number 0 logging level info the next step is to create inferoxy docker network bash docker network create inferoxy now we should run redis in this network redis is needed to store information about your models bash docker run network inferoxy name redis redis latest create models yaml file with simple set of models you can read about models yaml in documentation https github com eora ai inferoxy wiki yaml stub address public registry visionhub ru models stub v5 batch size 256 run on gpu false stateless true now we can start inferoxy bash docker run env file env v var run docker sock var run docker sock p 7787 7787 p 7788 7788 p 8000 8000 p 8698 8698 name inferoxy rm network inferoxy v pwd models yaml etc inferoxy models yaml public registry visionhub ru inferoxy inferoxy version documentation you can find the full documentation here https github com eora ai inferoxy wiki discord join our community in discord server https discord gg asbwzuzuvp to discuss stuff related to inferoxy usage and development
mlops python computer-vision machine-learning pipelines
ai
2016s
2016s web application development by agile method in class 2016 team name service name description source https github com service http your service name herokuapp com ci test travis ci url backlog backlog url members github account name real name business hours e g monday 20 10 21 40 saturday 13 00 14 30 notes notes please update your service by your self http enpit aiit ac jp curriculum in japanese puzzveg description source https github com terra yucco ruthenium service https puzzveg herokuapp com ci test https travis ci org terra yucco ruthenium backlog https trello com b xzewhf6m aiit enpit 2016 ruthenium business hours saturday 13 30 18 00 members minimaru nakayama hiroyuki https github com minimaru terra yucco terashima yuko https github com terra yucco cloudliner ohno kiyoshi https github com cloudliner b1621ry yamada ryoma https github com b1621ry dtakeshima takeshima daichi https github com dtakeshima notes nyamco tochislide description share information of landslides and weather forecasts source https github com nyamco enpit tochislide service https young citadel 26247 herokuapp com posts service https polar taiga 41162 herokuapp com tochi slide index ci test https travis ci org nyamco enpit tochislide backlog https trello com b 8wf5mqep aiit enpit 2016 nyamco members a1624 z13052hn meeea makotomotegi yamdayk hiroak business hours saturday 12 00 14 30 notes himeji description source https github com moritoh ruby himeji https github com moritoh ruby himeji service https rocky waters 21208 herokuapp com ci test https travis ci org moritoh ruby himeji backlog https trello com b bfqieieh himeji business hours saturday 12 30 14 30 thursday 20 00 22 00 members tomomi rivers tomomi kawabe https github com tomomi rivers a rom ayumu tanaka https github com a rom a1630ty takashi yokogawa https github com a1630ty moritoh ruby aritomo moritoh https github com moritoh ruby kurasawanko akane kurasawa https github com kurasawanko notes sdd3 description source https github com ishidahiro sdd3 service https secure retreat 68116 herokuapp com ci test https travis ci org ishidahiro sdd3 backlog https github com ishidahiro sdd3 projects 3 business hours saturday 13 00 18 00 members takahashiko kosei takahashi ishidahiro hiroyuki ishida gittkrd kenichiro hohda leyosh yoshihiro yamabe hirokitakenaka takenaka tantan5 murase notes codebelle description source https github com tantan5 codebelle service https tabi cierge herokuapp com ci test https travis ci org tantan5 codebelle backlog coming soon members tkhgsh higashi hirokitakenaka takenaka tantan5 murase nerimarentai shima business hours saturday 15 00 19 00 notes ryupit description source https github com ryupitbros4 itswitter service https suitell herokuapp com ci test https travis ci org ryupitbros4 itswitter backlog https trello com b ohgw1xdj kpt https trello com b 9rm4lbot kpt business hours monday 13 00 16 00 thursday 10 00 12 00 saturday 09 00 10 00 members itomasameteo itosu masataka https github com itomasameteo akio m higa akitika https github com akio m 168572nogihu taguti yuuiti https github com 168572nogihu e125762 gibo keito https github com e125762 e125743 itou takumi https github com e125743 notes 8 00 22 00 commit
front_end
Billing-Software
billing software java application database management system software engineering project work 201401137 mohit savaliya 201401136 dhruv koshiyar we have made a project billing system in group of two students for the company as per its requirements first of all we have analyze the products and all basic things like tax system market value etc we have used java application tool of netbeans to create the basic software and microsoft access database to store and access the database for the system first of all we made srs then for the system design we have made er entity relational model with entity like customer item bill system etc then we made context level zero one and two diagram we have made level one and two diagrams due to divide the system to subsystems for better understanding we also made data dictionary for all the entities and sub entities then we made form designs like login form add customer add item edit customer edit item product detail edit product detail customer detail bill invoice search item search customer credit and debit note change username and password add note search invoice reminder calculator about system help stock calculator statement for each customer check credit check debit daily sold product information due date report master report master gst print customer details print item details print check details availability of item weekly monthly annually report etc we connect this designed system to microsoft access database to access the data for all the entities to made our search easy and less time complexity we have used more efficient data structure called trie so we can search customer by name and code item by name and code reports by date more efficiently by each character for the security purpose we have used hashtable data structure which contains minimum of one small one big alphabet one number one special character we also have implemented an automatic backup option which automatically create a duplicate microsoft access database file and save it to the different allocated location after making tax retail invoice we saved it in a pdf form by importing import com itextpdf text and using its applications finally we have done this project as per expectations
server