names
stringlengths 1
98
| readmes
stringlengths 8
608k
| topics
stringlengths 0
442
| labels
stringclasses 6
values |
---|---|---|---|
natural-language-processing | natural language processing natural language processing nlp is an area of computer science and artificial intelligence concerned with the interactions between computers and human natural languages in particular how to program computers to fruitfully process large amounts of natural language data challenges in natural language processing frequently involve speech recognition natural language understanding and natural language generation p align center img src images linguistics and reality 640x300 png width 350 p p align center img src https github com marcotav natural language processing blob master neural language model and spinoza images spinoza jpg width 350 p p align center a href d notebooks and descriptions a a href ci contact information a a href nd notebooks and descriptions a a href ci contact information a p p align center img src https github com marcotav natural language processing blob master sentiment analysis images sanalysis jpg width 350 p a id nd a notebooks and descriptions notebook brief description neural language model and spinoza http nbviewer jupyter org github marcotav natural language processing blob master neural language model and spinoza notebooks language model spinoza new ipynb spinoza s ethics is used to build a language model for text generation with recurrent neural nets sentiment analysis http nbviewer jupyter org github marcotav natural language processing blob master sentiment analysis notebooks sentiment analysis ipynb a reverse sentiment analysis using bernoulli naive bayes was performed on movie reviews already classified to identify which words appear more frequently on reviews from each class topic identification http nbviewer jupyter org github marcotav natural language processing blob master topic identification and entity recognition topic identification notebooks topic identification ipynb tutorial about topic identification in progress alphabet human thought meaning of sentences http nbviewer jupyter org github marcotav natural language processing blob master alphabet human thought meaning nlu logic notebooks meaning of sentences ipynb in this notebook it will be shown that using logic formalisms one can find more generic translation mechanisms in progress alphabet human thought sentence structure http nbviewer jupyter org github marcotav natural language processing blob master alphabet human thought sentence structure notebooks sentence structure ipynb we will show how to develop formal models for patterns in sequence of words using grammars and parsers in progress a id ci a contact information feel free to contact me email marcotav65 gmail com mailto marcotav65 gmail com github marcotav https github com marcotav linkedin marco tavora https www linkedin com in marco tavora website marcotavora me http www marcotavora me | natural-language-processing language-model neural-networks philosophers statistical-learning | ai |
datamodelling-postgres | data modelling and etl in postgres python this is the first project submission for the udacity data engineering nanodegree this project consists of putting into practice the following concepts data modeling and creating a star schema database with postgres using the psycopg2 python driver building an etl pipeline using python context project specification a startup called sparkify wants to analyze the data they ve been collecting on songs and user activity on their new music streaming app currently they don t have an easy way to query their data which consists of i a directory of json logs on user activity on the app data log data ii a directory with json metadata on the songs in their app data song data the minimum project specification is to create a database schema and etl pipeline for this analysis data song datasets all json files are nested in subdirectories under data song data a sample a single row of each file is num songs 1 artist id arjie2y1187b994ab7 artist latitude null artist longitude null artist location artist name line renaud song id soupiru12a6d4fa1e1 title der kleine dompfaff duration 152 92036 year 0 log datasets all json files are nested in subdirectories under data log data a sample of a single row of each file is artist sydney youngblood auth logged in firstname jacob gender m iteminsession 53 lastname klein length 238 07955 level paid location tampa st petersburg clearwater fl method put page nextsong registration 1 540558e 12 sessionid 954 song ain t no sunshine status 200 ts 1543449657796 useragent mozilla 5 0 macintosh intel mac os x 10 9 4 userid 73 database schema there is one main fact table containing all the measures associated to each event songplays and 4 dimensional tables each with a primary key that is being referenced from the fact table the database follows a star schema on why to use a relational database for this case the ability to use sql queries is sufficient for this type of analysis and the ability to join tables is necessary there are no concerns around data availability query latency or horizontal scalability the data types are structured the amount of data we need to analyze is not big enough to require big data related solutions fact table songplays records in log data associated with song plays i e records with page nextsong songplay id varchar primary key id of each user song play start time timestamp timestamp of beginning of user activity user id int not null id of user level varchar user level free paid song id varchar not null id of song played artist id varchar not null id of artist of the song played session id int id of the user session location varchar user location user agent varchar agent used by user to access sparkify platform dimension tables users data on individual sparkify users user id int primary key id of user first name varchar not null first name of user last name varchar not null last name of user gender varchar gender of user m f level varchar user level free paid songs data on individual songs in the sparkify database song id varchar primary key id of song title varchar not null title of song artist id varchar not null id of artist year int year of song release duration float not null song duration in milliseconds artists data on artists in the sparkify database artist id varchar primary key id of artist name varchar not null name of artist location varchar name of artist city lattitude float lattitude location of artist longitude float longitude location of artist time timestamps of records in songplays broken down into specific units start time date primary key timestamp of row hour int hour associated to start time day int day associated to start time week int week of year associated to start time month int month associated to start time year int year associated to start time weekday varchar name of weekday associated to start time eg monday tuesday etc project structure files used on the project 1 data folder nested at the home of the project 2 sql queries py contains all sql queries needed to initiate the tables and is imported into the files below 3 create tables py drops and creates tables this file is run to reset tables each time before the etl scripts are run 4 test ipynb displays the first few rows of each table to inspect the database 5 etl ipynb reads and processes a single file from song data and log data and loads the data into tables 6 etl py reads and processes all files from song data and log data and loads them into your tables 7 readme md current file provides overview of the project step by step overview of project 1 write drop create and insert query statements in sql queries py 2 run in terminal python create tables py 3 use test ipynb jupyter notebook to interactively verify that all tables were created correctly 4 follow the instructions and complete etl ipynb notebook to create the blueprint of the pipeline to process and insert all data into the tables 5 verify that base steps are correct by checking with test ipynb 6 fill in etl py script using etl ipynb as a base 6 run etl in terminal and verify results python etl py etl pipeline logic in depth breakdown of each step is available in etl ipynb but a summary of the overall logic 1 connect to the sparkify database 2 cycle through the files under data song data and for each json file encountered send the file to a function called process song file 3 this function loads each song file into a dataframe and selects the fields needed to generate the tables before inserting into the respective databases song data song id title artist id year duration artist data artist id artist name artist location artist longitude artist latitude 4 cycle through the tree files under data log data and for each json file encountered send the file to a function called process log file 5 this function loads each song file where page nextsong into a dataframe and selects the fields needed to generate the tables before inserting into the respective databases 1 we convert the ts attribute from unix time to a timestamp this is then used as the start time field in the time data and all additional fields are generated using the datetime library time df timestamp hour day week of year month year weekday user df userid firstname lastname gender level 6 lookup the songid and artistid from their respective tables by song name artist name and song duration in order to complete the songplay fact table the query used is the following song select select song id artists artist id from songs join artists on songs artist id artists artist id where songs title s and artists name s and songs duration s 7 the last step is inserting everything into the songplay fact table project checklist x table creation script create table py runs without errors x fact and dimensional tables for a star schema are properly defined x etl script etl py runs without errors x etl script properly processes transformations in python x the project shows proper use of documentation x the project code is clean and modular bonus insert data using the copy command to bulk insert log files instead of using insert on one row at a time bonus add data quality checks bonus create a dashboard for analytic queries on your new database | database postgres python sql | server |
ionic | ionic https ionic xgqfrms xyz heart advanced html5 mobile development framework and sdk cn ionic website https xgqfrms github io ionic demo https creator ionic io share a8419c08802c https creator ionic io share a8419c08802c publish https creator ionic io app dashboard projects | front_end |
|
STM32PlayBoard_v1 | stm32playboard v1 project overview p an stm32 based development board for boundless exploration in embedded systems p introducing the stm32playboard as the name suggests the main object of developing this board is to expand my knowledge and provide an entry gate for others to learn embedded systems with a well known and documented microcontroller by stmicroelectronics p this project opens the door to immersive learning experiences and practical experimentation top images boardfront png bottom images boardback png why i designed the stm32playboard a catalyst for growth and innovation the reason for the development of stm32playboard was rooted in a dual ambition to push the boundaries of my expertise in pcb design while creating a versatile tool that resonates with my journey in embedded systems 1 elevating pcb design expertise this project moved me into a relatively complex pcb design i was familiar with it also provided a stage to experiment with diverse pcb manufacturing and assembly services broadening my understanding of the complexities of hardware production 2 fulfilling a learning trip while learning about arm cortex m architectures i have always used a variety of stmicroelectronics boards recently i wanted to develop a basic yet peripheral rich custom board design philosophy p i pictured a compact design for the stm32playboard that prioritized accessibility to a wide range of communication protocols while offering dedicated pins for adc timers and some general purpose i os p this careful pin allocation aimed to balance versatility functionality and ease in pcb designing by providing exposure to commonly used communication interfaces and essential gpio options the board becomes an adaptable platform for various applications p additionally the compact nature of the design wasn t just about efficient use of space it also served a practical purpose i aimed to create a development board that could seamlessly integrate into my other projects as a convenient replacement this design approach allows for easy adaptability across different projects simplifying development and minimizing the need for significant redesigns system architecture block diagram architecture images blockdiagram png board component selection board components images boardcomponents png microcontroller 1 the board s core is an stm32f103c8t6 microcontroller from stmicroelectronics running with a 16mhz high speed external oscillator 2 board has a configurable boot0 for different operational modes 3 a physical reset button for manually resetting the board power supply 1 the board can be powered by an external voltage supply and through micro b usb they are reverse polarity protected and can both coexist 2 the voltage regulation is accomplished using ams1117 3 3 linear voltage regulator from advanced monolithic systems since the main component on the board is the main microcontroller ic i went with this standard less expensive and easily available ldo 3 the maximum voltage that can be provided externally is 18v and the maximum drop out voltage of this ldo is 1 2v at 1a we are going to be far from this current consumption using this board 4 the silkscreen on the back side of the pcb shows 15v is just an additional safety measure not to exceed 15v even though up to 18v is acceptable usb 1 the board supports usb 2 0 full speed interface using a micro b connector 2 the board provides esd protection on usb using usblc6 ic from stmicroelectronics onboard leds 1 d3 is onboard status led and is connected to pin pa1 the cathode of the led is permanent ground 2 d2 is the power led indicating a healthy power supply pcb design and schematics schematics schematics https github com lalitk space stm32playboard v1 blob main stm32playboard schematics stm32playboard v1 pdf board layout considerations 1 the pcb is two layered board the reason for being two layers is i wanted to go with a simple board design which also minimizes the production cost 2 the entire bottom layer of the board is grounded and the top layer is the signals layer 3 a wide trace of 0 5mm is for power and a 0 3mm trace is for signals 4 the board dimensions are 56x36mm front copper front copper images front cu png back copper back copper images back cu png interfacing and capabilities 1 this board functions as a development board with gpio and communication protocols exposed the board exposes ul li 1 spi connection spi1 li li 1 i2c connection i2c2 li li 1 usart connection usart1 li li swd for programming and debugging li li 1 adc adc1 in0 li li 2 timers tim1 ch1 and tim1 ch2 li li 4 gpios from port b pb3 pb4 pb5 pb6 li li an onboard led on pin pa1 li ul 2 the voltage level for all the pins is 3 3v and for interfacing with other 5v level compatible boards for example arduino a logic level shifter converter is required software development the board can be programmed using an external st link programmer via the swd interface technical specification input voltages max values through usb connector 5v through power connector 18v gpio tolerances max values all pins 3 3v memories size flash 64 kbytes sram 20 kbytes exposed peripherals connected to i2c2 sda pb11 i2c2 scl pb10 usart1 rx pb7 usart1 tx pb6 spi1 nss pa4 spi1 sck pa5 spi1 miso pa6 spi1 mosi pa7 status led d3 pa1 adc1 in0 pa0 tim1 ch1 pa8 tim1 ch2 pa9 gpio1 usart1 tx pb6 gpio2 pb5 gpio3 pb4 gpio4 swo pb3 board bring up manufactured and assembled board assembled boards images assembledboards jpg powering up and blinking onboard led it s alive images board bringup blink gif my learnings p throughout the development of this board i discovered a lot regarding designing a pcb with a microcontroller on it and various design practices like decoupling and bulk capacitors i also wanted to learn to program a stmicroelectronics microcontroller using an external programmer and i achieved this using this board p as per software development skills i am writing my drivers for various peripherals and example codes for this board which are available on this github repository foundations for the future p keeping this project as a base i would like to design and develop another stm32 based development board with a different more powerful microcontroller with various onboard peripherals like flash memory and some onboard sensors communicating with the main ic using multiple communication protocols | os |
|
GADS-2020-Phase2-Project | gads 2020 phase2 project codifying two labs instructions on qkwiklabs this project is part of the cloud track of the gads 2020 program it involves codifying the instructions found in the qkwiklabs instructions on using google platform resources covered during the program as part of this project i codified the following instructions using markdown creating virtual machines lab translations creating vms md working with virtual machines lab translations working with vms md built with markdown page qkwiklabs instruction usage just navigate through the page author bello babakolo github belsman https github com belsman twitter d belsman https twitter com d belsman linkdin bello babakolo https www linkedin com in bello babakolo b23b17145 contributing contributions issues and feature requests are welcome feel free to check the issues page issues show your support give a if you like this project acknowledgments google africa developer scholarship https www opportunitiesforafricans com the google africa developer scholarship pluralsight https app pluralsight com library | cloud |
|
parity-ethereum | parity ethereum docs logo parity ethereum svg h2 align center the fastest and most advanced ethereum client h2 p align center strong a href https github com paritytech parity ethereum releases latest download the latest release a strong p p align center a href https gitlab parity io parity parity ethereum commits master target blank img src https gitlab parity io parity parity ethereum badges master build svg a a href https www gnu org licenses gpl 3 0 en html target blank img src https img shields io badge license gpl 20v3 green svg a p table of contents 1 description chapter 001 2 technical overview chapter 002 3 building chapter 003 br 3 1 building dependencies chapter 0031 br 3 2 building from source code chapter 0032 br 3 3 simple one line installer for mac and linux chapter 0033 br 3 4 starting parity ethereum chapter 0034 4 testing chapter 004 5 documentation chapter 005 6 toolchain chapter 006 7 community chapter 007 8 contributing chapter 008 9 license chapter 009 1 description a id chapter 001 a built for mission critical use miners service providers and exchanges need fast synchronisation and maximum uptime parity ethereum provides the core infrastructure essential for speedy and reliable services clean modular codebase for easy customisation advanced cli based client minimal memory and storage footprint synchronise in hours not days with warp sync modular for light integration into your service or product 2 technical overview a id chapter 002 a parity ethereum s goal is to be the fastest lightest and most secure ethereum client we are developing parity ethereum using the sophisticated and cutting edge rust programming language parity ethereum is licensed under the gplv3 and can be used for all your ethereum needs by default parity ethereum runs a json rpc http server on port 8545 and a web sockets server on port 8546 this is fully configurable and supports a number of apis if you run into problems while using parity ethereum check out the wiki for documentation https wiki parity io feel free to file an issue in this repository https github com paritytech parity ethereum issues new or hop on our gitter https gitter im paritytech parity or riot https riot im app group parity matrix parity io chat room to ask a question we are glad to help for security critical issues please refer to the security policy outlined in security md security md parity ethereum s current beta release is 2 6 you can download it at the releases page https github com paritytech parity ethereum releases or follow the instructions below to build from source please mind the changelog md changelog md for a list of all changes between different versions 3 building a id chapter 003 a 3 1 build dependencies a id chapter 0031 a parity ethereum requires latest stable rust version to build we recommend installing rust through rustup https www rustup rs if you don t already have rustup you can install it like this linux bash curl https sh rustup rs ssf sh parity ethereum also requires gcc g pkg config file make and cmake packages to be installed osx bash curl https sh rustup rs ssf sh clang is required it comes with xcode command line tools or can be installed with homebrew windows make sure you have visual studio 2015 with c support installed next download and run the rustup installer from https static rust lang org rustup dist x86 64 pc windows msvc rustup init exe start vs2015 x64 native tools command prompt and use the following command to install and set up the msvc toolchain bash rustup default stable x86 64 pc windows msvc once you have rustup installed then you need to install perl https www perl org yasm https yasm tortall net make sure that these binaries are in your path after that you should be able to build parity ethereum from source 3 2 build from source code a id chapter 0032 a bash download parity ethereum code git clone https github com paritytech parity ethereum cd parity ethereum build in release mode cargo build release features final this produces an executable in the target release subdirectory note if cargo fails to parse manifest try bash cargo bin cargo build release note when compiling a crate and you receive errors it s in most cases your outdated version of rust or some of your crates have to be recompiled cleaning the repository will most likely solve the issue if you are on the latest stable version of rust try bash cargo clean this always compiles the latest nightly builds if you want to build stable or beta do a bash git checkout stable or bash git checkout beta 3 3 simple one line installer for mac and linux a id chapter 0033 a bash bash curl https get parity io l the one line installer always defaults to the latest beta release to install a stable release run bash bash curl https get parity io l r stable 3 4 starting parity ethereum a id chapter 0034 a manually to start parity ethereum manually just run bash target release parity so parity ethereum begins syncing the ethereum blockchain using systemd service file to start parity ethereum as a regular user using systemd init 1 copy scripts parity service to your systemd user directory usually config systemd user 2 copy release to bin folder write sudo install target release parity usr bin parity 3 to configure parity ethereum write a etc parity config toml config file see configuring parity ethereum https paritytech github io wiki configuring parity for details 4 testing a id chapter 004 a download the required test files git submodule update init recursive you can run tests with the following commands all packages cargo test all specific package cargo test package spec replace spec with one of the packages from the package list package list e g cargo test package evmbin you can show your logs in the test output by passing nocapture i e cargo test package evmbin nocapture 5 documentation a id chapter 005 a official website https parity io be sure to check out our wiki https wiki parity io for more information viewing documentation for parity ethereum packages you can generate documentation for parity ethereum rust packages that automatically opens in your web browser using rustdoc with cargo https doc rust lang org rustdoc what is rustdoc html using rustdoc with cargo of the the rustdoc book by running the the following commands all packages cargo doc document private items open specific package cargo doc package spec document private items open use document private items to also view private documentation and no deps to exclude building documentation for dependencies replacing spec with one of the following from the details section below i e cargo doc package parity ethereum open a id package list a package list details p parity ethereum ethcore client application bash parity ethereum parity ethereum account management key management tool and keys generator bash ethcore accounts ethkey cli ethstore ethstore cli parity chain specification bash chainspec parity cli signer tool rpc client bash cli signer parity rpc client parity ethereum ethash progpow implementations bash ethash parity ethcore library bash ethcore parity ethereum blockchain database test generator configuration caching importing blocks and block information bash ethcore blockchain parity ethereum ethcore contract calls and blockchain service registry information bash ethcore call contract parity ethereum ethcore database access utilities database cache manager bash ethcore db parity ethereum virtual machine evm rust implementation bash evm parity ethereum ethcore light client implementation bash ethcore light parity smart contract based node filter manage permissions of network connections bash node filter parity private transactions bash ethcore private tx parity ethereum ethcore client network service creation registration with the i o subsystem bash ethcore service parity ethereum ethcore blockchain synchronization bash ethcore sync parity ethereum common types bash common types parity ethereum virtual machines vm support library bash vm parity ethereum wasm interpreter bash wasm parity ethereum wasm test runner bash pwasm run test parity evm implementation bash evmbin parity ethereum ipfs compatible api bash parity ipfs api parity ethereum json deserialization bash ethjson parity ethereum state machine generalization for consensus engines bash parity machine parity ethereum ethcore miner interface bash ethcore miner parity local store price info ethcore stratum using queue parity ethereum ethcore logger implementation bash ethcore logger c bindings library for the parity ethereum client bash parity clib parity ethereum json rpc servers bash parity rpc parity ethereum ethcore secret store bash ethcore secretstore parity updater service bash parity updater parity hash fetch parity core libraries parity util bash ethcore bloom journal blooms db dir eip 712 fake fetch fastmap fetch ethcore io journaldb keccak hasher len caching lock macros memory cache memzero migration rocksdb ethcore network ethcore network devp2p panic hook patricia trie ethereum registrar rlp compress rlp derive parity runtime stats time utils triehash ethereum unexpected parity version p details contributing to documentation for parity ethereum packages document source code https doc rust lang org 1 9 0 book documentation html for parity ethereum packages by annotating the source code with documentation comments example generic documentation comment markdown summary description panics errors safety examples summary of example 1 rust insert example 1 code here for use with documentation as tests 6 toolchain a id chapter 006 a in addition to the parity ethereum client there are additional tools in this repository available evmbin evmbin parity ethereum evm implementation ethstore accounts ethstore parity ethereum key management ethkey accounts ethkey parity ethereum keys generator the following tool is available in a separate repository ethabi https github com paritytech ethabi parity ethereum encoding of function calls docs here https crates io crates ethabi whisper https github com paritytech whisper parity ethereum whisper v2 poc implementation 7 community a id chapter 007 a join the chat questions get in touch with us on gitter gitter parity https img shields io badge gitter parity 4ab495 svg https gitter im paritytech parity gitter parity js https img shields io badge gitter parity js 4ab495 svg https gitter im paritytech parity js gitter parity miners https img shields io badge gitter parity miners 4ab495 svg https gitter im paritytech parity miners gitter parity poa https img shields io badge gitter parity poa 4ab495 svg https gitter im paritytech parity poa alternatively join our community on matrix riot parity https img shields io badge riot 2bparity 3amatrix parity io orange svg https riot im app group parity matrix parity io 8 contributing a id chapter 008 a an introduction has been provided in the so you want to be a core developer presentation slides by hernando castano http tiny cc contrib to parity eth additional guidelines are provided in contributing github contributing md contributor code of conduct code of conduct github code of conduct md 9 license a id chapter 009 a license license | ethereum blockchain rust client node | blockchain |
AIT-MSc-Databases | msc in applied software engineering data architecture and database systems joe o regan a00258304 athlone institute of technology 2018 weekly labs and assignments for ait msc in applied engineering data architecture and database systems module | server |
|
Hands-On-Mobile-and-Embedded-Development-with-Qt-5 | hands on mobile and embedded development with qt 5 a href https www packtpub com application development hands mobile and embedded development qt 5 utm source github utm campaign 9781789614817 img src https www packtpub com media catalog product cache e4d64343b1bc593f1c5348fe05efa4a6 b 1 b12076 cover png height 256px align right a this is the code repository for hands on mobile and embedded development with qt 5 https www packtpub com application development hands mobile and embedded development qt 5 utm source github utm campaign 9781789614817 published by packt build apps for android ios and raspberry pi with c and qt what is this book about qt is a world class framework helping you to develop rich graphical user interfaces guis and multi platform applications that run on all major desktop platforms and most mobile or embedded platforms the framework helps you connect the dots across platforms and between online and physical experience this book covers the following exciting features explore the latest features of qt such as preview for qt for python and qt for webassembly create fluid uis with a dynamic layout for different sized screens deploy embedded applications on linux systems using yocto design qt apis for building applications for embedded and mobile devices utilize connectivity for networked and machine automated applications if you feel this book is for you get your copy https www amazon com dp 1789614813 today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders for example chapter02 the code will look like the following if qtouchscreen devices isempty qapp setstylesheet qbutton padding 10px following is what you need for this book the book is ideal for mobile developers embedded systems engineers and enthusiasts who are interested in building cross platform applications with qt prior knowledge of c is required with the following software and hardware list you can run all code files present in the book chapter 1 15 software and hardware list chapter software required os required 1 qt 5 12 qt creator windows mac os x and linux any 2 qt 5 12 windows mac os x and linux any 3 qt qtgraphicaleffects 5 12 windows mac os x and linux any 4 qt qt virtual keyboard windows mac os x linux ios or android 5 qt windows mac os x and linux any 6 qt qtconnectivity 7 qt qtsensors qtwebsockets qtmqtt windows mac os x linux ios or android 8 qt qtlocation windows mac os x and linux any 9 qt qtmultimedia windows mac os x and linux any 10 qt sqlite mysql windows mac os x and linux any 11 qt qtpurchasing ios or android 12 qt boot 2 qt yocto buildroot windows mac os x and linux any crosstool ng xcode android studio android studio 13 qt sailfish os sdk ubports sdk windows mac os x and linux any 14 qt 5 13 emcripten web browser linux 15 qt boot 2 qt yocto linux we also provide a pdf file that has color images of the screenshots diagrams used in this book click here to download it https www packtpub com sites default files downloads 9781789614817 colorimages pdf errata there is spacing issue in the code snippets present in the book we advice you to use the code files present on github or in the code bundle of the book related products other books you may enjoy mastering qt 5 packt https www packtpub com application development mastering qt 5 utm source github utm campaign 9781786467126 amazon https www amazon com dp 1786467127 qt 5 projects packt https www packtpub com application development qt 5 projects utm source github utm campaign 9781788293884 amazon https www amazon com dp 1788293886 get to know the author lorn potter is a software developer specializing in qt and qml on mobile and embedded devices with his company llornkcor technologies he has worked for trolltech nokia canonical and was a freelance contractor for jolla intopalo and the qt company he is the official maintainer of qt sensors for which he developed the qsensorgestures api he maintains the unsupported qtsysteminfo for the open source qt project and also works on qt bearer management and qt for webassembly he has written blogs and articles for the linux journal he started his career in tech as trolltech s qtopia community liaison he currently resides in australia and spends his spare time recording electronic psybient music for the project brog on his website suggestions and feedback click here https docs google com forms d e 1faipqlsdy7datc6qmel81fiuuymz0wy9vh1jhkvpy57oimekgqib ow viewform if you have any feedback or suggestions download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781789614817 https packt link free ebook 9781789614817 a p | front_end |
|
com.nimbits | nimbits com the nimbits project has been archived nimbits was an open source project founded in 2001 and has gone through many iterations over the last two decades it was originally a data aquasition system meant for process control and automation and evolved into an iot api it has been a lot of fun for me personally to develop the platform and it made me the software engineer i am today today there are many options for doing similar things that nimbits did i recommend looking at https nodered org https www stringify com thank you for your support over the years ben | server |
|
Blockchain | blockchain web3 blockchain intro which blockchain tokens cryptoeconomics smart contracts dapps daos icos infographic https coinhooked com introduction e gov blockchain https blockchainhub net blockchain intro free course already registered https academy b9lab com courses b9lab x16 0 2016 courseware 350259f977104a77a5708ac18c38824a blockchain good introduction https blog craftworkz co dip your toe into smart contracts with ethereum cf8aa476d0d7 website vs dapp front end api database a dapp is very similar to a traditional web application the front end uses the exact same technology to render the page the one critical difference is that instead of an api connecting to a database you have a smart contract connecting to a blockchain you can think of a dapp like this front end smart contract blockchain a dapp is a blockchain enabled website where the smart contract is what allows it to connect to the blockchain smart contract is like a character dapp is like a word the difference is not in the length but in the depth of meaning the later has a complete one makes sense as is the former doesn t it is supposed to be an actor in the system but at the different hierarchy level dapps is decentralized applications built on top of blockchain technology some dapps are built on it s own blockchain while most are based on popular blockchain networks like bitcoin ethereum use the tools in the sidebar to find dapps with specific features built on top of certain blockchains https medium com micheledaliessi how does the blockchain work 98c8cd01d2ae dapp ecosystem decentralized data decentralized wealth decentralized identity decentralized computing decentralized bandwidth decentralized markets for decentralized assets practical decentralization dapp economics an introduction to ethereum and smart contracts a programmable blockchain https auth0 com blog an introduction to ethereum and smart contracts part 2 an introduction to ethereum and smart contracts bitcoin the blockchain https auth0 com blog an introduction to ethereum and smart contracts ethereum https medium com consensys very deep dive on ethereum reading list f5b1122e5990 https ethereum github io go ethereum https ethereum org greeter https ethereum org token https ethereum org crowdsale https ethereum org dao pyethereum https github com ethereum pyethereum https medium com wearetheledger live from eventhorizon blockchain energy 644149f8331b semantic ethereum https media consensys net ethon introducing semantic ethereum 15f1f0696986 smart contracts dapp frameworks and tools https dappsforbeginners wordpress com solidity ethereum clients ethereum has several different client implementations meaning ways to run a node to interact with the ethereum network including c go python java haskell etc a gui based client in development alethzero the go language one go ethereum http ethereum github io go ethereum on other days a tool called testrpc that uses the python client pyethereum https github com ethereum pyethereum interactive console https github com ethereum go ethereum wiki javascript console json rpc https github com ethereum wiki wiki json rpc running a node on a test network if you install a client like geth and run it on the live network it will take a while to download the entire blockchain and sync with the network testrpc you can run a test network using geth or another fast way of getting a testnet running is using testrpc smart contract languages to write smart contracts there are a few different languages solidity which is like javascript and has sol as a file extension serpent python like with extension se solc compiler after writing a contract in solidity use solc to compile it it s from the c libraries different implementations complementing each other again which can be installed here web3 js api https github com consensys smart contract best practices dapp building frameworks trufle and embark the one that got me started is truffle before truffle i watched a group of smart student interns last summer code stuff for a sleepless hackathon albeit with terrific results and shrank back in fear then truffle came along and did a lot of the nitty gritty stuff for you so you can start writing compiling deploying testing building dapps right away another very similar framework for building and testing dapps is embark between those two i ve only used truffle but there are very successful dapp devs in both camps meteor another stack a lot of dapp devs use include web3 js meteor which is a general webapp framework the ethereum meteor wallet repo has a good starter example and silentciero is building a lot of meteor integrations with web3 js and dapp boilerplates apis blockapps net is creating a restful api for dapps based on a haskell node they run as a centralized service to save you the trouble of running a local ethereum node this departs from the completely decentralized model of dapps but is useful when running an ethereum node locally isn t realistic for example if you want to serve your dapp to users who won t be running local nodes either and reach a wider audience with just a web browser or mobile device blockapps has a command line tool called bloc in the works that can be used after creating a developer account with them smart contract ide ides there s a mix ide for writing contracts put out by ethereum haven t tried it but will soon browser based ides the solidity real time compiler and cosmo are both a fast way to get started compiling your smart contracts right away in a browser you can even point your local node at these hosted instances by opening up a port you should trust the site and not have your life savings in ether on your local node for that see the cosmo ui for instructions on how to do this with geth but once your contract is working ok it s nice to use a framework for adding a ui and packaging it all up as a dapp which is what truffle does and will be explained in the programming part later another powerful enterprise y browser ide is in the works by ether camp their ide comes with a sandbox test network with an auto generated gui for testing instead of writing tests manually as shown in the tutorial later as well as a sandbox transaction explorer at test ether camp when you re ready to deploy your contract for semi real using their testnet can a good way to confirm your smart contract s working as expected on a closer to real testbed the same explorer for the live ethereum network is at frontier ether camp and it shows details about every transaction ever ether camp s ide is invite only for eager guinea pigs at time of writing but will be launched soon workflow for deploying smart contracts the workflow is 1 start an ethereum node e g geth or testrpc 2 compile your solidity smart contract using solc get back the binary 3 deploy your compiled contract to the network this step costs ether and signs the contract using your node s default wallet address or you can specify another address get back the contract s blockchain address and abi a json ified representation of your compiled contract s variables events and methods that you can call 4 call stuff in the contract using web3 js s javascript api to interact with it this step may cost ether depending on the type of invocation 1 https consensys github io developers images front end png truffle for many types of dapps distributed apps truffle does everything you could want it compiles your blockchain contracts injects them into your web app and can even run a test suite against them http truffle readthedocs io en latest https metamask io metamask with metamask all your users need to do is install the chrome plugin and they will have their own secure blockchain accounts right there in the convenience of their browsers https medium com metamask developing ethereum dapps with truffle and metamask aa8ad7e363ba truffle tricks for ethereum development dispelling 8 myths first impressions https media consensys net truffle tricks for ethereum development dispelling 8 myths first impressions ecb3edf88207 sample app ethereum wallet app https github com ethereum meteor dapp wallet for net development nethereum http www nethereum com bringing the love of ethereum to net solidity integration with visual studio https media consensys net solidity integration with visual studio 7f25ea1bde71 openbazaar https openbazaar org why make openbazaar what is openbazaar how does openbazaar work how to install openbazaar what could openbazaar have done better la zooz http lazooz org what is la zooz ux hyperledger fabric vs ethereum vs ripple vs bitcoin https goupadhyblog wordpress com 2017 01 16 hyperledger vs ehtereum vs ripple vs bitcoin https goupadhyblog wordpress com 2017 02 10 consensus in hyperledger based bluemix blockchain apps https goupadhyblog wordpress com 2017 02 08 setup blockchain network in minutes using ibm bluemix https goupadhyblog wordpress com 2017 02 07 how to write smart contract chaincode in hyperledger based blockchain applications https goupadhyblog wordpress com 2017 02 02 hyperledger based bluemix blockchain application architecture and components https goupadhyblog wordpress com 2017 01 31 what blockchain is not https goupadhyblog wordpress com 2017 01 16 hyperledger vs ehtereum vs ripple vs bitcoin https goupadhyblog wordpress com 2017 01 15 potential blockchain use cases video bitcoin and cryptocurrency technologies https www coursera org learn cryptocurrency home welcome ibm blockchain for developers https developer ibm com courses all courses blockchain for developers 3 free courses http courses blockgeeks com collections bitcoin the blockchain https onemonth com courses blockchain video lectures and courses https github com digital dreamer blockchain programming wiki video lectures and courses welcome to ethereum 101 https academy b9lab com courses b9lab x16 0 2016 info blockchain university courses http blockchainu co upcoming investigating the potential of blockchains http blockchain open ac uk blockchain academy http www blockchain org in microsoft blockchain as a service https mva microsoft com en us training courses microsoft blockchain as a service 17104 l azrqbg3sd 3206218965 free videos http www blockchainworkspace com training free blockchain university https www youtube com channel ucj5uhx90mzglk0lc gsmtzw ethereum https www youtube com user ethereumproject lighthouse https bravenewcoin com news lighthouse tackles decentralized crowdfunding http bitcoinist com decentralized crowdfunding app lighthouse getting traction https techcrunch com 2014 05 23 lighthouse is a crowdfunding platform built on top of bitcoin functionality spv wallets identity url a 101 noob intro to programming smart contracts on ethereum https consensys github io developers articles 101 noob intro free course http courses blockgeeks com courses blockchain faqs answered in 1 hour http courses blockgeeks com courses blockchain glossary learn blockchain frequently used terms https github com oreillymedia decentralized applications the supply circle how blockchain technology disintermediates the supply chain https media consensys net the supply circle how blockchain technology disintermediates the supply chain 6a19f61f8f35 what is ethereum and how does it work https dinardirham com blog what is ethereum and how does it work dapps decentralized apps https cryptojunction com dapps course https academy b9lab com courses course v1 b9lab eth 11 2017 04 about https academy b9lab com courses b9lab x16 0 2016 about https academy b9lab com courses b9lab cto1 2017 03 about https academy b9lab com courses course v1 b9lab eth 12 2017 05 about https academy b9lab com courses course v1 b9lab eth 13 2017 06 about https academy b9lab com courses course v1 b9lab cto3 2017 05 about https academy b9lab com courses course v1 b9lab eth 14 2017 07 about http courses blockgeeks com http courses blockgeeks com courses ultimate blockchain course building blockchain application aws http courses blockgeeks com courses best solidity tutorial for ethereum smart contracts on internet http courses blockgeeks com courses ethereum developer https www udemy com ethereum developer https www udemy com the basics of blockchain https www udemy com blockchain101 https www udemy com ethereum https www udemy com blockchain developer https www udemy com best solidity tutorial course ethereum blockchain development free course http courses blockgeeks com courses take blockchain faqs answered in 1 hour lessons 990989 1 introduction overview http courses blockgeeks com courses take blockchain glossary learn blockchain frequently used terms lessons 991023 1 introduction overview http courses blockgeeks com courses take bitcoin and cryptocurrency technologies online course texts 893555 intro to crypto and cryptocurrencies https academy b9lab com courses b9lab x16 0 2016 courseware 350259f977104a77a5708ac18c38824a enterprise blockchain consortiums http bankinnovation net 2017 03 enterprise blockchain consortiums part 1 blockchain a distributed ledger https mastanbtc github io blockchainnotes blockchainintro what does 100 ether mean https medium com humanizing the singularity what does ether 100 mean bb58522f781e ethereum growing exponentially in china https medium com andrewkeys 88339 ethereum growing exponentially in china 31f1d24c8ee9 https medium com tag ethereum https medium com tag blockchain how does the blockchain work for dummies explained simply https medium com the intrepid review how does the blockchain work for dummies explained simply 9f94d386e093 blockchain tokens and the dawn of the decentralized business model https blog coinbase com app coins and the dawn of the decentralized business model 8b8c951e734f how to raise money on a blockchain with a token https blog gdax com how to raise money on a blockchain with a token 510562c9cdfa blockchain the pro s and con s of a technology that will affect our future https medium com totvslabs blockchain the pros and con s of a technology that will affect our future f67037da7d64 blockchain can revolutionize ehrs with optimum security and interoperability https dzone com articles blockchain can revolutionize ehrs with optimum sec blockchain everything you need to know https dzone com articles blockchain everything you need to know fromrel true smart contract security how to never break the blockchain https dzone com articles smart contract security how to never break the blo fromrel true berlin the blockchain capital of the world https dzone com articles berlin the blockchain capital of the world fromrel true https assets weforum org editor 6psx2aynmuhnu lu7q8wx2ainuqbluejm2guq1sfps png http arch doc rchain coop en latest images comparison table png | hyperledger blockchain ethereum dapp solidity truffle blockchain-contracts blockchain-website openbazaar | blockchain |
HoloLensForCV | page type sample name hololens2forcv samples description hololens research mode samples for building and deploying projects with opencv with sensor data streaming and recording languages cpp products windows mixed reality hololens hololensforcv samples we want to help people use the hololens as a computer vision and robotics research device the project was launched at cvpr 2017 and we intend to extend it as new capabilities become available on the hololens contents this repository contains reusable components samples and tools aimed to make it easier to use hololens as a tool for computer vision and robotics research take a quick look at the hololensforcv uwp component shared hololensforcv this component is used by most of our samples and tools to access stream and record hololens sensor data learn how to build https github com microsoft hololensforcv wiki building the project and deploy https github com microsoft hololensforcv wiki running the samples the samples learn how to use opencv on hololens samples computeondevice learn how to stream tools streamer sensor data and how to process it online samples computeondesktop on a companion pc learn how to record tools recorder sensor data and how to process it offline samples batchprocessing on a companion pc universal windows platform development all of the samples require visual studio 2017 update 3 and the windows software development kit sdk for windows 10 to build test and deploy your universal windows platform apps in addition samples specific to windows 10 holographic require a windows holographic device to execute windows holographic devices include the microsoft hololens and the microsoft hololens emulator get a free copy of visual studio 2017 community edition with support for building universal windows platform apps http go microsoft com fwlink p linkid 280676 install the windows holographic tools https developer microsoft com windows mixed reality install the tools learn how to build great apps for windows by experimenting with our sample apps https developer microsoft com en us windows samples find more information on microsoft developer site http go microsoft com fwlink linkid 532421 additionally to stay on top of the latest updates to windows and the development tools become a windows insider by joining the windows insider program become a windows insider https insider windows com using the samples the easiest way to use these samples without using git is to download the zip file containing the current version using the following link or by clicking the download zip button on the repo page you can then unzip the entire archive and use the samples in visual studio 2017 download the samples zip archive master zip notes before you unzip the archive right click it select properties and then select unblock be sure to unzip the entire archive and not just individual samples the samples all depend on the shared shared folder in the archive in visual studio 2017 the platform target defaults to arm so be sure to change that to x64 or x86 if you want to test on a non arm device contributing this project welcomes contributions and suggestions most contributions require you to agree to a contributor license agreement cla declaring that you have the right to and actually do grant us the rights to use your contribution for details visit https cla microsoft com when you submit a pull request a cla bot will automatically determine whether you need to provide a cla and decorate the pr appropriately e g label comment simply follow the instructions provided by the bot you will only need to do this once across all repos using our cla this project has adopted the microsoft open source code of conduct https opensource microsoft com codeofconduct for more information see the code of conduct faq https opensource microsoft com codeofconduct faq or contact opencode microsoft com mailto opencode microsoft com with any additional questions or comments | ai |
|
nlp-in-python-tutorial | welcome to the natural language processing in python tutorial we will be going through several jupyter notebooks during the tutorial and use a number of data science libraries along the way the easiest way to get started is to download anaconda which is free and open source when you download this it comes with the jupyter notebook ide and many popular data science libraries so you don t have to install them one by one here are the steps you ll need to take before the start of the tutorial 1 download anaconda i highly recommend that you download the python 3 7 version https www anaconda com download 2 download the jupyter notebooks clone or download this github repository https github com adashofdata nlp in python tutorial so you have access to all the jupyter notebooks ipynb extension in the tutorial note the green button on the right side of the screen that says clone or download if you know how to use github go ahead and clone the repo if you don t know how to use github you can also just download the zip file and unzip it on your laptop 3 launch anaconda and open a jupyter notebook windows open the anaconda navigator program you should see the jupyter notebook logo below the logo click launch a browser window should open up in the browser window navigate to the location of the saved jupyter notebook files and open 0 hello world ipynb follow the instructions in the notebook mac linux open a terminal type jupyter notebook a browser should open up in the browser window navigate to the location of the saved jupyter notebook files and open 0 hello world ipynb follow the instructions in the notebook 4 install a few additional packages there are a few additional packages we ll be using during the tutorial that are not included when you download anaconda wordcloud textblob and gensim windows open the anaconda prompt program you should see a black window pop up type conda install c conda forge wordcloud to download wordcloud you will be asked whether you want to proceed or not type y for yes once that is done type conda install c conda forge textblob to download textblob and y to proceed and type conda install c conda forge gensim to download gensim and y to proceed mac linux your terminal should already be open type command t to open a new tab type conda install c conda forge wordcloud to download wordcloud you will be asked whether you want to proceed or not type y for yes once that is done type conda install c conda forge textblob to download textblob and y to proceed and type conda install c conda forge gensim to download gensim and y to proceed if you have any issues please email me at adashofdata gmail com or come talk to me before the start of the tutorial | ai |
|
mini-amazon-go | mini amazon go computer vision hackathon mini amazon go setup hardware raspberry pi 4 https www amazon com gp product b07txmdvpq ref ppx yo dt b asin title o00 s00 ie utf8 psc 1 img height 200 alt portfolio view src https user images githubusercontent com 3307514 66869709 c7ce7b00 ef54 11e9 8824 32cbcd40100d png camera module https www amazon com gp product b07pq63d2s ref ppx yo dt b asin title o01 s00 ie utf8 psc 1 img height 200 alt portfolio view src https user images githubusercontent com 3307514 66869685 b8e7c880 ef54 11e9 969c 2ed21178d7ca png touch screen protection case https www amazon com gp product b07wrv48zw ref ppx yo dt b asin title o01 s00 ie utf8 psc 1 aws kinesis video stream on raspberry pi tutorial https docs aws amazon com kinesisvideostreams latest dg producersdk cpp rpi html notes the default region is us west 2 change to nearest region if applicable raspbian os can use apt get to install shared libs to accelerate compilation of kinesis producer c sdk https github com awslabs amazon kinesis video streams producer sdk cpp blob master install instructions linux md install steps for ubuntu 17x and raspbian stretch using apt get setup iam role for kinesis video stream and paste the secret key to raspberry there s a sample app to upload videos to kvs kinesis video gstreamer sample app w width h height f framerate b bitrateinkbps stream name note that not all resolution works 640x480 works on raspberry pi 4 for me or you can use src start upload sh src start upload sh trouble shooting there s potential fix for large latency https docs aws amazon com kinesisvideostreams latest dg troubleshooting html troubleshooting general sagemaker notebook access permission for sagemaker notebook to access kvs it must have permission amazonkinesisvideostreamsfullaccess added to the sagemaker execution role when creating the notebook instance example notebook for processing kvs see videostream example src videostream ipynb example of training and test see src training object detector yolo3 ipynb src training object detector yolo3 ipynb and src test object detector yolo3 ipynb src test object detector yolo3 ipynb yolo3 recommended or see src training object detector ipynb src training object detector ipynb and src test object detector ipynb src test object detector ipynb ssd or see beijing cv hackathon https github com hetong007 d2l 1day cv hackathon | ai |
|
sync_intern_product_landing_page | welcome to the hacktoberfest 2023 repository hacktoberfest logo assets hacktoberfest wall png iphone 15 selling product webpage readme introduction welcome to the readme for our iphone 15 selling product webpage this readme provides an overview of our product and its accompanying webpage features iphone 15 a brief description of the iphone 15 its key features and what sets it apart from other models product images high quality images of the iphone 15 to showcase its design specifications detailed technical specifications and hardware information pricing information about pricing discounts and available configurations how to purchase guidance on how customers can buy the iphone 15 contact information a way for customers to get in touch with any questions or concerns getting started here you can provide information on how to access and use your webpage prerequisites list any prerequisites users might need like a modern web browser installation if applicable provide instructions on how to install or access the webpage quick start offer a step by step guide for first time users to navigate your webpage usage navigating the webpage describe the user interface and how customers can browse through your product listings making a purchase walk users through the process of making a purchase if applicable contact us explain how customers can reach out for inquiries or support contributing if you d like to contribute to our project we welcome your contributions please follow these steps 1 fork this repository 2 create a new branch for your feature or bug fix git checkout b feature new feature 3 make your changes and test thoroughly 4 commit your changes with clear messages git commit m add new feature xyz 5 push your branch to your fork git push origin feature new feature 6 submit a pull request to the main repository support if you have any questions need assistance or want to report an issue please contact us | hacktoberfest hacktoberfest2023 | front_end |
scrooge | scrooge enable engineering finance product and executive audiences to observe analyze and optimize cloud spending | cloud |
|
MD | md sehatyuk mobile app | front_end |
|
deep-learning-for-natural-language-processing | apress source code this repository accompanies deep learning for natural language processing https www apress com 9781484236840 by palash goyal sumit pandey and karan jain apress 2018 comment cover cover image 9781484236840 jpg download the files as a zip using the green button or clone the repository to your machine using git releases release v1 0 corresponds to the code in the published book without corrections or updates contributions see the file contributing md for more information on how you can contribute to this repository | ai |
|
Simulation-and-modelling-of-natural-processes | simulation and modelling of natural processes | ai |
|
android | header image imgs head en png okcaros readme readme cn md okcaros https www okcaros com is an open source system built on top of lineageos https github com lineageos android 13 specifically customized for automotive use it operates on mainstream android smartphones and facilitates audio video and touch data communication with the in car entertainment by carplay protocol it offers several advantages 1 fast connection speed 4 10 seconds 2 high display compatibility 1 1 resolution matching with the in car entertainment 3 support for video transmission at 60fps frame rate and 25mb bitrate 4 support for lossless pcm audio transmission at 48000 44100hz 16 bit stereo how to use if your in car entertainment supports carplay all you need is a compatible android smartphone download the okcaros rom package flash it onto the phone and connect it to your car 1 official website https www okcaros com 2 list of supported devices https wiki okcaros com devices 3 rom package download https download okcaros com real life demonstrations gesture control gif imgs gesture gif smooth screen sharing gif imgs demo gif building the system yourself if you wish to compile okcaros on your own you should be familiar with source control tools https source android com setup develop start by fetching the okcaros source code locally bash repo init u https github com okcar os android git b okcar 1 0 git lfs this is a lengthy process the entire source tree is nearly 183gb when fully downloaded repo sync once the source code is downloaded you can proceed with compiling okcaros following the okcaros build guide compilation is also a time consuming process on a computer equipped with an amd 7950x cpu and 64gb of ram it takes approximately 50 minutes computers with lower specifications may require several hours or even days for the compilation process | android-auto carplay carplay-reverse in-car-entertainment okcaros | os |
have-fun-with-machine-learning | have fun with machine learning a guide for beginners also available in chinese traditional readme zh tw md also available in korean readme ko kr md preface this is a hands on guide to machine learning for programmers with no background in ai using a neural network doesn t require a phd and you don t need to be the person who makes the next breakthrough in ai in order to use what exists today what we have now is already breathtaking and highly usable i believe that more of us need to play with this stuff like we would any other open source technology instead of treating it like a research topic in this guide our goal will be to write a program that uses machine learning to predict with a high degree of certainty whether the images in data untrained samples data untrained samples are of dolphins or seahorses using only the images themselves and without having seen them before here are two example images we ll use a dolphin data untrained samples dolphin1 jpg raw true dolphin a seahorse data untrained samples seahorse1 jpg raw true seahorse to do that we re going to train and use a convolutional neural network cnn https en wikipedia org wiki convolutional neural network we re going to approach this from the point of view of a practitioner vs from first principles there is so much excitement about ai right now but much of what s being written feels like being taught to do tricks on your bike by a physics professor at a chalkboard instead of your friends in the park i ve decided to write this on github vs as a blog post because i m sure that some of what i ve written below is misleading naive or just plain wrong i m still learning myself and i ve found the lack of solid beginner documentation an obstacle if you see me making a mistake or missing important details please send a pull request with all of that out the way let me show you how to do some tricks on your bike overview here s what we re going to explore setup and use existing open source machine learning technologies specifically caffe http caffe berkeleyvision org and digits https developer nvidia com digits create a dataset of images train a neural network from scratch test our neural network on images it has never seen before improve our neural network s accuracy by fine tuning existing neural networks alexnet and googlenet deploy and use our neural network this guide won t teach you how neural networks are designed cover much theory or use a single mathematical expression i don t pretend to understand most of what i m going to show you instead we re going to use existing things in interesting ways to solve a hard problem q i know you said we won t talk about the theory of neural networks but i m feeling like i d at least like an overview before we get going where should i start there are literally hundreds of introductions to this from short posts to full online courses depending on how you like to learn here are three options for a good starting point this fantastic blog post https jalammar github io visual interactive guide basics neural networks by j alammar which introduces the concepts of neural networks using intuitive examples similarly this video https www youtube com watch v fmpdiaimiea introduction by brandon rohrer https www youtube com channel ucsbktrp45ltfha p49i2aeq is a really good intro to convolutional neural networks like we ll be using if you d rather have a bit more theory i d recommend this online book http neuralnetworksanddeeplearning com chap1 html by michael nielsen http michaelnielsen org setup installing the software we ll use caffe and digits can be frustrating depending on your platform and os version by far the easiest way to do it is using docker below we examine how to do it with docker as well as how to do it natively option 1a installing caffe natively first we re going to be using the caffe deep learning framework http caffe berkeleyvision org from the berkely vision and learning center bsd licensed q wait a minute why caffe why not use something like tensorflow which everyone is talking about these days there are a lot of great choices available and you should look at all the options tensorflow https www tensorflow org is great and you should play with it however i m using caffe for a number of reasons it s tailormade for computer vision problems it has support for c python with node js support https github com silklabs node caffe coming it s fast and stable but the number one reason i m using caffe is that you don t need to write any code to work with it you can do everything declaratively caffe uses structured text files to define the network architecture and using command line tools also you can use some nice front ends for caffe to make training and validating your network a lot easier we ll be using nvidia s digits https developer nvidia com digits tool below for just this purpose caffe can be a bit of work to get installed there are installation instructions http caffe berkeleyvision org installation html for various platforms including some prebuilt docker or aws configurations note when making my walkthrough i used the following non released version of caffe from their github repo https github com bvlc caffe commit 5a201dd960840c319cefd9fa9e2a40d2c76ddd73 on a mac it can be frustrating to get working with version issues halting your progress at various steps in the build it took me a couple of days of trial and error there are a dozen guides i followed each with slightly different problems in the end i found this one https gist github com doctorpangloss f8463bddce2a91b949639522ea1dcbe4 to be the closest i d also recommend this post https eddiesmo wordpress com 2016 12 20 how to set up caffe environment and pycaffe on os x 10 12 sierra which is quite recent and links to many of the same discussions i saw getting caffe installed is by far the hardest thing we ll do which is pretty neat since you d assume the ai aspects would be harder don t give up if you have issues it s worth the pain if i was doing this again i d probably use an ubuntu vm instead of trying to do it on mac directly there s also a caffe users https groups google com forum forum caffe users group if you need answers q do i need powerful hardware to train a neural network what if i don t have access to fancy gpus it s true deep neural networks require a lot of computing power and energy to train if you re training them from scratch and using massive datasets we aren t going to do that the secret is to use a pretrained network that someone else has already invested hundreds of hours of compute time training and then to fine tune it to your particular dataset we ll look at how to do this below but suffice it to say that everything i m going to show you i m doing on a year old macbook pro without a fancy gpu as an aside because i have an integrated intel graphics card vs an nvidia gpu i decided to use the opencl caffe branch https github com bvlc caffe tree opencl and it s worked great on my laptop when you re done installing caffe you should have or be able to do all of the following a directory that contains your built caffe if you did this in the standard way there will be a build dir which contains everything you need to run caffe the python bindings etc the parent dir that contains build will be your caffe root we ll need this later running make test make runtest should pass after installing all the python deps doing pip install r requirements txt in python running make pycaffe make pytest should pass you should also run make distribute in order to create a distributable version of caffe with all necessary headers binaries etc in distribute on my machine with caffe fully built i ve got the following basic layout in my caffe root dir caffe build python lib tools caffe this is our main binary distribute python lib include bin proto at this point we have everything we need to train test and program with neural networks in the next section we ll add a user friendly web based front end to caffe called digits which will make training and testing our networks much easier option 1b installing digits natively nvidia s deep learning gpu training system or digits https github com nvidia digits is bsd licensed python web app for training neural networks while it s possible to do everything digits does in caffe at the command line or with code using digits makes it a lot easier to get started i also found it more fun due to the great visualizations real time charts and other graphical features since you re experimenting and trying to learn i highly recommend beginning with digits there are quite a few good docs at https github com nvidia digits tree master docs including a few installation https github com nvidia digits blob master docs builddigits md configuration https github com nvidia digits blob master docs configuration md and getting started https github com nvidia digits blob master docs gettingstarted md pages i d recommend reading through everything there before you continue as i m not an expert on everything you can do with digits there s also a public digits user group https groups google com forum forum digits users if you have questions you need to ask there are various ways to install and run digits from docker to pre baked packages on linux or you can build it from source i m on a mac so i built it from source note in my walkthrough i ve used the following non released version of digits from their github repo https github com nvidia digits commit 81be5131821ade454eb47352477015d7c09753d9 because it s just a bunch of python scripts it was fairly painless to get working the one thing you need to do is tell digits where your caffe root is by setting an environment variable before starting the server bash export caffe root path to caffe digits devserver note on mac i had issues with the server scripts assuming my python binary was called python2 where i only have python2 7 you can symlink it in usr bin or modify the digits startup script s to use the proper binary on your system once the server is started you can do everything else via your web browser at http localhost 5000 which is what i ll do below option 2 caffe and digits using docker install docker https www docker com if not already installed then run the following command in order to pull and run a full caffe digits container a few things to note make sure port 8080 isn t allocated by another program if so change it to any other port you want change path to this repository to the location of this cloned repo and data repo within the container will be bound to this directory this is useful for accessing the images discussed below bash docker run name digits d p 8080 5000 v path to this repository data repo kaixhin digits now that we have our container running you can open up your web browser and open http localhost 8080 everything in the repository is now in the container directory data repo that s it you ve now got caffe and digits working if you need shell access use the following command bash docker exec it digits bin bash training a neural network training a neural network involves a few steps 1 assemble and prepare a dataset of categorized images 2 define the network s architecture 3 train and validate this network using the prepared dataset we re going to do this 3 different ways in order to show the difference between starting from scratch and using a pretrained network and also to show how to work with two popular pretrained networks alexnet googlenet that are commonly used with caffe and digits for our training attempts we ll use a small dataset of dolphins and seahorses i ve put the images i used in data dolphins and seahorses data dolphins and seahorses you need at least 2 categories but could have many more some of the networks we ll use were trained on 1000 image categories our goal is to be able to give an image to our network and have it tell us whether it s a dolphin or a seahorse prepare the dataset the easiest way to begin is to divide your images into a categorized directory layout dolphins and seahorses dolphin image 0001 jpg image 0002 jpg image 0003 jpg seahorse image 0001 jpg image 0002 jpg image 0003 jpg here each directory is a category we want to classify and each image within that category dir an example we ll use for training and validation q do my images have to be the same size what about the filenames do they matter no to both the images sizes will be normalized before we feed them into the network we ll eventually want colour images of 256 x 256 pixels but digits will crop or squash we ll squash our images automatically in a moment the filenames are irrelevant it s only important which category they are contained within q can i do more complex segmentation of my categories yes see https github com nvidia digits blob digits 4 0 docs imagefolderformat md we want to use these images on disk to create a new dataset and specifically a classification dataset create new dataset images create new dataset png raw true create new dataset we ll use the defaults digits gives us and point training images at the path to our data dolphins and seahorses data dolphins and seahorses folder digits will use the categories dolphin and seahorse to create a database of squashed 256 x 256 training 75 and testing 25 images give your dataset a name dolphins and seahorses and click create new image classification dataset images new image classification dataset png raw true new image classification dataset this will create our dataset which took only 4s on my laptop in the end i have 92 training images 49 dolphin 43 seahorse in 2 categories with 30 validation images 16 dolphin 14 seahorse it s a really small dataset but perfect for our experimentation and learning purposes because it won t take forever to train and validate a network that uses it you can explore the db if you want to see the images after they have been squashed explore the db images explore dataset png raw true explore the db training attempt 1 from scratch back in the digits home screen we need to create a new classification model create classification model images create classification model png raw true create classification model we ll start by training a model that uses our dolphins and seahorses dataset and the default settings digits provides for our first network we ll choose to use one of the standard network architectures alexnet pdf http papers nips cc paper 4824 imagenet classification with deep convolutional neural networks pdf alexnet s design http vision stanford edu teaching cs231b spring1415 slides alexnet tugce kyunghee pdf won a major computer vision competition called imagenet in 2012 the competition required categorizing 1000 image categories across 1 2 million images new classification model 1 images new image classification model attempt1 png raw true model 1 caffe uses structured text files to define network architectures these text files are based on google s protocol buffers https developers google com protocol buffers you can read the full schema https github com bvlc caffe blob master src caffe proto caffe proto caffe uses for the most part we re not going to work with these but it s good to be aware of their existence since we ll have to modify them in later steps the alexnet prototxt file looks like this for example https github com bvlc caffe blob master models bvlc alexnet train val prototxt we ll train our network for 30 epochs which means that it will learn with our training images then test itself using our validation images and adjust the network s weights depending on how well it s doing and repeat this process 30 times each time it completes a cycle we ll get info about accuracy 0 to 100 where higher is better and what our loss is the sum of all the mistakes that were made where lower is better ideally we want a network that is able to predict with high accuracy and with few errors small loss note some people have reported hitting errors in digits https github com humphd have fun with machine learning issues 17 doing this training run for many the problem related to available memory the process needs a lot of memory to work if you re using docker you might want to try increasing the amount of memory available to digits in docker preferences advanced memory initially our network s accuracy is a bit below 50 this makes sense because at first it s just guessing between two categories using randomly assigned weights over time it s able to achieve 87 5 accuracy with a loss of 0 37 the entire 30 epoch run took me just under 6 minutes model attempt 1 images model attempt1 png raw true model attempt 1 we can test our model using an image we upload or a url to an image on the web let s test it on a few examples that weren t in our training validation dataset model 1 classify 1 images model attempt1 classify1 png raw true model 1 classify 1 model 1 classify 2 images model attempt1 classify2 png raw true model 1 classify 2 it almost seems perfect until we try another model 1 classify 3 images model attempt1 classify3 png raw true model 1 classify 3 here it falls down completely and confuses a seahorse for a dolphin and worse does so with a high degree of confidence the reality is that our dataset is too small to be useful for training a really good neural network we really need 10s or 100s of thousands of images and with that a lot of computing power to process everything training attempt 2 fine tuning alexnet how fine tuning works designing a neural network from scratch collecting data sufficient to train it e g millions of images and accessing gpus for weeks to complete the training is beyond the reach of most of us to make it practical for smaller amounts of data to be used we employ a technique called transfer learning or fine tuning fine tuning takes advantage of the layout of deep neural networks and uses pretrained networks to do the hard work of initial object detection imagine using a neural network to be like looking at something far away with a pair of binoculars you first put the binoculars to your eyes and everything is blurry as you adjust the focus you start to see colours lines shapes and eventually you are able to pick out the shape of a bird then with some more adjustment you can identify the species of bird in a multi layered network the initial layers extract features e g edges with later layers using these features to detect shapes e g a wheel an eye which are then feed into final classification layers that detect items based on accumulated characteristics from previous layers e g a cat vs a dog a network has to be able to go from pixels to circles to eyes to two eyes placed in a particular orientation and so on up to being able to finally conclude that an image depicts a cat what we d like to do is to specialize an existing pretrained network for classifying a new set of image classes instead of the ones on which it was initially trained because the network already knows how to see features in images we d like to retrain it to see our particular image types we don t need to start from scratch with the majority of the layers we want to transfer the learning already done in these layers to our new classification task unlike our previous attempt which used random weights we ll use the existing weights of the final network in our training however we ll throw away the final classification layer s and retrain the network with our image dataset fine tuning it to our image classes for this to work we need a pretrained network that is similar enough to our own data that the learned weights will be useful luckily the networks we ll use below were trained on millions of natural images from imagenet http image net org which is useful across a broad range of classification tasks this technique has been used to do interesting things like screening for eye diseases from medical imagery identifying plankton species from microscopic images collected at sea to categorizing the artistic style of flickr images doing this perfectly like all of machine learning requires you to understand the data and network architecture you have to be careful with overfitting of the data might need to fix some of the layers might need to insert new layers etc however my experience is that it just works much of the time and it s worth you simply doing an experiment to see what you can achieve using our naive approach uploading pretrained networks in our first attempt we used alexnet s architecture but started with random weights in the network s layers what we d like to do is download and use a version of alexnet that has already been trained on a massive dataset thankfully we can do exactly this a snapshot of alexnet is available for download https github com bvlc caffe tree master models bvlc alexnet we need the binary caffemodel file which is what contains the trained weights and it s available for download at http dl caffe berkeleyvision org bvlc alexnet caffemodel while you re downloading pretrained models let s get one more at the same time in 2014 google won the same imagenet competition with googlenet https research google com pubs pub43022 html codenamed inception a 22 layer neural network a snapshot of googlenet is available for download as well see https github com bvlc caffe tree master models bvlc googlenet again we ll need the caffemodel file with all the pretrained weights which is available for download at http dl caffe berkeleyvision org bvlc googlenet caffemodel with these caffemodel files in hand we can upload them into digits go to the pretrained models tab in digits home page and choose upload pretrained model load pretrained model images load pretrained model png raw true load pretrained model for both of these pretrained models we can use the defaults digits provides i e colour squashed images of 256 x 256 we just need to provide the weights caffemodel and model definition original prototxt click each of those buttons to select a file for the model definitions we can use https github com bvlc caffe blob master models bvlc googlenet train val prototxt for googlenet and https github com bvlc caffe blob master models bvlc alexnet train val prototxt for alexnet we aren t going to use the classification labels of these networks so we ll skip adding a labels txt file upload pretrained model images upload pretrained model png raw true upload pretrained model repeat this process for both alexnet and googlenet as we ll use them both in the coming steps q are there other networks that would be good as a basis for fine tuning the caffe model zoo http caffe berkeleyvision org model zoo html has quite a few other pretrained networks that could be used see https github com bvlc caffe wiki model zoo fine tuning alexnet for dolphins and seahorses training a network using a pretrained caffe model is similar to starting from scratch though we have to make a few adjustments first we ll adjust the base learning rate to 0 001 from 0 01 since we don t need to make such large jumps i e we re fine tuning we ll also use a pretrained network and customize it new image classification images new image classification model attempt2 png raw true new image classification in the pretrained model s definition i e prototext we need to rename all references to the final fully connected layer where the end result classifications happen we do this because we want the model to re learn new categories from our dataset vs its original training data i e we want to throw away the current final layer we have to rename the last fully connected layer from fc8 to something else fc9 for example finally we also need to adjust the number of categories from 1000 to 2 by changing num output to 2 here are the changes we need to make diff 332 8 332 8 layer name fc8 name fc9 type innerproduct bottom fc7 top fc8 top fc9 param lr mult 1 345 5 345 5 inner product param num output 1000 num output 2 weight filler type gaussian 359 5 359 5 name accuracy type accuracy bottom fc8 bottom fc9 bottom label top accuracy 367 5 367 5 name loss type softmaxwithloss bottom fc8 bottom fc9 bottom label top loss 375 5 375 5 name softmax type softmax bottom fc8 bottom fc9 top softmax include stage deploy i ve included the fully modified file i m using in src alexnet customized prototxt src alexnet customized prototxt this time our accuracy starts at 60 and climbs right away to 87 5 then to 96 and all the way up to 100 with the loss steadily decreasing after 5 minutes we end up with an accuracy of 100 and a loss of 0 0009 model attempt 2 images model attempt2 png raw true model attempt 2 testing the same seahorse image our previous network got wrong we see a complete reversal 100 seahorse model 2 classify 1 images model attempt2 classify1 png raw true model 2 classify 1 even a children s drawing of a seahorse works model 2 classify 2 images model attempt2 classify2 png raw true model 2 classify 2 the same goes for a dolphin model 2 classify 3 images model attempt2 classify3 png raw true model 2 classify 3 even with images that you think might be hard like this one that has multiple dolphins close together and with their bodies mostly underwater it does the right thing model 2 classify 4 images model attempt2 classify4 png raw true model 2 classify 4 training attempt 3 fine tuning googlenet like the previous alexnet model we used for fine tuning we can use googlenet as well modifying the network is a bit trickier since you have to redefine three fully connected layers instead of just one to fine tune googlenet for our use case we need to once again create a new classification model new classification model images new image classification model attempt3 png raw true new classification model we rename all references to the three fully connected classification layers loss1 classifier loss2 classifier and loss3 classifier and redefine the number of categories num output 2 here are the changes we need to make in order to rename the 3 classifier layers as well as to change from 1000 to 2 categories diff 917 10 917 10 exclude stage deploy layer name loss1 classifier name loss1a classifier type innerproduct bottom loss1 fc top loss1 classifier top loss1a classifier param lr mult 1 decay mult 1 930 7 930 7 decay mult 0 inner product param num output 1000 num output 2 weight filler type xavier std 0 0009765625 945 7 945 7 layer name loss1 loss type softmaxwithloss bottom loss1 classifier bottom loss1a classifier bottom label top loss1 loss loss weight 0 3 954 7 954 7 layer name loss1 top 1 type accuracy bottom loss1 classifier bottom loss1a classifier bottom label top loss1 accuracy include stage val 962 7 962 7 layer name loss1 top 5 type accuracy bottom loss1 classifier bottom loss1a classifier bottom label top loss1 accuracy top5 include stage val 1705 10 1705 10 exclude stage deploy layer name loss2 classifier name loss2a classifier type innerproduct bottom loss2 fc top loss2 classifier top loss2a classifier param lr mult 1 decay mult 1 1718 7 1718 7 decay mult 0 inner product param num output 1000 num output 2 weight filler type xavier std 0 0009765625 1733 7 1733 7 layer name loss2 loss type softmaxwithloss bottom loss2 classifier bottom loss2a classifier bottom label top loss2 loss loss weight 0 3 1742 7 1742 7 layer name loss2 top 1 type accuracy bottom loss2 classifier bottom loss2a classifier bottom label top loss2 accuracy include stage val 1750 7 1750 7 layer name loss2 top 5 type accuracy bottom loss2 classifier bottom loss2a classifier bottom label top loss2 accuracy top5 include stage val 2435 10 2435 10 layer name loss3 classifier name loss3a classifier type innerproduct bottom pool5 7x7 s1 top loss3 classifier top loss3a classifier param lr mult 1 decay mult 1 2448 7 2448 7 decay mult 0 inner product param num output 1000 num output 2 weight filler type xavier 2461 7 2461 7 layer name loss3 loss type softmaxwithloss bottom loss3 classifier bottom loss3a classifier bottom label top loss loss weight 1 2470 7 2470 7 layer name loss3 top 1 type accuracy bottom loss3 classifier bottom loss3a classifier bottom label top accuracy include stage val 2478 7 2478 7 layer name loss3 top 5 type accuracy bottom loss3 classifier bottom loss3a classifier bottom label top accuracy top5 include stage val 2489 7 2489 7 layer name softmax type softmax bottom loss3 classifier bottom loss3a classifier top softmax include stage deploy i ve put the complete file in src googlenet customized prototxt src googlenet customized prototxt q what about changes to the prototext definitions of these networks we changed the fully connected layer name s and the number of categories what else could or should be changed and in what circumstances great question and it s something i m wondering too for example i know that we can fix certain layers https github com bvlc caffe wiki fine tuning or training certain layers exclusively so the weights don t change doing other things involves understanding how the layers work which is beyond this guide and also beyond its author at present like we did with fine tuning alexnet we also reduce the learning rate by 10 from 0 01 to 0 001 q what other changes would make sense when fine tuning these networks what about different numbers of epochs batch sizes solver types adam adadelta adagrad etc learning rates policies exponential decay inverse decay sigmoid decay etc step sizes and gamma values great question and one that i wonder about as well i only have a vague understanding of these and it s likely that there are improvements we can make if you know how to alter these values when training this is something that needs better documentation because googlenet has a more complicated architecture than alexnet fine tuning it requires more time on my laptop it takes 10 minutes to retrain googlenet with our dataset achieving 100 accuracy and a loss of 0 0070 model attempt 3 images model attempt3 png raw true model attempt 3 just as we saw with the fine tuned version of alexnet our modified googlenet performs amazing well the best so far model attempt 3 classify 1 images model attempt3 classify1 png raw true model attempt 3 classify 1 model attempt 3 classify 2 images model attempt3 classify2 png raw true model attempt 3 classify 2 model attempt 3 classify 3 images model attempt3 classify3 png raw true model attempt 3 classify 3 using our model with our network trained and tested it s time to download and use it each of the models we trained in digits has a download model button as well as a way to select different snapshots within our training run e g epoch 30 trained models images trained models png raw true trained models clicking download model downloads a tar gz archive containing the following files deploy prototxt mean binaryproto solver prototxt info json original prototxt labels txt snapshot iter 90 caffemodel train val prototxt there s a nice description https github com bvlc caffe wiki using a trained network deploy in the caffe documentation about how to use the model we just built it says a network is defined by its design prototxt and its weights caffemodel as a network is being trained the current state of that network s weights are stored in a caffemodel with both of these we can move from the train test phase into the production phase in its current state the design of the network is not designed for deployment before we can release our network as a product we often need to alter it in a few ways 1 remove the data layer that was used for training as for in the case of classification we are no longer providing labels for our data 2 remove any layer that is dependent upon data labels 3 set the network up to accept data 4 have the network output the result digits has already done the work for us separating out the different versions of our prototxt files the files we ll care about when using this network are deploy prototxt the definition of our network ready for accepting image input data mean binaryproto our model will need us to subtract the image mean from each image that it processes and this is the mean image labels txt a list of our labels dolphin seahorse in case we want to print them vs just the category number snapshot iter 90 caffemodel these are the trained weights for our network we can use these files in a number of ways to classify new images for example in our caffe root we can use build examples cpp classification classification bin to classify one image bash cd caffe root build examples cpp classification classification bin deploy prototxt snapshot iter 90 caffemodel mean binaryproto labels txt dolphin1 jpg this will spit out a bunch of debug text followed by the predictions for each of our two categories 0 9997 dolphin 0 0003 seahorse you can read the complete c source https github com bvlc caffe tree master examples cpp classification for this in the caffe examples https github com bvlc caffe tree master examples for a classification version that uses the python interface digits includes a nice example https github com nvidia digits tree master examples classification there s also a fairly well documented python walkthrough https github com bvlc caffe blob master examples 00 classification ipynb in the caffe examples python example let s write a program that uses our fine tuned googlenet model to classify the untrained images we have in data untrained samples data untrained samples i ve cobbled this together based on the examples above as well as the caffe python module s source https github com bvlc caffe tree master python which you should prefer to anything i m about to say a full version of what i m going to discuss is available in src classify samples py src classify samples py let s begin first we ll need the numpy http www numpy org module in a moment we ll be using numpy http www numpy org to work with ndarray s https docs scipy org doc numpy reference generated numpy ndarray html which caffe uses a lot if you haven t used them before as i had not you d do well to begin by reading this quickstart tutorial https docs scipy org doc numpy dev user quickstart html second we ll need to load the caffe module from our caffe root dir if it s not already included in your python environment you can force it to load by adding it manually along with it we ll also import caffe s protobuf module python import numpy as np caffe root path to your caffe root sys path insert 0 os path join caffe root python import caffe from caffe proto import caffe pb2 next we need to tell caffe whether to use the cpu or gpu https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe caffe cpp l50 l52 for our experiments the cpu is fine python caffe set mode cpu now we can use caffe to load our trained network to do so we ll need some of the files we downloaded from digits namely deploy prototxt our network file the description of the network snapshot iter 90 caffemodel our trained weights we obviously need to provide the full path and i ll assume that my files are in a dir called model python model dir model deploy file os path join model dir deploy prototxt weights file os path join model dir snapshot iter 90 caffemodel net caffe net deploy file caffe test weights weights file the caffe net constructor https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe caffe cpp l91 l117 takes a network file a phase caffe test or caffe train as well as an optional weights filename when we provide a weights file the net will automatically load them for us the net has a number of methods and attributes https github com bvlc caffe blob master python caffe pycaffe py you can use note there is also a deprecated version of this constructor https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe caffe cpp l119 l134 which seems to get used often in sample code on the web it looks like this in case you encounter it python net caffe net str deploy file str model file caffe test we re interested in loading images of various sizes into our network for testing as a result we ll need to transform them into a shape that our network can use i e colour 256x256 caffe provides the transformer class https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe io py l98 for this purpose we ll use it to create a transformation appropriate for our images network python transformer caffe io transformer data net blobs data data shape set transpose https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe io py l187 transformer set transpose data 2 0 1 set raw scale https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe io py l221 transformer set raw scale data 255 set channel swap https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe io py l203 transformer set channel swap data 2 1 0 we can also use the mean binaryproto file digits gave us to set our transformer s mean python this code for setting the mean from https github com nvidia digits tree master examples classification mean file os path join model dir mean binaryproto with open mean file rb as infile blob caffe pb2 blobproto blob mergefromstring infile read if blob hasfield shape blob dims blob shape assert len blob dims 4 shape should have 4 dimensions shape is s blob shape elif blob hasfield num and blob hasfield channels and blob hasfield height and blob hasfield width blob dims blob num blob channels blob height blob width else raise valueerror blob does not provide shape or 4d dimensions pixel np reshape blob data blob dims 1 mean 1 mean 1 transformer set mean data pixel if we had a lot of labels we might also choose to read in our labels file which we can use later by looking up the label for a probability using its position e g 0 dolphin 1 seahorse python labels file os path join model dir labels txt labels np loadtxt labels file str delimiter n now we re ready to classify an image we ll use caffe io load image https github com bvlc caffe blob 61944afd4e948a4e2b4ef553919a886a8a8b8246 python caffe io py l279 to read our image file then use our transformer to reshape it and set it as our network s data layer python load the image from disk using caffe s built in i o module image caffe io load image fullpath preprocess the image into the proper format for feeding into the model net blobs data data transformer preprocess data image q how could i use images i e frames from a camera or video stream instead of files great question here s a skeleton to get you started python import cv2 get the shape of our input data layer so we can resize the image input shape net blobs data data shape webcamcap cv2 videocapture 0 could also be a url filename if webcamcap isopened rval frame webcamcap read else rval false while rval rval frame webcamcap read net blobs data data transformer preprocess data frame webcamcap release back to our problem we next need to run the image data through our network and read out the probabilities from our network s final softmax layer which will be in order by label category python run the image s pixel data through the network out net forward extract the probabilities of our two categories from the final layer softmax layer out softmax here we re converting to python types from ndarray floats dolphin prob softmax layer item 0 seahorse prob softmax layer item 1 print the results i m using labels just to show how it s done label labels 0 if dolphin prob seahorse prob else labels 1 filename os path basename fullpath print s is a s dolphin 3f seahorse 3f filename label dolphin prob 100 seahorse prob 100 running the full version of this see src classify samples py src classify samples py using our fine tuned googlenet network on our data untrained samples data untrained samples images gives me the following output truncated caffe network output dolphin1 jpg is a dolphin dolphin 99 968 seahorse 0 032 dolphin2 jpg is a dolphin dolphin 99 997 seahorse 0 003 dolphin3 jpg is a dolphin dolphin 99 943 seahorse 0 057 seahorse1 jpg is a seahorse dolphin 0 365 seahorse 99 635 seahorse2 jpg is a seahorse dolphin 0 000 seahorse 100 000 seahorse3 jpg is a seahorse dolphin 0 014 seahorse 99 986 i m still trying to learn all the best practices for working with models in code i wish i had more and better documented code examples apis premade modules etc to show you here to be honest most of the code examples i ve found are terse and poorly documented caffe s documentation is spotty and assumes a lot it seems to me like there s an opportunity for someone to build higher level tools on top of the caffe interfaces for beginners and basic workflows like we ve done here it would be great if there were more simple modules in high level languages that i could point you at that did the right thing with our model someone could should take this on and make using caffe models as easy as digits makes training them i d love to have something i could use in node js for example ideally one shouldn t be required to know so much about the internals of the model or caffe i haven t used it yet but deepdetect https deepdetect com looks interesting on this front and there are likely many other tools i don t know about results at the beginning we said that our goal was to write a program that used a neural network to correctly classify all of the images in data untrained samples data untrained samples these are images of dolphins and seahorses that were never used in the training or validation data untrained dolphin images dolphin 1 data untrained samples dolphin1 jpg raw true dolphin 1 dolphin 2 data untrained samples dolphin2 jpg raw true dolphin 2 dolphin 3 data untrained samples dolphin3 jpg raw true dolphin 3 untrained seahorse images seahorse 1 data untrained samples seahorse1 jpg raw true seahorse 1 seahorse 2 data untrained samples seahorse2 jpg raw true seahorse 2 seahorse 3 data untrained samples seahorse3 jpg raw true seahorse 3 let s look at how each of our three attempts did with this challenge model attempt 1 alexnet from scratch 3rd place image dolphin seahorse result dolphin1 jpg data untrained samples dolphin1 jpg 71 11 28 89 expressionless dolphin2 jpg data untrained samples dolphin2 jpg 99 2 0 8 sunglasses dolphin3 jpg data untrained samples dolphin3 jpg 63 3 36 7 confused seahorse1 jpg data untrained samples seahorse1 jpg 95 04 4 96 disappointed seahorse2 jpg data untrained samples seahorse2 jpg 56 64 43 36 confused seahorse3 jpg data untrained samples seahorse3 jpg 7 06 92 94 grin model attempt 2 fine tuned alexnet 2nd place image dolphin seahorse result dolphin1 jpg data untrained samples dolphin1 jpg 99 1 0 09 sunglasses dolphin2 jpg data untrained samples dolphin2 jpg 99 5 0 05 sunglasses dolphin3 jpg data untrained samples dolphin3 jpg 91 48 8 52 grin seahorse1 jpg data untrained samples seahorse1 jpg 0 100 sunglasses seahorse2 jpg data untrained samples seahorse2 jpg 0 100 sunglasses seahorse3 jpg data untrained samples seahorse3 jpg 0 100 sunglasses model attempt 3 fine tuned googlenet 1st place image dolphin seahorse result dolphin1 jpg data untrained samples dolphin1 jpg 99 86 0 14 sunglasses dolphin2 jpg data untrained samples dolphin2 jpg 100 0 sunglasses dolphin3 jpg data untrained samples dolphin3 jpg 100 0 sunglasses seahorse1 jpg data untrained samples seahorse1 jpg 0 5 99 5 sunglasses seahorse2 jpg data untrained samples seahorse2 jpg 0 100 sunglasses seahorse3 jpg data untrained samples seahorse3 jpg 0 02 99 98 sunglasses conclusion it s amazing how well our model works and what s possible by fine tuning a pretrained network obviously our dolphin vs seahorse example is contrived and the dataset overly limited we really do want more and better data if we want our network to be robust but since our goal was to examine the tools and workflows of neural networks it s turned out to be an ideal case especially since it didn t require expensive equipment or massive amounts of time above all i hope that this experience helps to remove the overwhelming fear of getting started deciding whether or not it s worth investing time in learning the theories of machine learning and neural networks is easier when you ve been able to see it work in a small way now that you ve got a setup and a working approach you can try doing other sorts of classifications you might also look at the other types of things you can do with caffe and digits for example finding objects within an image or doing segmentation have fun with machine learning | neural-network image-classification machine-learning caffe tutorial | ai |
dmn-tensorflow | dynamic memory network hb research https img shields io badge hb research experiment green svg style flat colora 448c57 colorb 555555 https github com hb research tensorflow implementation of ask me anything dynamic memory networks for natural language processing https arxiv org pdf 1506 07285 pdf images images ask me anything figure 3 png requirements python 3 6 tensorflow 1 8 hb config https github com hb research hb config singleton config nltk tokenizer and blue score tqdm progress bar project structure init project by hb base https github com hb research hb base config config files yml json using with hb config data dataset path notebooks prototyping with numpy or tf interactivesession dynamic memory dmn architecture graphs from input to output init py graph logic encoder py encoder episode py episode and attentiongate data loader py raw date precossed data generate batch using dataset hook py training or test hook feature eg print variables main py define experiment fn model py define estimatorspec reference hb config https github com hb research hb config dataset https www tensorflow org api docs python tf data dataset from generator experiments fn https www tensorflow org api docs python tf contrib learn experiment estimatorspec https www tensorflow org api docs python tf estimator estimatorspec todo implements dmn dynamic memory networks for visual and textual question answering https arxiv org pdf 1603 01417 pdf 2016 by c xiong config example babi task1 yml yml data base path data task path en 10k task id 1 pad id 0 model batch size 16 use pretrained true true or false embed dim 50 if use pretrained only available 50 100 200 300 encoder type uni uni bi cell type gru lstm gru layer norm lstm nas num layers 1 num units 32 memory hob 3 dropout 0 0 reg scale 0 001 train learning rate 0 0001 optimizer adam adagrad adam ftrl momentum rmsprop sgd train steps 100000 model dir logs babi task1 save checkpoints steps 1000 check hook n iter 1000 min eval frequency 1000 print verbose false debug false usage install requirements pip install r requirements txt then prepare dataset and pre trained glove sh scripts fetch babi data sh sh scripts fetch glove data sh finally start trand and evalueate model python main py config babi task1 mode train and evaluate experiments modes white check mark working white medium small square not tested yet white check mark evaluate evaluate on the evaluation data white medium small square extend train hooks extends the hooks for training white medium small square reset export strategies resets the export strategies with the new export strategies white medium small square run std server starts a tensorflow server and joins the serving thread white medium small square test tests training evaluating and exporting the estimator for a single step white check mark train fit the estimator using the training data white check mark train and evaluate interleaves training and evaluation tensorboar tensorboard logdir logs reference implementing dynamic memory networks https yerevann github io 2016 02 05 implementing dynamic memory networks arxiv ask me anything dynamic memory networks for natural language processing https arxiv org abs 1506 07285 2015 6 by a kumar arxiv dynamic memory networks for visual and textual question answering https arxiv org abs 1603 01417 2016 3 by c xiong author dongjun lee humanbrain djlee gmail com | tensorflow natural-language-processing dynamic-memory-network babi-tasks question-answering nlp hb-experiment | ai |
Vision | vision a eye gaze tracking library based on computer vision and neural network for neuralaction this repository is part of project neuralaction current gaze tracking model s mean error is 3 2 cm in 50 cm far without any calibration with calibration mean error is 1 8 cm demo web gazedemo gif gaze tracking with calibrations todo neuralaction 0 2 2019 10 research more accurate eye gaze tracker more accurate eye blink classification being more robust to a person s appearance difference less calibration tries develop support arm64 woa support winml onnx models optimize for low powered devices low battery slow cpu or gpu windows eye tracking accessibility api integration eye gaze model x channel merged input neural net based calibrator mobilenet v3 training support ir camera from windows hello auto detect and load calibration data arm64 woa support opencvsharp native recompile sharpface native recompile winml features x winml backend x onnx model runner x onnx formatted gaze model x onnx formatted eye blink model gpu support fp16 computation data augmentation x grayscale for ir camera styleaug cyclegan for race appearance transfer eye blink modle apply transfer learning windows eye tracking accessibility api intergration chrome ui accessibility expose neuralaction 0 1 2018 05 gaze tracking calibration codes put more various data into gaze tracking model main features x single camera gaze tracking x gaze tracking service x abstractions around opencv x abstractions around tensorflow x platform abstraction layer files audio video etc opencv features x face tracking tadas openface https github com tadasbaltrusaitis openface x cascade object detection x some examples of opencv x cross platform webcam i o tensorflow features x data sharing between opencv x input image normalization x gpu acceleration supports x model imports | tensorflow opencv computer-vision neural-network eye-tracking face-tracking eye-gaze onnx-models vision calibration gpu-support | ai |
PyDatSet | pydatset load various datasets in python description this repo contains pydatset a package for loading and eventually augmenting datasets in python you can check the source of each function into their pydocs currently supported datasets are mnist http yann lecun com exdb mnist kaggle mnist https www kaggle com c digit recognizer cifar 10 https www cs toronto edu kriz cifar html tiny imagenet http cs231n stanford edu project html gtsrb http benchmark ini rub de section gtsrb subsection news sfddd https www kaggle com c state farm distracted driver detection pull requests are welcome requirements numpy www numpy org scipy www scipy org used to load tiny imagenet cv2 opencv org used to load gtsrb scikit image scikit image org used for data augmentation installation get pip https pypi python org pypi pip and run pip install git git github com dnlcrl pydatset git usage download the required dataset e g cifar10 and call the respective load path function for example python from pydatset import cifar10 data cifar10 load path to cifaf10 apply data augmentation to a given batch by doing something like from pydatset data augmentation import random contrast random flips random crops random rotate random tint batch random tint batch batch random contrast batch batch random flips batch batch random rotate batch 10 batch random crops batch 32 32 pad 2 check pydatset readme md for more infos about the package contents | ai |
|
chinese-hershey-font | doc screen000 png chinese hershey font convert chinese characters to single line fonts using computer vision check out the live demo https lingdong github io chinese hershey font download the hershey font heiti hf txt dist hershey heiti hf txt 20975 unicode characters mingti hf txt dist hershey mingti hf txt 73874 unicode characters single line fonts such as the hershey fonts https en wikipedia org wiki hershey fonts are massively useful for making cool procedural graphics and for engraving they re also arguably easier to learn for neural nets this tool automatically generates chinese single line fonts given a regular true type font file ttf ttc it can output either the classic hershey format or a json file containing all the polylines the algorithm scans a raster rendering of a character at many different angles to find line segments that are most likely part of a stroke it then estimates the strokes by connecting merging and cleaning up the line segments doc screen002 png dependencies python 2 pil pillow pip install pillow file formats the following are types of files this software can produce to encode single line fonts stroke files a json file with one object the keys of the object are unicode indices of characters each key maps to an array of polylines a polyline is an array of points a point is a 2 element array containing x and y coordinates a coordinate is a float between 0 0 and 1 0 and 0 0 is the upper left corner for example json u 4e00 0 0 0 55 1 0 0 55 u 4e01 0 02 0 02 0 99 0 02 0 51 0 02 0 53 0 925 0 31 1 0 the above encoding contains the first two chinese characters in unicode and hershey fonts the hershey fonts are a collection of vector fonts developed c 1967 by dr allen vincent hershey at the naval weapons laboratory originally designed to be rendered using vectors on early cathode ray tube displays the fonts are publicly available and have few usage restrictions vector fonts are easily scaled and rotated in two or three dimensions consequently the hershey fonts have been widely used in computer graphics computer aided design programs and more recently also in computer aided manufacturing applications like laser engraving wikipedia https en wikipedia org wiki hershey fonts this link http paulbourke net dataformats hershey gives an overview on how to parse hershey fonts you can also find my own implementation at lingdong p5 hershey js https github com lingdong p5 hershey js compared to stroke files hershey fonts are many times more compact but also has less accurate coordinates usage pre compiled fonts if you re only interested in utilizing pre generated single line fonts you can grab them at the following places hershey fonts can be found in dist hershey dist hershey folder heiti hf txt dist hershey heiti hf txt 20975 characters based on souce han sans https github com adobe fonts source han sans this includes all unicode basic cjk ideographs kaiti hf txt dist hershey kaiti hf txt 20975 characters based on tw kai https data gov tw dataset 5961 mingti hf txt dist hershey mingti hf txt 73874 characters based on hanazono https zh wikipedia org wiki this includes all unicode basic cjk ideographs as well as cjk ideographs extension a e heiti small hf txt dist hershey heiti small hf txt 20975 characters similar to heiti hf txt but has smaller font size and spacing mingti basic hf txt dist hershey mingti basic hf txt 27631 characters a subset of mingti tf txt containing the basic cjk ideographs and extension a stroke files json containing polyline coordinates can be found in dist json dist json folder note heiti hf txt and kaiti hf txt were originally based on proprietary macos system fonts stheiti and stkaiti they re now replaced with open source alternatives generating stroke files if you would like to generate new single line fonts from custom ttf ttc files first use the following command to generate a json encoded stroke file python char2stroke py build path to font ttf optional arguments first first height height last last ngradient ngradient output output strw strw width width width and height determines the dimension of the raster images of characters to be scanned the larger these numbers the more detailed and the slower default is 100 for both strw is the approximate stroke width in pixels at given width and height this is used when merging strokes default is 10 first and last specifies the range of unicode characters to include default values are 0x4e00 and 0x9fef which contains all the cjk ideograph glyphs ngradient is the number of different gradients used to scan the image at ngradient 1 only strokes at 0 45 and 90 are scanned while at 2 3 and 4 slopes of atan 1 2 atan 1 3 and atan 1 4 are also included default is 2 output path to write the output file when this is not specified the program writes to stdout and info https en wikipedia org wiki redirection computing and info https en wikipedia org wiki pipeline unix can be used to redirect the output quick tests before generating large files it is helpful to do some quick tests on small sets of characters compare how different fonts turn out and tweak parameters the following command facilitate this process with side by side visualization of computer vision results python char2stroke py test path to font1 ttf path to font2 ttf optional arguments corpus corpus height height ngradient ngradient nsample nsample strw strw width width corpus string to test on default is the text of thousand character classic https en wikipedia org wiki thousand character classic nsample number of characters to be randomly picked from the corpus default is 8 the other arguments are identical to those of build mode generate hershey font from stroke file the following command generates a hershey font from a stroke file produced in the previous step python tohershey py path to input json path to output hf txt example doc screen001 png | chinese hershey-fonts computer-vision convert typography font | ai |
metabase-clickhouse-driver | p align center style font size 300 img src https www metabase com images logo svg width 200px align center img src static clickhouse svg width 180px align center h1 align center clickhouse driver for metabase h1 p br p align center a href https github com enqueue metabase clickhouse driver actions workflows check yml img src https github com enqueue metabase clickhouse driver actions workflows check yml badge svg branch master a a href https github com enqueue metabase clickhouse driver releases img src https img shields io github release enqueue metabase clickhouse driver svg label latest 20release a a href https raw githubusercontent com enqueue metabase clickhouse driver master license img src https img shields io badge license apache 2 0 blue svg a p about clickhouse https clickhouse com github https github com clickhouse clickhouse database driver for the metabase https metabase com github https github com metabase metabase business intelligence front end installation run using metabase jar 1 download a fairly recent metabase binary release jar file from the metabase distribution page https metabase com start jar html 2 download the clickhouse driver jar from this repository s releases https github com enqueue metabase clickhouse driver releases page 3 create a directory and copy the metabase jar to it 4 in that directory create a sub directory called plugins 5 copy the clickhouse driver jar to the plugins directory 6 make sure you are the in the directory where your metabase jar lives 7 run mb plugins dir plugins java jar metabase jar for example using metabase v0 47 2 and clickhouse driver 1 2 2 choosing the right version bash export metabase version v0 47 2 export metabase clickhouse driver version 1 2 2 mkdir p mb plugins cd mb curl o metabase jar https downloads metabase com metabase version metabase jar curl l o plugins ch jar https github com clickhouse metabase clickhouse driver releases download metabase clickhouse driver version clickhouse metabase driver jar mb plugins dir plugins java jar metabase jar run as a docker container alternatively if you don t want to run metabase jar you can use a docker image bash export metabase docker version v0 47 2 export metabase clickhouse driver version 1 2 2 mkdir p mb plugins cd mb curl l o plugins ch jar https github com clickhouse metabase clickhouse driver releases download metabase clickhouse driver version clickhouse metabase driver jar docker run d p 3000 3000 mount type bind source pwd plugins ch jar destination plugins clickhouse jar metabase metabase metabase docker version choosing the right version metabase release driver version 0 33 x 0 6 0 34 x 0 7 0 0 35 x 0 7 1 0 37 3 0 7 3 0 38 1 0 7 5 0 41 2 0 8 0 0 41 3 1 0 8 1 0 42 x 0 8 1 0 44 x 0 9 1 0 45 x 1 1 0 0 46 x 1 1 7 0 47 x 1 2 2 creating a metabase docker image with clickhouse driver you can use a convenience script build docker image sh which takes three arguments metabase version clickhouse driver version and the desired final docker image tag bash build docker image sh v0 47 2 1 2 2 my metabase with clickhouse v0 0 1 where v0 47 2 is metabase version 1 2 2 is clickhouse driver version and my metabase with clickhouse v0 0 1 being the tag then you should be able to run it bash docker run d p 3000 3000 name my metabase my metabase with clickhouse v0 0 1 or use it with docker compose for example yaml version 3 8 services clickhouse image clickhouse clickhouse server 23 8 alpine container name metabase clickhouse server ports 8123 8123 9000 9000 ulimits nofile soft 262144 hard 262144 metabase image my metabase with clickhouse v0 0 1 container name metabase with clickhouse ports 3000 3000 using certificates in the advanced options add the following to the additional jdbc connection string options input sslrootcert path to ca crt where path to ca crt is the absolute path to the server ca on the metabase host or docker container depends on your deployment make sure that you tick use a secure connection ssl as well operations the driver should work fine for many use cases please consider the following items when running a metabase instance with this driver create a dedicated user for metabase whose profile has readonly set to 2 consider running the metabase instance in the same time zone as your clickhouse database the more time zones involved the more issues compare the results of the queries with the results returned by clickhouse client metabase is a good tool for organizing questions dashboards etc and to give non technical users a good way to explore the data and share their results the driver cannot support all the cool special features of clickhouse e g array functions you are free to use native queries of course known limitations as the underlying jdbc driver version does not support columns with aggregatefunction type these columns are excluded from the table metadata and data browser result sets to prevent sync or data browsing errors if the past month week quarter year filter over a datetime64 column is not working as intended this is likely due to a type conversion issue https github com clickhouse clickhouse pull 50280 see this report https github com clickhouse metabase clickhouse driver issues 164 for more details this issue was resolved as of clickhouse 23 5 contributing check out our contributing guide contributing md | analytics metabase clickhouse | front_end |
Azure-Sphere-Azure-RTOS-ThreadX-Real-time-Ultrasonic-Rover | secure iot with azure rtos threadx azure sphere and azure iot better together learn how to build a secure iot solution with azure rtos threadx and azure sphere resources architecture png what you will learn you will learn how to integrate a real time azure rtos threadx application responsible for running a timing sensitive ultrasonic distance sensor with the security and cloud connectivity of azure sphere julyot this is part of the julyot iot tech community http aka ms julyot series a collection of blog posts hands on labs and videos designed to demonstrate and teach developers how to build projects with azure internet of things iot services please also follow julyot https twitter com hashtag julyot on twitter source code and learning resources source code azure sphere seeing eyed rover real time azure rtos threadx sensors and azure iot https github com gloveboxes azure sphere azure rtos threadx real time ultrasonic rover learning resources azure sphere developer learning path https github com gloveboxes azure sphere learning path learn more about azure sphere azure sphere https azure microsoft com services azure sphere wt mc id iot 0000 dglover is a comprehensive iot security solution including hardware os and cloud components to actively protect your devices your business and your customers azure sphere is made up of three interrelated components 1 azure sphere certified mcus 2 azure sphere os 3 azure sphere security service azure sphere end to end resources azure sphere end to end png azure sphere architecture the azure sphere is built on the mediatec mt3620 this crossover mcu consists of 5 cores there is a dedicated communications core a dedicated security subsystem core and three user application cores the three applications cores are as follows 1 x arm cortex a7 core running embedded linux built with yokto exposing a set of posix apis developers can build and deploy a high level application to this core this core is also responsible for the trustzone security monitor threat detection reporting and os and application life cycle management 2 x arm cortex m4fs developers can build and deploy real time applications to these cores real time applications can be built against the bare metal or built using real time frameworks such as azure rtos threadx and azure rtos with visual studio https visualstudio microsoft com downloads wt mc id iot 0000 dglover free community edition or better or visual studio code https code visualstudio com wt mc id iot 0000 dglover you can develop and debug applications running on all three cores for example you can simultaneously debug an app running on the a7 core and a m4 core azure rtos threadx app azure sphere architecture resources azure sphere architecture png application architecture the application running on the azure sphere consists of two parts resources application architecture png real time azure rtos threadx application the real time azure rtos threadx application running on one of the m4 cores that is responsible for running the timing sensitive hc sr04 ultrasonic distance sensor distance is measured every 20 milliseconds so the rover can decide the best route the sensor requires precise microsecond timing to trigger the distance measurement process so it is a perfect candidate for running on the real time core as a azure rtos threadx thread every 5 seconds a azure rtos threadx thread sends distance telemetry to the azure sphere a7 high level application azure iot high level application the application running on the azure sphere a7 high level application core is responsible for less timing sensitive tasks such as establishing wifi network connectivity negotiating security and connecting with azure iot central updating the device twin and send telemetry messages extending i am thinking about extending this solution with a local tinyml module for smarter navigation parts list 1 x seeed studio mt3620 mini dev board https www seeedstudio com mt3620 for azure sphere 1 x mt3620 grove breakout https www seeedstudio com mt3620 grove breakout p 4043 html 2 x grove ultrasonic distance sensor https www seeedstudio com grove ultrasonic distance sensor html 1 x h bridge driver seeed studio have a grove i2c motor driver https wiki seeedstudio com grove i2c motor driver v1 3 or you can wire up your own h bridge connector to the grove breakout board 1 x rover chassis motors wheels etc resources img 0172 cropped jpg azure iot central azure iot central https azure microsoft com services iot central wt mc id iot 0000 dglover provides an easy way to connect monitor and manage your internet of things iot assets at scale i created a free trial of azure iot central https azure microsoft com services iot central wt mc id iot 0000 dglover and in no time i had the rover distance sensor charted and available for deeper analysis by the way you can continue to connect two devices for free to iot central after the trial period expires resources iot central distance chart png extend and integrate azure iot central applications with other cloud services azure iot central is also extensible using rules and workflows for more information review use workflows to integrate your azure iot central application with other cloud services https docs microsoft com azure iot central core howto configure rules advanced wt mc id iot 0000 dglover how to build the solution 1 set up your azure sphere development environment https github com gloveboxes azure sphere learning path tree master zdocs visual studio iot central lab 0 introduction and lab set up 2 review integrate azure rtos threadx real time room sensors with azure iot https github com gloveboxes azure sphere learning path tree master zdocs vs code iot central lab 6 azurertos and inter core messaging 3 learn how to connect and azure sphere to azure iot central https github com gloveboxes azure sphere learning path tree master zdocs visual studio iot central lab 2 send telemetry to azure iot central or azure iot hub https github com gloveboxes azure sphere learning path tree master zdocs vs code iot hub lab 2 send telemetry to azure iot hub 4 the iot central device template capabilities model json file for this solution is included in the iot central directory of this repo how to install the real time tool chain on linux 1 download the gnu arm embedded toolchain https developer arm com tools and software open source software developer tools gnu toolchain gnu rm downloads 2 install the downloaded package i install in the opt directory bash sudo tar xjvf gcc arm none eabi 9 2020 q2 update x86 64 linux tar bz2 c opt 3 update your path open bashrc and at the end add bash export path path opt gcc arm none eabi 9 2020 q2 update bin how to update to the latest threadx bits how to determine current threadx version 1 open tx api h c application interface definition release tx api h portable c 6 0 1 download latest bits clone the threadx github repo bash git clone https github com azure rtos threadx git 1 update the cortex m4 bits https github com azure rtos threadx tree master ports cortex m4 gnu 2 update the threadx common bits https github com azure rtos threadx tree master common customise for azure sphere 1 open tx initialize low level s 2 change system clock 200000000 3 comment out global ram segment used end and the related code as there is no such symbol in the ld file it s likely these areas are already commented out asm global ram segment used end and asm set base of available memory to end of non initialised ram area ldr r0 tx initialize unused memory build address of unused memory pointer ldr r1 ram segment used end build first free address add r1 r1 4 str r1 r0 setup first unused memory pointer how to update to the latest mediatec bits mediatek mt3620 m4 driver real time application sample code https github com mediatek labs mt3620 m4 software mediatec regularly update the mt3620 m4 drivers and samples enabling the system tick handler open the mt3620 m4 bsp scr nvic c file search for c void systmtick handler void sys tick in ms ifdef osai freertos extern void systick handler void systick handler endif comment out the ifdef endif directives to enable the system tick handler for azure threadx void systmtick handler void sys tick in ms ifdef osai freertos extern void systick handler void systick handler endif tips and tricks milliseconds to ticks c convert from millisecond to threadx tick value define ms to tick ms ms tx timer ticks per second 1000 ticks per millisecond 1 tick 10ms it is configurable have fun and stay safe and be sure to follow us on julyot https twitter com hashtag julyot src hash ref src twsrc 5etfw | os |
|
arex | div align center h1 arex ui h1 div div align center a user friendly visual frontend interface for operating arex div get started 1 installation please refer to install documentation http arextest com docs chapter1 quick installation 2 getting started please visit our documentation website to get started arex documentions http arextest com docs chapter1 get started feature code no invasion based data collection and automation mock mock most components such as dubbo http redis persistence layer framework configuration center support a variety of complex business scenarios of verification including multi threaded concurrency asynchronous callbacks write operations and so on rapid reproduction of production bugs directly use the production recorded data in the local environment join the arex community qq group 656108079 follow us on twitter https twitter com arex test join the mailing list https groups google com g arex test join arex slack https arexcommunity slack com ssb redirect join arex discord https discord gg wy3czhnv9k license text copyright c 2022 arextest this program is free software you can redistribute it and or modify it under the terms of the gnu affero general public license as published by the free software foundation either version 3 of the license or at your option any later version this program is distributed in the hope that it will be useful but without any warranty without even the implied warranty of merchantability or fitness for a particular purpose see the gnu affero general public license for more details you should have received a copy of the gnu affero general public license along with this program if not see https www gnu org licenses | front_end |
|
Book_List | img src images books jpg h1 id hackerranksolutions align center book list h1 h2 new content every week h2 h3 put a if you find this repo usefull and follow me on a href https github com mukeshmithrakumar img src https img shields io badge github black svg alt github a to get notified when i upload new content h2 books h2 data science from scratch by steven cooper https github com mukeshmithrakumar book list blob master data 20science 20from 20scratch pdf data science tutorial library https github com mukeshmithrakumar book list blob master data 20science 20tutorial 20library pdf deep learning masterpiece by andrew ng https github com mukeshmithrakumar book list blob master deep 20learning 20masterpiece 20by 20andrew 20ng pdf introduction to machine learning https github com mukeshmithrakumar book list blob master introduction 20to 20machine 20learning pdf 30000 questions https github com mukeshmithrakumar book list blob master 30000 20questions pdf 7 data science machine learning cheatsheets in one https github com mukeshmithrakumar book list blob master data 20science 20machine 20learning 20cheatsheets pdf machine learning and deep learning https github com mukeshmithrakumar book list blob master machine 20learning 20and 20deep 20learning pdf language processing and python https github com mukeshmithrakumar book list blob master language 20processing 20and 20python pdf learning pandas https github com mukeshmithrakumar book list blob master learning 20pandas pdf think stats https github com mukeshmithrakumar book list blob master think 20stats pdf rules of machine learning https github com mukeshmithrakumar book list blob master rules 20of 20machine 20learning pdf python machine learning projects https github com mukeshmithrakumar book list blob master python 20machine 20learning 20projects pdf dive into deep learning https github com mukeshmithrakumar book list blob master dive 20into 20deep 20learning pdf machine learning for marketers https github com mukeshmithrakumar book list blob master machine 20learning 20for 20marketers pdf statistical foundations of machine learning https github com mukeshmithrakumar book list blob master statistical 20foundations 20of 20machine 20learning pdf python code for artificial intelligence https github com mukeshmithrakumar book list blob master python 20code 20for 20artificial 20intelligence pdf fundamentals of python https github com mukeshmithrakumar book list blob master fundamentals 20of 20python pdf h3 if you are looking for courses to take to learn machine learning check out a href https github com mukeshmithrakumar learn ml in 6 months learn ml in 6 months a h3 h3 if you are working on hackerrank to improve your coding skills check out my a href https github com mukeshmithrakumar hackerranksolutions hackerranksolutions a for python java c c shell sql javascript and interview preparation kit solutions h3 h3 if you want to learn deep learning with tensorflow take a look at a href https github com adhiraiyan deeplearningwithtf2 0 deeplearningwithtf2 0 a h3 h3 | books free python machine-learning data-science algorithms deep-learning | ai |
dentalvision | dentalvision active shape model implementation for dental radiographs using a blend of the methods suggested by cootes et al 1992 active shape models their training and application and blanz et al 2004 a statistical method for robust 3d surface reconstruction from sparse data extensive comments can be found in the code | computer-vision machine-learning | ai |
fishtank-vue | fishtank vue fish tank vuejs component module https fishtank bna com installation sh npm install fishtank fishtank vue usage please reference https fishtank bna com https fishtank bna com for detailed documentation esm module import the component from the library module js import fishtankcard from fishtank fishtank vue extend default components fishtankcard commonjs module import the component library module and reference the desired component js const fishtank require fishtank vue extend default components fishtank fishtankcard or js const card require fishtank vue fishtankcard extend default components card component css fishtank vue provides styles are compiled and self contained within the component itself there are no additional css imports contributing contributing documentation github contributing md dev environment setup bash install dependencies npm install or yarn serve with hot reload at localhost 8080 npm run serve or yarn serve | fish-tank fishtank-vue | os |
fundamental-styles | fundamental library styles a href https badge fury io js fundamental styles img src https badge fury io js fundamental styles svg alt npm version a img src https github com sap fundamental styles actions workflows create release yml badge svg branch main alt ci status img src https img shields io npm dm fundamental styles label npm 20downloads alt npm downloads a href https join slack com t ui fundamentals shared invite enqtntizotu0mzc2ntc5lwqzzwi5mwfhyje5otc4yzlin2jhotc1zjqxztg1yjzimwziyzrknjmwyzgymmfkymnhzdvjmwe5mdizowezmmm img src https img shields io badge slack ui fundamentals blue svg logo slack alt slack a a href https api reuse software info github com sap fundamental styles img src https api reuse software badge github com sap fundamental styles alt reuse status a a href https storybook js org img src https raw githubusercontent com storybookjs brand main badge badge storybook svg alt storybookjs a a href https www netlify com img src https www netlify com img global badges netlify light svg alt deploys by netlify a what is fundamental library styles fundamental library styles is a lightweight presentation layer that can be used in conjunction with any ui framework such as angular react vue etc by utilizing the fundamental styles library which includes a collection of stylesheets and html tags developers can create visually consistent and professional looking fiori applications in any web based technology of their choice we are also working on angular https github com sap fundamental ngx react https github com sap fundamental react and vue https github com sap fundamental vue implementations getting started the library is modular so you can use as little or as much as you need cdn the fully compiled minified library is available via unpkg cdn https unpkg com for inclusion in your application for prerelease version use html link href https unpkg com fundamental styles prerelease dist fundamental styles css rel stylesheet for latest stable version use html link href https unpkg com fundamental styles latest dist fundamental styles css rel stylesheet you can also include specific version of library into your html via using html link href https unpkg com fundamental styles versionnumber dist fundamental styles css rel stylesheet where you should replace versionnumber with desired version number for example with 0 20 3 theming to use particular theme you need to include two css variables files html link href https unpkg com sap theming theming base content content base baselib themename css variables css rel stylesheet html link href https unpkg com fundamental styles versionnumber dist theming themename css rel stylesheet available values for themename are sap horizon sap horizon dark sap horizon hcb sap horizon hcw sap fiori 3 sap fiori 3 dark sap fiori 3 hcb sap fiori 3 hcw sap fiori 3 light dark npm package the compiled css for the full library and modules e g core layout etc are distributed via npm https www npmjs com package fundamental styles npm install fundamental styles save note we only distribute compiled css for each component not the full project or html for specific components icons see the icon component https fundamental styles netlify app path docs components icons sap icons sizes for a list of icon class names see project configuration below for instructions to include sap fiori icons in your project project configuration this project does not contain fonts and icons they must be added to your project separately download the sap theming library after adding fonts and icons to your project include the following in your css css font face font family 72 src url sap theming theming base content content base baselib basetheme fonts 72 regular full woff format woff font weight normal font style normal font face font family 72 src url sap theming theming base content content base baselib basetheme fonts 72 bold full woff format woff font weight 700 font style normal font face font family 72 src url sap theming theming base content content base baselib basetheme fonts 72 light full woff format woff font weight 300 font style normal font face font family sap icons src url sap theming theming base content content base baselib sap horizon fonts sap icons woff format woff font weight normal font style normal font face font family businesssuiteinappsymbols src url sap theming theming base content content base baselib sap horizon fonts businesssuiteinappsymbols woff format woff font weight normal font style normal font face font family sap icons tnt src url sap theming theming base content content base baselib basetheme fonts sap icons tnt woff format woff font weight normal font style normal html font size 16px working with the project download and installation 1 clone repository clone the repo using the git software of your choice or using the git command git clone https github com sap fundamental styles git 2 install npm dependencies npm install 3 serve the development playground and documentation website locally 1 if you want to serve with development environment run npm start 2 for production build serve run npm run start prod project dependencies the project has the following prerequisites git for downloading this repo node lts https nodejs org sla our service level agreement fundamental library styles is aiming to deliver sla what consumable css that strives for fiori https www sap com products fiori html compliance reference html specification that consuming libraries must adhere to sla how theme able components built on top of sap theming base content https github com sap theming base content by consuming the css custom properties delivered by the theming library self contained styles that is each component s style file contains all the styling needed to be rendered properly external styling won t bleed in internal styling won t bleed out bleeding in means that css global reset won t affect the component and bleeding out means that the component styling should not affect other html elements accessibility support accessibility color contrast support for wcag 2 0 level aa 4 5 1 for typical text accessibility semantic html reference accessibility aria attributes noted when possible in html reference this library is also being consumed by fundamental library for angular https github com sap fundamental ngx fundamental library for react https github com sap fundamental react and fundamental library for vue https github com sap fundamental vue the above sla is the primary difference between this library and the earlier fundamental https github com sap fundamental support if you encounter an issue you can create a ticket https github com sap fundamental styles issues new choose or post on the fundamentals slack channel https join slack com t ui fundamentals shared invite zt 6op8woeb 0 urqrgzemm3updfqehbaw contributing if you want to contribute please check the contribution guidelines https github com sap fundamental styles wiki contribution guidelines also check the development guidelines https github com sap fundamental styles wiki development guidelines and visual testing guide https github com sap fundamental styles wiki visual testing with chromatic versioning the fundamental styles library follows semantic versioning https semver org these components strictly adhere to the major minor patch numbering system also known as breaking feature fix merges to the main branch will be published as a prerelease pre releases will include an rc version e g major minor patch rc rc the following circumstances will be considered a major or breaking change dropping existing classnames css variables color names color groups spacing parameters the existing underlying html markup of a component is altered non visual html attribute changes additions such as role aria data note fundamental styles provides css directly and html as reference to consumers because of the reference relationship of the html seen in fundamental styles we want to be very clear when we alter that reference so that it is properly reflected in js implementation libraries because of this even non visual changes will be treated as breaking the following circumstances will not be considered a major or breaking change introducing new classnames css variables color names color groups spacing parameters adding or modifying css properties and values of existing classnames fundamental library github repository the fundamental library github repository is a monorepo package that allows the reusage of other packages while keeping them isolated from one another the fundamental library github repository consist of customer experience package https github com sap fundamental styles tree main packages cx fundamental library next package https github com sap fundamental styles tree main packages fn common css https github com sap fundamental styles tree main packages common css styles package https github com sap fundamental styles tree main packages styles thanks a href https www chromatic com img src https user images githubusercontent com 321738 84662277 e3db4f80 af1b 11ea 88f5 91d67a5e59f6 png width 153 height 30 alt chromatic a thanks to chromatic https www chromatic com for providing the visual testing platform that helps us review ui changes and catch visual regressions | open-source | os |
onion-monero-blockchain-explorer | onion monero blockchain explorer currently available monero blockchain explorers have several limitations which are of special importance to privacy oriented users they use javascript have images which might be used for cookieless tracking http lucb1e com rp cookielesscookies track users activities through google analytics are closed sourced are not available as hidden services do not support monero testnet nor stagenet networks have limited json api in this example these limitations are addressed by development of an onion monero blockchain explorer the example not only shows how to use monero c libraries but also demonstrates how to use crow https github com ipkn crow c micro web framework mstch https github com no1msd mstch c mustache templates json https github com nlohmann json json for modern c fmt https github com fmtlib fmt small safe and fast string formatting library explorer hosts clearnet versions https xmrchain net https xmrchain net https enabled most popular and very stable https monerohash com explorer https monerohash com explorer nice looking one https enabled http monerochain com http monerochain com json api based multiple nodes https blox minexmr com https blox minexmr com https enabled testnet version https testnet xmrchain com https testnet xmrchain com https enabled stagenet version https stagenet xmrchain net https stagenet xmrchain net i2p users main monero network http 7o4gezpkye6ekibhgpkg7v626ze4idsirapufzrefkdysa6zxhha b32 i2p http 7o4gezpkye6ekibhgpkg7v626ze4idsirapufzrefkdysa6zxhha b32 i2p tor versions http exploredv42tq2nowrll6f27nuymenndwupueqvyugaqzbrvmjhhceqd onion http exploredv42tq2nowrll6f27nuymenndwupueqvyugaqzbrvmjhhceqd onion native v3 onion json api enabled emission enabled rawtx enabled alternative block explorers https localmonero co blocks https localmonero co blocks https monerovision com https monerovision com onion monero blockchain explorer features the key features of the onion monero blockchain explorer are no cookies no web analytics trackers no images open sourced made fully in c showing encrypted payments id showing ring signatures showing transaction extra field showing public components of monero addresses decoding which outputs and mixins belong to the given monero address and viewkey can prove that you send monero to someone detailed information about ring members such as their age timescale and ring sizes showing number of amount output indices support monero testnet and stagnet networks tx checker and pusher for online pushing of transactions estimate possible spendings based on address and viewkey can provide total amount of all miner fees decoding encrypted payment id decoding outputs and proving txs sent to sub address development branch current development branch https github com moneroexamples onion monero blockchain explorer tree devel note devel branch of the explorer follows master branch of the monero compilation on ubuntu 18 04 20 04 monero download and compilation to download and compile recent monero follow instructions in the following link https github com moneroexamples monero compilation blob master readme md compile and run the explorer once the monero compiles the explorer can be downloaded and compiled as follows bash go to home folder if still in monero cd download the source code git clone https github com moneroexamples onion monero blockchain explorer git enter the downloaded sourced code folder cd onion monero blockchain explorer make a build folder and enter it mkdir build cd build create the makefile cmake compile make to run it xmrblocks by default it will look for blockchain in its default location i e bitmonero lmdb you can use b option if its in different location for example bash xmrblocks b home mwo non default monero location lmdb example output bash mwo arch onion monero blockchain explorer xmrblocks 2016 may 28 10 04 49 160280 blockchain initialized last block 1056761 d0 h0 m12 s47 time ago current difficulty 1517857750 2016 05 28 02 04 49 info crow 0 1 server is running local port 8081 go to your browser http 127 0 0 1 8081 compiling and running with docker the explorer can also be compiled using docker build as described below by default it compiles against latest release release v0 17 branch of monero build using all cpu cores docker build no cache t xmrblocks alternatively specify number of cores to use e g 2 docker build no cache build arg nproc 2 t xmrblocks to build against development branch of monero i e master branch docker build no cache build arg nproc 3 build arg monero branch master t xmrblocks the build needs 3 gb space the final container image is 179mb to run it mount the monero blockchain onto the container as volume either run in foreground docker run it v path to monero blockckain on the host home monero bitmonero p 8081 8081 xmrblocks or in background docker run it d v path to monero blockchain on the host home monero bitmonero p 8081 8081 xmrblocks example output docker run rm it v mnt w7 bitmonero home monero bitmonero p 8081 8081 xmrblocks staring in non ssl mode 2020 04 20 16 20 00 info crow 0 1 server is running at 0 0 0 0 8081 using 1 threads docker compose example the explorer can also be built and run using docker compose i e yaml version 3 services monerod image sethsimmons simple monerod latest restart unless stopped container name monerod volumes xmrdata home monero bitmonero ports 18080 18080 18089 18089 command rpc restricted bind ip 0 0 0 0 rpc restricted bind port 18089 public node no igd enable dns blocklist prune blockchain explore image xmrblocks latest build onion monero blockchain explorer container name explore restart unless stopped volumes xmrdata home monero bitmonero ports 8081 8081 command xmrblocks daemon url monerod 18089 enable json api enable autorefresh option enable emission monitor enable pusher volumes xmrdata to build this image run the following bash git clone https github com moneroexamples onion monero blockchain explorer git docker compose build or build and run in one step via bash git clone https github com moneroexamples onion monero blockchain explorer git docker compose up d when running via docker please use something like traefik https doc traefik io traefik or enable ssl enable ssl https to secure communications the explorer s command line options xmrblocks onion monero blockchain explorer h help arg 1 0 produce help message t testnet arg 1 0 use testnet blockchain s stagenet arg 1 0 use stagenet blockchain enable pusher arg 1 0 enable signed transaction pusher enable mixin details arg 1 0 enable mixin details for key images e g timescale mixin of mixins in tx context enable key image checker arg 1 0 enable key images file checker enable output key checker arg 1 0 enable outputs key file checker enable json api arg 1 0 enable json rest api enable as hex arg 1 0 enable links to provide hex represtations of a tx and a block enable autorefresh option arg 1 0 enable users to have the index page on autorefresh enable emission monitor arg 1 0 enable monero total emission monitoring thread p port arg 8081 default explorer port x bindaddr arg 0 0 0 0 default bind address for the explorer testnet url arg you can specify testnet url if you run it on mainnet or stagenet link will show on front page to testnet explorer stagenet url arg you can specify stagenet url if you run it on mainnet or testnet link will show on front page to stagenet explorer mainnet url arg you can specify mainnet url if you run it on testnet or stagenet link will show on front page to mainnet explorer no blocks on index arg 10 number of last blocks to be shown on index page mempool info timeout arg 5000 maximum time in milliseconds to wait for mempool data for the front page mempool refresh time arg 5 time in seconds for each refresh of mempool state c concurrency arg 0 number of threads handling http queries default is 0 which means it is based you on the cpu b bc path arg path to lmdb folder of the blockchain e g bitmonero lmdb ssl crt file arg path to crt file for ssl https functionality ssl key file arg path to key file for ssl https functionality d daemon url arg http 127 0 0 1 18081 monero daemon url daemon login arg specify username password for daemon rpc client enable mixin guess arg 1 0 enable guessing real outputs in key example usage defined as bash aliases bash for mainnet explorer alias xmrblocksmainnet onion monero blockchain explorer build xmrblocks port 8081 testnet url http 139 162 32 245 8082 enable pusher enable emission monitor for testnet explorer alias xmrblockstestnet onion monero blockchain explorer build xmrblocks t port 8082 mainnet url http 139 162 32 245 8081 enable pusher enable emission monitor example usage when running via docker bash run in foreground docker run it v path to monero blockckain on the host home monero bitmonero p 8081 8081 xmrblocks xmrblocks daemon url node sethforprivacy com 18089 enable json api enable autorefresh option enable emission monitor enable pusher run in background docker run it d v path to monero blockchain on the host home monero bitmonero p 8081 8081 xmrblocks xmrblocks daemon url node sethforprivacy com 18089 enable json api enable autorefresh option enable emission monitor enable pusher make sure to always start the portion of command line flags with xmrblocks and set any flags you would like after that as shown above enable monero emission obtaining current monero emission amount is not straight forward thus by default it is disabled to enable it use enable emission monitor flag e g bash xmrblocks enable emission monitor this flag will enable emission monitoring thread when started the thread will initially scan the entire blockchain and calculate the cumulative emission based on each block since it is a separate thread the explorer will work as usual during this time every 10000 blocks the thread will save current emission in a file by default in bitmonero lmdb emission amount txt for testnet or stagenet networks it is bitmonero testnet lmdb emission amount txt or bitmonero stagenet lmdb emission amount txt this file is used so that we don t need to rescan entire blockchain whenever the explorer is restarted when the explorer restarts the thread will first check if bitmonero lmdb emission amount txt is present read its values and continue from there if possible subsequently only the initial use of the thread is time consuming once the thread scans the entire blockchain it updates the emission amount using new blocks as they come since the explorer writes this file there can be only one instance of it running for mainnet testnet and stagenet thus for example you can t have two explorers for mainnet running at the same time as they will be trying to write and read the same file at the same time leading to unexpected results off course having one instance for mainnet and one instance for testnet is fine as they write to different files when the emission monitor is enabled information about current emission of coinbase and fees is displayed on the front page e g monero emission fees is 14485540 430 52545 373 as of 1313448 block the values given can be checked using monero daemon s print coinbase tx sum command for example for the above example print coinbase tx sum 0 1313449 to disable the monitor simply restart the explorer without enable emission monitor flag enable ssl https by default the explorer does not use ssl but it has such a functionality as an example you can generate your own ssl certificates as follows bash cd tmp example folder openssl genrsa out server key 1024 openssl req new key server key out server csr openssl x509 req days 3650 in server csr signkey server key out server crt having the crt and key files run xmrblocks in the following way bash xmrblocks ssl crt file tmp server crt ssl key file tmp server key note because we generated our own certificate modern browsers will complain about it as they can t verify the signatures against any third party so probably for any practical use need to have properly issued ssl certificates json api the explorer has json api for the api it uses conventions defined by jsend https labs omniti com labs jsend by default the api is disabled to enable it use enable json api flag e g xmrblocks enable json api api transaction tx hash bash curl w n x get http 127 0 0 1 8081 api transaction 6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d partial results shown json data block height 1268252 coinbase false confirmations 1057855 current height 2326107 extra 01be23e277aed6b5f41f66b05244bf994c13108347366ec678ae16657f0fc3a22b inputs amount 0 key image 67838fd0ffd79f13e735830d3ec60412aed59e53e1f997feb6f73d088b949611 mixins block no 1238623 public key 0a5b853c55303c10e1326acfb085b9e246e088b1ccac7e37f7a810d46a28a914 tx hash 686555fb053dd53f6f9eb79449e2bdcd377221f823f508158d70d4a1966fe955 block no 1246942 public key 527cf86f5abbfb006c970f7c6eb40493786d4751306f8985c6a43f98a88c0dff tx hash 4fa1999f9e0d2ad031dbe5594f2e8336651b6cad19dd3cee7980a01c47600f91 mixin 9 outputs amount 0 public key 525779873776e4a42f517fd79b72e7c31c3ba03e730fc32287f6414fb702c1d7 amount 0 public key e25f00fceb77af841d780b68647618812695b4ca6ebe338faba6e077f758ac30 payment id payment id8 rct type 1 timestamp 1489753456 timestamp utc 2017 03 17 12 24 16 tx fee 12517785574 tx hash 6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d tx size 13323 tx version 2 xmr inputs 0 xmr outputs 0 status success api transactions transactions in last 25 blocks bash curl w n x get http 127 0 0 1 8081 api transactions partial results shown json data blocks age 33 16 49 53 height 1268252 size 105390000000000000 timestamp 1489753456 timestamp utc 2017 03 17 12 24 16 txs coinbase true mixin 0 outputs 8491554678365 rct type 0 tx fee 0 tx hash 7c4286f64544568265bb5418df84ae69afaa3567749210e46f8340c247f4803f tx size 151000000000000 tx version 2 coinbase false mixin 5 outputs 0 rct type 2 tx fee 17882516700 tx hash 2bfbccb918ee5f050808dd040ce03943b7315b81788e9cdee59cf86b557ba48c tx size 19586000000000000 tx version 2 limit 25 page 0 status success api transactions page page no limit tx per page bash curl w n x get http 127 0 0 1 8081 api transactions page 2 limit 10 result analogical to the one above api block block number block hash bash curl w n x get http 139 162 32 245 8081 api block 1293257 partial results shown json data block height 1293257 block reward 0 current height 1293264 hash 9ef6bb8f9b8bd253fc6390e5c2cdc45c8ee99fad16447437108bf301fe6bd6e1 size 141244 timestamp 1492761974 timestamp utc 2017 04 21 08 06 14 txs coinbase true extra 018ae9560eb85d5ebd22d3beaed55c21d469eab430c5e3cac61b3fe2f5ad156770020800000001a9030800 mixin 0 payment id payment id8 rct type 0 tx fee 0 tx hash 3ff71b65bec34c9261e01a856e6a03594cf0472acf6b77db3f17ebd18eaa30bf tx size 95 tx version 2 xmr inputs 0 xmr outputs 8025365394426 status success api mempool return all txs in the mempool bash curl w n x get http 127 0 0 1 8081 api mempool partial results shown json data limit 100000000 page 0 total page no 0 txs coinbase false extra 022100325f677d96f94155a4840a84d8e0c905f7a4697a25744633bcb438feb1e51fb2012eda81bf552c53c2168f4130dbe0265c3a7898f3a7eee7c1fed955a778167b5d mixin 3 payment id 325f677d96f94155a4840a84d8e0c905f7a4697a25744633bcb438feb1e51fb2 payment id8 rct type 2 timestamp 1494470894 timestamp utc 2017 05 11 02 48 14 tx fee 15894840000 tx hash 9f3374f8ac67febaab153eab297937a3d0d2c706601e496bf5028146da0c9aef tx size 13291 tx version 2 xmr inputs 0 xmr outputs 0 txs no 7 status success limit of 100000000 is just default value above to ensure that all mempool txs are fetched if no specific limit given api mempool limit no of top txs return number of newest mempool txs e g only 10 bash curl w n x get http 127 0 0 1 8081 api mempool limit 10 result analogical to the one above api search block number tx hash block hash bash curl w n x get http 127 0 0 1 8081 api search 1293669 partial results shown json data block height 1293669 current height 1293670 hash 5d55b8fabf85b0b4c959d66ad509eb92ddfe5c2b0e84e1760abcb090195c1913 size 118026 timestamp 1492815321 timestamp utc 2017 04 21 22 55 21 title block txs coinbase true extra 01cb7fda09033a5fa06dc601b9295ef3790397cf3c645e958e34cf7ab699d2f5230208000000027f030200 mixin 0 payment id payment id8 rct type 0 tx fee 0 tx hash 479ba432f5c88736b438dd4446a11a13046a752d469f7828151f5c5b86be4e9a tx size 95 tx version 2 xmr inputs 0 xmr outputs 7992697599717 status success api outputs txhash tx hash address address viewkey viewkey txprove 0 1 for txprove 0 we check which outputs belong to given address and corresponding viewkey for txprove 1 we use to prove to the recipient that we sent them founds for this we use recipient s address and our tx private key as a viewkey value i e viewkey tx private key checking outputs bash we use here official monero project s donation address as an example curl w n x get http 127 0 0 1 8081 api outputs txhash 17049bc5f2d9fbca1ce8dae443bbbbed2fc02f1ee003ffdd0571996905faa831 address 44affq5ksigboz4nmdwytn18obc8aems33dblws3h7otxft3xjrpdtqgv7sqssabybb98unbr2vbbet7f2wfn3rvgqbep3a viewkey f359631075708155cc3d92a32b75a7d02a5dcf27756707b47a2b31b21c389501 txprove 0 json data address 42f18fc61586554095b0799b5c4b6f00cdeb26a93b20540d366932c6001617b75db35109fbba7d5f275fef4b9c49e0cc1c84b219ec6ff652fda54f89f7f63c88 outputs amount 34980000000000 match true output idx 0 output pubkey 35d7200229e725c2bce0da3a2f20ef0720d242ecf88bfcb71eff2025c2501fdb amount 0 match false output idx 1 output pubkey 44efccab9f9b42e83c12da7988785d6c4eb3ec6e7aa2ae1234e2f0f7cb9ed6dd tx hash 17049bc5f2d9fbca1ce8dae443bbbbed2fc02f1ee003ffdd0571996905faa831 tx prove false viewkey f359631075708155cc3d92a32b75a7d02a5dcf27756707b47a2b31b21c389501 status success proving transfer we use recipient s address i e not our address from which we sent xmr to recipient for the viewkey we use tx private key although the get variable is still called viewkey that we obtained by sending this txs bash this is for testnet transaction curl w n x get http 127 0 0 1 8082 api outputs txhash 94782a8c0aa8d8768afa0c040ef0544b63eb5148ca971a024ac402cad313d3b3 address 9wuf8ucputb2huk7rphbw5pfcykoskxqtgxbckbdnztcprdnfjjljtuht87zhtgsffcb21qmjxjj18pw7cbnrctckhrub7n viewkey e94b5bfc599d2f741d6f07e3ab2a83f915e96fb374dfb2cd3dbe730e34ecb40b txprove 1 json data address 71bef5945b70bc0a31dbbe6cd0bd5884fe694bbfd18fff5f68f709438554fb88a51b1291e378e2f46a0155108782c242cc1be78af229242c36d4f4d1c4f72da2 outputs amount 1000000000000 match true output idx 0 output pubkey c1bf4dd020b5f0ab70bd672d2f9e800ea7b8ab108b080825c1d6cfc0b7f7ee00 amount 0 match false output idx 1 output pubkey 8c61fae6ada2a103565dfdd307c7145b2479ddb1dab1eaadfa6c34db65d189d5 tx hash 94782a8c0aa8d8768afa0c040ef0544b63eb5148ca971a024ac402cad313d3b3 tx prove true viewkey e94b5bfc599d2f741d6f07e3ab2a83f915e96fb374dfb2cd3dbe730e34ecb40b status success result analogical to the one above api networkinfo bash curl w n x get http 127 0 0 1 8081 api networkinfo json data alt blocks count 0 block size limit 600000 cumulative difficulty 2091549555696348 difficulty 7941560081 fee per kb 303970000 grey peerlist size 4991 hash rate 66179667 height 1310423 incoming connections count 0 outgoing connections count 5 start time 1494822692 status ok target 120 target height 0 testnet false top block hash 76f9e85d62415312758bc09e0b9b48fd2b005231ad1eee435a8081e551203f82 tx count 1219048 tx pool size 2 white peerlist size 1000 status success api outputsblocks search for our outputs in last few blocks up to 5 blocks using provided address and viewkey bash testnet address curl w n x get http 127 0 0 1 8081 api outputsblocks address 9sdynu82ih1gdhdgrqhbecfsdfasjfgxl9b9v5f1aytfurysvej7bd9pyx5sw2qlk8hggdfm8qj5dnecqghm24ce6qwegdi viewkey 807079280293998634d66e745562edaaca45c0a75c8290603578b54e9397e90a limit 5 mempool 1 example result json data address 0182d5be0f708cecf2b6f9889738bde5c930fad846d5b530e021afd1ae7e24a687ad50af3a5d38896655669079ad0163b4a369f6c852cc816dace5fc7792b72f height 960526 limit 5 mempool true outputs amount 33000000000000 block no 0 in mempool true output idx 1 output pubkey 2417b24fc99b2cbd9459278b532b37f15eab6b09bbfc44f9d17e15cd25d5b44f payment id tx hash 9233708004c51d15f44e86ac1a3b99582ed2bede4aaac6e2dd71424a9147b06f amount 2000000000000 block no 960525 in mempool false output idx 0 output pubkey 9984101f5471dda461f091962f1f970b122d4469077aed6b978a910dc3ed4576 payment id 0000000000000055 tx hash 37825d0feb2e96cd10fa9ec0b990ac2e97d2648c0f23e4f7d68d2298996acefd amount 96947454120000 block no 960525 in mempool false output idx 1 output pubkey e4bded8e2a9ec4d41682a34d0a37596ec62742b28e74b897fcc00a47fcaa8629 payment id 0000000000000000000000000000000000000000000000000000000000001234 tx hash 4fad5f2bdb6dbd7efc2ce7efa3dd20edbd2a91640ce35e54c6887f0ee5a1a679 viewkey 807079280293998634d66e745562edaaca45c0a75c8290603578b54e9397e90a status success api emission bash curl w n x get http 127 0 0 1 8081 api emission json data blk no 1313969 coinbase 14489473877253413000 fee 52601974988641130 status success emission only works when the emission monitoring thread is enabled api version bash curl w n x get http 127 0 0 1 8081 api version json data api 65536 blockchain height 1357031 git branch name update to current monero last git commit date 2017 07 25 last git commit hash a549f25 monero version full 0 10 3 1 ab594cfe status success api number is store as uint32 t in this case 65536 represents major version 1 and minor version 0 in javascript to get these numbers one can do as follows javascript var api major response data api 16 var api minor response data api 0xffff api rawblock block number block hash return raw json block data as represented in monero bash curl w n x get http 139 162 32 245 8081 api rawblock 1293257 example result not shown api rawtransaction tx hash return raw json tx data as represented in monero bash curl w n x get http 139 162 32 245 8081 api rawtransaction 6093260dbe79fd6277694d14789dc8718f1bd54457df8bab338c2efa3bb0f03d example result not shown other monero examples other examples can be found on github https github com moneroexamples tab repositories please know that some of the examples repositories are not finished and may not work as intended how can you help constructive criticism code and website edits are always good they can be made through github | blockchain-explorer monero linux cpp11 | blockchain |
tutorials.Step-by-Step | step by step tutorials img src img logo png align left width 162 https www fiware org license mit https img shields io github license fiware tutorials step by step svg https opensource org licenses mit support badge https img shields io badge tag fiware orange svg logo stackoverflow https stackoverflow com questions tagged fiware this is an umbrella repository which holds collections of ngsi v2 and ngsi ld tutorials for developers wishing to learn about the fiware https www fiware org ecosystem and allow users and developers to easily navigate to the relevant source code documentation and docker images global summit banner ad a href https www fiware org global summit img src https fiware github io catalogue img summit23 png width 240 height 70 a a href https www eventbrite com e fiware on site training tickets 591474775977 img src https fiware github io catalogue img training23 png width 240 height 70 a img src https img shields io badge ngsi v2 5dc0cf svg width 90 align left https fiware ges github io orion api v2 stable smart supermarket this is a collection of tutorials for the fiware ecosystem designed for ngsi v2 developers each tutorial based around a smart supermarket consists of a series of exercises to demonstrate the correct use of individual fiware components using ngsi v2 interfaces and shows the flow of context data within a simple smart solution either by connecting to a series of dummy iot devices or manipulating the context directly or programmatically movie camera introduction br to ngsi v2 https www youtube com watch v pk4ggyjlmdy books ngsi v2 tutorial br documentation https fiware tutorials rtfd io https fiware tutorials letsfiware jp br img src https img shields io badge ngsi ld d6604d svg width 90 align left https www etsi org deliver etsi gs cim 001 099 009 01 03 01 60 gs cim009v010301p pdf smart farm this is a collection of tutorials for the fiware ecosystem designed for ngsi ld developers linked data concepts are explained using the entities from a smart farm each tutorial then demonstrates the correct use of individual fiware components via the ngsi ld interface and shows the flow of context data within a simple smart solution either by connecting to a series of dummy iot devices or manipulating the context directly or programmatically movie camera introduction br to linked data https www youtube com watch v 4x xzt5ef5q movie camera introduction br to ngsi ld https www youtube com watch v rz13iylpata books ngsi ld tutorial br documentation https ngsi ld tutorials rtfd io should i use ngsi v2 or ngsi ld fiware offers two flavours of the ngsi interfaces ngsi v2 offers json based interoperability used in individual smart systems ngsi ld offers json ld based interoperability used for federations and data spaces ngsi v2 is ideal for creating individual applications offering interoperable interfaces for web services or iot devices it is easier to understand than ngsi ld and does not require a json ld context https www w3 org tr json ld11 the context however ngsi ld and linked data is necessary when creating a data space or introducing a system of systems aproach and in situations requiring interoperability across apps and organisations install to download the full set of tutorials simply clone this repository console git clone https github com fiware tutorials step by step git cd tutorials step by step git submodule update init recursive the ngsi v2 and ngsi ld tutorials are then available under the ngsi v2 and ngsi ld directories respectively docker and docker compose img src https www docker com favicon ico align left height 30 width 30 each tutorial runs all components using docker https www docker com docker is a container technology which allows to different components isolated into their respective environments to install docker on windows follow the instructions here https docs docker com docker for windows to install docker on mac follow the instructions here https docs docker com docker for mac to install docker on linux follow the instructions here https docs docker com install docker compose is a tool for defining and running multi container docker applications a series of yaml files are used configure the required services for the application this means all container services can be brought up in a single command docker compose is installed by default as part of docker for windows and docker for mac however linux users will need to follow the instructions found here https docs docker com compose install you can check your current docker and docker compose versions using the following commands console docker compose v docker version please ensure that you are using docker version 18 03 or higher and docker compose 1 21 or higher and upgrade if necessary postman img src https raw githubusercontent com fiware tutorials step by step master img postman svg align left height 30 width 30 the tutorials which use http requests supply a collection for use with the postman utility postman is a testing framework for rest apis the tool can be downloaded from www getpostman com www getpostman com all the fiware postman collections can downloaded directly from the postman api network https explore postman com team 3mm5ey6chbyp9d cygwin for windows img src https www cygwin com favicon ico align left height 30 width 30 style border right style solid border right width 10px border color transparent background transparent we will start up our services using a simple bash script windows users should download cygwin http www cygwin com to provide a command line functionality similar to a linux distribution on windows apache maven img src https maven apache org favicon ico align left height 30 width 30 style border right style solid border right width 10px border color transparent background transparent apache maven https maven apache org download cgi is a software project management and comprehension tool based on the concept of a project object model pom maven can manage a project s build reporting and documentation from a central piece of information maven can be used to define and download our dependencies and to build and package java or scala code into a jar file license mit license 2018 2023 fiware foundation e v | fiware tutorial contextual-data ngsi ngsi-v2 ngsi-ld | os |
women-in-cloud-rvce-android | women in cloud rvce android unofficial an unofficial android application for women in cloud rvce center of excellence a href https render com img alt production src https img shields io badge production up lgreen svg a maintainability https api codeclimate com v1 badges a5688e693a48ff0953ca maintainability https codeclimate com github mssandeepkamath women in cloud rvce api maintainability app preview https user images githubusercontent com 90695071 229217757 4a582538 3619 4a4a 9970 b0537a488e50 png table of contents about about play store link play store link api repository link api repository link previews previews class diagram class diagram contributing contributing contact contact about women in cloud rvce android is an unofficial app designed to enhance the women in cloud rvce experience it offers various features to interact with the api and access information about projects internships events and more play store link p align left a href https play google com store apps details id com sandeep womenincloudrvce img alt get it on google play height 80 src https play google com intl en us badges images generic en badge web generic png a p api repository link access the api repository women in cloud rvce api https github com mssandeepkamath women in cloud rvce api previews img src https user images githubusercontent com 90695071 232440631 db760891 a6c6 44a4 9c8b caf49cfc40e3 png width 17 height 17 img src https user images githubusercontent com 90695071 232440662 c0489d3a 171c 4f29 998e 872f4759bbac png width 17 height 17 img src https user images githubusercontent com 90695071 232440692 89b82203 e7e3 4f6d 8c0c 26f844b15a2d png width 17 height 17 img src https user images githubusercontent com 90695071 232441624 af761065 dd2a 41f0 b0ab 3a2a05ab7228 png width 17 height 17 img src https user images githubusercontent com 90695071 232442093 af4d25c4 0056 48e9 88f6 23ade1f14632 png width 17 height 17 class diagram refer to the java class diagram https user images githubusercontent com 90695071 232449390 b4fe8ce7 95f3 4748 afa4 805ab6b8b805 png for an overview of the application s structure contributing contributions are welcome to contribute 1 fork this repository 2 clone the forked repository locally 3 add the original repository as an upstream remote 4 create a new branch for your feature fix 5 make your changes and commit them 6 push the changes to your fork 7 create a pull request in the original repository for merging contact for any questions or feedback feel free to contact us mailto your email example com disclaimer this app is developed independently and is not affiliated with any official women in cloud rvce platforms | android kotlin women-in-cloud workspace-management java rvce | cloud |
CIS581-Project2-Image-Morphing | cis 581 project 2 image morphing fall 2013 qiong wang introduction this is the course project for cis 581 computer vision and photography it mainly focuses on image morphing techniques for human faces all the codes in this project are implemented in matlab 2013a contents requirement the detailed requirement can be found in the coursee wiki page through the following link http alliance seas upenn edu cis581 wiki projects fall2013 project2 proj2 2013 fall pdf implementation a morph is a simultaneous warp of the image shape and a cross dissolve of the image colors the warp is controlled by the transformation between correspondence in this project two method are used for image morphing one is the triangulation method with a defined triangular mesh over the control points for each meshed triangle it is easy to find the affine transformation of three points detailed description can be found here https docs google com viewer url http 3a 2f 2falliance seas upenn edu 2f cis581 2fwiki 2flectures 2ffall2013 2fcis581 09 13 triangle pdf the other is the thin plate spline tps method using sparse and irregular positioned feature points and smooth interpolation detailed description is here http alliance seas upenn edu cis581 wiki lectures fall2013 cis581 10 13 thin plate pdf here is an easy implementation tutorial for both of the two methods http alliance seas upenn edu cis581 wiki projects fall2013 project2 project 2 review pptx the primary morphing here is from my face to a minion s face and the detailed process of this can be found in the result part codes description here are the functions scripts and data mainly used to generate the final face morphing name description click correspondences m using cpselect to choose the corresponding points mytsearch cpp you can use this cpp file after mex mytsearch cpp mytsearch mexa64 this mexa64 file is generated from mytsearch cpp morph m morphing function using triangulation method morph tps m morphing function using thin plate spline method est tps m estimate the parameters for tps method morph wrapper tps m wrapper function including computation of tps parameters pixel limit m validate the pixel result after transformation script image morphing m script to run the image morphing with either method points mat saved corresponding control points for image morphing results here are the links for the videos of face morphing using triangulation and thin plate spline method 1 triangulation method screenshot https raw github com gabriellaqiong cis581 project2 image morphing master img 1957 png http www youtube com watch v w3h t3rlg0q feature youtu be 2 thin plate sline method screenshot https raw github com gabriellaqiong cis581 project2 image morphing master img 1958 png http www youtube com watch v qluvgkthgh4 third party code here the mytsearch cpp is from david r martin boston college which will save a lot of time to do the triangular index search the original code is attached on this link http graphics cs cmu edu courses 15 463 2011 fall hw proj3 mytsearch mod c note please first mex this cpp file and use it in matlab thank you | ai |
|
PHP-web-shells | php web shells when i started web application security testing i fall in love with web shell development and designed some php based web shells this repository contains all my codes which i released in public | php-backdoor exploit-code symlink bruteforce joomla-sql-injection | front_end |
guides | web development guides tutorials and snippets guides tutorials demos and snippets for front end and back end web development design full screen navigation overlay full screen navigation overlay off canvas navigation off canvas navigation responsive dropdown navigation bar responsive dropdown navigation bar flexbox grid flexbox grid float grid float grid fundamentals of responsive design fundamentals of responsive design what bootstrap is and how to use it bootstrap development local server environment mamp local server environment mamp virtual hosts virtual hosts developing a wordpress theme from scratch developing a wordpress theme from scratch migrating wordpress migrating wordpress getting started with git getting started with git using grunt and sass using grunt and sass more more full screen navigation overlay a style of navigation that triggers a full screen overlay this example utilizes flexbox tutorial http www taniarascia com full screen navigation overlay bull demo http codepen io taniarascia full yyrxrg css aside position fixed width 100 height 100 top 0 left 0 opacity 0 visibility hidden z index 2 open opacity 1 visibility visible html aside div class toggle overlay a class close toggle a div nav nav aside js toggle overlay click function aside toggleclass open off canvas navigation slide out navigation that is hidden off canvas until triggered inspired by the lanyon poole theme for jekyll http lanyon getpoole com tutorial http www taniarascia com off canvas navigation bull demo http codepen io taniarascia full qjbwpb css html body overflow x hidden position position absolute top 0 left 0 z index 2 media screen and min width 800px position position fixed aside width 250px height 100 position fixed top 0 left 250px main width 100 position absolute left 0 article padding 0 80px show nav aside show nav position show nav main transform translatex 250px show nav position position fixed html aside navigation aside a id nav toggle class position toggle a main article main content article main js nav toggle click function this classlist toggle active if body hasclass show nav body removeclass show nav else body addclass show nav responsive dropdown navigation bar a classic top navigation bar with dropdowns the navigation links collapse into a hamburger icon toggle on mobile collapse tutorial http www taniarascia com responsive dropdown navigation bar bull demo http codepen io taniarascia full dyvvyv js nav ul li a not only child click function e this siblings nav dropdown toggle nav dropdown not this siblings hide e stoppropagation html click function nav dropdown hide nav toggle click function nav ul slidetoggle flexbox grid a simple responsive grid made with flexbox this grid is based on percentages and is infinitely expandable tutorial http www taniarascia com easiest flex grid ever bull demo http codepen io taniarascia full rolege css column flex basis 100 media screen and min width 800px row display flex flex direction row flex wrap nowrap column flex 1 one fourth flex 2 5 one third flex 3 3 half flex 5 html div class row div class column 50 div div class column 50 div div float grid a simple responsive grid made with css floats and a clearfix hack this grid is based on percentages rather than the traditional 12 column grid you can add as many classes as you want columns tutorial http www taniarascia com you dont need a framework bull demo http codepen io taniarascia pen gpgdyy css row before row after display table content clear both one one third two thirds one fourth half width 100 media only screen and min width 800px one width 100 half width calc 100 2 one third width calc 100 3 one fourth width calc 100 4 two thirds width calc 100 3 2 column float left html div class row div class half column 50 div div class half column 50 div div fundamentals of responsive design before integrating third party frameworks into your websites learn how to build your own simple framework and understand responsive breakpoints grids and navigation styles understanding the fundamentals of responsive design http www taniarascia com you dont need a framework 1 foundation meta name viewport content width device width initial scale 1 1 breakpoints media screen and min width 800px 1 structure a grid system 1 navigation collapsible navigation bootstrap bootstrap is one of the most popular css frameworks learn what a framework is how to get started with bootstrap and how to use the documentation to build a responsive frame upon which you can add custom styles what bootstrap is and how to use it http www taniarascia com what is bootstrap and how do i use it what is bootstrap why is a framework important building a basic bootstrap template applying custom styles local server environment mamp setting up a local environment is necessary in order to begin working in the back end with languages such as php and mysql learn how to set up a lamp mamp wamp stack setting up a local server environment with mamp http www taniarascia com local environment linux mac windows apache mysql php virtual hosts if you want to set up more than one websites on a local server you can use virtual hosts to work on as many sites as you want setting up virtual hosts http www taniarascia com setting up virtual hosts editing the apache configuration file httpd conf editing the hosts files hosts setting the correct ports developing a wordpress theme from scratch you know html but your potential clients don t if you intend to make a website and pass it on to a user so that they might edit it by themselves in the future a content management system cms will be necessary wordpress is the most popular cms and it s built on php this guide will show you how to take your html site and turn it into a wordpress theme it assumes no prior knowledge of php or wordpress developing a wordpress theme from scratch http www taniarascia com developing a wordpress theme from scratch migrating wordpress after creating a wordpress theme in a local environment there are a few necessary steps to take to ensure that the site doesn t break when migrating to a live server migrating a wordpress site to a live server http www taniarascia com migrating a wordpress site to a live server getting started with git links to git repositories are everywhere learn how to use the command line to create a local project upload it to a git repository host such as github and upload it onto a live server bypassing ftp this guide assumes no prior knowledge with or familiarity of git github version control software or using command lines getting started with git http www taniarascia com getting started with git using grunt and sass understand how to use node js node package manager npm grunt sass and command line can be daunting this guide assumes no prior knowledge of using the command line installing node js using npm grunt or sass and details every step along the way to ensure that grunt it up and running by the end of this article you will be able to set up grunt and use it to compile scss into css which is minified and autoprefixed as well as minifying and linting javascript code getting started with grunt and sass http www taniarascia com getting started with grunt and sass more common development terms and abbreviations http www taniarascia com development terms abbreviations google maps apis for multiple locations http www taniarascia com google maps apis for multiple locations in progress start making websites com http startmakingwebsites com a beginner s guide to web development html cheat sheet http startmakingwebsites com html list of all commonly used html5 valid tags | front_end |
|
practical-nlp-code | practical natural language processing a comprehensive guide to building real world nlp systems sowmya vajjala https www linkedin com in sowmya vajjala 2a38734 bodhisattwa p majumder http www majumderb com anuj gupta https www linkedin com in anujgupta 82 harshit surana http harshitsurana com endorsed by zachary lipton http zacklipton com sebastian ruder https ruder io marc najork http marc najork org monojit choudhury https www microsoft com en us research people monojitc vinayak hegde https www linkedin com in vinayakh mengting wan https mengtingwan github io siddharth sharma https www linkedin com in siddharth sharma 31140210 ed harris https www linkedin com in e10is foreword by julian mcauley https cseweb ucsd edu jmcauley homepage www practicalnlp ai http www practicalnlp ai published by o reilly media 2020 http shop oreilly com product 0636920262329 do book structure figure https github com practical nlp practical nlp figures raw master figures p 1 png please note that the code repository is still under development and review all the notebooks will be crystalized in the coming months the notebooks have been tested on an ubuntu machine running python 3 6 currently we are using tf1 x we will migrate to tf2 x in the coming months sparkles a newer version compatible with ubuntu 23 is being added in pnlp refactor ubuntu23 https github com practical nlp practical nlp code tree pnlp refactor ubuntu23 we thank everyone especially the educators universities for their feedback in pointing out the issues improving the accessibility of the notebooks details of the repository roadmap could be found here roadmap md http check server in book images book png 250x250 http practicalnlp ai open the repository in google colab open in colab https colab research google com assets colab badge svg https colab research google com github practical nlp practical nlp blob master open the repository in jupyter nbviewer open in nbviewer https user images githubusercontent com 2791223 29387450 e5654c72 8294 11e7 95e4 090419520edb png https nbviewer jupyter org github practical nlp practical nlp tree master chapterwise folders chapter 1 nlp a primer https github com practical nlp practical nlp tree master ch1 only figures outline chapter 2 nlp pipeline https github com practical nlp practical nlp tree master ch2 chapter 3 text representation https github com practical nlp practical nlp tree master ch3 chapter 4 text classification https github com practical nlp practical nlp tree master ch4 chapter 5 information extraction https github com practical nlp practical nlp tree master ch5 chapter 6 chatbots https github com practical nlp practical nlp tree master ch6 chapter 7 topics in brief https github com practical nlp practical nlp tree master ch7 chapter 8 social media https github com practical nlp practical nlp tree master ch8 chapter 9 e commerce and retail https github com practical nlp practical nlp tree master ch9 chapter 10 healthcare finance and law https github com practical nlp practical nlp tree master ch10 chapter 11 end to end nlp process https github com practical nlp practical nlp tree master ch11 contributors to codebase varun purushotham https www linkedin com in varunp2k kartikay bagla https www linkedin com in kartikay bagla 60638a167 jitin kapila https www linkedin com in jitinkapila vishal gupta https www linkedin com in vishalg8897 taranjeet singh https www linkedin com in taranjeet7114 shreyans dhankhar https www linkedin com in shreyans dhankhar 501b88118 kumarjit pathak https www linkedin com in kumarjitpathak ernest kirubakaran selvaraj https www linkedin com in ernest s kirubakaran ayush datta https www linkedin com in ayushdatta rui shu https www linkedin com in rui shu nachiketh suresh https www linkedin com in nachiketh suresh a4955411 jatin papreja https www linkedin com in jatin papreja 71982bb9 kumar apurva https www linkedin com in kumar apurva 000b38197 sukeerat goindi https github com sukeeratsg abhijeetsingh meena https github com ethan0456 found a bug if the bug is in book text please enter a errata here https www oreilly com catalog errata csp isbn 0636920262329 https www oreilly com catalog errata csp isbn 0636920262329 if the bug is in the codebase great please read this https github com practical nlp practical nlp edit master contributing md how can you help us improve the codebase | natural-language-processing natural-language-understanding oreilly-books | ai |
hivemq-mqtt-tensorflow-kafka-realtime-iot-machine-learning-training-inference | streaming machine learning at scale from 100000 iot devices with hivemq apache kafka and tensorflow if you just want to get started and quickly start the demo in a few minutes go to the quick start infrastructure readme md to setup the infrastructure on gcp and run the demo you can also check out the 20min video recording with a live demo streaming machine learning at scale from 100000 iot devices with hivemq apache kafka and tensorflow https www youtube com watch v 7ovslt0az3m there is also a blog post with more details about use cases for event streaming and streaming analytics in the automotive industry https www kai waehner de blog 2019 11 22 apache kafka automotive industry industrial iot iiot motivation demo an iot scenario at scale you want to see an iot example at huge scale not just 100 or 1000 devices producing data but a really scalable demo with millions of messages per second from tens of thousands of devices this is the right demo for you the demo shows how you can integrate with tens or hundreds of thousands iot devices and process the data in real time the demo use case is predictive maintenance i e anomaly detection in a connected car infrastructure to predict motor engine failures background cloud native mqtt and kafka if you need more background about the challenges of building a scalable iot infrastructure the differences and relation between mqtt and apache kafka and best practices for realizing a cloud native iot infrastructure based on kubernetes check out the slide deck best practices for streaming iot data with mqtt and apache kafka https www slideshare net kaiwaehner best practices for streaming iot data with mqtt and apache kafka background digital twin with kafka in addition to the predictive maintenance scenario with machine learning we also implemented an example of a digital twin more thoughts on this here apache kafka as digital twin for open scalable reliable industrial iot iiot https www kai waehner de blog 2019 11 28 apache kafka industrial iot iiot build an open scalable reliable digital twin in our example we use kafka as ingestion layer and mongodb for storage and analytics however this is just one of various iot architectures for building a digital twin with apache kafka https www kai waehner de blog 2020 03 25 architectures digital twin digital thread apache kafka iot platforms machine learning use case anomaly detection in real time for 100000 connected cars streaming machine learning in real time at scale with mqtt http mqtt org apache kafka https kafka apache org tensorflow https www tensorflow org and tensorflow i o https www tensorflow org io data integration data preprocessing model training model deployment real time scoring real time monitoring this project implements a scenario where you can integrate with tens of thousands simulated cars using a scalable mqtt platform and an event streaming platform the demo trains new analytic models from streaming data without the need for an additional data store to do predictive maintenance on real time sensor data from cars use case streaming machine learning with mqtt kafka and tensorflow i o pictures use case mqtt hivemq to tensorflow via apache kafka streams ksql png we built two different analytic models using different approaches unsupervised learning used by default in our project an autoencoder neural network is trained to detect anomaly detection without any labeled data supervised learning a lstm neural network is trained on labeled data to be able to do predictions about future sensor events the digital twin implementation with kafka and mongodb is discussed on its own page infrastructure kafka connect mongodb readme md including implementation details and configuration examples another example demonstrates how you can store all sensor data in a data lake in this case gcp google cloud storage gcs infrastructure kafka connect gcs readme md for further analytics architecture we use hivemq https github com hivemq hivemq community edition as open source mqtt broker to ingest data from iot devices ingest the data in real time into an apache kafka https github com cluster for preprocessing using kafka streams https kafka apache org documentation streams ksql https github com confluentinc ksql and model training inference using tensorflow 2 0 https www tensorflow org and its tensorflow i o kafka plugin https github com tensorflow io tree master tensorflow io kafka we leverage additional enterprise components from hivemq and confluent to allow easy operations scalability and monitoring here is the architecture of the mvp we created for setting up the mqtt and kafka infrastructure on kubernetes mvp architecture pictures mvp architecture hivemq confluent mqtt kafka iot tensorflow png and this is the architecture of the final demo where we included tensorflow and tf i o s kafka plugin for model training and model inference advanced architecture pictures advanced architecture hivemq confluent mqtt kafka iot tensorflow png deployment on google cloud platoform gcp and google kubernetes engine gke we used google cloud platform gcp https cloud google com gcp and google kubernetes engine gke https cloud google com kubernetes engine for different reasons what you get here is not a fully managed solution but a mix of a cloud provider a fully managed kubernetes service and self managed components for mqtt kafka tensorflow and client applications here is why we chose this setup get started quickly with two or three terraform and shell commands to setup the whole infrastructure but being able to configure and change the details for playing around and starting your own poc for your specific problem and use case show how to build a cloud native infrastructure which is flexible scalable and elastic demonstrate an architecture which is applicable on different cloud providers like gcp aws azure alibaba and on premises go deep dive into configuration to show different options and allow flexible s l xl or xll setups for connectivity throughput data processing model training and model inference understand the complexity of self managed distributed systems even with kubernetes terraform and a cloud provider like gcp you have to do a lot to do such a deployment in a reliable scalable and elastic way get motivated to cloud out fully managed services for parts of your infrastructure if it makes sense e g google s cloud machine learning engine for model training and model inference https cloud google com ml engine using tensorflow ecosystem or confluent cloud as fully managed kafka as a service offering with consumption based pricing and slas https www confluent io confluent cloud test data car sensors we generate streaming test data at scale using a car data simulator https github com sbaier1 avro car sensor simulator the test data uses apache avro file format to leverage features like compression schema versioning and confluent features like schema registry https github com confluentinc schema registry or ksql s schema inference you can either use some test data stored in the csv file car sensor data csv testdata car sensor data csv or better generate continuous streaming data using the included script as described in the quick start check out the avro file format here cardata v1 avsc cardata v1 avsc here is the schema and one row of the test data time car coolant temp intake air temp intake air flow speed battery percentage battery voltage current draw speed engine vibration amplitude throttle pos tire pressure 1 1 tire pressure 1 2 tire pressure 2 1 tire pressure 2 2 accelerometer 1 1 value accelerometer 1 2 value accelerometer 2 1 value accelerometer 2 2 value control unit firmware 1567606196 car1 39 395103 34 53991 123 317406 0 82654595 246 12367 0 6586535 24 934872 2493 487 0 034893095 32 31 34 34 0 5295712 0 9600553 0 88389874 0 043890715 2000 streaming ingestion and model training with kafka and tensorflow io typically analytic models are trained in batch mode where you first ingest all historical data in a data store like hdfs aws s3 or gcs then you train the model using a framework like spark mllib tensorflow or google ml tensorflow i o https github com tensorflow io is a component of the tensorflow framework which allows native integration with various technologies one of these integrations is tensorflow io kafka which allows streaming ingestion into tensorflow from kafka without the need for an additional data store this significantly simplifies the architecture and reduces development testing and operations costs yong tang member of the sig tensorflow i o team did a great presentation about this at kafka summit 2019 in new york https www confluent io kafka summit ny19 real time streaming with kafka and tensorflow video and slide deck available for free you can pick and choose the right components from the apache kafka and tensorflow ecosystems to build your own machine learning infrastructure for data integration data processing model training and model deployment machine learning workflow with tensorflow and apache kafka ecosystem pictures tensorflow apache kafka streaming workflow png this demo will do the following steps consume streaming data from 100 000 simulated cars forwarding it via hivemq mqtt broker to kafka consumers ingest store and scale the iot data with apache kafka and confluent platform preprocess the data with ksql filter transform ingest the data into tensorflow tf data and tensorflow io kafka plugin build train and save the model tensorflow 2 0 api store the trained model in google cloud storage object store and load it dynamically from a kafka client application deploy the model in two variants in a simple python application using tensorflow i os inference capabilities and within a powerful kafka streams application for embedded real time scoring optional steps nice to have show iot specific features with hivemq tool stack deploy the model via tensorflow serving some kind of a b testing maybe with istio service mesh re train the model and updating the kafka streams application via sending the new model to a kafka topic monitoring of model training via tensorboard and model deployment inference via some kind of kafka integration dashboard technology confluent cloud for kafka as a service focus on business problems not running infrastructure enhance demo with c librdkafka clients and tensorflow lite for edge computing add confluent tiered storage as cost efficient long term storage todo other ideas why is streaming machine learning so awesome again you don t need another data store anymore just ingest the data directly from the distributed commit log of kafka model training from the distributed commit log of apache kafka leveraging tensorflow i o pictures kafka commit log model training with tensorflow io png this totally simplfies your architecture as you don t need another data store in the middle in conjunction with tiered storage a confluent add on to apache kafka you can built a cost efficient long term storage aka data lake with kafka no need for hdfs or s3 as additional storage see the blog post streaming machine learning with tiered storage and without a data lake https www confluent io blog streaming machine learning with tiered storage for more details be aware this is still not online training streaming ingestion for model training is fantastic you don t need a data store anymore this simplifies the architecture and reduces operations and developemt costs however one common misunderstanding has to be clarified as this question comes up every time you talk about tensorflow i o and apache kafka as long as machine learning deep learning frameworks and algorythms expect data in batches you cannot achieve real online training i e re training optimizing the model with each new input event only a few algoryhtms and implementations are available today like online clustering thus even with tensorflow i o and streaming ingestion via apache kafka you still do batch training though you can configure and optimize these batches to fit your use case additionally only kafka allows ingestion at large scale for use cases like connected cars or assembly lines in factories you cannot build a scalable reliable mission critical ml infrastructure just with python the combination of tensorflow i o and apache kafka is a great step closer to real time training of analytic models at scale i posted many articles and videos about this discussion get started with how to build and deploy scalable machine learning in production with apache kafka https www confluent io blog build deploy scalable machine learning production apache kafka and check out my other resources if you want to learn more requirements and setup full live demo for end to end mqtt kafka integration we have prepared a terraform script to deploy the complete environment in google kubernetes engine gke this includes kafka cluster apache kafka ksql schema registry control center mqtt cluster hivemq broker kafka plugin test data generator streaming machine learning model training and inference tensorflow i o and kafka plugin monitoring infrastructure prometheus grafana hivemq control center confluent control center the setup is pretty straightforward no previous experience required for getting the demo running you just need to install some clis on your laptop gcloud kubectl helm terraform and then run two or three script commands as described in the quick start guide with default configuration the demo starts at small scale this is sufficient to show an impressive demo it also reduces cost and to enables free usage without the need for commercial licenses you can also try it out at extreme scale 100000 iot connections this option is also described in the quick start afterwards you execute one single command to set up the infrastructure and one command to generate test data of course you can configure everything to your needs like the cluster size test data etc follow the instructions in the quick start infrastructure readme md to setup the cluster please let us know if you have any problems setting up the demo so that we can fix it streaming machine learning with kafka and tensorflow if you are just interested in the streaming ml part for model training and model inference using python kafka and tensorflow i o check out python application leveraging apache kafka and tensorflow for streaming model training and inference python scripts lstm tensorflow io kafka readme md python scripts lstm tensorflow io kafka readme md you can also checkout two simple examples which use kafka python clients to produce data to kafka topics and then consume the streaming data directly with tensorflow i o for streaming ml without an additional data store streaming ingestion of mnist data into tensorflow via kafka for image regonition confluent tensorflow io kafka py autoencoder for anomaly detection of sensor data into tensorflow via kafka producer python client https github com kaiwaehner hivemq mqtt tensorflow kafka realtime iot machine learning training inference blob master python scripts autoencoder anomaly detection sensor kafka producer from csv py and consumer tensorflow i o kafka plugin model training https github com kaiwaehner hivemq mqtt tensorflow kafka realtime iot machine learning training inference blob master python scripts autoencoder anomaly detection sensor kafka consumer and tensorflow model training py | kafka hivemq mqtt kafka-streams kafka-connect ksql tensorflow grpc java python tiered-storage data-lake confluent ksqldb tensorflow-io terraform gcp kubernetes cloud mongodb | server |
STM32F4xx_FreeRTOS_DSP_Project_Templates | stm32f4xx freertos dsp project templates stm32f4 freertos dsp stm32f4 stm32f4xx dsp stdperiph lib v1 8 0 freertos kernel v10 0 0 armcc v5 06 ide keil uvision5 windows 10 | stm32 rtos freertos dsp templates | os |
nrf-mesh-freeRTOS-example | freertos light switch server example this example project demonstrates how the mesh stack can coexist with the freertos library available in the nrf5 sdk the mesh stack takes advantage of freertos scheduling and task priorities so that mesh can be used more easily in projects and with libraries that use freertos moreover the example demonstrates how to resolve hardware and naming conflicts between the mesh stack and freertos the example is based on light switch server example from mesh sdk v4 1 0 which was modified to run the low priority mesh and ble stack event processing in freertos tasks software requirements this example requires the following additional software mesh sdk v4 1 0 https www nordicsemi com software and tools software nrf5 sdk for mesh download infotabs nrf5 sdk v16 0 0 https www nordicsemi com software and tools software nrf5 sdk download infotabs setup windows to set up the example copy the examples and mesh directories from the root directory of this repository into the root directory of mesh sdk v4 1 0 this should result in some files from mesh sdk being replaced by files from this repository other platforms to set up the example the contents of the examples and mesh directories from the root directory of this repository should be merged into the corresponding examples and mesh directories in the root directory of mesh sdk v4 1 0 it is important that the existing file hierarchy in mesh sdk is preserved only the files replaced by this project should be modified please follow instructions specific to your platform on how to merge these directories testing the example to build and test the example follow the instructions in the example documentation examples light switch freertos server readme md | os |
|
stanford-cs224n | standford cs224n the three assignments are considered completed in the future following work on the selected final project may be added here too for each question in each assignment besides the coding part i usually have a corresponding jupyter notebook to record my written answers they main include more details e g backpropagation derivation and could be used as in complementary to the stanford solution files the course cs224n natural language processing with deep learning winter quarter january march 2017 http web stanford edu class cs224n syllabus html nbviewer if some equations do not show up correctly on github rendered notebooks ipynb files please try nbviewer http nbviewer jupyter org github zyxue stanford cs224n tree master note this course used python 2 7 tensorflow 1 2 due to version constraints of cudnn 5 higher version of tensorflow than 1 2 may not work pip install upgrade https storage googleapis com tensorflow linux gpu tensorflow gpu 0 12 1 cp27 none linux x86 64 whl | stanford-nlp stanford-machine-learning nlp tensorflow deep-learning deep-neural-networks cs224n | ai |
Learn_Computer_Vision | learn computer vision this is the curriculum for learn computer vision by siraj raval on youtube course objective this is the curriculum for this https youtu be fse 02fpjas video on learn computer vision by siraj raval on youtube after completing this course start your own startup do consulting work or find a full time job related to computer vision remember to believe in your ability to learn you can learn cv you will learn cv and if you stick to it eventually you will master it find a study buddy join the computer vision curriculum channel in our slack channel to find one http wizards herokuapp com components each week video lectures reading assignments project s course length 8 weeks 2 3 hours of study per day tools used python opencv tensorflow prerequisites learn python https www edx org course introduction to python for data science 3 calculus http tutorial math lamar edu pdf calculus cheat sheet all pdf linear algebra https www souravsengupta com cds2016 lectures savov notes pdf part 1 low level vision image image week 1 basic image processing techniques luminance brightness contrast gamma histogram equalization linear filtering enhance image blur sharpen edge detect image countours convolution non linear filtering median bilateral filter morphology color processing b w saturation white balance dithering quantization ordered dither floyd steinberg blending image pyramids texture analysis template matching find object in an image video lectures https www youtube com watch v nt80junwlw list pljmxczuzeychvw5yysu92wry8iwhtuq7p index 2 videos 1 5 reading assignments http szeliski org book drafts szeliskibook 20100903 draft pdf sec 3 1 1 2 3 2 sec 3 2 3 4 2 3 3 2 4 project detect an object in an image via the opencv library week 2 motion and optical flow motion analysis optical flow video lectures https www udacity com course introduction to computer vision ud810 udacity lesson 6 https www youtube com watch v nt80junwlw list pljmxczuzeychvw5yysu92wry8iwhtuq7p index 2 video 8 https www youtube com watch v wc8hxuhshaq list plvqb6 mdbcdlnt84lk nvboqcxllotr8j index 6 t 0s reading assignments http szeliski org book drafts szeliskibook 20100903 draft pdf sec 10 5 sec 8 4 up until 8 4 1 project track a moving object in a video frame with opencv part 2 mid level vision image features week 3 basic segmentation segmentation and clustering algorithms like watershed grabcut interactive segmentation hough transform detect circles lines foreground extraction video lectures https www youtube com watch v zf 3aorwec0 https www youtube com watch v 3qjej6wgeza reading assignments sec sec 5 2 5 4 http szeliski org book drafts szeliskibook 20100903 draft pdf project segment lane lines in a road image with opencv week 4 fitting fitting lines and curves robust fitting ransac deformable contours video lectures videos 6 7 https www youtube com watch v nt80junwlw list pljmxczuzeychvw5yysu92wry8iwhtuq7p index 2 reading assignments sec 4 3 2 5 1 1 http szeliski org book drafts szeliskibook 20100903 draft pdf project compute vanishing points in a hallway image with opencv part 3 multiple views week 5 multiple images local invariant feature detection and description image transformations and alignment planar homography epipolar geometry and stereo object instance recognition video lectures https www youtube com playlist list plyh 5mhpffffvcczcbdwxab cty4zg3dj reading assignments http vision cs utexas edu 376 spring2018 tues may 1 see the associated readings on this page project turn a set of images into a 3d object with opencv week 6 3d scenes stereo vision dense motion and tracking 3d objects 3d scene understanding 3d segmentation and modeling video lectures https www youtube com watch v nt80junwlw list pljmxczuzeychvw5yysu92wry8iwhtuq7p index 2 video 9 all videos https www coursera org learn stereovision motion tracking reading assignments google and read the following papers 1 n dalal histograms of oriented gradients for human detection 2 g csurka et al bag of visual words a brilliant representation of cross field research visual categorization with bags of keypoints 3 s lazebnik c schmid j ponce beyond bags of features spatial pyramid matching for recognizing natural scene categories 4 jegou et al aggregating local image descriptors into compact codes project perform object segmentation in a 3d scene with opencv part 4 high level vision features analysis week 7 object detection classification object scene activity categorization semantic segmentation object detection non max suppression sliding windows boundary boxes and anchors counting yolo and darknet region proposal networks supervised classification algorithms probabilistic models for sequence data visual attributes optical character recognition facial detection video lectures https www youtube com watch v a v5 8vgv0a list pljmxczuzeychvw5yysu92wry8iwhtuq7p index 8 10 18 my video on yolo reading assignments http vision cs utexas edu 376 spring2018 tues may 1 see the associated readings on this page project classify a car in an image with tensorflow week 8 modern deep learning active learning dimensionality reduction non parametric methods and big data u net transfer learning avoiding overfitting gans video lectures videos 19 20 https www youtube com watch v a v5 8vgv0a list pljmxczuzeychvw5yysu92wry8iwhtuq7p index 8 my video on transfer learning lectures 1 16 stanford https www youtube com watch v vt1jzlth4g4 list pl3fw7lu3i5jvhm8ljyj zlfqrf3eo8syv reading assignments http vision cs utexas edu 376 spring2018 tues may 1 see the associated readings on this page project build a generative adversarial network to detect faces | ai |
|
clair | clair a surprisingly simple semantic text metric leveraging large language models project https davidmchan github io clair paper https davidmchan github io clair static pdfs emnlp 2023 clair language models for caption evaluation pdf official implementation of the paper clair evaluating image captions with large language models br the evaluation of machine generated image captions poses an interesting yet persistent challenge effective evaluation measures must consider numerous dimensions of similarity including semantic relevance visual structure object interactions caption diversity and specificity existing highly engineered measures attempt to capture specific aspects but fall short in providing a holistic score that aligns closely with human judgments here we propose clair a novel method that leverages the zero shot language modeling capabilities of large language models llms to evaluate candidate captions in our evaluations clair demonstrates a stronger correlation with human judgments of caption quality compared to existing measures notably on flickr8k expert clair achieves relative correlation improvements over spice of 39 6 and over image augmented methods such as refclip s of 18 3 moreover clair provides noisily interpretable results by allowing the language model to identify the underlying reasoning behind its assigned score getting started setup bash pip install git https github com davidmchan clair git usage python import os from clair import clair os environ openai api key your api key candidates a photo of a cat sitting on a couch a cat sitting on a couch references a cat sitting on a couch a cat is sitting on a couch a cat is sitting on a sofa clair score clair candidates references model chat gpt print clair score models clair relies on the grazier https github com davidmchan grazier library to inteface with large language models the following models are currently supported see the linked repository for more information on configuring each model from openai gpt 4 base 32k chat and completion engines gpt 3 5 chatgpt chat and completion engines gpt 3 davinci v2 v3 ada babbage curie completion engine from anthropic claude 2 base chat and completion engines claude base 100k chat and completion engines claude instant base 100k chat and completion engines from google gcp palm chat and completion engines from huggingface gpt 2 base medium large xl completion engine gpt neo 125m 1 3b 2 7b completion engine gpt j 6b completion engine falcon 7b 40b rw 1b rw 7b completion engine dolly v1 6b v2 3b 7b 12b chat and completion engines mpt instruct 7b 30b chat and completion engines from facebook via huggingface llama 7b 13b 30b 65b completion engine llama 2 7b 13b 70b completion engine opt 125m 350m 1 3b 2 7b 6 7b 13b 30b 66b completion engine from stanford via huggingface alpaca 7b chat and completion engines from berkeley via huggingface koala 7b 13b v1 13b v2 chat and completion engines vicuna 7b 13b chat and completion engines from stabilityai via huggingface stablelm 7b 13b chat and completion engines from allenai via huggingface tulu 7b 13b 30b 65b chat and completion engines open instruct sharegpt 7b 13b 30b 65b chat and completion engines from ai21 jurassic 2 light mid ultra completion engines citation if you find this repository useful please cite our paper bibtex inproceedings chan etal 2023 clair evaluating title clair evaluating image captions with large language models author chan david m and petryk suzanne and gonzalez joseph e and darrell trevor and canny john booktitle proceedings of the 2023 conference on empirical methods in natural language processing month dec year 2023 address singapore singapore publisher association for computational linguistics | ai |
|
OpenCV | opencv based on the udemy course introduction to computer vision master opencv in python by rajeev ratan opencv stands for open source computer vision written in c opencv 3 x is similar to opencv 2 x except the removal of certain patented algorithms such as sift and surf | ai |
|
cartero | depreciation notice at long last cartero has been depreciated as the first multi page application build tool cartero served its purpose well for many years however now there are far more robust and better supported tools that can be used to achieve the same goals we recommend migrating any projects still using cartero to web pack https webpack js org thank you to everyone who contributed to this pioneering project cartero cartero is an asset pipeline built on npm https www npmjs org and browserify http browserify org that allows you to easily organize front end code in multi page web applications into reusable packages containing html javascript css and images build status https travis ci org rotundasoftware cartero svg branch master https travis ci org rotundasoftware cartero overview cartero eliminates much of the friction involved in the build process when applying modular design principles to front end assets in multi page web applications use directories to group together assets js css images etc for each ui component and cartero will take care of ensuring that all the appropriate assets are then loaded on the pages that require them and just those pages depending on a package is as simple as require pork and beans and since cartero is built on npm https www npmjs org the official node js package manager you can easily publish your packages and or depend on other npm packages in your own code a package might contain assets for a calendar widget a popup dialog a header or footer an entire web page cartero is a build tool it does not introduce many new concepts and the same modular organizational structure it facilitates could also be achieved by stringing together other build tools and the appropriate script link and img tags however using cartero is well a lot easier see this article https github com rotundasoftware cartero blob master comparison md for more info on how cartero compares to other tools and this tutorial http www jslifeandlove org intro to cartero to get started command line usage build all assets for a multi page application with the cartero command for example cartero views index js static assets this command will gather up the js and other assets required by each javascript entry point matching the first argument in this case all index js files in the views directory and drop the compiled assets into the output directory specified in the second argument along with information used at run time by the hook the hook to load the appropriate assets for each entry point cartero only processes each asset one time so compiling assets for many entry points at once is extremely efficient each additional entry point adds practically no overhead cartero also separates out any javascript assets that are used by all your entry points into a common bundle so that the browser cache can be leveraged to load any shared logic quickly on each page this magic is done by factor bundle https github com substack factor bundle thanks for your wizardry http cyber wizard institute james adding a w flag to the cartero command will run cartero in watch mode so that the output is updated whenever assets are changed again cartero s watch mode is extremely efficient only rebuilding what is necessary for a given change the hook at run time your application needs to be able to easily figure out where assets are located for this reason cartero provides a small 100 loc https github com rotundasoftware cartero node hook blob master index js runtime library that your server side logic can use to look up asset urls or paths based on a simple map output by cartero at build time at the time of this writing only a hook for node js https github com rotundasoftware cartero node hook is available but one can quickly be written for any server side environment for example if views page1 index js is an entry point the following call will return all the script and link tags needed to load the js and css bundles it requires javascript h gettagsforentrypoint views page1 index js function err scripttags styletags scripttags and styletags and strings of script and link tags respectively attach the tags to the express res locals so we can output them in our template to load the page s assets res locals script scripttags res locals style styletags you can also ask the cartero hook to lookup the url of a specific asset for example to find the url of carnes png in that same page1 directory javascript var imageurl h getasseturl views page1 carnes png it s all in the package json cartero can gather and compile style and image assets from any module with a package json file just include a style and or image property in the package json that enumerates the assets the package requires of that type in glob notation https github com isaacs node glob glob primer for example name my module version 1 0 2 main lib my module js dependencies style scss styles image icon png images note that package json files can be in any location you can even put package json files in your views folder sound weird try it the javascript entry point that is used by any given view is after all just like a package it has its own js css and may depend on other packages or even be depended upon relax your brain does the below directory structure make sense node modules my module index js icon png package json style scss views page1 package json page1 jade server side template style css index js entry point for page 1 page2 package json style css page2 jade server side template index js entry point for page 2 usage npm install g cartero cartero entrypoints outputdir options the cartero command gathers up all assets required by the javascript files matching the entrypoints argument which is a glob string and transforms and concatenates them as appropriate and saves the output in outputdir at run time the html tags needed to load the assets required by a particular entry point can be found using the cartero hook s https github com rotundasoftware cartero node hook the cartero express middleware https github com rotundasoftware cartero express middleware can be used for an added level of convenience command line options transform t name or path of a application level transform see discussion of apptransforms option transformdir d path of an application transform directory see discussion of application transforms watch w watch mode watch for changes and update output as appropriate for dev mode postprocessor p the name of a post processor module to apply to assets e g uglifyify etc maps m enable javascript source maps in js bundles for dev mode keepseperate s keep css files separate instead of concatenating them for dev mode outputdirurl o the base url of the cartero output directory e g assets defaults to help h show this message tranforms package specific local transforms the safest and most portable way to apply transforms to a package like sass css or coffeescript js is using the transforms key in a package s package json file the key should be an array of names or file paths of transform modules https github com substack module deps transforms for example name my module description example module version 1 5 0 style scss transforms sass css stream dependencies sass css stream 0 0 1 all transform modules are called on all assets including javascript files it is up to the transform module to determine whether or not it should apply itself to a file usually based on the file extension application level transforms you can apply transforms to all packages within an entire branch of the directory tree using the apptransforms and apptransformdirs options or their corresponding command line arguments packages inside a node modules folder located inside one of the supplied directories are not effected for example to transform all sass files inside the views directory to css cartero views index js static assets t sass css stream d views catalog of transforms any browserify javascript transform will work with cartero see the parcelify documentation https github com rotundasoftware parcelify catalog of transforms for a catalog of transforms that apply to other asset types built in transforms there are three built in transforms that cartero automatically applies to all packages the relative to absolute path transform style assets only cartero automatically applies a transform to your style assets that replaces relative urls with absolute urls after any local default transforms are applied this transform makes relative urls work even after css files are concatenated into bundles for example the following url reference in a third party module will work even after concatenation css div backdrop background url pattern png the asset url transform to resolve asset urls at times it is necessary to resolve the url of an asset at build time for example in order to reference an image in one package from another for this reason cartero applies a special transform to all javascript and style assets that replaces expressions of the form asset url path with the url of the asset at path after any local default transforms are applied the path is resolved to a file using the node resolve algorithm and then mapped to the url that file will have once in the cartero output directory for instance in page1 index js javascript mymodule require my module img my module attr src asset url my module icon png the same resolution algorithm can be employed at run time on the server side via the cartero hook https github com rotundasoftware cartero node hook using the getasseturl method the resolve transform to resolve node style paths to absolute paths just like to the asset url transform but evaluates to the absolute path of the referenced asset for example in a nunjucks https mozilla github io nunjucks template this transform could be used to reference a template in one module from another jinja extends resolve other module template nunj block body endblock api c cartero entrypoints outputdir options entrypoints is a glob pattern or an array of glob patterns any javascript file matching the pattern s will be treated as an entry point outputdir is the path of the directory into which all of your processed assets will be dropped along with some meta data it should be a directory that is exposed to the public so assets can be loaded using script link tags e g the static directory in express applications options are as follows assettypes default style image the keys in package json files that enumerate assets that should be copied to the cartero output directory assettypestoconcatenate default style a subset of assettypes that should be concatenated into bundles note javascript files are special cased and are always both included and bundled by browserify outputdirurl default the base url of the output directory approotdir default undefined the root directory of your application you generally only need to supply this option if the directory structure of the system on which your application will be run is different than of the system on which cartero is being run apptransforms default undefined an array of transform module tranforms names paths or functions to be applied to all packages in directories in the apptransformdirs array apptransformdirs default undefined apptransforms are applied to any packages that are within one of the directories in this array the recursive search is stopped on node module directories packagetransform default undefined a function that transforms package json files before they are used the function should be of the signature function pkgjson pkgpath and return the parsed transformed package object this feature can be used to add default values to package json files or alter the package json of third party modules without modifying them directly sourcemaps default false enable js source maps passed through to browserify watch default false reprocess assets and bundles and meta data when things change postprocessors default an array of post procesor functions or module names paths post processors should have the same signature as transform modules https github com substack module deps transforms these transforms are applied to all final bundles after all other transforms have been applied useful for minification uglification etc a cartero object is returned which is an event emitter c on done function called when all assets and meta data have been written to the destination directory c on error function err called when an error occurs c on browserifyinstancecreated function browserifyinstance called when the browserify watchify instance is created c on filewritten function path assettype isbundle watchmodeupdate called when an asset or bundle has been written to disk watchmodeupdate is true iff the write is a result of a change in watch mode c on packagecreated function package called when a new parcelify https github com rotundasoftware parcelify package is created faq q what is the best way to handle client side templates use a browserify transform like nunjucksify https github com rotundasoftware nunjucksify or node hbsfy https github com epeli node hbsfy to precompile templates and require them explicitly from your javascript files q what does cartero write to the output directory you generally don t need to know the anatomy of cartero s output directory since the cartero hook https github com rotundasoftware cartero node hook serves as a wrapper for the information and assets in contains but here is the lay of the land for the curious note the internals of the output directory are not part of the public api and may be subject to change static assets output directory 66e20e747e10ccdb653dadd2f6bf05ba01df792b entry point package directory assets json page1 bundle 14d030e0e64ea9a1fced71e9da118cb29caa6676 js page1 bundle da3d062d2f431a76824e044a5f153520dad4c697 css 880d74a4a4bec129ed8a80ca1f717fde25ce3932 entry point package directory assets json page2 bundle 182694e4a327db0056cfead31f2396287b7d4544 css page2 bundle 5066f9594b8be17fd6360e23df52ffe750206020 js 9d82ba90fa7a400360054671ea26bfc03a7338bf regular package directory robot png metadata json each subdirectory in the output directory corresponds to a particular package some of which are entry points is named using that package s unique id contains all the assets specific to that package and has the same directory structure as the original package directories that coorespond to entry points also contain an assets json file which enumerates the assets used by the entry point the metadata json file maps package paths to package ids q is it safe to let browsers cache asset bundles yes the name of asset bundles generated by cartero includes an shasum of their contents when the contents of one of the files changes its name will be updated which will cause browsers to request a new copy of the content the rails asset pipeline http guides rubyonrails org asset pipeline html implements the same cache busting technique q will relative urls in css files break when cartero bundles them into one file well they would break but cartero automatically applies a tranform to all your style assets that replaces relative urls with absolute urls calculated using the outputdirurl option so no they won t break contributors david beck https twitter com davegbeck james halliday https twitter com substack oleg seletsky https github com go oleg license mit | front_end |
|
ramp | ramp rapid machine learning prototyping ramp is a python library for rapid prototyping of machine learning solutions it s a light weight pandas http pandas pydata org based machine learning framework pluggable with existing python machine learning and statistics tools scikit learn http scikit learn org rpy2 http rpy sourceforge net rpy2 html etc ramp provides a simple declarative syntax for exploring features algorithms and transformations quickly and efficiently documentation http ramp readthedocs org why ramp clean declarative syntax complex feature transformations chain and combine features python normalize log x interactions log x1 f x2 f x3 2 reduce feature dimension python dimensionreduction f x d i for i in range 100 decomposer pca n components 3 incorporate residuals or predictions to blend with other models python residuals simple model def predictions complex model def data context awareness any feature that uses the target y variable will automatically respect the current training and test sets similarly preparation data a feature s mean and stdev for example is stored and tracked between data contexts composability all features estimators and their fits are composable pluggable and storable easy extensibility ramp has a simple api allowing you to plug in estimators from scikit learn rpy2 and elsewhere or easily build your own feature transformations metrics feature selectors reporters or estimators quick start getting started with ramp classifying insults http www kenvanharen com 2012 11 getting started with ramp detecting html or the quintessential iris example python import pandas from ramp import import urllib2 import sklearn from sklearn import decomposition fetch and clean iris data from uci data pandas read csv urllib2 urlopen http archive ics uci edu ml machine learning databases iris iris data data data drop 149 bad line columns sepal length sepal width petal length petal width class data columns columns all features features fillmissing f 0 for f in columns 1 features log transformed features and interaction terms expanded features features log f f 1 for f in features f sepal width 2 combo interactions features define several models and feature sets to explore run 5 fold cross validation on each and print the results we define 2 models and 4 feature sets so this will be 4 2 8 models tested shortcuts cv factory data data target asfactor class metrics metrics generalizedmcc report feature importance scores from random forest reporters reporters rfimportance try out two algorithms model sklearn ensemble randomforestclassifier n estimators 20 sklearn linear model logisticregression and 4 feature sets features expanded features feature selection trained featureselector expanded features use random forest s importance to trim selectors randomforestselector classifier true target asfactor class target to use n keep 5 keep top 5 features reduce feature dimension pointless on this dataset combo dimensionreduction expanded features decomposer decomposition pca n components 4 normalized features normalize f for f in expanded features status ramp is alpha currently so expect bugs bug fixes and api changes requirements numpy scipy pandas pytables sci kit learn gensim author ken van haren email with feedback questions kvh science io squaredloss http twitter com squaredloss contributors john mcdonnell https github com johnmcdonnell rob story https github com wrobstory | ai |
|
DataScienceR | r data science tutorials this repo contains a curated list of r tutorials and packages for data science nlp and machine learning this also serves as a reference guide for several common data analysis tasks curated list of python tutorials for data science nlp and machine learning https github com ujjwalkarn datasciencepython comprehensive topic wise list of machine learning and deep learning tutorials codes articles and other resources https github com ujjwalkarn machine learning tutorials blob master readme md learning r online courses tryr on codeschool http tryr codeschool com introduction to r for data science microsoft edx https www edx org course introduction r data science microsoft dat204x gclid cliyopb448wcfrjxvaod rolsa introduction to r on datacamp https www datacamp com courses free introduction to r data analysis with r https www udacity com course data analysis with r ud651 free resources for learning r http stats stackexchange com questions 138 free resources for learning r r for data science hadley wickham http r4ds had co nz advanced r hadley wickham http adv r had co nz swirl learn r in r http swirlstats com data analysis and visualization using r http varianceexplained org rdata many r programming tutorials http www listendata com p r programming tutorials html a handbook of statistical analyses using r https cran r project org web packages hsaur vignettes ch introduction to r pdf find other chapters cookbook for r http www cookbook r com learning r in 7 simple steps http www datasciencecentral com profiles blogs learning r in seven simple steps more resources awesome r repository on github https github com qinwf awesome r r reference card cheatsheet https cran r project org doc contrib short refcard pdf r bloggers blog aggregator http www r bloggers com r resources on github https github com binga datasciencearsenal blob master r resources md awesome r resources https github com ujjwalkarn awesome r data mining with r https github com ujjwalkarn data mining with r rob j hyndman s r blog http robjhyndman com hyndsight r simple r tricks and tools http robjhyndman com hyndsight simpler video https www youtube com watch v toc w7l2qo rstudio github repo https github com rstudio tidying messy data in r http www dataschool io tidying messy data in r video https vimeo com 33727555 baseball research with r http www hardballtimes com a short ish introduction to using r for baseball research 600 websites about r http www datasciencecentral com profiles blogs 600 websites about r implementation of 17 classification algorithms in r http www datasciencecentral com profiles blogs implemetation of 17 classification algorithms in r cohort analysis and lifecycle grids mixed segmentation with r http analyzecore com 2015 04 01 cohort analysis and lifecycle grids mixed segmentation with r using r and tableau http www tableau com learn whitepapers using r and tableau comprehensive view on cran packages http www docfoc com cran pdf using r for statistical tables and plotting distributions http math arizona edu jwatkins r 01 pdf extended model formulas in r multiple parts and multiple responses https cran r project org web packages formula vignettes formula pdf r vs python head to head data analysis https www dataquest io blog python vs r utm content buffer55639 utm medium social utm source linkedin com utm campaign buffer r for data science hadley wickham s book http r4ds had co nz r study group at upenn https www ling upenn edu joseff rstudy index html program defined functions in r http dni institute in blogs extracting data from facebook using r important questions in r why is bracket better than subset http stackoverflow com questions 9860090 in r why is better than subset subsetting data in r http www statmethods net management subset html vectorization in r why http www noamross net blog 2014 4 16 vectorization in r why html quickly reading very large tables as dataframes in r http stackoverflow com questions 1727772 quickly reading very large tables as dataframes in r using r to show data http www sr bham ac uk ajrs r r show data html how can i view the source code for a function http stackoverflow com questions 19226816 how can i view the source code for a function lq 1 how to make a great r reproducible example https stackoverflow com questions 5963269 how to make a great r reproducible example r grouping functions sapply vs lapply vs apply vs tapply vs by vs aggregate https stackoverflow com questions 3505701 r grouping functions sapply vs lapply vs apply vs tapply vs by vs aggrega tricks to manage the available memory in an r session https stackoverflow com questions 1358003 tricks to manage the available memory in an r session difference between assignment operators and in r https stackoverflow com questions 1741820 assignment operators in r and what is the difference between require and library https stackoverflow com questions 5595512 what is the difference between require and library how can i view the source code for a function https stackoverflow com questions 19226816 how can i view the source code for a function how can i change fonts for graphs in r https stackoverflow com questions 27689222 changing fonts for graphs in r common dataframe operations create an empty data frame https stackoverflow com questions 10689055 create an empty data frame sort a dataframe by column s https stackoverflow com questions 1296646 how to sort a dataframe by columns merge join data frames inner outer left right https stackoverflow com questions 1299871 how to join merge data frames inner outer left right drop data frame columns by name https stackoverflow com questions 4605206 drop data frame columns by name remove rows with nas in data frame https stackoverflow com questions 4862178 remove rows with nas in data frame quickly reading very large tables as dataframes in r https stackoverflow com questions 1727772 quickly reading very large tables as dataframes in r drop factor levels in a subsetted data frame https stackoverflow com questions 1195826 drop factor levels in a subsetted data frame convert r list to data frame https stackoverflow com questions 4227223 r list to data frame convert data frame columns from factors to characters https stackoverflow com questions 2851015 convert data frame columns from factors to characters extracting specific columns from a data frame https stackoverflow com questions 10085806 extracting specific columns from a data frame caret package in r ensembling models with caret http stats stackexchange com questions 27361 stacking ensembling models with caret model training and tuning http topepo github io caret training html caret model list http topepo github io caret modellist html relationship between data splitting and traincontrol http stackoverflow com questions 14968874 caret relationship between data splitting and traincontrol specify model generation parameters http stackoverflow com questions 10498477 carettrain specify model generation parameters lq 1 tutorial https www r project org nosvn conferences user 2013 tutorials kuhn user caret 2up pdf paper www jstatsoft org article view v028i05 v28i05 pdf ensembling models with r http amunategui github io blending models ensembling regression models in r http stats stackexchange com questions 26790 ensembling regression models r cheatsheets r reference card https cran r project org doc contrib short refcard pdf r reference card 2 0 https cran r project org doc contrib baggott refcard v2 pdf data wrangling in r https www rstudio com wp content uploads 2015 02 data wrangling cheatsheet pdf ggplot2 cheatsheet https www rstudio com wp content uploads 2015 08 ggplot2 cheatsheet pdf shiny cheatsheet http shiny rstudio com images shiny cheatsheet pdf devtools cheatsheet https www rstudio com wp content uploads 2015 06 devtools cheatsheet pdf markdown cheatsheet https www rstudio com wp content uploads 2015 02 rmarkdown cheatsheet pdf reference https www rstudio com wp content uploads 2015 03 rmarkdown reference pdf data exploration cheatsheet http www analyticsvidhya com blog 2015 10 cheatsheet 11 steps data exploration with codes reference slides r reference card https cran r project org doc contrib baggott refcard v2 pdf association rule mining https 78462f86 a e2d7344e s sites googlegroups com a rdatamining com www docs rdatamining slides association rules pdf attachauth anoy7crd9hri7333kwhk0tvpss1vfgwow4buisml8b0nanfnteoq6qbcwjk acruy2n6cmuejsyrlood5bo1cqruyxkebsl1jbtnivbb gsr3cytt9qq6xb3zasmedacas9j1fzdilvn zlfbrf ajm7gau54jwrbhvkuqpopyemtoswctmmrjdrnwh4zqd5kyejlmhdcxb8bp dwbuxzg2t8sagbchguqkpttj u03wvkyw5mgmrgu7q4xiyyumbas pqedi6q attredirects 0 time series analysis https 78462f86 a e2d7344e s sites googlegroups com a rdatamining com www docs rdatamining slides time series analysis pdf attachauth anoy7cphtfej6imguupe5ygqn5flmh5 qpe4yngj9fyv3wqfy0qu8lwggieczks6p63rhx5nml8lqxqnx7qh7ozm1hoi kl0m9sloac0tc4sqipwc8dprqvoysdyw0edejfzwaqor0ayjmwefhpy6nqxigaaj4arrwzcnr1dyc7nqk4dtvqm80arrn5yzq9rnbgic30x xkwnqxoxl4fo54thpzmnb4wlkv5geo hdqpkwtkbmnr7u kgpoymjhgvxp3nr02ajsb attredirects 0 data exploration and visualisation https 78462f86 a e2d7344e s sites googlegroups com a rdatamining com www docs rdatamining slides data exploration visualization pdf attachauth anoy7cpqnctmcv1omsiokmefan8q6m j4hizv 1enjlu3nrpixihzjblf 9b sixmxpux xn5caw74gur18dn0ecaiim9mvectqt 2dcpno0dfhrjvnb5j8ehkbx w7y6mygb7uaoiubjdmvgr9vcifjf6pgqqalupywcb1ygbt4pv61bqzozru4 eicfghmordi8ygbqscyt2thakhpsegxd0dd3g08pgn3by70mkm02zaqarewbii91ktnh1 zmelecvatl smxmggnnidm6maxewq1pirtq 3d 3d attredirects 0 regression and classification https 78462f86 a e2d7344e s sites googlegroups com a rdatamining com www docs rdatamining slides regression classification pdf attachauth anoy7cq0yqcj 65paftfuqhaztyvp4e4r 5ob1klv3swvkjhvydaj0yu5yepiociqc0k p1qzo6z1vd0r9e05ku8y7mn6ntesqooq mmwlmqae7d2mnqkhzbqft6tk2hj3g3fk40mvfyu5ggogmxmyn9nvhihkwciyjy9a8zlbfo4r9a35kptdr6jjjaw5eqwseme bvt5iyzuyms7qs tvlghjj40zghpro7gcwxfb7qqapete9nyeu7mxay2z lazxn0vsnqe6 attredirects 0 text mining on twitter data https 78462f86 a e2d7344e s sites googlegroups com a rdatamining com www docs rdatamining slides text mining pdf attachauth anoy7cquewmhhfnhxiknhv6c2wqundaib8a betrfagfxz2deivendtk gs7mszjermc7b l6ktcwhff1zozof9xalkiaw6incenjdo1fwuhjfujagwwbcbexjkevummwlbx sdufzygjutbib2llgkrmqc3dd241hnzhtvgvupg26vhkn ju woej7uiilrjwftdvnrzwgwrvimwr0acnou56qab zmbg cvrs4qoqroieetlpr7k 3d attredirects 0 using r for multivariate analysis little book of r for multivariate analysis http little book of r for multivariate analysis readthedocs io en latest the freqparcoord package for multivariate visualization https matloff wordpress com 2014 03 30 the freqparcoord package for multivariate visualization use of freqparcoord for regression diagnostics http www r bloggers com use of freqparcoord for regression diagnostics time series analysis time series forecasting online book https www otexts org fpp a little book of time series analysis in r http a little book of r for time series readthedocs org en latest src timeseries html quick r time series and forecasting http www statmethods net advstats timeseries html components of time series data https www linkedin com pulse component time series data jeffrey strickland ph d cmsp unobserved component models using r https www linkedin com pulse unobserved component models r jeffrey strickland ph d cmsp the holt winters forecasting method http webarchive nationalarchives gov uk 20080726235635 http statistics gov uk iosmethodology downloads annex b the holt winters forecasting method pdf cran task view time series analysis https cran r project org web views timeseries html bayesian inference packages for bayesian inference https github com ujjwalkarn awesome r bayesian bayesian inference in r video https www youtube com watch v fiwik7onx3u r and bayesian statistics http www r bloggers com r and bayesian statistics machine learning using r machine learning with r https github com jhashanti machine learning with r using r for multivariate analysis online book http little book of r for multivariate analysis readthedocs org en latest src multivariateanalysis html cran task view machine learning statistical learning https cran r project org web views machinelearning html machine learning using r online book https www otexts org sfml linear regression and regularization code http rpubs com justmarkham linear regression salary cheatsheet http www analyticsvidhya com blog 2015 09 full cheatsheet machine learning algorithms multinomial and ordinal logistic regression in r http www analyticsvidhya com blog 2016 02 multinomial ordinal logistic regression evaluating logistic regression models in r https www r bloggers com evaluating logistic regression models neural networks in r visualizing neural nets in r https beckmw wordpress com 2013 11 14 visualizing neural networks in r update nnet package http stackoverflow com questions 21788817 r nnet with a simple example of 2 classes with 2 variables fitting a neural network in r neuralnet package http www r bloggers com fitting a neural network in r neuralnet package neural networks with r a simple example http gekkoquant com 2012 05 26 neural networks with r simple example neuralnettools 1 0 0 now on cran https beckmw wordpress com tag neural network introduction to neural networks in r http www louisaslett com courses data mining st4003 lab5 introduction to neural networks pdf step by step neural networks using r https bicorner com 2015 05 13 neural networks using r r for deep learning http www parallelr com r deep neural network from scratch neural networks using package neuralnet http www di fc ul pt jpn r neuralnets neuralnets html paper https journal r project org archive 2010 1 rjournal 2010 1 guenther fritsch pdf sentiment analysis different approaches https drive google com open id 0by wg rxnp 6u1jlnva3cnaxz3m sentiment analysis with machine learning in r http datascienceplus com sentiment analysis with machine learning in r first shot sentiment analysis in r http andybromberg com sentiment analysis qdap package https github com trinker qdap code http stackoverflow com questions 22774913 estimating document polarity using rs qdap package without sentsplit sentimentr package https github com trinker sentimentr tm plugin sentiment package https github com mannau tm plugin sentiment packages other than sentiment http stackoverflow com questions 15194436 is there any other package other than sentiment to do sentiment analysis in r sentiment analysis and opinion mining https www cs uic edu liub fbs sentiment analysis html tm term score http www inside r org packages cran tm docs tm term score vadersentiment paper http comp social gatech edu papers icwsm14 vader hutto pdf vadersentiment code https github com cjhutto vadersentiment imputation in r imputation in r http stackoverflow com questions 13114812 imputation in r imputation with random forests http stats stackexchange com questions 49270 imputation with random forests how to identify and impute multiple missing values using r http www unt edu rss class jon benchmarks missingvalueimputation jds nov2010 pdf mice error in implementation of random forest in mice r package http stackoverflow com questions 23974026 error in implementation of random forest in mice r package mice impute rf mice http www inside r org packages cran mice docs mice impute rf nlp and text mining in r what algorithm i need to find n grams http stackoverflow com questions 8161167 what algorithm i need to find n grams nlp r tutorial http www r bloggers com natural language processing tutorial introduction to the tm package text mining in r https cran r project org web packages tm vignettes tm pdf adding stopwords in r tm http stackoverflow com questions 18446408 adding stopwords in r tm text mining http www r bloggers com text mining word stemming in r http www omegahat net rstem stemming pdf classification of documents using text mining package tm http web letras up pt bhsmaia edv apresentacoes bradzil classif withtm pdf text mining tools techniques and applications http slidegur com doc 1830649 text mining text mining overview applications and issues http www3 cs stonybrook edu cse634 g8present pdf text mining pdf http www3 cs stonybrook edu cse634 presentations textmining pdf text mining another pdf http www stat columbia edu madigan w2025 notes introtextmining pdf good ppt http studylib net doc 5800473 topic7 textmining scraping twitter and web data using r http www nyu edu projects politicsdatalab localdata workshops twitter pdf visualisation in r ggplot2 tutorial http www ling upenn edu joseff avml2012 shiny examples https github com rstudio shiny examples top 50 ggplot2 visualizations http r statistics co top50 ggplot2 visualizations masterlist r code html comprehensive guide to data visualization in r http www analyticsvidhya com blog 2015 07 guide data visualization r interactive visualizations with r a minireview http www r bloggers com interactive visualizations with r a minireview beginner s guide to r painless data visualization http www computerworld com article 2497304 business intelligence beginner s guide to r painless data visualization html data visualization in r with ggvis https www datacamp com courses ggvis data visualization r tutorial multiple visualization articles in r http www r statistics com tag visualization statistics with r using r for biomedical statistics online book http a little book of r for biomedical statistics readthedocs org en latest src biomedicalstats html elementary statistics with r http www r tutor com elementary statistics a hands on introduction to statistics with r https www datacamp com introduction to statistics quick r basic statistics http www statmethods net stats index html quick r descriptive statistics http www statmethods net stats descriptives html explore statistics with r edx https www edx org course explore statistics r kix kiexplorx 0 useful r packages tidy data hadley paper https www jstatsoft org article view v059i10 package tidyr tidyr is an evolution of reshape2 it s design specifically for data tidying not general reshaping or aggregating and works well with dplyr data pipelines broom https github com dgrtwo broom plyr stringr reshape2 tutorial http www dataschool io tidying messy data in r video https vimeo com 33727555 code https github com justmarkham tidy data dplyr code files in this repo https github com ujjwalkarn datasciencer tree master intro 20to 20dplyr dplyr tutorial 1 http www dataschool io dplyr tutorial for faster data manipulation in r dplyr tutorial 2 http www dataschool io dplyr tutorial part 2 do your data janitor work like a boss with dplyr http www gettinggeneticsdone com 2014 08 do your data janitor work like boss html ggplot2 ggplot2 tutorial http www ling upenn edu joseff avml2012 good tutorial https github com jennybc ggplot2 tutorial introduction to ggplot2 https speakerdeck com karthik introduction to ggplot2 github https github com karthik ggplot lecture a quick introduction to ggplot http www noamross net blog 2012 10 5 ggplot introduction html r graphics cookbook http www cookbook r com graphs index html ggplot2 version of figures in lattice multivariate data visualization with r https learnr wordpress com 2009 06 28 ggplot2 version of figures in lattice multivariate data visualization with r part 1 a speed test comparison of plyr data table and dplyr http www r statistics com 2013 09 a speed test comparison of plyr data table and dplyr data table introduction to the data table package in r https cran r project org web packages data table vignettes datatable intro pdf fast summary statistics in r with data table http blog yhat com posts fast summary statistics with data dot table html other packages package e1071 package appliedpredictivemodeling package stringr stringr is a set of simple wrappers that make r s string functions more consistent simpler and easier to use package stringdist implements an approximate string matching version of r s native match function can calculate various string distances based on edits damerau levenshtein hamming levenshtein optimal sting alignment qgrams or heuristic metrics package fselector this package provides functions for selecting attributes from a given dataset ryacas an r interface to the yacas computer algebra system https cran r project org web packages ryacas vignettes ryacas pdf scatterplot3d an r package for visualizing multivariate data https cran r project org web packages scatterplot3d vignettes s3d pdf tm plugin webmining intro https cran r project org web packages tm plugin webmining vignettes shortintro pdf solving differential equations in r ode examples https cran r project org web packages diffeq vignettes odeinr pdf structural equation modeling with the sem package in r http socserv socsci mcmaster ca jfox misc sem sem paper pdf prettyscree prettygraphs http www inside r org packages cran prettygraphs docs prettyscree market basket analysis in r market basket analysis with r http www salemmarafi com code market basket analysis with r step by step explanation of market basket http dni institute in blogs market basket analysis step by step approach using r | datascience data-science r text-mining | ai |
blocksim | blocksim blockchain simulator a framework for modeling and simulate a blockchain protocol it follows a discrete event simulation model currently there are models to simulate bitcoin and ethereum this is an ongoing research project these framework is in a beta stage if you find any issue please report installation we need to setup a virtualenv and install all the dependencies sh pip3 install virtualenv virtualenv p python3 venv source venv bin activate pip3 install r requirements txt running sh python m blocksim main how to use and model check our wiki https github com blockbirdlabs blocksim wiki abstract a blockchain is a distributed ledger in which participants that do not fully trust each other agree on the ledger s content by running a consensus algorithm this technology is raising a lot of interest both in academia and industry but the lack of tools to evaluate design and implementation decisions may hamper fast progress to address this issue this paper presents a discrete event simulator that is flexible enough to evaluate different blockchain implementations these blockchains can be rapidly modeled and simulated by extending existing models running bitcoin and ethereum simulations allowed us to change conditions and answer different questions about their performance for example we concluded that doubling the number of transactions per block has a low impact on the block propagation delay 10ms and that encrypting communication has a high impact in that delay more than 25 full paper you can find the full paper here https www carlosfaria com papers blocksim blockchain simulator pdf aditionaly the presentation for the 2019 ieee international conference on blockchain can be found here https www carlosfaria com talks blocksim ieee blockchain 2019 pdf | ethereum discrete-event-simulation blocksim | blockchain |
tivaapps | overview this is a quick how to for writing a few hello world example apps on top of the texas intruments real time operating system aka ti rtos with sysbios kernel and running and debugging it on the tiva c launchpad tm4c123gxl evaluation board 13 from a linux host without depending on the ide from ti of particular interest is the system output re direction to uart and workarounds for a few issues in sysbios code example apps hello console output to uart hwiswitask various threading primitives ex from sysbios user manual exception cause a divide by zero and dump exception context to console skeleton main c initialization of console and logger and call to app entry point event logger c logger event formatting uart iface c write to uart serial ports ek tm4c123gxl c board specific code most content for this how to was gathered from various existing resources and stitched together to get an end to end working environment the initiative was started when there was less support for the tiva c launchpad than there is now and contained more resources e g linker scripts which are no longer needed and hence have been removed disclaimer personal project not affiliated with ti tool dependencies gcc arm none eabi toolchain tested with v4 7 2013q3 0precise6 from ubuntu repos prefix assumed to be usr rtsc xdc tools v3 25 05 94 standalone distribution http software dl ti com dsps dsps public sw sdo sb targetcontent rtsc index html build framework for building xdc packages and applications that comsume them sysbios and tirtos are collections of xdc packages overview of rtsc http rtsc eclipse org docs tip general information and an accessible tutorial http rtsc eclipse org docs tip rtsc module primer lesson 0 documentation for some packages here http rtsc eclipse org cdoc tip index html the docs for the packages used for the present hello world are inline in the xdc files they can be compiled into html with xdc tools cdoc http rtsc eclipse org cdoc tip index html xdc tools cdoc package html or viewed interactive by setting up xdc tools cdoc sg http rtsc eclipse org docs tip command xdc tools cdoc viewer depends on mozilla note xdc tools v3 25 00 48 in ti rtos distribution v1 21 00 09 suffers from bug 421665 https bugs eclipse org bugs show bug cgi id 421665 which prevents it from working on 32 bit linux hosts openocd http openocd sourceforge net for flashing and debugging the target via icdi tested with v0 7 0 installed from source sudo apt get install openocd or from source board config script ek tm4c123gxl cfg included in this repo source dependencies tivaware v2 0 1 11577a shipped with ti rtos v1 21 00 09 http www ti com tool ti rtos low level support for microcontroller features alternatively standalone distribution sw tm4c complete http www ti com tool sw tm4c tivaware 1 0 or 1 1 is also included in the sw ek tm4c123gxl distribution which looks superceded by the above to rebuild bash cd tirtos 1 21 00 09 products tivaware c series 2 0 1 11577a make clean needed to clean deps files with previous build make ti sysbios package v6 37 00 20 patched sysbios repo https github com alexeicolin sysbios real time operating system kernel note sysbios suffers from this http e2e ti com support microcontrollers tiva arm f 908 p 313785 1097401 aspx 1097401 and this http e2e ti com support microcontrollers tiva arm f 908 t 313791 aspx bugs which break timers the version in the repo has the patches note this version of sysbios is shipped with ti rtos distribution v1 21 00 09 current latest version in dedicated sysbios distribution is v6 35 04 50 which does not include explicit support for tm4c123 in particular default linking scripts however it can be made to work with a hand written linker script see earlier commits in this repo to build bash cd path to sysbios make f bios mak xdc install dir path to xdctools 3 25 05 94 xdcargs gnu targets arm m4f usr ti drivers package part of ti rtos http www ti com tool ti rtos v1 21 00 09 wraps low level microcontroller features to be used with xdc packages in tirtos repo https github com alexeicolin tirtos repo there are patches to the uart code that remove unneeded interrupts due to the async api which is unused by our app change the uart clock source to piosc to inorder to make it robust against clock changes due to transitioning into deep sleep sleeping is not enabled by default if you use that source then see the readme for build instructions otherwise see below to rebuild bash cd path to tirtos 1 21 00 09 make f tirtos mak ccs build false gcc installation dir usr tirtos installation dir pwd xdctools installation dir path to xdctools 3 25 05 94 bios installation dir path to sysbios drivers ti uia packages v1 04 00 06 shipped with ti rtos http www ti com tool ti rtos v1 21 00 09 instrumentation functionality of which the logger framework is used here to rebuild bash cd path to tirtos 1 21 00 09 make f tirtos mak ccs build false gcc installation dir usr tirtos installation dir pwd xdctools installation dir path to xdctools 3 25 05 94 bios installation dir path to sysbios uia build flash and run setup pointers to the above dependencies export xdctools installation dir path to xdctools 3 25 05 94 export tivaware installation dir path to tirtos 1 21 00 09 products tivaware c series 2 0 1 11577a export bios installation dir path to sysbios export tirtos installation dir path to tirtos 1 21 00 09 export uia installation dir path to tirtos 1 21 00 09 uia 1 04 00 06 the default target builds all example applications cd path to tivaapps make openocd f openocd cfg program app hello out 0x00000000 the tiva c board should create a usb to serial device linked to the uart port 0 to see the console output screen dev ttyacm0 115200 and push the reset button on the board for reference the driver kernel module for the usb to serial device is cdc acm output of lsusb bus 004 device 010 id 1cbe 00fd luminary micro inc and dmesg 487894 788077 usb 4 1 new full speed usb device number 10 using uhci hcd 487894 949990 cdc acm 4 1 1 0 this device cannot do calls on its own it is not a modem 487894 950031 cdc acm 4 1 1 0 ttyacm0 usb acm device debug openocd f openocd cfg gdb gdb target remote localhost 3333 gdb monitor reset halt gdb break app gdb c across re flashes the gdb session can be left running but it should be detach ed once detached openocd demon can be shutdown binary flashed demon restarted and gdb target command re run without detaching the gdb session seems to break in odd ways add another application create an app name c with app as the main function analogous to app hello c app hello c in makefile makefile add the file to apps variable and add a corresponding target below applications | os |
|
MoSync | mosync mobile development | front_end |
|
Deloitte-Virtual-Internship-2021 | deloitte virtual internship 2021 cloud engineering tech strategy amp innovation and optimisation amp delivery | cloud |
|
Course-Outline-Web-Development | ensf 607 web development project authors evan boerchers dean kim hunter kimmett introduction the goal of this project was to create a web application using react for the frontend and django for the backend this web app would serve as a form for teachers to input information for their classes to be easily exported into a format useable by the university the project also served as a way to demonstrate an agile workflow and our group kept scrum notes pertaining to this facet of the project this project includes the course outline builder app itself as well as two preprojects one for the backend demonstrating django usage and one for the frontend creating a simple react web app contents the following folders are included in this project courseoutlinebuilder course outline builder web app project folder containing the following packages for the backend and frontend outline backend outline frontend preprojects aforementioned preprojects demonstrating django and react usage django evan react preproject screenshots screenshots demonstrating the web app for this readme scrum scrum notes for each of the 3 sprints done and their respective milestones completed under this project sprint 1 pre project sprint 2 milestone 1 sprint 3 milestone 2 dependencies the following packages are required to run this project frontend react native npm node material ui core maerial ui icons axios react router backend django djagno rest framework django filter django cors headers running backend ensure the required dependencies are located in the enviroment in which the django server will be running navigate to courseoutlinebuilder outline backend folder enter command python manage py runserver frontend ensure node and npm yarn is installed on machine navigate to courseoutlinebuilder outline frontend folder run npm install to install needed packages run npm start to start the local server app demo once backend and frontend are running the following home page should open home screenshots homepage png from this home page one can create a new course edit the base information of an existing one open the full outline for editing or delete a course using create will bring up the following prompt create screenshots homecreate png using edit will bring up a similar prompt edit screenshots homeedit png using delete will simply delete the selected outline using open will open a new page with the selected outline s full outline on display which can be edited open screenshots opengeneral png using save will save any changes into the backend and the sql database while home will return to the home page the outline contains 9 sections with varying fields some of which are simple text fields others being dynamic tables containing their own fields these dynamic tables can be edited and have rows added deleted below is an example of one table screenshots opentable png the final grade determination table is a notable one as this table contains a row which will sum all the other rows so that the user can ensure their grades add up to 100 grades screenshots opengrades png | django react sql webdev javascript | server |
SuperICL | super in context learning supericl code for small models are valuable plug ins for large language models https arxiv org abs 2305 08848 supericl workflow https github com jetrunner supericl assets 22514219 4567f26b 2c21 4f00 bfba 92622dfab47b how to run code setup install requirements bash pip install r requirements txt add openai api key bash cp api config example py api config py vi api config py glue bash python run glue py model path roberta large mnli model name roberta large dataset mnli m explanation include this to enable explanation for overrides for all supported tasks see here https github com jetrunner supericl blob main run glue py l34 for the complete set of parameters see the code here https github com jetrunner supericl blob main run glue py l23 xnli bash python run xnli py model path path to model model name xlm v lang en ar bg de el es fr hi ru sw th tr ur vi zh for the complete set of parameters see the code here https github com jetrunner supericl blob main run xnli py l20 citation bibtex article xu2023small title small models are valuable plug ins for large language models author canwen xu and yichong xu and shuohang wang and yang liu and chenguang zhu and julian mcauley journal arxiv preprint arxiv 2305 08848 year 2023 | ai |
|
step-portfolio | step portfolio this is the portfolio i created as part of google s student training in engineering program step the full program i followed and original git history can be found here https github com anthony fiddes step project this is just the portfolio portion this portfolio can be easily adapted to create a clean sectioned resume to be hosted on google cloud features a simple flexible styling made from scratch a comments section support for uploading images as part of comment support for using google s translate api to translate all comments to english spanish or chinese screenshots introduction section it shows the top of my portfolio https user images githubusercontent com 11233666 88747026 f2707580 d11b 11ea 9678 ee10de6d9947 png comment section https user images githubusercontent com 11233666 88747227 5a26c080 d11c 11ea 8980 d6617d165e1c png use the portfolio can be run with the following command bash mvn package appengine run | cloud |
|
awesome-computer-vision-in-sports | awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome awesome computer vision in sports papers by the year of publications 2022 soccernet 2022 challenges results acm mmsports 22 paper https arxiv org abs 2210 02365 deepsportradar v1 computer vision dataset for sports understanding with high quality annotations acm mmsports 22 paper https arxiv org abs 2208 08190 2021 lol v2t large scale esports video description dataset cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers tanaka lol v2t large scale esports video description dataset cvprw 2021 paper pdf contrastive learning for sports video unsupervised player classification cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers koshkina contrastive learning for sports video unsupervised player classification cvprw 2021 paper pdf automated tackle injury risk assessment in contact based sports a rugby union example cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers martin automated tackle injury risk assessment in contact based sports a cvprw 2021 paper pdf toward improving the visual characterization of sport activities with abstracted scene graphs cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers rahimi toward improving the visual characterization of sport activities with abstracted cvprw 2021 paper pdf detecting and matching related objects with one proposal multiple predictions cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers liu detecting and matching related objects with one proposal multiple predictions cvprw 2021 paper pdf soccernet v2 a dataset and benchmarks for holistic understanding of broadcast soccer videos cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers deliege soccernet v2 a dataset and benchmarks for holistic understanding of broadcast cvprw 2021 paper pdf table tennis stroke recognition using two dimensional human pose estimation cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers kulkarni table tennis stroke recognition using two dimensional human pose estimation cvprw 2021 paper pdf puck localization and multi task event recognition in broadcast hockey videos cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers giancola temporally aware feature pooling for action spotting in soccer broadcasts cvprw 2021 paper pdf automatic play segmentation of hockey videos cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers pidaparthy automatic play segmentation of hockey videos cvprw 2021 paper pdf deepdarts modeling keypoints as objects for automatic scorekeeping in darts using a single camera cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers mcnally deepdarts modeling keypoints as objects for automatic scorekeeping in darts cvprw 2021 paper pdf camera calibration and player localization in soccernet v2 and investigation of their representations for action spotting cvprw 21 paper https openaccess thecvf com content cvpr2021w cvsports papers cioppa camera calibration and player localization in soccernet v2 and investigation of cvprw 2021 paper pdf 2020 actor transformers for group activity recognition cvpr 20 paper http isis data science uva nl cgmsnoek pub gavrilyuk transformers cvpr2020 pdf progressive relation learning for group activity recognition cvpr 20 paper https arxiv org pdf 1908 02948 pdf group activity detection from trajectory and video data in soccer cvprw 20 paper https openaccess thecvf com content cvprw 2020 papers w53 sanford group activity detection from trajectory and video data in soccer cvprw 2020 paper pdf falcons fast learner grader for contorted poses in sports cvprw 20 paper https openaccess thecvf com content cvprw 2020 papers w53 nekoui falcons fast learner grader for contorted poses in sports cvprw 2020 paper pdf as seen on tv automatic basketball video production using gaussian based actionness and game states recognition cvprw 20 paper https openaccess thecvf com content cvprw 2020 papers w53 quiroga as seen on tv automatic basketball video production using gaussian based cvprw 2020 paper pdf decoupling video and human motion towards practical event detection in athlete recordings cvprw 20 paper https openaccess thecvf com content cvprw 2020 papers w53 einfalt decoupling video and human motion towards practical event detection in cvprw 2020 paper pdf utilizing mask r cnn for waterline detection in canoe sprint video analysis cvprw 20 paper https openaccess thecvf com content cvprw 2020 papers w53 von braun utilizing mask r cnn for waterline detection in canoe sprint video cvprw 2020 paper pdf ttnet real time temporal and spatial video analysis of table tennis cvprw 20 paper https arxiv org pdf 2004 09927 pdf 2019 learning actor relation graphs for group activity recognition cvpr 19 paper https openaccess thecvf com content cvpr 2019 papers wu learning actor relation graphs for group activity recognition cvpr 2019 paper pdf 2018 soccer on your tabletop cvpr 18 paper https grail cs washington edu projects soccer soccer on your tabletop pdf egocentric basketball motion planning from a single first person image cvpr 18 paper https arxiv org pdf 1803 01413 towards structured analysis of broadcast badminton videos wacv 18 paper http cvit iiit ac in images conferencepapers 2018 badminton analytics pdf fine grained activity recognition in baseball videos cvprw 18 paper https arxiv org pdf 1804 03247 pdf where will they go predicting fine grained adversarial multi agent motion using conditional variational autoencoders eccv 18 paper https openaccess thecvf com content eccv 2018 papers panna felsen where will they eccv 2018 paper pdf fine grained video captioning for sports narrative eccv 18 paper https openaccess thecvf com content cvpr 2018 papers yu fine grained video captioning cvpr 2018 paper pdf 2017 what will happen next forecasting player moves in sports videos iccv 17 paper http openaccess thecvf com content iccv 2017 papers felsen what will happen iccv 2017 paper pdf not all passes are created equal objectively measuring the risk and reward of passes in soccer from tracking data sigkdd 17 paper https dl acm org citation cfm id 3098051 smarttennistv an automatic indexing system for tennis ncvpripg 17 paper https researchweb iiit ac in anurag ghosh static smarttennistv automatic indexing pdf social scene understanding end to end multi person action localization and collective activity recognition cvpr 17 paper http openaccess thecvf com content cvpr 2017 papers bagautdinov social scene understanding cvpr 2017 paper pdf coordinated multi agent imitation learning icml 17 paper http proceedings mlr press v70 le17a le17a pdf 2016 chalkboarding a new spatiotemporal query paradigm for sports play retrieval acm iui 16 paper https dl acm org citation cfm id 2856772 what players do with the ball a physically constrained interaction modeling cvpr 16 paper https www cv foundation org openaccess content cvpr 2016 papers maksai what players do cvpr 2016 paper pdf generating long term trajectories using deep hierarchical networks nips 16 paper http papers nips cc paper 6520 generating long term trajectories using deep hierarchical networks pdf detecting events and key actors in multi person videos cvpr 16 paper http openaccess thecvf com content cvpr 2016 papers ramanathan detecting events and cvpr 2016 paper pdf 2015 mimicking human camera operators wacv 15 paper https pdfs semanticscholar org e73b 4b7c48a4aeca81ec132e9e147dc19d103ded pdf quality vs quantity improved shot prediction in soccer using strategic features from spatiotemporal data paper mit sloan sports analytics conference 15 paper https s3 us west 1 amazonaws com disneyresearch wp content uploads 20150308192147 quality vs quantity e2 80 9d improved shot prediction in soccer using strategic features from spatiotemporal data paper pdf 2014 how to get an open shot analyzing team movement in basketball using tracking data mit sloan 14 paper https s3 us west 1 amazonaws com disneyresearch wp content uploads 20141125014436 how to get an open shot analyzing team movement in basketball using tracking data paper pdf 2013 take your eyes off the ball improving ball tracking by focusing on team play cviu 13 paper https infoscience epfl ch record 185107 files top infosci 1 pdf detecting and tracking sports players with random forests and context conditioned motion models cvpr 13 paper https www cv foundation org openaccess content cvpr 2013 papers liu tracking sports players 2013 cvpr paper pdf representing and discovering adversarial team behaviors using player roles cvpr 13 paper http openaccess thecvf com content cvpr 2013 papers lucey representing and discovering 2013 cvpr paper pdf recognising team activities from noisy data cvprw 13 paper https www cv foundation org openaccess content cvpr workshops 2013 w19 papers bialkowski recognising team activities 2013 cvpr paper pdf 2012 point less calibration camera parameters from gradient based alignment to edge images wacv 12 paper http ieeexplore ieee org stamp stamp jsp tp arnumber 6163012 datasets available 1 volleyball dataset link https github com mostafa saad deep activity rec | ai |
|
aws-iot-device-sdk-java | new version available a new aws iot device sdk is now available https github com awslabs aws iot device sdk java v2 it is a complete rework built to improve reliability performance and security we invite your feedback this sdk will no longer receive feature updates but will receive security updates aws iot device sdk for java the aws iot device sdk for java enables java developers to access the aws iot platform through mqtt or mqtt over the websocket protocol aws iot protocol the sdk is built with aws iot device shadow support aws iot thing providing access to thing shadows sometimes referred to as device shadows using shadow methods including get update and delete it also supports a simplified shadow access model which allows developers to exchange data with their shadows by just using getter and setter methods without having to serialize or deserialize any json documents to get started use the maven repository or download the latest jar file latest jar overview overview install the sdk install the sdk use the sdk use the sdk sample applications sample applications api documentation api documentation license license support support overview this document provides instructions for installing and configuring the aws iot device sdk for java it also includes some examples that demonstrate the use of different apis mqtt connection types the sdk is built on top of the paho mqtt java client library paho mqtt java download developers can choose from two types of connections to connect to the aws iot service mqtt over tls 1 2 with x 509 certificate based mutual authentication mqtt over websocket with aws signature version 4 authentication for mqtt over tls port 8883 a valid certificate and private key are required for authentication for mqtt over websocket port 443 a valid aws identity and access management iam access key id and secret access key pair is required for authentication thing shadows a thing shadow represents the cloud counterpart of a physical device or thing although a device is not always online its thing shadow is a thing shadow stores data in and out of the device in a json based document when the device is offline its shadow document is still accessible to the application when the device comes back online the thing shadow publishes the delta to the device which the device didn t see while it was offline the sdk implements the protocol for applications to retrieve update and delete shadow documents mentioned here aws iot thing when you use the simplified access model you have the option to enable strict document versioning to reduce the overhead of subscribing to shadow topics for each method requested the sdk automatically subscribes to all of the method topics when a connection is established simplified shadow access model unlike the shadow methods which operate on json documents the simplified shadow access model allows developers to access their shadows with getter and setter methods to use this feature you must extend the device class awsiotdevice use the annotation awsiotdeviceproperty to mark class member variables to be managed by the sdk and provide getter and setter methods for accessing these variables the getter methods will be used by the sdk to report to the shadow periodically the setter methods will be invoked whenever there is a change to the desired state of the shadow document for more information see use the sdk use the sdk later in this document install the sdk minimum requirements to use the sdk you will need java 1 7 install the sdk using maven the recommended way to use the aws iot device sdk for java in your project is to consume it from maven simply add the following dependency to the pom file of your maven project xml dependencies dependency groupid com amazonaws groupid artifactid aws iot device sdk java artifactid version 1 3 9 version dependency dependencies the sample applications included with the sdk can also be installed using the following dependency definition xml dependencies dependency groupid com amazonaws groupid artifactid aws iot device sdk java samples artifactid version 1 3 9 version dependency dependencies install the sdk using the latest jar the latest jar files can be downloaded here latest jar you can simply extract and copy the jar files to your project s library directory and then update your ide to include them to your library build path you will also need to add two libraries the sdk depends on jackson 2 x including jackson core jackson core and jackson databind jackson databind paho mqtt client for java 1 1 x download instructions paho mqtt java download build the sdk from the github source you can build both the sdk and its sample applications from the source hosted at github sh git clone https github com aws aws iot device sdk java git cd aws iot device sdk java mvn clean install dgpg skip true use the sdk the following sections provide some basic examples of using the sdk to access the aws iot service over mqtt for more information about each api see the api documentation api docs initialize the client to access the aws iot service you must initialize awsiotmqttclient the way in which you initialize the client depends on the connection type mqtt or mqtt over websocket you choose in both cases a valid client endpoint and client id are required for setting up the connection initialize the client with mqtt over tls 1 2 for this mqtt connection type port 8883 the aws iot service requires tls mutual authentication so a valid client certificate x 509 and rsa keys are required you can use the aws iot console aws iot console or the aws command line tools to generate certificates and keys for the sdk only a certificate file and private key file are required java string clientendpoint prefix ats iot region amazonaws com use value returned by describe endpoint endpoint type iot data ats string clientid unique client id replace with your own client id use unique client ids for concurrent connections string certificatefile certificate file x 509 based certificate file string privatekeyfile private key file pkcs 1 or pkcs 8 pem encoded private key file sampleutil java and its dependency privatekeyreader java can be copied from the sample source code alternatively you could load key store directly from a file see the example included in this readme keystorepasswordpair pair sampleutil getkeystorepasswordpair certificatefile privatekeyfile awsiotmqttclient client new awsiotmqttclient clientendpoint clientid pair keystore pair keypassword optional parameters can be set before connect client connect initialize the client with mqtt over websocket for this mqtt connection type port 443 you will need valid iam credentials to initialize the client this includes an aws access key id and secret access key there are a number of ways to get iam credentials for example by creating permanent iam users or by requesting temporary credentials through the amazon cognito service for more information see the developer guides for these services as a best practice for application security do not embed credentials directly in the source code java string clientendpoint prefix ats iot region amazonaws com use value returned by describe endpoint endpoint type iot data ats string clientid unique client id replace with your own client id use unique client ids for concurrent connections aws iam credentials could be retrieved from aws cognito sts or other secure sources awsiotmqttclient client new awsiotmqttclient clientendpoint clientid awsaccesskeyid awssecretaccesskey sessiontoken optional parameters can be set before connect client connect publish and subscribe after the client is initialized and connected you can publish messages and subscribe to topics to publish a message using a blocking api java string topic my own topic string payload any payload client publish topic awsiotqos qos0 payload to publish a message using a non blocking api java public class mymessage extends awsiotmessage public mymessage string topic awsiotqos qos string payload super topic qos payload override public void onsuccess called when message publishing succeeded override public void onfailure called when message publishing failed override public void ontimeout called when message publishing timed out string topic my own topic awsiotqos qos awsiotqos qos0 string payload any payload long timeout 3000 milliseconds mymessage message new mymessage topic qos payload client publish message timeout to subscribe to a topic java public class mytopic extends awsiottopic public mytopic string topic awsiotqos qos super topic qos override public void onmessage awsiotmessage message called when a message is received string topicname my own topic awsiotqos qos awsiotqos qos0 mytopic topic new mytopic topicname qos client subscribe topic note all operations publish subscribe unsubscribe will not timeout unless you define a timeout when performing the operation if no timeout is defined then a value of 0 is used meaning the operation will never timeout and in rare cases wait forever for the server to respond and block the calling thread indefinitely it is recommended to set a timeout for qos1 operations if your application expects responses within a fixed duration or if there is the possibility the server you are communicating with may not respond shadow methods to access a shadow using a blocking api java string thingname thing name replace with your aws iot thing name awsiotdevice device new awsiotdevice thingname client attach device client connect delete existing shadow document device delete update shadow document state state state reported sensor 3 0 device update state get the entire shadow document string state device get to access a shadow using a non blocking api java public class myshadowmessage extends awsiotmessage public myshadowmessage super null null override public void onsuccess called when the shadow method succeeded state json document received is available in the payload field override public void onfailure called when the shadow method failed override public void ontimeout called when the shadow method timed out string thingname thing name replace with your aws iot thing name awsiotdevice device new awsiotdevice thingname client attach device client connect myshadowmessage message new myshadowmessage long timeout 3000 milliseconds device get message timeout simplified shadow access model to use the simplified shadow access model you need to extend the device class awsiotdevice and then use the annotation class awsiotdeviceproperty to mark the device attributes and provide getter and setter methods for them the following very simple example has one attribute somevalue defined the code will report the attribute to the shadow identified by thingname every 5 seconds in the reported section of the shadow document the sdk will call the setter method setsomevalue whenever there s a change to the desired section of the shadow document java public class mydevice extends awsiotdevice public mydevice string thingname super thingname awsiotdeviceproperty private string somevalue public string getsomevalue read from the physical device public void setsomevalue string newvalue write to the physical device mydevice device new mydevice thingname long reportinterval 5000 milliseconds default interval is 3000 device setreportinterval reportinterval client attach device client connect other topics enable logging the sdk uses java util logging for logging to change the logging behavior for example to change the logging level or logging destination you can specify a property file using the jvm property java util logging config file it can be provided through jvm arguments like so sh djava util logging config file logging properties to change the console logging level the property file logging properties should contain the following lines override of console logging level java util logging consolehandler level info load keystore from file to initialize the client you can load a keystore object directly from jks based keystore files you will first need to import x 509 certificate and the private key into the keystore file like so sh openssl pkcs12 export in certificate file inkey private key file out p12 keystore name alias type in the export password keytool importkeystore srckeystore p12 keystore srcstoretype pkcs12 srcstorepass export password alias alias deststorepass keystore password destkeypass key password destkeystore my keystore after the keystore file my keystore is created you can use it to initialize the client like so java string keystorefile my keystore replace with your own key store file string keystorepassword keystore password replace with your own key store password string keypassword key password replace with your own key password keystore keystore keystore getinstance keystore getdefaulttype keystore load new fileinputstream keystorefile keystorepassword tochararray string clientendpoint prefix iot region amazonaws com replace prefix and region with your own string clientid unique client id replace with your own client id use unique client ids for concurrent connections awsiotmqttclient client new awsiotmqttclient clientendpoint clientid keystore keypassword use ecc based certificates you can use elliptic curve cryptography ecc based certificates to initialize the client to create an ecc key and certificate see this blog post aws iot ecc blog after you have created and registered the key and certificate use the following command to convert the ecc key file to pkck 8 format sh openssl pkcs8 topk8 nocrypt in ecckey key out ecckey pk8 key type in the key password you can then use the instruction described in this section initialize the client to initialize the client with just this one change java sampleutil java and its dependency privatekeyreader java can be copied from the sample source code alternatively you could load key store directly from a file see the example included in this readme keystorepasswordpair pair sampleutil getkeystorepasswordpair certificatefile privatekeyfile ec increase in flight publish limit too many publishes in progress error if you are getting a too many publishes in progress error this means that your application has more operations in flight meaning they have not succeeded or failed but they are waiting for a response from the server than paho supports by default by default the paho client supports a maximum of 10 in flight operations the recommended way to resolve this issue is to track how many qos1 operations you have sent that are in flight and when you reach the limit of 10 you add any further operations into a queue then as the qos1 operations are no longer in flight you grab qos1 operations from the queue until it is empty or until you have hit the maximum of 10 in flight operations you then repeat this process until all the operations are sent this will prevent your application from ever trying to send too many operations at once and exceeding the maximum in flight limit of the paho client another way to help reduce this issue is to increase the maximum number of in flight operations that the paho client can process to do this you will need to modify the source code to increase this limit download the source code from github navigate to the awsiotmqttconnection java file and add the following line of code in the buildmqttconnectoptions function just under the line options setkeepaliveinterval client getkeepaliveinterval 1000 around line 151 options setmaxinflight 100 then compile the source code and use the compiled jar in your application this will increase paho s in flight limit to 100 and allow you to have more in flight at the same time giving additional room for sending larger volumes of qos1 operations note that these in flight operations still need to be acknowledged by the server or timeout before they are no longer in flight you can just have up to 100 in flight rather than the default of 10 for aws iot core you can only send a maximum of 100 qos1 operations per second any operations sent after the first 100 per second will be ignored by aws iot core for this reason it is highly recommended you perform all operations with a timeout if you increase the maximum in flight limit to prevent a situation where you send more than 100 qos1 operations per second and are waiting on an operation to get an acknowledgement from the sever that will never come sample applications there are three samples applications included with the sdk the easiest way to run these samples is through maven which will take care of getting the dependencies publish subscribe sample this sample consists of two publishers publishing one message per second to a topic one subscriber subscribing to the same topic receives and prints the messages shadow sample this sample consists of a simple demo of the simplified shadow access model the device contains two attributes window state and room temperature window state can be modified therefore controlled remotely through desired state to demonstrate this control function you can use the aws iot console to modify the desired window state and then see its change from the sample output shadow echo sample this sample consists of a simple demo that uses shadow methods to send a shadow update and then retrieve it back every second arguments for the sample applications to run the samples you will also need to provide the following arguments through the command line clientendpoint client endpoint obtained via calling describe endpoint clientid client id thingname aws iot thing name not required for the publish subscribe sample you will also need to provide either set of the following arguments for authentication for an mqtt connection provide these arguments certificatefile x 509 based certificate file for just in time registration this is the concatenated file from both the device certificate and ca certificate for more information about just in time registration please see this blog just in time registration privatekeyfile private key file keyalgorithm optional rsa or ec if not specified rsa is used for an mqtt over websocket connection provide these arguments awsaccesskeyid iam access key id awssecretaccesskey iam secret access key sessiontoken optional if temporary credentials are used run the sample applications you can use the following commands to execute the sample applications assuming tls mutual authentication is used to run the publish subscribe sample use the following command sh mvn exec java pl aws iot device sdk java samples dexec mainclass com amazonaws services iot client sample pubsub publishsubscribesample dexec args clientendpoint prefix ats iot region amazonaws com clientid unique client id certificatefile certificate file privatekeyfile private key file to run the shadow sample use the following command sh mvn exec java pl aws iot device sdk java samples dexec mainclass com amazonaws services iot client sample shadow shadowsample dexec args clientendpoint prefix ats iot region amazonaws com clientid unique client id thingname thing name certificatefile certificate file privatekeyfile private key file to run the shadow echo sample use the following command sh mvn exec java pl aws iot device sdk java samples dexec mainclass com amazonaws services iot client sample shadowecho shadowechosample dexec args clientendpoint prefix ats iot region amazonaws com clientid unique client id thingname thing name certificatefile certificate file privatekeyfile private key file sample source code you can get the sample source code either from the github repository as described here build the sdk from the github source or from the latest sdk binary latest jar they both provide you with maven project files that you can use to build and run the samples from the command line or import them into an ide such as eclipse the sample source code included with the latest sdk binary is shipped with a modified maven project file pom xml that allows you to build the sample source indepedently without the need to reference the parent pom file as with the github source tree api documentation you ll find the api documentation for the sdk here api docs license this sdk is distributed under the apache license version 2 0 apache license 2 for more information see license txt and notice txt support if you have technical questions about the aws iot device sdk use the aws iot forum aws iot forum for any other questions about aws iot contact aws support aws support aws iot protocol http docs aws amazon com iot latest developerguide protocols html aws iot thing http docs aws amazon com iot latest developerguide iot thing shadows html aws iot forum https forums aws amazon com forum jspa forumid 210 aws iot console http aws amazon com iot latest jar https s3 amazonaws com aws iot device sdk java aws iot device sdk java latest zip jackson core https github com fasterxml jackson core jackson databind https github com fasterxml jackson databind paho mqtt java download https eclipse org paho clients java api docs http aws iot device sdk java docs s3 website us east 1 amazonaws com aws iot ecc blog https aws amazon com blogs iot elliptic curve cryptography and forward secrecy support in aws iot 3 aws support https aws amazon com contact us apache license 2 http www apache org licenses license 2 0 just in time registration https aws amazon com blogs iot just in time registration of device certificates on aws iot | server |
|
Embedded-Design-Project | embedded design project a keypad based lock with sms alert is designed in completion of module ee6304 embedded system design the coding is done by using c language atmel studio and arduino ide are used for code developing in this design an authourized person can unlock the lock by inserting the correct pin from the keypad if the password is incorrect a sms alert will be send to the owners mobile phone and also the password can be changed by a authourized person | os |
|
tab-rs | tab the intuitive config driven terminal multiplexer designed for software systems engineers img align right width 400 height 400 src readme tab vectr svg intuitive tabs are discovered and selected with a built in fuzzy finder or with the command tab name tab has one escape sequence ctrl t config driven tab provides persistent configurable tabs with unique command history working directories and docstrings state agnostic tab provides a simple consistent interface that works anywhere in any state autocompleted tab provides an interactive fuzzy finder and dynamic autocomplete on the command line so you can get oriented and context switch fast fast tabs launch in 50ms and reconnect in 10ms keyboard latency stdin to stdout is under 5ms quickstart quick installation usage instructions brew install austinjones taps tab or cargo install tab then tab install all installs shell autocompletion scripts and statusline integrations tab foo to open a tab tab bar to switch to another tab works within an active session echo tab to view the active tab put this in your promptline or get https starship rs tab to open the fuzzy finder tab w baz to close a tab or many tabs tab l to view the tabs ctrl t to escape to the fuzzy finder tab adds trailing slashes to tab names this improves autocomplete between tabs and subtabs e g tab and tab child installation tab currently supports macos and linux tab supports the bash fish and zsh shells 1 install the binary the tab binary can be installed using homebrew cargo or from binaries homebrew brew install austinjones taps tab cargo cargo install tab binaries download binaries from https github com austinjones tab rs releases latest https github com austinjones tab rs releases latest known issues the zsh installer fails if the usr local share zsh site functions directory is not writable and you don t use oh my zsh see 221 https github com austinjones tab rs issues 221 after you upgrade tab or move the tab binary you may want to run the tab shutdown command to restart the daemon see 163 https github com austinjones tab rs issues 163 if you get the message tab unsupported terminal app you fix it by removing the osx plugin from your zshrc see 156 https github com austinjones tab rs issues 156 invoking tab with no arguments within a session causes your shell history to be cleared by fuzzy finder output when reconnecting to the session it may require enter for the shell prompt to appear see 262 https github com austinjones tab rs issues 262 2 install autocompletions for your shell tab works best when configured with shell autocompletions tab has a built in script installer which provides a detailed explanation of the changes and prompts for your confirmation you should run it as your standard user account as it manages files within your home directory all tab can install completions for all shells supported integration which are present on your system tab install all bash fish zsh tab can also install completions for a specific shell tab install bash tab install fish tab install zsh 3 configure your statusline starship tab integrates with the starship https starship rs prompt and can auto configure the integration tab install starship you can also put the current tab before the directory in config starship toml this is how i ve configured my own shell format custom tab all other you can configure any other statusline tool that supports environment variables the current tab name is available in the tab environment var you can also add a handcrafted statusline snippet to your shell s rc configuration file in bash https github com austinjones tab rs blob main tab src completions bash statusline bash fish https github com austinjones tab rs blob main tab src completions fish statusline fish or zsh https github com austinjones tab rs blob main tab src completions zsh statusline zsh navigation tab is designed to provide quick navigation between tabs and persistent navigation to workspaces or repositories in these examples the prefix before the is the selected tab to select a tab opens the fuzzy finder tab foo selects a tab by name foo tab bar bar to switch to another tab while within a session and exit the session foo tab bar bar exit to switch to another tab while within an interactive application monitor top top output ctrl t foo foo each workspace has it s own tab you can use this to quickly reset the working directory within a workspace repo tab workspace workspace to switch to another workspace if a workspace link has been configured in the current workspace tab yml https github com austinjones tab rs blob main examples advanced workspace tab yml workspace tab other workspace other workspace configuration tab supports configurable sessions which can be written in yaml there are a few types of configurations user configuration which can be placed at config tab yml or tab config this configuration is always active and can be used to define global tabs or pin links to workspaces for easy access it can also be used to override keybindings https github com austinjones tab rs blob main examples advanced workspace tab yml l65 workspace configurations which are active within any subdirectory and link to repositories repository configurations which define tab endpoints your typical tab interaction would be switching to one of these repositories via tab myproj detailed documentation is available in the examples https github com austinjones tab rs tree main examples directory but here are some starter configurations config tab yml or tab config workspace this is a global tab that is always available and initializes in tab dir tab global tab doc my global tab doc dir tab dir this links to a workspace config in my workspace workspaces are only active within the workspace directory this creates a link to that workspace so you can always activate it with tab my workspace workspace my workspace workspace tab yml workspace repo my project tab workspace tab doc this is a top level workspace tab workspace other workspace workspace my project tab yml repo proj doc my project tabs tab run dir src doc runs the project server with these configurations tab l provides the following tab l available tabs global tab my global tab doc proj my project proj run runs the project server workspace tab this is a top level workspace tab other workspace workspace tab for other workspace security tab can execute commands in a terminal so i take security seriously this is how i protect your machine in tab the tab daemon requires the following to accept any websocket connection the request must include a 128 byte auth token stored in the file tab daemon pid yml on unix operating systems the file is assigned the permissions 600 the origin header must not be present in the request this prevents any connection from a browser the daemon binds to 127 0 0 1 this should prevent any attempted connections from the local network | terminal multiplexer shell | os |
UR-SQL-Challenge | employee database a mystery in two parts background it is a beautiful spring day and it is two weeks since you have been hired as a new data engineer at pewlett hackard your first major task is a research project on employees of the corporation from the 1980s and 1990s all that remain of the database of employees from that period are six csv files in this assignment you will design the tables to hold data in the csvs import the csvs into a sql database and answer questions about the data in other words you will perform 1 data engineering 3 data analysis note you may hear the term data modeling in place of data engineering but they are the same terms data engineering is the more modern wording instead of data modeling task data modeling inspect the csvs and sketch out an erd of the tables feel free to use a tool like http www quickdatabasediagrams com http www quickdatabasediagrams com model output images employeesql erd model png data engineering use the information you have to create a table schema for each of the six csv files remember to specify data types primary keys foreign keys and other constraints for the primary keys check to see if the column is unique otherwise create a composite key https en wikipedia org wiki compound key which takes to primary keys in order to uniquely identify a row be sure to create tables in the correct order to handle foreign keys import each csv file into the corresponding sql table note be sure to import the data in the same order that the tables were created and account for the headers when importing to avoid errors data analysis once you have a complete database do the following 1 list the following details of each employee employee number last name first name sex and salary 2 list first name last name and hire date for employees who were hired in 1986 3 list the manager of each department with the following information department number department name the manager s employee number last name first name 4 list the department of each employee with the following information employee number last name first name and department name 5 list first name last name and sex for employees whose first name is hercules and last names begin with b 6 list all employees in the sales department including their employee number last name first name and department name 7 list all employees in the sales and development departments including their employee number last name first name and department name 8 in descending order list the frequency count of employee last names i e how many employees share each last name bonus optional as you examine the data you are overcome with a creeping suspicion that the dataset is fake you surmise that your boss handed you spurious data in order to test the data engineering skills of a new employee to confirm your hunch you decide to take the following steps to generate a visualization of the data with which you will confront your boss 1 import the sql database into pandas yes you could read the csvs directly in pandas but you are after all trying to prove your technical mettle consult sqlalchemy documentation https docs sqlalchemy org en latest core engines html postgresql for more information if using a password do not upload your password to your github repository 2 create a histogram to visualize the most common salary ranges for employees histogram output images salary freq histogram png 3 create a bar chart of average salary by title bar output images salary by title bar png epilogue evidence in hand you march into your boss s office and present the visualization with a sly grin your boss thanks you for your work on your way out of the office you hear the words search your id number you look down at your badge to see that your employee id number is 499942 submission create an image file of your erd create a sql file of your table schemata create a sql file of your queries optional create a jupyter notebook of the bonus analysis references mockaroo llc 2021 realistic data generator https www mockaroo com https www mockaroo com 2021 trilogy education services llc a 2u inc brand confidential and proprietary all rights reserved | sql data-analysis sqlalchemy | server |
Using-Large-Language-Models-to-Replicate-Human-Subject-Studies | using large language models to simulate multiple humans and replicate human subject studies authors gati aher rosa i arriaga adam tauman kalai inproceedings turingexp22 title using large language models to simulate multiple humans and replicate human subject studies author aher gati v and arriaga rosa i and kalai adam tauman booktitle proceedings of the 40th international conference on machine learning icml year 2023 url https arxiv org abs 2208 10264 organization pmlr submitted to arxiv https arxiv org abs 2208 10264 on august 18 2022 abstract we introduce a new type of test called a turing experiment te for evaluating to what extent a given language model such as gpt models can simulate different aspects of human behavior a te can also reveal consistent distortions in a language model s simulation of a specific human behavior unlike the turing test which involves simulating a single arbitrary individual a te requires simulating a representative sample of participants in human subject research we carry out tes that attempt to replicate well established findings from prior studies we design a methodology for simulating tes and illustrate its use to compare how well different language models are able to reproduce classic economic psycholinguistic and social psychology experiments ultimatum game garden path sentences milgram shock experiment and wisdom of crowds in the first three tes the existing findings were replicated using recent models while the last te reveals a hyper accuracy distortion present in some language models including chatgpt and gpt 4 which could affect downstream applications in education and the arts keywords turing test large language models evaluation metrics code https github com microsoft turing experiments tree main | ai |
|
EditableLoopPedal | editablelooppedal an editable loop pedal a project for the embedded systems design icom4217 course using a dspic30f4013 university of puerto rico br mayaguez campus br electrical and computer engineering br editable loop pedal by armando j ortiz garc a team leader br jos a rodr guez rivera br edgardo acosta br br for prof carlos bernal br course icom 4217 090 br abstract musicians today are surrounded with all kinds of technology for the benefits of music composition musicians are not always in the company of other musicians when they want to practice on their own music is not always one track sound usually no one can play more than one instrument per person and this is a problem there is a mechanism that has been created for this need called a loop and that is to record a segment of audio and play it back infinitely this is used for a variety of applications like pattern creation practice etc the problem arises when creating loops especially when they are recorded on top of itself when it is desired to remove a previously recorded segment or edit it our goal is to be able to use a simple stompbox that can be a function loop sampler and be able to edit each track individually | os |
|
VgaCamera_ES_Project | vgacamera es project | os |
|
Crouton | crouton http crouton mybluemix net join the chat at https gitter im edfungus crouton https badges gitter im join 20chat svg https gitter im edfungus crouton utm source badge utm medium badge utm campaign pr badge utm content badge http crouton mybluemix net crouton is a dashboard that lets you visualize and control your iot devices with minimal setup essentially it is the easiest dashboard to setup for any iot hardware enthusiast using only mqtt and json makes iot devices easy to use no frontend knowledge needed flexible for wide variety of devices compatible with devices that use mqtt easy to use interface it is free running the project git clone https github com edfungus crouton git npm install grunt getting started to start we will need a device and a mqtt broker don t have either no worries because we have several demo options for you the device the device can be anything from hardware esp8266 to a python script as long as the device can use the mqtt protocol and encode decode json then it will work as a device while this guide is not about setting up specific devices we do have reference links for the esp8266 and python mqtt libraries esp8266 if you are using lua scripting then the nodemcu firmware has mqtt and json capabilities already built esp8266 lua nodemcu documentation on github https github com nodemcu nodemcu firmware wiki nodemcu api en sample code for the esp8266 which included a togglable and dimmable light refer to the client directory https github com edfungus crouton tree master clients page for more details python the package we suggest to use for python client is paho mqtt http www eclipse org paho it has the necessary mqtt libraries to get up and running python mqtt library paho mqtt http www eclipse org paho sample code for python dummy client which includes most of the dashboard elements refer to the client directory https github com edfungus crouton tree master clients page for more details test clients in addition to the sample code above we have a always running we try test client under the name crouton demo on the test mosquitto org public broker the status of the device can be checked here http crouton demo client mybluemix net and of course check out the getting started crouton gettingstarted for more details demo page detailing crouton test clients http crouton mybluemix net crouton gettingstarted mqtt broker mqtt broker takes care of delivering messages to its clients via a publish and subscribe model the mqtt broker required for crouton must support web sockets web sockets allow crouton to connect to it while there are free options to download and run your own mqtt broker there are also free to use public mqtt brokers that are already set up we default to mosquitto s publicly hosted broker free downloadable brokers mosquitto http mosquitto org emqttd http emqtt io free publicly hosted brokers mosquitto http test mosquitto org hivemq http www hivemq com try out remember crouton must connect to the web socket port of the mqtt broker while devices will usually connect to the regular port in the future we plan to have crouton s own mqtt broker which will have better up time and security in comparison to public brokers how it works after getting a device and a broker we simply need to hook everything up the beauty of crouton is that like the device crouton is also a client to the same mqtt broker as a result crouton does not rely on any additional centralized server than does your mqtt device in fact crouton is purely a frontend application with no backend services mqtt diagram with crouton and devices https raw githubusercontent com edfungus crouton master public common images mqtt png being a distributed system like the one shown above connection between the device and crouton is only via the mqtt broker the communication between crouton and the device is defined by a protocol that utilizes addresses and mqtt s last will and testament lwt feature once crouton and and the device are connected to the same mqtt broker we can use crouton to initiate the connection to the device the device will in turn send a json describing itself known as the deviceinfo to crouton connecting to crouton first have crouton and the device connected to the same mqtt broker the connection between the device and crouton will be initiated from crouton therefore the device needs to subscribe to its own inbox device should subscribe to the following inbox the device name deviceinfo every time the device successfully connects to the mqtt broker it should publish its deviceinfo this is needed for auto reconnection if crouton is waiting for the device to connect it will listen for the deviceinfo when the device comes back online device should publish deviceinfo json once connected outbox the device name deviceinfo deviceinfo the deviceinfo is the primary method for crouton to understand the device it is the first message the device will send to crouton to establish connection and also the message that describes the device and values to crouton the primary object is deviceinfo within deviceinfo there are several keys as follows json deviceinfo name kroobar endpoints bardoor title bar main door card type crouton simple text units people entered values value 34 description kroobar s iot devices status good name a string that is the name for the device this is same name you would use to add the device endpoints an object that configures each dashboard element crouton will show there can be more than one endpoint which would be key object pairs within endpoints description a string that describes the device for display to user only status a string that describes the status of the device for display to user only note both name and endpoints are required and must be unique to other names or endpoints respectively addresses addresses are what crouton and the device will publish and subscribe to they are also critical in making the communication between crouton and the devices accurate therefore there is a structure they should follow an address can be broken up into 3 sections inbox or outbox device name endpoint name inbox or outbox the box is relative to the device inbox is for messages going to the device and outbox is for messages from the device device name the name of the device the are targeting from name key value pair of deviceinfo endpoint name the name of the endpoint the are targeting from the key used in the key object pair in endpoints note all addresses must be unique to one mqtt broker therefore issues could be encounter when using public brokers where there are naming conflicts updating device values updating device values can come from crouton or the device the message payload will be a json that updates the value this json will be equivalent to the object of the key values within each endpoint however only values that are being updated needs to be updated all other values must be updated by the deviceinfo json json payload value 35 an entry in endpoints bardoor title bar main door card type crouton simple text units people entered values value 34 from crouton crouton has the ability to update the value of the device s endpoints via certain dashboard cards therefore the device needs to be subscribe to certain addresses detailed in the endpoints section below the payload from crouton is in the same format as the one coming from the device address inbox kroobar bardoor payload value 35 from device to update values on crouton from the device simply publish messages to the outbox of the endpoint which crouton is already subscribed to the payload is just the same as the one coming from crouton address outbox kroobar bardoor payload value 35 last will and testament lwt in order for crouton to know when the device has unexpectedly disconnected the device must create a lwt with the mqtt broker this a predefined broadcast that the broker will publish on the device s behalf when the device disconnects the payload in this case can be anything as long as the address is correct address outbox kroobar lwt payload anything note no endpoints can be named lwt they will conflict in the name space endpoints a device can have multiple endpoints each endpoint represents a dashboard card that will be displayed on the dashboard the device must subscribe to the inbox of each endpoint which may receive values from crouton for example a toggle switch on crouton may change a value on the device therefore a subscription is necessary however an alert button which sends only messages from device to crouton may not need a subscription because the device does not expect any values from crouton subscription address for endpoints on device inbox device name endpoint name upon receiving a new value from crouton the device must send back the new value or the appropriate value back to crouton this is because crouton will not reflect the new value change unless it is coming from the device therefore the value shown on crouton more accurately reflects the value on the device address outbox device name endpoint name payload value some new value here dashboard cards dashboard cards are visual and control elements which allow interaction with connected devices by simply adding a new endpoint in the deviceinfo json a new dashboard card will be automatically generated upon connection they strive to do several things provide a visual experience for iot devices modularity allows different combinations of different devices simplicity of define functionality for each type of card reflect the latest real value of the device the type of dashboard card for each endpoint is specified in the object of the endpoint under the key card type the values object is the object that is sent between crouton and the device to keep values up to date both of these fields are required for each endpoint json bardoor card type crouton simple text values any icons used come from font awesome https fortawesome github io font awesome icons and utilizes the same name for the icons also new cards are always coming so keep the suggestions coming simple cards simple cards are basic cards that have only one value therefore if they are buttons inputs etc there will only be one button inputs etc per card their functionality are fairly limited but by using multiple cards together they can still be powerful all simple card have the same prefix simple card prefix crouton simple card type simple text crouton simple text https raw githubusercontent com edfungus crouton master public common images crouton simple text png br simple text is used to display a value text or number on the dashboard from the device to crouton json device crouton name crouton simple text example bardoor units people entered optional values value 34 required card type crouton simple text required title bar main door optional simple input crouton simple input https raw githubusercontent com edfungus crouton master public common images crouton simple input png br simple input is similar to simple text except the user can update the value on the device from crouton there is no length restriction of the value by crouton json device crouton name crouton simple input example custommessage values value happy hour is now required card type crouton simple input required title billboard message optional simple slider crouton simple slider https raw githubusercontent com edfungus crouton master public common images crouton simple slider png br simple slider allows the user to select continuous values within a given range both the large number and the slider will attempt the give the real device value at all times except when the user is sliding json device crouton name crouton simple slider example barlightlevel values value 30 required min 0 required max 100 required units percent optional card type crouton simple slider required title bar light brightness optional simple button crouton simple button https raw githubusercontent com edfungus crouton master public common images crouton simple button png br simple button is one directional sending a signal with no meaningful value from crouton to the device however this is still a bi directional card because the button is only enable if value is true if the device updates the value of the card to false the button will be disabled json device crouton name crouton simple button example lastcall values value true required icons icon bell required card type crouton simple button required title last call bell optional simple toggle crouton simple toggle https raw githubusercontent com edfungus crouton master public common images crouton simple toggle png br simple toggle allows a boolean value to be changed by both crouton and the device in the larger value display priority for display is icon labels boolean text if no labels or icons are given the words true and false will be used the labels around the toggle is only defined by labels object json device crouton name crouton simple toggle example backdoorlock values value false required labels optional true locked false unlocked icons optional true lock false lock card type crouton simple toggle required title employee door optional chart cards these cards are for charts donut chart crouton chart donut 1 https raw githubusercontent com edfungus crouton master public common images crouton chart donut 1 png crouton chart donut 1 https raw githubusercontent com edfungus crouton master public common images crouton chart donut 2 png br a fairly flexible pie chart the labels and series values are in arrays the labels are optional must have at least an empty array and will not show if empty message is displayed in the center of the donut centersum defualt is false sums up all of the values and replaces message total is the value that will fill up the complete circle if sum of series is beyond total the extra parts will be truncated json device crouton name crouton chart donut example drinksordered values labels scotch rum coke shiner margarita other required series 10 20 30 10 30 required message optional total 100 required card type crouton chart donut title drinks ordered optional occupancy values labels required series 76 required total 100 required centersum true optional units optional card type crouton chart donut required title occupancy optional line chart crouton chart line https raw githubusercontent com edfungus crouton master public common images crouton chart line png br a simple line chart with multiple lines available the labels corresponds to the x axis values and the series corresponds to the y axis values multiple sets of x y values can be passed at once as long as the array length of labels and series are matched the reason why series is multidimensional is so that multiple lines can be drawn where each array in series corresponds to a line the update parameter is expected on update and holds a copy of values with the new labels and series within max refers to the maximum number of data points based on the x axis is shown low and high refers to the maximum y values expected it is suggested that labels and series be prepopulated with one set of x y value for each line json device crouton name crouton chart line example temperature values labels 1 required series 60 required update null required max 11 required low 58 required high 73 required card type crouton chart line required title temperature f optional 3 lines and 1 value values labels 1 series 60 2 543 update null 3 lines and 2 values values labels 1 2 series 60 62 2 4 543 545 update null to update temperature values labels null series null update labels 2 series 62 advanced cards these cards are a little bit more specific to certain applications rgb slider crouton rgb slider https raw githubusercontent com edfungus crouton master public common images crouton rgb slider png br rgb slider is three combined slider for the specific application of controlling a rgb led prepopulate the values for red green and blue by setting in values json device crouton name crouton rgb slider example discolights values red 0 required green 0 required blue 0 required min 0 required max 255 required card type crouton rgb slider required title rgb lights optional youtube stream crouton video youtube https raw githubusercontent com edfungus crouton master public common images crouton video youtube png br thank you wickwire https github com wickwire for this addition br add a youtube video or livestream to your dashboard this is great for having a dashboard of various security cameras json device crouton name crouton video youtube example youtubestream values youtubeid gznb3jq2yzo required card type crouton video youtube required title youtube stream optional | server |
|
CloudComputingCaseStudy | cloud computing case study a case study on cloud computing cloud computing case study cloud computing case study introduction introduction resources resources course information course information technologies technologies project status project status installation installation get repository get repository license license introduction a case study on cloud computing which studies about what is cloud computing history different types services offered by cloud computing pros and cons resources word document resources introductiontoengineeringcloudcomputingcasestudy docx introductiontoengineeringcloudcomputingcasestudy docx presentation resources introductiontoengineeringcloudcomputingcasestudypresentation pptx introductiontoengineeringcloudcomputingcasestudypresentation pptx course information course name introduction to engineering course code 15gn1004 course credits 3 academic year 2015 2016 technologies microsoft word and power point cloud computing project status case study completed installation get repository git git https github com msaf9 introductiontoengineering git cd introductiontoengineering license | case-study cloud-computing | cloud |
Shopify | shopify overview this repository contains the challenge code for the 2019 ios engineer internship at shopify repo should only be used by managers and recruiters shopify for assessment purposes screenshots img src screenshots home png width 250 height 500 img img src screenshots detail png width 250 height 500 img getting started run these commands to have the full project ready on your local machine git clone https github com mediboss shopify git pod install file folder architecture i followed the mvc architecure for the shopify challenge project root shopify models views controllers networking http services supporting files extensions info plist screenshots home png detail png shopifytests shopifyuitests readme md podfile podfile lock engineer medi assumani ios engineer built with swift 4 2 xcode kingfisher | os |
|
JanataHack-Computer-Vision-Hackathon | janatahack computer vision hackathon winning solution for janatahack computer vision hackathon private lb rank 1 leader board img src cvlb png alt markdown monster icon style float left margin right 10px emergency vs non emergency vehicle classification fatalities due to traffic delays of emergency vehicles such as ambulance fire brigade is a huge problem in daily life we often see that emergency vehicles face difficulty in passing through traffic so differentiating a vehicle into an emergency and non emergency category can be an important component in traffic monitoring as well as self drive car systems as reaching on time to their destination is critical for these services in this problem you will be working on classifying vehicle images as either belonging to the emergency vehicle or non emergency vehicle category for the same you are provided with the train and the test dataset emergency vehicles usually includes police cars ambulance and fire brigades data description train zip contains 2 csvs and 1 folder containing image data 1 train csv image names emergency or not contains the image name and correct class for 1646 70 train images 2 images contains 2352 images for both train and test sets test csv image names contains just the image names for the 706 30 test images sample submission csv image names emergency or not contains the exact format for a valid submission 1 for emergency vehicle 0 for non emergency vehicle data at a glance img src data df png alt markdown monster icon style float left margin right 10px data photos img src data photos png alt markdown monster icon style float left margin right 10px data augmentation approach 1 regular transformation and data augmentation 2 used fastai resnet101 model tools used 1 python for programming 2 numpy library for methodology 3 opencv for reading the images and stuff 3 fastai library for the model 4 matplotlib and seaborn was used for plotting and analyzing the data data leak sorted images shows everything img src sorted data photos png alt markdown monster icon style float left margin right 10px competition result rank 5th on public lb and 1st on private lb link to leaderboard https datahack analyticsvidhya com contest janatahack computer vision hackathon leaderboard | ai |
|
nativewind | div align center p align center a href https nativewind dev target blank img src https nativewind dev img logo svg alt tailwind css width 70 height 70 h1 align center style color red nativewind h1 a p img alt github branch checks state src https img shields io github checks status marklawlor nativewind next img alt npm src https img shields io npm v nativewind img alt npm src https img shields io npm dt nativewind img alt github src https img shields io github license marklawlor nativewind div br nativewind uses tailwind css https tailwindcss com as high level scripting language to create a universal design system styled components can be shared between all react native platforms using the best style engine for that platform e g css stylesheet or stylesheet create it s goals are to to provide a consistent styling experience across all platforms improving developer ux component performance and code maintainability nativewind processes your styles during your application build and uses a minimal runtime to selectively apply reactive styles eg changes to device orientation light dark mode in action a href https snack expo dev name hello world dependencies react react native nativewind latest platform web supportedplatforms ios android web code import 20react 20from 20 react 3b 0aimport 20 7b 20withexposnack 20 7d 20from 20 nativewind 3b 0a 0aimport 20 7b 20text 2c 20view 20 7d 20from 20 react native 3b 0aimport 20 7b 20styled 20 7d 20from 20 nativewind 3b 0a 0aconst 20styledview 20 3d 20styled view 0aconst 20styledtext 20 3d 20styled text 0a 0aconst 20app 20 3d 20 20 3d 3e 20 7b 0a 20 20return 20 0a 20 20 20 20 3cstyledview 20classname 3d 22flex 1 20items center 20justify center 22 3e 0a 20 20 20 20 20 20 3cstyledtext 20classname 3d 22text slate 800 22 3e 0a 20 20 20 20 20 20 20 20try 20editing 20me 20 f0 9f 8e 89 0a 20 20 20 20 20 20 3c 2fstyledtext 3e 0a 20 20 20 20 3c 2fstyledview 3e 0a 20 20 3b 0a 7d 0a 0a 2f 2f 20this 20demo 20is 20using 20a 20external 20compiler 20that 20will 20only 20work 20in 20expo 20snacks 0a 2f 2f 20you 20may 20see 20flashes 20of 20unstyled 20content 2c 20this 20will 20not 20occur 20under 20normal 20use 0a 2f 2f 20please 20see 20the 20documentation 20to 20setup 20your 20application 0aexport 20default 20withexposnack app 3b 0a picture source media prefers color scheme dark srcset https user images githubusercontent com 3946701 178458845 c9ac0299 6809 4002 99a0 78030f27b06a png img src https user images githubusercontent com 3946701 178458837 df03c080 eb13 4dcc 9080 186b061a8678 png picture a features works on all rn platforms uses the best style system for each platform uses the tailwind css compiler styles are computed at build time small runtime keeps your components fast babel plugin for simple setup and improving intellisense support respects all tailwind config js settings including themes custom values plugins dark mode arbitrary classes media queries pseudo classes hover focus active on compatible components styling based on parent state automatically style children based upon parent pseudo classes children styles create simple layouts based upon parent class documentation all documentation is on our website https nativewind dev | react-native tailwindcss css react tailwind react-native-web nativewind | os |
UCSD_Data_Science_and_Engineering | ucsd data science and engineering in the data science and engineering program engineering professionals combine the skills of software programmer database manager and statistician to create mathematical models of the data identify trends deviations then present them in effective visual ways that can be understood by others data scientists unlock new sources of economic value provide fresh insights into science and inform decision makers by analyzing large diverse complex longitudinal and distributed data sets generated from instruments sensors internet transactions email video and other digital sources students entering the mas program for a degree in data science and engineering will undertake courses in programming analysis and applications management and visualization this program requires three foundational courses four core courses and two electives totaling thirty four units plus a capstone team project course of four units for a total of thirty eight units | server |
|
aws-iot-elf | aws iot elf an aws iot python example client that strives to be extremely low friction elf overview the aws iot elf python example client demonstrates how one can create things send messages to things subscribe to topics to receive messages from things and clean up things in the aws iot service using the aws sdk for python aka boto3 this example also demonstrates how to bring together boto3 and the standard aws iot device client in a straightforward fashion create thing s once the getting started getting started guide below has been completed to create a single thing in the aws iot service simply type python elf py create to create a given number of things eg 3 in the aws iot service type python elf py create 3 send messages to send messages using previously created things type python elf py send subscribe to topics to receive messages from a topic type python elf py subscribe clean thing s to clean up all previously created things type python elf py clean getting started to get this example working with python 2 7 11 on a flavor of unix or mac os first ensure you have python 2 7 11 on your machine by executing this command line command python version if you don t have python locally homebrew http brew sh can help you get the latest python on mac os python org https www python org downloads source can start you off for python on linux unix flavors if you re starting out on windows follow the windows getting started master win readme md and return to this point after completion alternatively elf can be run as a docker container instructions for building an elf docker image can be found in the docker getting started master docker readme md now with a working python and git installation clone this repo to your local machine cd mkdir dev cd dev git clone https github com awslabs aws iot elf git this will create a local folder dev aws iot elf to keep the aws iot elf python dependencies separate you probably want to install https virtualenv pypa io en stable virtualenv if you choose to install virtualenv then create a virtual environment cd dev aws iot elf virtualenv venv and then activate that virtual environment source dev aws iot elf venv bin activate or on windows venv scripts activate now install the aws iot elf dependencies into your local environment using these commands cd dev aws iot elf pip install r requirements txt next install http docs aws amazon com cli latest userguide installing html and configure http docs aws amazon com cli latest userguide cli chap getting started html the aws cli when you configure the aws cli the api keys you install as the default profile or as a named profile http docs aws amazon com cli latest userguide cli chap getting started html cli multiple profiles should have at least the following privileges in an associated iam policy json version 2012 10 17 statement sid elfstmt20160531 effect allow action iot attachprincipalpolicy iot attachthingprincipal iot createkeysandcertificate iot createpolicy iot creatething iot createtopicrule iot deletecertificate iot deletepolicy iot deletething iot deletetopicrule iot describeendpoint iot detachprincipalpolicy iot detachthingprincipal iot replacetopicrule iot updatecertificate iot updatething resource now to authenticate with aws iot http docs aws amazon com iot latest developerguide identity in iot html using server authentication you will need to download the verisign root ca https www symantec com content en us enterprise verisign roots verisign class 203 public primary certification authority g5 pem and save it as a file aws iot rootca crt or if you are on linux unix macos simply execute this command in the same directory as elf py curl o aws iot rootca crt https www symantec com content en us enterprise verisign roots verisign class 203 public primary certification authority g5 pem lastly you will probably want to read through the troubleshooting section at the bottom of these instructions just in case you experience a bump to validate the aws iot elf is setup correctly execute python elf py create and python elf py clean you should not see any errors congratulations the elf and a minimal development environment are now configured on your machine detailed help defaults the aws iot elf uses the following defaults region us west 2 mqtt topic elf thing message iot elf hello send message duration 10 seconds topic subscription duration 10 seconds create thing s using the create command will invoke the create things cli https github com awslabs aws iot elf blob master elf py l331 function with the given command line arguments to create a given number of things eg 3 in the aws iot service in a specific region type python elf py region region name create 3 this command results in three numbered things thing 0 thing 1 and thing 2 being created in region name to create a single thing in the aws iot service using a different aws cli profile type python elf py profile profile name create to create a single thing in the aws iot service in a specific region using a different aws cli profile type python elf py region region name profile profile name create calling the create command with a region and or profile cli option means that the things will be created in that region and will use the corresponding aws cli named profile http docs aws amazon com cli latest userguide cli chap getting started html cli multiple profiles api key and secret key pair additional send subscribe and clean commands should use the same options in this way the aws iot elf will send messages to the same region and with the same profile used to create the things in the first place once clean is called successfully different region and or profile option values can be used to orient the aws iot elf differently when looking through the create thing cli function the core of the create command is shown in these lines of code python generate a numbered thing name t name thing name template format i create a key and certificate in the aws iot service per thing keys cert iot create keys and certificate setasactive true create a named thing in the aws iot service iot create thing thingname t name attach the previously created certificate to the created thing iot attach thing principal thingname t name principal keys cert certificatearn everything else around this code is to prepare for or record the results of the invocation of these functions note the thing policy is created and attached in the send step lastly when you run the create command you can see in the example output below that a thing named thing 0 was created and associated with a certificate in region us west 2 python elf py create iot elf info read elf id from config 95bb3ad5 95d1 4c9a b818 b23f92fcde30 iot elf info create things elf creating 1 thing iot elf info thing thing 0 associated with cert arn aws iot us west 2 example1261 cert exampleexampleexampleb326b72e76c2f67ccd6f8ec15515a9bd28168b2cc42 iot elf info thing name thing 0 and pem file dev aws iot elf misc thing 0 pem iot elf info thing name thing 0 public key file dev aws iot elf misc thing 0 pub iot elf info thing name thing 0 private key file dev aws iot elf misc thing 0 prv iot elf info wrote 1 things to config file dev aws iot elf misc things json iot elf info create things elf created 1 things in region us west 2 send messages using the send command will invoke the send messages cli https github com awslabs aws iot elf blob master elf py l405 function with the given command line arguments to send a specific message on a specific topic for a specified duration in another region type python elf py region region name send topic elf example duration num seconds example elf message to send a json payload as a message read from a file named example json type python elf py region region name send json message example json append thing name which will result in messages similar to the following sent as shown in the example output iot elf info elf thing 0 posting message some thing another thing ts 1465257133 82 on topic elf thing 0 iot elf info elf thing 0 posting message some thing another thing ts 1465257134 82 on topic elf thing 0 to send a json payload as a thing 0 shadow update read from a file named example shadow json type python elf py send json message example shadow json topic aws things thing 0 shadow update note the quotes around the topic value are important otherwise the aws portion of the value will possibly be interpreted as a shell variable subscribe to topic s using the subscribe command will invoke the subscribe cli https github com awslabs aws iot elf blob master elf py l454 function with the given command line arguments to subscribe to messages at the default topic root of elf for a specified duration type python elf py subscribe duration num seconds to subscribe to messages at the default topic root of elf for a specified duration in another region type python elf py region region name subscribe duration num seconds to send messages from elf y to elf x through the aws iot service open two command line windows aka elf x and elf y in the elf x window type python elf py subscribe duration 30 append thing name and in the elf y window type python elf py send duration 15 append thing name which in a matter of seconds will result in messages shown in the elf x window similar to this example output iot elf info received message msg iot elf hello ts 1468604634 27 from topic elf thing 0 iot elf info received message msg iot elf hello ts 1468604635 28 from topic elf thing 0 clean thing s using the clean command will invoke the clean up cli https github com awslabs aws iot elf blob master elf py l500 function with the given command line arguments this will remove all resources that were created by elf in the aws iot service and on the local file system to clean up all previously created resources type python elf py clean if you want to force only a clean up of the locally stored files without cleaning the resources created in the aws iot service type python elf py clean only local when looking through the clean up cli function the core of the clean command is shown in these lines of code python snip iot detach principal policy policyname thing policy name key principal thing certificatearn snip iot delete policy policyname thing policy name key snip iot update certificate certificateid thing certificateid newstatus inactive snip iot detach thing principal thingname thing name principal thing certificatearn snip iot delete certificate certificateid thing certificateid snip iot delete thing thingname thing name all steps prior to delete thing are required in order to detach and clean up a fully functional and authorized thing help for additional detailed help and configuration options enter python elf py help or python elf py create help python elf py send help python elf py subscribe help python elf py clean help troubleshooting q when i type in my command line i get a parameter parsing error for example this command line gives an error python elf py clean region us east 1 usage elf py h region region profile profile name create send clean elf py error unrecognized arguments region us east 1 a this example error is caused when using the elf command line because the order of the commands and options matters the general structure of all elf commands are python elf py global elf options command command specific options the global elf options should be the same across commands the command specific options are specific to any given command you can learn more about the various options by getting detailed help on the command line q i see an unknown protocol error similar to the following why file snip python2 7 ssl py line 808 in do handshake self sslobj do handshake ssl sslerror ssl unknown protocol unknown protocol a the python installation in use does not support tlsv1 2 https en wikipedia org wiki transport layer security tls 1 2 which is required by the aws iot service upgrade the python installation to 2 7 11 q i seem to need to upgrade my openssl and python installations why a a version of python 2 7 ssl https docs python org 2 library ssl html with support for open ssl 1 0 1 is necessary to support the security posture and specifically tlsv1 2 https en wikipedia org wiki transport layer security tls 1 2 required by the aws iot service q when i try to send messages i see a resourcealreadyexistsexception similar to the following what might be wrong example botocore exceptions clienterror an error occurred resourcealreadyexistsexception when calling the createpolicy operation policy cannot be created name already exists name policy thing 0 a in this example exception for some reason the policy name policy thing 0 already exists and is colliding with the new policy to be created and applied to the thing the old existing policy needs to be detached http docs aws amazon com cli latest reference iot detach principal policy html and deleted http docs aws amazon com cli latest reference iot delete policy html manually using the aws cli or aws iot console q when i try to create send or clean i see an accessdeniedexception similar to the following what might be wrong example botocore exceptions clienterror an error occurred accessdeniedexception when calling the createkeysandcertificate operation user arn aws iam xxxxxxyyyyyy user elf is not authorized to perform iot createkeysandcertificate a in this example exception the user elf does not have enough privilege to perform the iot createkeysandcertificate action on the aws iot service make sure the privileges as described in the getting started section are associated with the user or profile and specifically the api keys experiencing the exception q when i try to send messages using my recently created things i see a resourcenotfoundexception similar to the following what might be wrong example botocore exceptions clienterror an error occurred resourcenotfoundexception when calling the attachprincipalpolicy operation the certificate given in the principal does not exist a in this example exception the certificate recorded into the aws iot elf config file does not exist in the region most likely the create command was called with a region option that is not the same as the region used when calling the send command related resources aws iot getting started http docs aws amazon com kinesis latest dev introduction html aws sdk for python http aws amazon com sdkforpython aws iot device sdk for python https github com aws aws iot device sdk python apache 2 0 license http aws amazon com apache2 0 | server |
|
EasyML | easy machine learning what is easy machine learning machine learning algorithms have become the key components in many big data applications however the full potential of machine learning is still far from been realized because using machine learning algorithms is hard especially on distributed platforms such as hadoop and spark the key barriers come from not only the implementation of the algorithms themselves but also the processing for applying them to real applications which often involve multiple steps and different algorithms our platform easy machine learning presents a general purpose dataflow based system for easing the process of applying machine learning algorithms to real world tasks in the system a learning task is formulated as a directed acyclic graph dag in which each node represents an operation e g a machine learning algorithm and each edge represents the flow of the data from one node to its descendants the task can be defined manually or be cloned from existing tasks templates after submitting a task to the cloud each node will be automatically scheduled to execute according to the dag graphical user interface is implemented for making users to create configure submit and monitor a task in a drag and drop manner advantages of the system include 1 lowing the barriers of defining and executing machine learning tasks 2 sharing and re using the implementations of the algorithms the job dags and the experimental results 3 seamlessly integrating the stand alone algorithms as well as the distributed algorithms in one task the system consists of three major components a distributed machine learning library which implements not only popular used machine learning algorithms but also the algorithms for data pre post processing data format transformation feature generation performance evaluation etc these algorithms are mainly implemented based on spark a gui based machine learning studio system which enable users to create configure submit monitor and sharing their machine learning process in a drag and drop manner all of the algorithms in the machine learning library can be accessed and configured in the studio system they are the key building blocks for constructing machine learning tasks div align center img src img lr dag png width 400 height 300 alt an example dataflow dag div a cloud service for executing the tasks we build the service based on the open source big data platform of hadoop and spark in order to build an platform we organised a cluster of server on docker after receiving a task dag from the gui each node will be automatically scheduled to run when all of its dependent data sources are ready the algorithm corresponds to the node will scheduled to run on linux spark or map reduce according to their implementation div align center img src img docker structure png width 90 alt docker studio div how to involve in our project pull all project and prepare some necessary environments and a kind of development utilities follows the step in quick start md https github com ict bda easyml blob master quickstart md and you can create our system in your computer how to use easy machine learning studio after you have ran easy ml you can login via http localhost 18080 emlstudio html with our official account bdaict hotmail com and password bdaict for the best user experience it is recommended to use chrome div align center img src img home page png width 90 alt homepage div as shown in the following figure the users can create a machine learning task a dataflow dag with the algorithms and data sets listed in the left panel of the page they can choose to click the algorithms and data sets listed in the program and data panels they can also click the job panel select an existing task clone it and make necessary modifications the users can configure the task information and parameter values of each node in the right panel the nodes in the task could corresponds to either a stand alone linux program or a distributed program running on spark or hadoop map reduce div align center img src img job construct png width 90 alt job structure div the task is submitted to run on the cloud after clicking the submit button the status of each node is indicated with different colors as shown in the following figure div align center img src img job submit png width 90 alt job structure div users could right click on the green output port of finished executing node to preview the output data one could check the stdout and stderr logs from the right click menu of each finished executing node as well the users may check the outputs of a node by right clicking the corresponding output ports the standard output and standard error information printed during the execution can be checked through right clicking the corresponding nodes and selects the menu show stdout and show stderr div align center img src img job stdout png width 90 alt job stdout div a finished either success or not task can be further modified and resubmitted to run as shown in the following figure our system will only schedule the influenced nodes to run the outputs of uninfluenced nodes are directly reused to save the running time and system resources div align center img src img job reuse submit png width 90 alt job stdout div the users can upload their own algorithm packages and data sets for creating their own tasks or shared with other users by clicking the upload program button the popup window allows the users to specify the necessary information of the algorithm package including the name the category the description and the command line pattern string etc as shown in the following figure the most important thing is to write the command line pattern string with the predefined format it defined the input ports output ports and parameter settings of a node we developed a tool in the panel for helping users to write the command line string patterns by clicking the upload data button users can upload a data set in the similar way as that of uploading a algorithms package div align center img src img upload program png width 90 alt job stdout div how to experience our system we apply an online service for you to experience our system you can register your own account or use our official account to login the system the website of the system is as belows outside ict you can visit http 159 226 40 104 18080 dev http 159 226 40 104 18080 dev http 159 226 40 104 18080 dev inside ict you can visit http 10 60 0 50 18080 dev http 10 60 0 50 18080 dev http 10 60 0 50 18080 dev if you have any advice or problems when you expericen our system welcome to contact us you can leave us a message or give a email to bdaict hotmail com thank you for your advice papers and presentations 1 easyml ease the process of machine learning with data flow http www bigdatalab ac cn junxu publications sosp2017 aisys easyml pdf sosp ai system workshop shanghai oct 28 2017 2 tianyou guo jun xu xiaohui yan jianpeng hou ping li zhaohui li jiafeng guo and xueqi cheng ease the process of machine learning with dataflow http www bigdatalab ac cn junxu publications cikm2016 bdademo pdf proceedings of the 25th acm international conference on information and knowledge management cikm 16 indianapolis usa pp 2437 2440 2016 acknowledgements the following people contributed to the development of the easyml project jun xu school of information renmin university of china homepage http info ruc edu cn academic professor php teacher id 169 http info ruc edu cn academic professor php teacher id 169 xiaohui yan homepage http xiaohuiyan github io http xiaohuiyan github io xinjie chen institute of computing technolgy chinese academy of sciences zhaohui li institute of computing technolgy chinese academy of sciences tianyou guo sougou inc jianpeng hou google china ping li tencent wechat jiashuo cao chengdu university of information technology dong huang university of chinese academy of sciences xueqi cheng institute of computing technolgy chinese academy of sciences homepage http www bigdatalab ac cn cxq http www bigdatalab ac cn cxq | machine-learning-studio machine-learning-platform learning-platform machine-learning big-data-analytics | ai |
data_engineering_tools | data engineering tools a repo catalog of data engineering tools comparing with cloud enterprise and open source this is meant to serve as reference for folks to try and use data engineering tools and frameworks without relying on cloud computing resources and servces data engineering is a challenging field to get into due to the limitations you have of working in data before getting work experience in data my hopen is to give you the resources you need starting with listing out the tools that you can try on your own they are basically the same throughout functionality description open source enterprise cloud service data ingestion the process of importing data from external sources into a data storage system apache nifi apache flume meltano airbyte aws glue azure data factory meltano airbyte data transformation the process of cleaning normalizing and converting data into a format suitable for analysis apache beam apache spark dbt aws glue azure databricks dbt enterprise data storage the process of storing data in a structured or unstructured manner often in a distributed system apache hadoop hdfs apache cassandra aws s3 azure blob storage data processing the process of executing computations on data such as aggregations filtering and machine learning algorithms apache flink apache spark aws emr azure hdinsight data visualization the process of creating visual representations of data such as charts graphs and maps apache superset tableau google data studio data warehousing the process of storing and organizing data in a central location for reporting and analysis apache hive apache druid aws redshift azure synapse analytics google bigquery data governance the process of managing and governing data within an organization including security privacy and compliance apache ranger apache atlas aws glue azure purview google cloud datapolicy manager data lineage the process of tracking the origin and movement of data within an organization apache atlas apache falcon aws glue azure purview google cloud datapolicy manager data quality the process of ensuring that data is accurate complete and consistent apache datafu talend talend informatica google cloud data quality services data catalog a system for organizing and storing metadata about data assets within an organization apache hive metastore apache atlas aws glue azure purview google cloud datapolicy manager | cloud |
|
mqtt-smarthome | mqtt smarthome note that this project is not associated with or endorsed by http mqtt org chat https mqtt smarthome slack com https join slack com t mqtt smarthome shared invite enqtmjy5odmyndy1mzq3ltblodq5yte2zmjkytviogi0mgqyoge1yjjmzjnmmgqzy2fjn2i1ndy2mtrhzwu0zwe4zgrlmdhimgq4ngzkodq table of contents this project is an architectural proposal for using mqtt as the low level central message bus in smarthome applications architecture md architecture md contains the architectural proposal software md software md list of software written with this proposal in mind or otherwise usable devices md devices md list of devices developed with this proposal in mind or otherwise usable getting started with mqtt smarthome homematic and node red howtos homematic md a little howto on installing and using mqtt smarthome with hm2mqtt js node red and node red dashboard | iot mqtt smarthome | server |
Sublime-Text-Plugins-for-Frontend-Web-Development | div align center img src preview gif alt previews sublime in use with some of the used plugins div sublime text 3 plugins for frontend web development sublime is great and for many still the best text editor available but out of the box it lacks some features that modern competitors have already built in plugins help to stay ahead but it s a hassle to keep up with all of them in order to help you i compiled a list of plugins i use for my daily frontend web development if you know plugins that should be on this list just open an issue this list was shared by among others smashing magazine https twitter com smashingmag status 857784722373701632 umar hansa https twitter com umaar status 855385340105904128 frontendfocus http frontendfocus co issues 291 frontenddaily https twitter com frontenddaily status 868137687546568704 speckyboy https twitter com speckyboy status 864145924053970945 table of contents 1 plugins plugins 1 administrative administrative 2 general general 3 javascript javascript 4 html css htmlcss 5 linter linter 6 other other 2 themes themes 3 settings settings a name plugins 1 plugins a name administrative i administrative these plugins are kind of meta because they are not concerned with writing code package control https packagecontrol io packages package 20control this package enables you to install other packages since build 3124 you can install it within sublime via em tools em em install package control em advancednewfile https packagecontrol io packages advancednewfile a better way to create new files for instance it automatically creates a folder when needed sidebarenhancements https packagecontrol io packages sidebarenhancements adds features such as renaming to the sidebar a file icon https packagecontrol io packages a 20file 20icon add icons to the files in the sidebar projectmanager https packagecontrol io packages projectmanager organizing project files by putting them in a central location todoreview https packagecontrol io packages todoreview scans files for todo s and more findkeyconflicts https packagecontrol io packages findkeyconflicts key conflicts are inevitable when you use a lot of plugins editor config https packagecontrol io packages editorconfig maintain consistent coding styles between different editors sync settings https packagecontrol io packages sync 20settings keep sublime settings in sync via github gist package syncing https packagecontrol io packages package 20syncing keep all you settings packages etc in sync via dropbox and co sftp https packagecontrol io packages sftp transfer files to a server via ftps and sftp the plugin is like sublime nagware https en wikipedia org wiki shareware nagware you can use it for free but get a reminder to buy a licence terminalview https packagecontrol io packages terminalview a linux macos plugin for sublime text 3 that allows for terminals inside editor views a name general ii general useful for all languages all autocomplete https packagecontrol io packages all 20autocomplete indexes all open files for auto completion brackethighlighter https packagecontrol io packages brackethighlighter improves the already built in highlighting terminal https packagecontrol io packages terminal open terminal with current working directory set to the directory of the open file on a hot key aligntab https packagecontrol io packages aligntab align your code by or your own regex gitgutter https packagecontrol io packages gitgutter displays modified lines in the gutter left to the line numbers git https packagecontrol io packages git includes some git commands gitsavvy https packagecontrol io packages gitsavvy full git and github integration gitignore https packagecontrol io packages gitignore fetches templates for the gitignore provided by github https github com github gitignore local history https packagecontrol io packages local 20history keep a local history of your files dashdoc https packagecontrol io packages dashdoc open current selection in dash https kapeli com dash on a hot key text pastry https packagecontrol io packages text 20pastry extend the power of multiple selections with features such as incremental numbers and date ranges a name javascript iii javascript tern for sublime https packagecontrol io packages tern for sublime static javascript code analyzer with auto completion function argument hints go to definition and more the installation and configuration can be a little bit tricky but it s worth it choose tern over sublimecodeintel https packagecontrol io packages sublimecodeintel unmaintained and javascript completions https packagecontrol io packages javascript 20completions buggy javascript enhancements https packagecontrol io packages javascript 20enhancements another plugin that gives auto completion and alltogether improves the javascript development environment importjs https packagecontrol io packages importjs automatically manage your module imports javascript nodejs snippets https packagecontrol io packages javascript 20 26 20nodejs 20snippets jsprettier https packagecontrol io packages jsprettier integration of prettier https github com prettier prettier the opinionated javascript formatter console wrap https packagecontrol io packages console 20wrap fast way to log to console doxydoxygen https packagecontrol io packages doxydoxygen generate code documentation blocks for your functions babel https packagecontrol io packages babel syntax definitions for es6 javascript with react jsx extensions typescript https packagecontrol io packages typescript elm language support https packagecontrol io packages elm 20language 20support a name htmlcss iv html css sass https packagecontrol io packages sass sass is a preprocessor extending css and this plugins adds the language support sasssolutions https packagecontrol io packages sasssolution auto complete for variables and mixins from your settings scss file css3 https packagecontrol io packages css3 replaces the built in css support with a more up to date one includes cssnext http cssnext io support follow the instructions to make it work properly emmet https packagecontrol io packages emmet allows you to write html very fast you have to learn their way though color highlighter https packagecontrol io packages color 20highlighter a name linter v linter linters help you to spot mistakes in your code early on in order to make them work properly check the instructions in the packages for some you have to install additional tools sublimelinter https packagecontrol io packages sublimelinter sublimelinter html tidy https packagecontrol io packages sublimelinter html tidy sublimelinter contrib stylelint https packagecontrol io packages sublimelinter contrib stylelint for css choose stylelint over sublimelinter csslint https packagecontrol io packages sublimelinter csslint sublimelinter contrib scss lint https packagecontrol io packages sublimelinter contrib scss lint sublimelinter contrib eslint https packagecontrol io packages sublimelinter contrib eslint sublimelinter flow https packagecontrol io packages sublimelinter flow sublimelinter contrib elm make https packagecontrol io packages sublimelinter contrib elm make sublimelinter json https packagecontrol io packages sublimelinter json a name other vi other markdown preview https packagecontrol io packages markdown 20preview advanced csv https packagecontrol io packages advanced 20csv live server https packagecontrol io packages liveserver a name themes 2 themes the built in themes do not support recent syntax such as es2015 in the following i list some i have test and do it solarized color scheme https packagecontrol io packages solarized 20color 20scheme replaced the outdated built in one the two theme installed by babel https packagecontrol io packages babel monokai phoenix and next oceanic next color scheme https packagecontrol io packages oceanic 20next 20color 20scheme ayu https packagecontrol io packages ayu lightscript https packagecontrol io packages lightscript material theme https packagecontrol io packages material 20theme boxy theme https packagecontrol io packages boxy 20theme a name settings 3 settings disallows approving auto complete suggestions with enter to prevent ambiguous situations you have to get used to it but also sublime strongly recommends it auto complete commit on tab true auto complete delay 0 allow auto complete suggestion within snippets auto complete with fields true color scheme packages solarized color scheme solarized light tmtheme create window at startup false draw white space all ensure newline at eof on save true font face input sans narrow font size 15 highlight line true ignored packages css vintage highlights the indentation of the current scope indent guide options draw normal draw active indent to bracket true rulers 80 tab size 2 translate tabs to spaces true nb the following could lead to a lot of unnecessary changes in other s peoples files trim trailing white space on save true word wrap true a rel license href http creativecommons org licenses by 4 0 img alt creative commons license style border width 0 src https i creativecommons org l by 4 0 88x31 png a br this work is licensed under a a rel license href http creativecommons org licenses by 4 0 creative commons attribution 4 0 international license a | sublime-text sublime-text-3 frontend package plugin webdevelopment awesome awesome-list javascript | front_end |
wormhole | img alt banner src docs images banner jpg this monorepo contains the reference implementation of the wormhole protocol https wormholenetwork com to learn about how to use and build on wormhole read the docs https docs wormhole com see live contracts https docs wormholenetwork com wormhole contracts for current testnet and mainnet deployments of the wormhole smart contracts see develop md develop md for instructions on how to set up a local devnet contributing md contributing md for instructions on how to contribute to this project and security md security md for more information about our security audits and bug bounty program see docs operations md docs operations md for node operator instructions docs images overview svg this software is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license or plainly spoken this is a very complex piece of software which targets a bleeding edge experimental smart contract runtime mistakes happen and no matter how hard you try and whether you pay someone to audit it it may eat your tokens set your printer on fire or startle your cat cryptocurrencies are a high risk investment no matter how fancy | solidity golang blockchain rust | blockchain |
My-Past-Works | my past works brief description and source documents for my past embedded system design experience this respo is being udated | os |
|
Data-Engineering-Database-Dev-Developing-SQL-Server-Databases | data engineering database dev developing sql server databases data engineering database dev developing sql server databases | server |
|
styleguides | style guide resources http styleguides io this site lists lots of useful resources to help you learn about and create your own front end style guides and pattern libraries best of all you can add your own resources to share with others and you don t even need to download any files to do that you can do it all within github s web interface to do add screenshots to resources that are examples additional hover state to resources that doesn t rely on transform transfer all the resources from gimmiebar https gimmebar com collection 4ecd439c2f0aaad734000022 front end styleguides set up a domain set up tags and filters how to add resources the easy way the site uses a templating system called jekyll that generates static html files and makes it easy hopefully to add files that share the same template github plays nicely with jekyll and lets you add and edit files using the web interface so no need to download anything you can do it straight in repository on github first you ll need to decide whether your resource is a book article tool podcast guide example or talk the only folders you ll need to worry about are resourcearticle resourcebook resourceexample resourcepodcast resourcetalk resourcetool if it s a tool open up the resourcetool folder in there you ll find a readme that ll explain exactly what to do to how to add resources the hard way if you want to get this repo running locally on your machine you ll need to get set up with jekyll to install jekyll open up your command line and type gem install jekyll then bundle exec jekyll serve to get the server address where the static files are built you only need to worry about the files in the folders prefixed by resource and you can follow the instructions in the readme in each folder if you re not sure what to do | front_end |
|
iot-manager-demo | iot manager demo demo sketches for iot manager for ios https itunes apple com us app iot manager id1155934877 iot manager for android https play google com store apps details id ru esp8266 iotmanager pull requests welcomed | server |
|
NLP-Sentiment-Classification | nlp sentiment classification sentiment classification with natural language processing on lstm | ai |
|
INFO3440Project | bsc information technology software engineering project group 4 this is for the development of akipro more information can be found in the repositories wiki | server |
|
dropship-business | dropship business backend app implemented with python flask postgresql a dropship business database design that utilizes flask orm to manage backend app development with python api was tested through a client e g insomnia and relational database management was implemented via postgresql project files associated with the app are located in the zip file titled dropship business orm the package includes entity relationship diagram for a dropship business model draw io postgres sql schema v1 0 2021 11 08 sql initial design concepts not used in final version includes examples of one to one many to one and many to many foreign key references pg dump 11 23 2021 sql a database dump created using pg dump features examples where database queries are optimized through the use of index flask directory containing rest api that features subfolders for database migrations flask migrate a rest api built on python backend using the flask framework insomnia request collection 2021 11 21 json an insomnia request collection featuring endpoints for various get post put patch and delete http methods readme md documentation with a retrospective analysis regarding the project technologies utilized in the app development process docker instructions on docker installation can be found here https docs docker com get docker postgresql instructions on postgresql installation can be referenced here https www postgresql org pgadmin 4 a well written tutorial on how to install and run psql using docker can be found here https dev to shree j how to install and run psql using docker 41j2 insomnia instructions on insomnia installation can be referenced here https insomnia rest flask sqalchemy instructions on flask sqlalchemy installation can be referenced here https pythonbasics org flask sqlalchemy venv instructions regarding the python virtual environment venv from which the app directory operates can be found here https docs python org 3 library venv html api reference tables accounts api name http method parameter path description index get accounts http localhost 5000 accounts retrieves an index of every record stored in the accounts table update patch accounts id http localhost 5000 accounts id grants an ability to update a record in the accounts table delete account delete accounts id http localhost 5000 accounts id grants an ability to delete a selected user account account query parameters name data type required optional description name string optional lists the name associated with an account record can be updated with patch request purpose string optional lists the purpose associated with an account record can be updated with patch request brands api name http method parameter path description index get brands http localhost 5000 brands retrieves an index of every record stored in the brands table new brand post brands http localhost 5000 brands instantiates a new record in the brands table update patch brands id http localhost 5000 brands id grants an ability to update a record in the brands table brand query parameters name data type required optional description address string optional lists the address associated with a given record can be updated with patch request email string optional lists the email associated with a given record can be updated with patch request name string optional lists the name associated with a given record can be updated with patch request poducts api name http method parameter path description index get products http localhost 5000 products retrieves an index of every record stored in the products table product query parameters name data type required optional description description string optional lists the description of a given record orders api name http method parameter path description index get orders http localhost 5000 orders retrieves an index of every record stored in the orders table order query parameters name data type required optional description account id integer optional lists the account id associated with an order record brand id integer optional lists the brand id associated with an order record product id integer optional lists the product id associated with an order record product quantity integer optional lists the product quantity associated with an order record parcels api name http method parameter path description index get parcels http localhost 5000 parcels retrieves an index of every record stored in the parcels table parcel query parameters name data type required optional description customer id integer optional lists the customer id associated with a parcel record distributor id integer optional lists the distributor id associated with a parcel record estimated delivery date date optional lists the estimated delivery date associated with a parcel record shipment date date optional lists the shipment date associated with a parcel record retrospective analysis designing an er diagram encompassing all of the entities attributes and relationships needed for initial database creation was key this provides an allowance to focus later efforts on developing a more efficient database next shifting focus on creating table structures with raw sql based on many to many one to many and one to one relationships presented in the er diagram lead to a reassessment of entity relations helping to further optimize and strengthen database architecture raw sql statements were then converted to orm statements using sqlalchemy with flask using crud operations to set up the database data was inputted into the database using postgresql insert statements on the pgadmin 4 local server http based api was developed and interacted with using insomnia the application was implemented with orm due to it s known advantage of speeding up the development process and with orm database migration is simplified when leveraging object oriented programming methods in python via sqlalchemy python automates the process of database maintenance so upgrades may occur on a scheduled basis with the click of a button improvements to the database design include modifying blueprint related functions to dump as a string literals when converting to json in order to instantiate get post put patch method requests this would allow the app to manage retrieve and update data related to price points tracking information and other important data stored on the database in numerical decimal form further adding relationships between entities is another method we could explore allowing us to retrieve datasets that may not immediately seem useful in the nearterm but could prove invaluable in the long term i e understanding which products ship out the most from what distribution channels in a given time frame unfortunately without a larger dataset collected over a much larger time frame predicting the effectiveness of tentative entity relationships is merely speculative in nature fortunately orm will allow for such upgrades to occur quite reactively as database relationship opportunities present themselves | server |
|
tdd-spring-react | test driven web application development with spring react this application is built with spring and react and everything is implemented with tdd approach you can find full training at udemy test driven web application development with spring react https www udemy com course test driven web application development with spring react referralcode 5ee4fa2e84e78941f649 | spring react tdd react-router redux testing-library-react jest junit | front_end |
dev-setup | dev setup p align center img src https raw githubusercontent com donnemartin dev setup resources master res repo header gif p motivation setting up a new developer machine can be an ad hoc manual and time consuming process dev setup aims to simplify the process with easy to understand instructions and dotfiles scripts to automate the setup of the following os x updates and xcode command line tools os x defaults geared towards developers developer tools vim bash tab completion curl git gnu core utils python ruby etc developer apps iterm2 sublime text atom virtualbox vagrant docker chrome etc python data analysis ipython notebook numpy pandas scikit learn matplotlib etc big data platforms spark with ipython notebook integration and mapreduce cloud services amazon web services boto aws cli s3cmd etc and heroku common data stores mysql postgresql mongodb redis and elasticsearch javascript web development node js jshint and less android development java android sdk android studio intellij idea but i don t need all these tools dev setup is geared to be more of an organized reference of various developer tools you re not meant to install everything if you re interested in automation dev setup provides a customizable setup script single setup script there s really no one size fits all solution for developers so you re encouraged to make tweaks to suit your needs credits credits this repo builds on the awesome work from mathias bynens https github com mathiasbynens and nicolas hery https github com nicolashery for automation what about vagrant docker or boxen vagrant vagrant and docker docker are great tools and are set up by this repo i ve found that vagrant works well to ensure dev matches up with test and production tiers i ve only started playing around with docker for side projects and it looks very promising however for mac users docker and vagrant both rely on virtual machines which have their own considerations pros cons boxen https boxen github com is a cool solution although some might find it better geared towards more mature companies or devops teams i ve seen some discussions of difficulties as it is using puppet under the hood https github com boxen our boxen issues 742 this repo takes a more light weight approach to automation using a combination of homebrew homebrew cask and shell scripts to do basic system setup it also provides easy to understand instructions for installation configuration and usage for each developer app or tool p align center img src https raw githubusercontent com donnemartin dev setup resources master res iterm2 png p sections summary section 1 contains the dotfiles scripts and instructions to set up your system sections 2 through 7 detail more information about installation configuration and usage for topics in section 1 section 1 installation scripts tested on os x 10 10 yosemite and 10 11 el capitan single setup script single setup script bootstrap sh script bootstrapsh script syncs dev setup to your local home directory osxprep sh script osxprepsh script updates os x and installs xcode command line tools brew sh script brewsh script installs common homebrew formulae and apps osx sh script osxsh script sets up os x defaults geared towards developers pydata sh script pydatash script sets up python for data analysis aws sh script awssh script sets up spark hadoop mapreduce and amazon web services datastores sh script datastoressh script sets up common data stores web sh script websh script sets up javascript web development android sh script androidsh script sets up android development section 2 general apps and tools sublime text sublime text atom atom terminal customization terminal customization iterm2 iterm2 vim vim git git virtualbox virtualbox vagrant vagrant docker docker homebrew homebrew ruby and rbenv ruby and rbenv python python pip pip virtualenv virtualenv virtualenvwrapper virtualenvwrapper section 3 python data analysis anaconda anaconda ipython notebook ipython notebook numpy numpy pandas pandas matplotlib matplotlib seaborn seaborn scikit learn scikit learn scipy scipy flask flask bokeh bokeh section 4 big data aws and heroku spark spark mapreduce mapreduce awesome aws awesome aws aws account aws account aws cli aws cli saws saws boto boto s3cmd s3cmd s3distcp s3distcp s3 parallel put s3 parallel put redshift redshift kinesis kinesis lambda lambda aws machine learning aws machine learning heroku heroku section 5 data stores mysql mysql mysql workbench mysql workbench mongodb mongodb redis redis elasticsearch elasticsearch section 6 javascript web development node js nodejs jshint jshint less less section 7 android development java java android sdk android sdk android studio android studio intellij idea intellij idea section 8 misc contributions contributions credits credits contact info contact info license license section 1 installation single setup script running with git clone the repo git clone https github com donnemartin dev setup git cd dev setup run the dots script with command line arguments since you probably don t want to install every section the dots script supports command line arguments to run only specified sections simply pass in the scripts scripts that you want to install below are some examples for more customization you can clone clone the repo or fork https github com donnemartin dev setup fork the repo and tweak the dots script and its associated components to suit your needs run all dots all run bootstrap sh osxprep sh brew sh and osx sh dots bootstrap osxprep brew osx run bootstrap sh osxprep sh brew sh and osx sh pydata sh aws sh and datastores sh dots bootstrap osxprep brew osx pydata aws datastores running without git curl o https raw githubusercontent com donnemartin dev setup master dots dots add args here scripts dots https github com donnemartin dev setup blob master dots runs specified scripts bootstrap sh https github com donnemartin dev setup blob master bootstrap sh syncs dev setup to your local home directory osxprep sh https github com donnemartin dev setup blob master osxprep sh updates os x and installs xcode command line tools brew sh https github com donnemartin dev setup blob master brew sh installs common homebrew formulae and apps osx sh https github com donnemartin dev setup blob master osx sh sets up os x defaults geared towards developers pydata sh https github com donnemartin dev setup blob master pydata sh sets up python for data analysis aws sh https github com donnemartin dev setup blob master aws sh sets up spark hadoop mapreduce and amazon web services datastores sh https github com donnemartin dev setup blob master datastores sh sets up common data stores web sh https github com donnemartin dev setup blob master web sh sets up javascript web development android sh https github com donnemartin dev setup blob master android sh sets up android development notes dots will initially prompt you to enter your password dots might ask you to re enter your password at certain stages of the installation if os x updates require a restart simply run dots again to resume where you left off when installing the xcode command line tools a dialog box will confirm installation once xcode is installed follow the instructions on the terminal to continue dots runs brew sh which takes awhile to complete as some formulae need to be installed from source when dots completes be sure to restart your computer for all updates to take effect i encourage you to read through section 1 so you have a better idea of what each installation script does the following discussions describe in greater detail what is executed when running the dots https github com donnemartin dev setup blob master dots script bootstrap sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res commands png br p the bootstrap sh script will sync the dev setup repo to your local home directory this will include customizations for vim bash curl git tab completion aliases a number of utility functions etc section 2 of this repo describes some of the customizations running with git first fork or clone the repo clone the repo the bootstrap sh script will pull in the latest version and copy the files to your home folder source bootstrap sh to update later on just run that command again alternatively to update while avoiding the confirmation prompt set f source bootstrap sh running without git to sync dev setup to your local home directory without git run the following cd curl l https github com donnemartin dev setup tarball master tar xzv strip components 1 exclude readme md bootstrap sh license to update later on just run that command again optional specify path if path exists it will be sourced along with the other files before any feature testing such as detecting which version of ls is being used takes place here s an example path file that adds usr local bin to the path bash export path usr local bin path optional add custom commands if extra exists it will be sourced along with the other files you can use this to add a few custom commands without the need to fork this entire repository or to add commands you don t want to commit to a public repository my extra looks something like this bash git credentials git author name donne martin git committer name git author name git config global user name git author name git author email donne martin gmail com git committer email git author email git config global user email git author email pip should only run if there is a virtualenv currently activated export pip require virtualenv true install or upgrade a global package usage gpip install upgrade pip setuptools virtualenv gpip pip require virtualenv pip you could also use extra to override settings functions and aliases from the dev setup repository although it s probably better to fork the dev setup repository https github com donnemartin dev setup fork osxprep sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res xcode jpg br p run the osxprep sh script osxprep sh osxprep sh will first install all updates if a restart is required simply run the script again once all updates are installed osxprep sh will then install xcode command line tools install xcode command line tools if you want to go the manual route you can also install all updates by running app store selecting the updates icon then updating both the os and installed apps install xcode command line tools an important dependency before many tools such as homebrew can work is the command line tools for xcode these include compilers like gcc that will allow you to build from source if you are running os x 10 9 mavericks or later then you can install the xcode command line tools directly from the command line with xcode select install note the osxprep sh script executes this command running the command above will display a dialog where you can either install xcode and the command line tools install the command line tools only cancel the install os x 10 8 and older if you re running 10 8 or older you ll need to go to http developer apple com downloads http developer apple com downloads and sign in with your apple id the same one you use for itunes and app purchases unfortunately you re greeted by a rather annoying questionnaire all questions are required so feel free to answer at random once you reach the downloads page search for command line tools and download the latest command line tools os x mountain lion for xcode open the dmg file once it s done downloading and double click on the mpkg installer to launch the installation when it s done you can unmount the disk in finder brew sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res homebrew2 png br p when setting up a new mac you may want to install homebrew http brew sh a package manager that simplifies installing and updating applications or libraries some of the apps installed by the brew sh script include chrome firefox sublime text atom dropbox evernote skype slack alfred virtualbox vagrant docker etc for a full listing of installed formulae and apps refer to the commented brew sh source file https github com donnemartin dev setup blob master brew sh directly and tweak it to suit your needs run the brew sh script brew sh the brew sh script takes awhile to complete as some formulae need to be installed from source for your terminal customization to take full effect quit and re start the terminal osx sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res osx png br p when setting up a new mac you may want to set os x defaults geared towards developers the osx sh script also configures common third party apps such sublime text and chrome note i strongly encourage you read through the commented osx sh source file https github com donnemartin dev setup blob master osx sh and tweak any settings based on your personal preferences the script defaults are intended for you to customize for example if you are not running an ssd you might want to change some of the settings listed in the ssd section run the osx sh script osx sh for your terminal customization to take full effect quit and re start the terminal pydata sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res pydata png br p to set up a development environment to work with python and data analysis without relying on the more heavyweight anaconda anaconda distribution run the pydata sh script pydata sh this will install virtualenv virtualenv and virtualenvwrapper virtualenvwrapper it will then set up two virtual environments loaded with the packages you will need to work with data in python 2 and python 3 to switch to the python 2 virtual environment run the following virtualenvwrapper command workon py2 data to switch to the python 3 virtual environment run the following virtualenvwrapper command workon py3 data then start working with the installed packages for example ipython notebook section 3 python data analysis section 3 python data analysis describes the installed packages and usage aws sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res aws png br p to set up a development environment to work with spark hadoop mapreduce and amazon web services run the aws sh script aws sh section 4 big data aws and heroku section 4 big data aws and heroku describes the installed packages and usage datastores sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res datastores png br p to set up common data stores run the datastores sh script datastores sh section 5 data stores section 5 data stores describes the installed packages and usage web sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res webdev png br p to set up a javascript web development environment run the web sh script web sh section 6 web development section 6 web development describes the installed packages and usage android sh script p align center img src https raw githubusercontent com donnemartin dev setup resources master res android png br p to set up an android development environment run the android sh script android sh section 7 android development section 7 android development describes the installed packages and usage section 2 general apps and tools sublime text p align center img src https raw githubusercontent com donnemartin dev setup resources master res sublime png br p with the terminal the text editor is a developer s most important tool everyone has their preferences but unless you re a hardcore vim http en wikipedia org wiki vim text editor user a lot of people are going to tell you that sublime text http www sublimetext com is currently the best one out there installation the brew sh script brewsh script installs sublime text if you prefer to install it separately go ahead and download http www sublimetext com it open the dmg file drag and drop in the applications folder note at this point i m going to create a shortcut on the os x dock for both for sublime text to do so right click on the running application and select options keep in dock sublime text is not free but i think it has an unlimited evaluation period anyhow we re going to be using it so much that even the seemingly expensive 70 price tag is worth every penny if you can afford it i suggest you support http www sublimetext com buy this awesome tool configuration the osx sh script osxsh script contains sublime text configurations soda theme the soda theme https github com buymeasoda soda theme is a great ui theme for sublime text especially if you use a dark theme and think the side bar sticks out like a sore thumb installation with sublime package control if you are using will bond s excellent sublime package control http wbond net sublime packages package control you can easily install soda theme via the package control install package menu item the soda theme package is listed as theme soda in the packages list installation with git alternatively if you are a git user you can install the theme and keep up to date by cloning the repo directly into your packages directory in the sublime text application settings area you can locate your sublime text packages directory by using the menu item preferences browse packages while inside the packages directory clone the theme repository using the command below git clone https github com buymeasoda soda theme theme soda activating the theme on sublime text 2 open your user settings preferences file sublime text 2 preferences settings user add or update your theme entry to be theme soda light sublime theme or theme soda dark sublime theme example sublime text 2 user settings theme soda light sublime theme activating the theme on sublime text 3 open your user settings preferences file sublime text preferences settings user add or update your theme entry to be theme soda light 3 sublime theme or theme soda dark 3 sublime theme example sublime text 3 user settings theme soda light 3 sublime theme changing monokai comment color although monokai is a great color scheme i find that comments can be difficult to see you can follow these instructions http stackoverflow com a 32686509 to change the color of the default theme i set my comments color to e6db74 dict dict key foreground key string e6db74 string dict dict atom p align center img src https raw githubusercontent com donnemartin dev setup resources master res atom png br p atom https github com atom atom is a great open source editor from github that is rapidly gaining contributors and popularity installation the brew sh script brewsh script installs atom if you prefer to install it separately download https atom io it open the dmg file drag and drop in the applications folder configuration atom has a great package manager that allows you to easily install both core and community packages terminal customization p align center img src https raw githubusercontent com donnemartin dev setup resources master res terminal png br p since we spend so much time in the terminal we should try to make it a more pleasant and colorful place configuration the bootstrap sh script bootstrapsh script and osx sh script osxsh script contain terminal customizations iterm2 p align center img src https raw githubusercontent com donnemartin dev setup resources master res iterm2 png br p i prefer iterm2 over the stock terminal as it has some additional great features https www iterm2 com features html download and install iterm2 the newest version even if it says beta release in finder drag and drop the iterm application file into the applications folder you can now launch iterm through the launchpad for instance let s just quickly change some preferences in iterm preferences in the tab profiles create a new one with the icon and rename it to your first name for example then select other actions set as default under the section window change the size to something better like columns 125 and rows 35 i also like to set general working directory reuse previous session s directory finally i change the way the option key works so that i can quickly jump between words as described here https coderwall com p h6yfda use and to jump forwards backwards words in iterm 2 on os x when done hit the red x in the upper left saving is automatic in os x preference panes close the window and open a new one to see the size change configuration since we spend so much time in the terminal we should try to make it a more pleasant and colorful place what follows might seem like a lot of work but trust me it ll make the development experience so much better now let s add some color i m a big fan of the solarized http ethanschoonover com solarized color scheme it is supposed to be scientifically optimal for the eyes i just find it pretty in iterm2 preferences under profiles and colors go to load presets and select solarized dark to activate it voila at this point you can also change your computer s name which shows up in this terminal prompt if you want to do so go to system preferences sharing for example i changed mine from donne s macbook pro to just macbook pro so it shows up as macbook pro in the terminal now we have a terminal we can work with vim p align center img src https raw githubusercontent com donnemartin dev setup resources master res vim png br p although sublime text will be our main editor it is a good idea to learn some very basic usage of vim http www vim org it is a very popular text editor inside the terminal and is usually pre installed on any unix system for example when you run a git commit it will open vim to allow you to type the commit message i suggest you read a tutorial on vim grasping the concept of the two modes of the editor insert by pressing i and normal by pressing esc to exit insert mode will be the part that feels most unnatural after that it s just remembering a few important keys configuration the bootstrap sh script bootstrapsh script contains vim customizations virtualbox p align center img src https raw githubusercontent com donnemartin dev setup resources master res virtualbox png br p virtualbox creates and manages virtual machines it s a solid free solution to its commercial rival vmware installation the brew sh script brewsh script installs virtualbox if you prefer to install it separately you can download it here https www virtualbox org wiki downloads or run brew update brew install caskroom cask brew cask brew cask install appdir applications virtualbox vagrant p align center img src https raw githubusercontent com donnemartin dev setup resources master res vagrant jpeg br p vagrant creates and configures development environments you can think of it as a higher level wrapper around virtualbox and configuration management tools like ansible chef puppet and salt vagrant also supports docker containers and server environments like amazon ec2 installation the brew sh script brewsh script installs vagrant if you prefer to install it separately you can download it here https www vagrantup com or run brew update brew install caskroom cask brew cask brew cask install appdir applications vagrant docker p align center img src https raw githubusercontent com donnemartin dev setup resources master res docker png br p docker automates the deployment of applications inside software containers i think the following quote http www linux com news enterprise cloud computing 731454 docker a shipping container for linux code explains docker nicely docker is a tool that can package an application and its dependencies in a virtual container that can run on any linux server this helps enable flexibility and portability on where the application can run whether on premise public cloud private cloud bare metal etc installation the brew sh script brewsh script installs docker if you prefer to install it separately you can download it here https www docker com or run brew update brew install docker brew install boot2docker configuration initialize and start boot2docker only need to do this once boot2docker init start the vm boot2docker up set the docker host environment variable and fill in ip and port based on the output from the boot2coker up command export docker host tcp ip port git p align center img src https raw githubusercontent com donnemartin dev setup resources master res git png br p what s a developer without git http git scm com installation git should have been installed when you ran through the install xcode command line tools install xcode command line tools section configuration to check your version of git run the following command git version and which git should output usr local bin git let s set up some basic configuration download the gitconfig https raw githubusercontent com donnemartin dev setup master gitconfig file to your home directory cd curl o https raw githubusercontent com donnemartin dev setup master gitconfig it will add some color to the status branch and diff git commands as well as a couple aliases feel free to take a look at the contents of the file and add to it to your liking next we ll define your git user should be the same name and email you use for github https github com and heroku http www heroku com git config global user name your name here git config global user email your email youremail com they will get added to your gitconfig file to push code to your github repositories we re going to use the recommended https method versus ssh so you don t have to type your username and password everytime let s enable git password caching as described here https help github com articles set up git git config global credential helper osxkeychain note on a mac it is important to remember to add ds store a hidden os x system file that s put in folders to your gitignore files you can take a look at this repository s gitignore https github com donnemartin dev setup blob master gitignore file for inspiration also check out github s collection of gitignore templates https github com github gitignore homebrew p align center img src https raw githubusercontent com donnemartin dev setup resources master res homebrew png br p package managers make it so much easier to install and update applications for operating systems or libraries for programming languages the most popular one for os x is homebrew http brew sh installation the brew sh script brewsh script installs homebrew and a number of useful homebrew formulae and apps if you prefer to install it separately run the following command and follow the steps on the screen ruby e curl fssl https raw githubusercontent com homebrew install master install usage to install a package or formula in homebrew vocabulary simply type brew install formula to update homebrew s directory of formulae run brew update note i ve seen that command fail sometimes because of a bug if that ever happens run the following when you have git installed cd usr local git fetch origin git reset hard origin master to see if any of your packages need to be updated brew outdated to update a package brew upgrade formula homebrew keeps older versions of packages installed in case you want to roll back that rarely is necessary so you can do some cleanup to get rid of those old versions brew cleanup to see what you have installed with their version numbers brew list versions ruby and rbenv p align center img src https raw githubusercontent com donnemartin dev setup resources master res ruby png br p ruby http www ruby lang org is already installed on unix systems but we don t want to mess around with that installation more importantly we want to be able to use the latest version of ruby installation brew sh provides rbenv https github com rbenv rbenv and ruby build https github com rbenv ruby build which allow you to manage multiple versions of ruby on the same machine brew sh adds the following line to your extra file to initialize rbenv eval rbenv init usage rbenv uses ruby build to download compile and install new versions of ruby you can see all versions available to download and install ruby build definitions to install a new version of ruby list all available versions installed on the system rbenv install l install a ruby version rbenv install 2 2 3 to switch ruby versions set a local application specific ruby version in the current directory rbenv local 1 9 3 set the global version of ruby to be used in all shells rbenv global 2 0 0 rbenv by default will install ruby versions into a directory of the same name under rbenv versions because your user owns this directory you no longer need to use sudo to install gems python p align center img src https raw githubusercontent com donnemartin dev setup resources master res python png br p os x like linux ships with python http python org already installed but you don t want to mess with the system python some system tools rely on it etc so we ll install our own version with homebrew it will also allow us to get the very latest version of python 2 7 and python 3 installation the brew sh script brewsh script installs the latest versions of python 2 and python 3 pip pip https pypi python org pypi pip is the python package manager installation the pydata sh script pydatash script installs pip usage here are a couple pip commands to get you started to install a python package pip install package to upgrade a package pip install upgrade package to see what s installed pip freeze to uninstall a package pip uninstall package virtualenv virtualenv http www virtualenv org is a tool that creates an isolated python environment for each of your projects for a particular project instead of installing required packages globally it is best to install them in an isolated folder in the project say a folder named venv that will be managed by virtualenv the advantage is that different projects might require different versions of packages and it would be hard to manage that if you install packages globally it also allows you to keep your global usr local lib python2 7 site packages folder clean installation the pydata sh script pydatash script installs virtualenv usage let s say you have a project in a directory called myproject to set up virtualenv for that project cd myproject virtualenv venv distribute if you want your virtualenv to also inherit globally installed packages like ipython or numpy mentioned above use virtualenv venv distribute system site packages these commands create a venv subdirectory in your project where everything is installed you need to activate it first though in every terminal where you are working on your project source venv bin activate you should see a venv appear at the beginning of your terminal prompt indicating that you are working inside the virtualenv now when you install something pip install package it will get installed in the venv folder and not conflict with other projects important remember to add venv to your project s gitignore file so you don t include all of that in your source code virtualenvwrapper virtualenvwrapper https virtualenvwrapper readthedocs org en latest is a set of extensions that includes wrappers for creating and deleting virtual environments and otherwise managing your development workflow making it easier to work on more than one project at a time without introducing conflicts in their dependencies main features include organizes all of your virtual environments in one place wrappers for managing your virtual environments create delete copy use a single command to switch between environments tab completion for commands that take a virtual environment as argument installation the pydata sh script pydatash script installs virtualenvwrapper usage create a new virtual environment when you create a new environment it automatically becomes the active environment mkvirtualenv env name remove an existing virtual environment the environment must be deactivated see below before it can be removed rmvirtualenv env name activate a virtual environment will also list all existing virtual environments if no argument is passed workon env name deactivate the currently active virtual environment note that workonwill automatically deactivate the current environment before activating a new one deactivate section 3 python data analysis anaconda p align center img src https raw githubusercontent com donnemartin dev setup resources master res anaconda png br p anaconda is a free distribution of the python programming language for large scale data processing predictive analytics and scientific computing that aims to simplify package management and deployment installation the pydata sh script pydatash script installs packages you need to run python data applications alternatively you can install the more heavy weight anaconda instead follow instructions to install anaconda http docs continuum io anaconda install html or the more lightweight miniconda http conda pydata org miniconda html ipython notebook p align center img src https raw githubusercontent com donnemartin data science ipython notebooks master images readme 1200x800 gif p ipython http ipython org is an awesome project which provides a much better python shell than the one you get from running python in the command line it has many cool functions running unix commands from the python shell easy copy paste creating matplotlib charts in line etc and i ll let you refer to the documentation http ipython org ipython doc stable index html to discover them ipython notebook is a web based interactive computational environment where you can combine code execution text mathematics plots and rich media into a single document installation the pydata sh script pydatash script installs ipython notebook if you prefer to install it separately run pip install ipython notebook if you run into an issue about pyzmq refer to the following stack overflow post http stackoverflow com questions 24995438 pyzmq missing when running ipython notebook and run pip uninstall ipython pip install ipython all usage ipython notebook if you d like to see some examples here are a couple of my repos that use ipython notebooks heavily data science ipython notebooks https github com donnemartin data science ipython notebooks interactive coding challenges https github com donnemartin interactive coding challenges numpy p align center img src https raw githubusercontent com donnemartin dev setup resources master res numpy png br p numpy adds python support for large multi dimensional arrays and matrices along with a large library of high level mathematical functions to operate on these arrays installation the pydata sh script pydatash script installs numpy if you prefer to install it separately run pip install numpy usage refer to the following numpy ipython notebook https github com donnemartin data science ipython notebooks numpy pandas p align center img src https raw githubusercontent com donnemartin dev setup resources master res pandas png br p pandas is a software library written for data manipulation and analysis in python offers data structures and operations for manipulating numerical tables and time series installation the pydata sh script pydatash script installs pandas if you prefer to install it separately run pip install pandas usage refer to the following pandas ipython notebooks https github com donnemartin data science ipython notebooks pandas matplotlib p align center img src https raw githubusercontent com donnemartin dev setup resources master res matplotlib png br p matplotlib is a python 2d plotting library which produces publication quality figures in a variety of hardcopy formats and interactive environments across platforms installation the pydata sh script pydatash script installs matplotlib if you prefer to install it separately run pip install matplotlib usage refer to the following matplotlib ipython notebooks https github com donnemartin data science ipython notebooks matplotlib seaborn p align center img src https raw githubusercontent com donnemartin dev setup resources master res seaborn png br p seaborn is a python visualization library based on matplotlib it provides a high level interface for drawing attractive statistical graphics installation the pydata sh script pydatash script installs matplotlib if you prefer to install it separately run pip install seaborn usage refer to the following matplotlib with seaborn ipython notebooks https github com donnemartin data science ipython notebooks matplotlib scikit learn p align center img src https raw githubusercontent com donnemartin dev setup resources master res scikitlearn png br p scikit learn adds python support for large multi dimensional arrays and matrices along with a large library of high level mathematical functions to operate on these arrays installation the pydata sh script pydatash script installs scikit learn if you prefer to install it separately run pip install scikit learn usage refer to the following scikit learn ipython notebooks https github com donnemartin data science ipython notebooks scikit learn scipy p align center img src https raw githubusercontent com donnemartin dev setup resources master res scipy png br p scipy is a collection of mathematical algorithms and convenience functions built on the numpy extension of python it adds significant power to the interactive python session by providing the user with high level commands and classes for manipulating and visualizing data installation the pydata sh script pydatash script installs scipy if you prefer to install it separately run pip install scipy usage refer to the following scipy ipython notebooks https github com donnemartin data science ipython notebooks statistical inference scipy flask p align center img src https raw githubusercontent com donnemartin dev setup resources master res flask png br p flask is a micro web application framework written in python installation the pydata sh script pydatash script installs scipy if you prefer to install it separately run pip install flask usage coming soon refer to the following flask ipython notebooks https github com donnemartin data science ipython notebooks bokeh p align center img src https raw githubusercontent com donnemartin dev setup resources master res bokeh png br p bokeh is a python interactive visualization library that targets modern web browsers for presentation its goal is to provide elegant concise construction of novel graphics in the style of d3 js but also deliver this capability with high performance interactivity over very large or streaming datasets bokeh can help anyone who would like to quickly and easily create interactive plots dashboards and data applications installation the pydata sh script pydatash script installs bokeh if you prefer to install it separately run pip install bokeh usage coming soon refer to the following bokeh ipython notebooks https github com donnemartin data science ipython notebooks section 4 big data aws and heroku spark p align center img src https raw githubusercontent com donnemartin dev setup resources master res spark png br p spark is an in memory cluster computing framework up to 100 times faster for certain applications and is well suited for machine learning algorithms installation the aws sh script aws script installs spark locally it also hooks up spark to run within the ipython notebook by configuring your bash profile and adding the repo s profile pyspark to ipython if you prefer to install it separately run brew install apache spark usage run spark locally pyspark run spark within ipython notebook ipython notebook profile pyspark refer to the following spark ipython notebook https github com donnemartin data science ipython notebooks spark spark is also supported on aws elastic mapreduce as described here https aws amazon com blogs aws new apache spark on amazon emr to create a cluster run the following command with the aws cli aws cli replacing mykeypair with the name of your keypair to ssh into your cluster aws emr create cluster name spark cluster ami version 3 8 applications name spark ec2 attributes keyname mykeypair instance type m3 xlarge instance count 3 use default roles mapreduce p align center img src https raw githubusercontent com donnemartin dev setup resources master res mrjob png br p mrjob supports mapreduce jobs in python running them locally or on hadoop clusters such as aws elastic mapreduce emr installation mrjob is python 2 only the aws sh script aws script installs mrjob locally if you prefer to install it separately run pip install mrjob the aws sh script also syncs the template mrjob conf file to your home folder note running the aws sh script will overwrite any existing mrjob conf file update the config file with your credentials keypair region and s3 bucket paths runners emr aws access key id youraccesskey aws secret access key yoursecretkey aws region us east 1 ec2 key pair yourkeypair ec2 key pair file ssh yourkeypair pem s3 scratch uri s3 yourbucketscratch s3 log uri s3 yourbucketlog usage refer to the following mrjob ipython notebook https github com donnemartin data science ipython notebooks mapreduce python awesome aws awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome p align center img src https raw githubusercontent com donnemartin data science ipython notebooks master images aws png p br awesome aws https github com donnemartin awesome aws is a curated list of awesome aws libraries open source repos guides blogs and other resources it s a great way to stay up to date with the various aws backed and community led efforts geared towards aws the fiery meter of awsome hot repos in awesome aws are visually tagged based on their popularity repo with 0100 stars fire repo with 0200 stars fire fire repo with 0500 stars fire fire fire repo with 1000 stars fire fire fire fire repo with 2000 stars fire fire fire fire fire repos not on the fiery meter of awsome can still be awesome see a note on repo awsomeness https github com donnemartin awesome aws blob master contributing md a note on repo awsomeness aws account to start using aws you first need to sign up for an account sign up for aws when you sign up for amazon web services aws your aws account is automatically signed up for all services in aws you are charged only for the services that you use new users are eligible for 12 months of usage through the aws free tier http aws amazon com free to create an aws account open http aws amazon com and then click sign up follow the on screen instructions part of the sign up procedure involves receiving a phone call and entering a pin using the phone keypad note your aws account id aws cli p align center img src https raw githubusercontent com donnemartin dev setup resources master res aws cli png br p the aws command line interface is a unified tool to manage aws services allowing you to control multiple aws services from the command line and to automate them through scripts installation the aws sh script aws script installs the aws cli if you prefer to install it separately run pip install awscli run the following command to configure the aws cli aws configure alternatively the aws sh script also syncs the template aws folder to your home folder note running the aws sh script will overwrite any existing aws folder update the config file with your credentials and location default region us east 1 default aws access key id youraccesskey aws secret access key yoursecretkey be careful you do not accidentally check in your credentials the gitignore file is set to ignore files with credentials usage refer to the following aws cli ipython notebook https github com donnemartin data science ipython notebooks aws saws http i imgur com vzc5zma gif although the aws cli https github com aws aws cli is a great resource to manage your aws powered services it s tough to remember usage of 50 top level commands 1400 subcommands countless command specific options resources such as instance tags and buckets saws a supercharged aws cli saws aims to supercharge the aws cli with features focusing on improving ease of use increasing productivity under the hood saws is powered by the aws cli and supports the same commands and command structure saws and aws cli usage aws command subcommand parameters options saws features auto completion of commands subcommands options auto completion of resources bucket names instance ids instance tags more coming soon todo add more resources customizable shortcuts fuzzy completion of resources and shortcuts syntax and output highlighting execution of shell commands command history contextual help toolbar options saws is available for mac linux unix and windows windows support http i imgur com eo12q9t png installation and usage refer to the repo link https github com donnemartin saws boto p align center img src https raw githubusercontent com donnemartin dev setup resources master res boto png br p boto is the official aws sdk for python installation the aws sh script aws script installs boto if you prefer to install it separately run pip install boto boto uses the same configuration as described in the aws cli aws cli section usage refer to the following boto ipython notebook https github com donnemartin data science ipython notebooks aws s3cmd p align center img src https raw githubusercontent com donnemartin dev setup resources master res s3cmd png br p before i discovered s3cmd http s3tools org s3cmd i had been using the s3 console http aws amazon com console to do basic operations and boto https boto readthedocs org en latest to do more of the heavy lifting however sometimes i just want to hack away at a command line to do my work i ve found s3cmd to be a great command line tool for interacting with s3 on aws s3cmd is written in python is open source and is free even for commercial use it offers more advanced features than those found in the aws cli http aws amazon com cli installation s3cmd is python 2 only the aws sh script aws script installs s3cmd if you prefer to install it separately run pip install s3cmd running the following command will prompt you to enter your aws access and aws secret keys to follow security best practices make sure you are using an iam account as opposed to using the root account i also suggest enabling gpg encryption which will encrypt your data at rest and enabling https to encrypt your data in transit note this might impact performance s3cmd configure alternatively the aws sh script also syncs the template s3cfg file to your home folder note running the aws sh script will overwrite any existing s3cfg file update the config file with your credentials and location credentials aws access key id youraccesskey aws secret access key yoursecretkey bucket location us gpg passphrase yourpassphrase be careful you do not accidentally check in your credentials the gitignore file is set to ignore files with credentials usage refer to the following s3cmd ipython notebook https github com donnemartin data science ipython notebooks aws s3distcp p align center img src https raw githubusercontent com donnemartin dev setup resources master res s3distcp png br p s3distcp http docs aws amazon com elasticmapreduce latest developerguide usingemr s3distcp html is an extension of distcp that is optimized to work with amazon s3 s3distcp is useful for combining smaller files and aggregate them together taking in a pattern and target file to combine smaller input files to larger ones s3distcp can also be used to transfer large volumes of data from s3 to your hadoop cluster installation s3distcp comes bundled with the aws cli usage refer to the following s3distcp ipython notebook https github com donnemartin data science ipython notebooks aws s3 parallel put p align center img src https raw githubusercontent com donnemartin dev setup resources master res s3 parallel put png br p s3 parallel put https github com twpayne s3 parallel put git is a great tool for uploading multiple files to s3 in parallel installation git clone https github com twpayne s3 parallel put git usage refer to the following s3 parallel put ipython notebook https github com donnemartin data science ipython notebooks aws redshift p align center img src https raw githubusercontent com donnemartin dev setup resources master res aws redshift png br p redshift is a fast data warehouse built on top of technology from massive parallel processing mpp setup follow these instructions http docs aws amazon com redshift latest gsg rs gsg prereq html usage refer to the following redshift ipython notebook https github com donnemartin data science ipython notebooks aws kinesis p align center img src https raw githubusercontent com donnemartin dev setup resources master res aws kinesis png br p kinesis streams data in real time with the ability to process thousands of data streams per second setup follow these instructions http docs aws amazon com kinesis latest dev before you begin html usage refer to the following kinesis ipython notebook https github com donnemartin data science ipython notebooks aws lambda p align center img src https raw githubusercontent com donnemartin dev setup resources master res aws lambda png br p lambda runs code in response to events automatically managing compute resources setup follow these instructions http docs aws amazon com lambda latest dg setting up html usage refer to the following lambda ipython notebook https github com donnemartin data science ipython notebooks aws aws machine learning p align center img src https raw githubusercontent com donnemartin dev setup resources master res aws ml png br p amazon machine learning is a service that makes it easy for developers of all skill levels to use machine learning technology amazon machine learning provides visualization tools and wizards that guide you through the process of creating machine learning ml models without having to learn complex ml algorithms and technology once your models are ready amazon machine learning makes it easy to obtain predictions for your application using simple apis without having to implement custom prediction generation code or manage any infrastructure setup follow these instructions http docs aws amazon com machine learning latest dg setting up html usage coming soon refer to the following aws machine learning ipython notebook https github com donnemartin data science ipython notebooks aws heroku p align center img src https raw githubusercontent com donnemartin dev setup resources master res heroku jpg br p heroku http www heroku com if you re not already familiar with it is a platform as a service http en wikipedia org wiki platform as a service paas that makes it really easy to deploy your apps online there are other similar solutions out there but heroku was among the first and is currently the most popular not only does it make a developer s life easier but i find that having heroku deployment in mind when building an app forces you to follow modern app development best practices http www 12factor net installation assuming that you have an account sign up if you don t let s install the heroku client https devcenter heroku com articles using the cli for the command line heroku offers a mac os x installer the heroku toolbelt https toolbelt heroku com that includes the client but for these kind of tools i prefer using homebrew it allows us to keep better track of what we have installed luckily for us homebrew includes a heroku toolbelt formula brew install heroku toolbelt the formula might not have the latest version of the heroku client which is updated pretty often let s update it now brew upgrade heroku toolbelt don t be afraid to run heroku update every now and then to always have the most recent version usage login to your heroku account using your email and password heroku login if this is a new account and since you don t already have a public ssh key in your ssh directory it will offer to create one for you say yes it will also upload the key to your heroku account which will allow you to deploy apps from this computer if it didn t offer create the ssh key for you i e your heroku account already has ssh keys associated with it you can do so manually by running mkdir ssh ssh keygen t rsa keep the default file name and skip the passphrase by just hitting enter both times then add the key to your heroku account heroku keys add once the key business is done you re ready to deploy apps heroku has a great getting started https devcenter heroku com articles python guide so i ll let you refer to that the one linked here is for python but there is one for every popular language heroku uses git to push code for deployment so make sure your app is under git version control a quick cheat sheet if you ve used heroku before cd myapp heroku create myapp git push heroku master heroku ps heroku logs t the heroku dev center https devcenter heroku com is full of great resources so be sure to check it out section 5 data stores mysql p align center img src https raw githubusercontent com donnemartin dev setup resources master res mysql png br p installation the datastores sh script datastoressh script installs mysql if you prefer to install it separately run brew update always good to do brew install mysql as you can see in the ouput from homebrew before we can use mysql we first need to set it up with unset tmpdir mkdir usr local var mysql install db verbose user whoami basedir brew prefix mysql datadir usr local var mysql tmpdir tmp usage to start the mysql server use the mysql server tool mysql server start to stop it when you are done run mysql server stop you can see the different commands available for mysql server with mysql server help to connect with the command line client run mysql uroot use exit to quit the mysql shell note by default the mysql user root has no password it doesn t really matter for a local development database if you wish to change it though you can use mysqladmin u root password new password mysql workbench p align center img src https raw githubusercontent com donnemartin dev setup resources master res mysql workbench png br p in terms of a gui client for mysql i m used to the official and free mysql workbench http www mysql com products workbench but feel free to use whichever you prefer installation the datastores sh script datastoressh script installs mysql workbench if you prefer to install it separately run brew install caskroom cask brew cask brew cask install appdir applications mysqlworkbench you can also find the mysql workbench download here http www mysql com downloads workbench note it will ask you to sign in you don t need to just click on no thanks just start my download at the bottom mongodb p align center img src https raw githubusercontent com donnemartin dev setup resources master res mongodb jpeg br p mongodb http www mongodb org is a popular nosql http en wikipedia org wiki nosql database installation the datastores sh script datastoressh script installs mongodb if you prefer to install it separately run brew update brew install mongo usage in a terminal start the mongodb server mongod in another terminal connect to the database with the mongo shell using mongo i ll let you refer to mongodb s getting started http docs mongodb org manual tutorial getting started guide for more redis p align center img src https raw githubusercontent com donnemartin dev setup resources master res redis png br p redis http redis io is a blazing fast in memory key value store that uses the disk for persistence it s kind of like a nosql database but there are a lot of cool things http oldblog antirez com post take advantage of redis adding it to your stack html that you can do with it that would be hard or inefficient with other database solutions for example it s often used as session management or caching by web apps but it has many other uses installation the datastores sh script datastoressh script installs redis if you prefer to install it separately run brew update brew install redis usage start a local redis server using the default configuration settings with redis server for advanced usage you can tweak the configuration file at usr local etc redis conf i suggest making a backup first and use those settings with redis server usr local etc redis conf in another terminal connect to the server with the redis command line interface using redis cli i ll let you refer to redis documentation http redis io documentation or other tutorials for more information elasticsearch p align center img src https raw githubusercontent com donnemartin dev setup resources master res elasticsearch png br p as it says on the box elasticsearch http www elasticsearch org is a powerful open source distributed real time search and analytics engine it uses an http rest api making it really easy to work with from any programming language you can use elasticsearch for such cool things as real time search results autocomplete recommendations machine learning and more installation the datastores sh script datastoressh script installs elasticsearch if you prefer to install it separately check out the following discussion elasticsearch runs on java so check if you have it installed by running java version if java isn t installed yet a window will appear prompting you to install it go ahead and click install next install elasticsearch with brew install elasticsearch note elasticsearch also has a plugin program that gets moved to your path i find that too generic of a name so i rename it to elasticsearch plugin by running will need to do that again if you update elasticsearch mv usr local bin plugin usr local bin elasticsearch plugin below i will use elasticsearch plugin just replace it with plugin if you haven t followed this step as you guessed you can add plugins to elasticsearch a popular one is elasticsearch head http mobz github io elasticsearch head which gives you a web interface to the rest api install it with elasticsearch plugin install mobz elasticsearch head usage start a local elasticsearch server with elasticsearch test that the server is working correctly by running curl xget http localhost 9200 if you installed the elasticsearch head plugin you can visit its interface at http localhost 9200 plugin head elasticsearch s documentation http www elasticsearch org guide is more of a reference to get started i suggest reading some of the blog posts linked on this stackoverflow answer http stackoverflow com questions 11593035 beginners guide to elasticsearch 11767610 11767610 section 6 web development node js p align center img src https raw githubusercontent com donnemartin dev setup resources master res nodejs png br p installation the web sh script websh script installs node js http nodejs org you can also install it manually with homebrew brew update brew install node the formula also installs the npm https npmjs org package manager however as suggested by the homebrew output we need to add usr local share npm bin to our path so that npm installed modules with executables will have them picked up to do so add this line to your path file before the export path line bash path usr local share npm bin path open a new terminal for the path changes to take effect we also need to tell npm where to find the xcode command line tools by running sudo xcode select switch usr bin if xcode command line tools were installed by xcode try instead sudo xcode select switch applications xcode app contents developer node modules are installed locally in the node modules folder of each project by default but there are at least two that are worth installing globally those are coffeescript http coffeescript org and grunt http gruntjs com npm install g coffee script npm install g grunt cli npm usage to install a package npm install package install locally npm install g package install globally to install a package and save it in your project s package json file npm install package save to see what s installed npm list local npm list g global to find outdated packages locally or globally npm outdated g to upgrade all or a particular package npm update package to uninstall a package npm uninstall package jshint p align center img src https raw githubusercontent com donnemartin dev setup resources master res jshint png br p jshint is a javascript developer s best friend if the extra credit assignment to install sublime package manager was completed jshint can be run as part of sublime text installation the web sh script websh script installs jshint you can also install it manually via via npm npm install g jshint follow additional instructions on the jshint package manager page https sublime wbond net packages jshint or build it manually https github com jshint jshint less p align center img src https raw githubusercontent com donnemartin dev setup resources master res less png br p css preprocessors are becoming quite popular the most popular processors are less http lesscss org and sass http sass lang com preprocessing is a lot like compiling code for css it allows you to reuse css in many different ways let s start out with using less as a basic preprocessor it s used by a lot of popular css frameworks like bootstrap http getbootstrap com installation the web sh script websh script installs less to install less manually you have to use npm node which you installed earlier using homebrew in the terminal use npm install g less note the g flag is optional but it prevents having to mess around with file paths you can install without the flag just know what you re doing you can check that it installed properly by using lessc version this should output some information about the compiler lessc 1 5 1 less compiler javascript okay less is installed and running great usage there s a lot of different ways to use less generally i use it to compile my stylesheet locally you can do that by using this command in the terminal lessc template less template css the two options are the input and output files for the compiler the command looks in the current directory for the less stylesheet compiles it and outputs it to the second file in the same directory you can add in paths to keep your project files organized lessc less template less css template css read more about less on their page here http lesscss org section 7 android development this section is under development java p align center img src https raw githubusercontent com donnemartin dev setup resources master res java png br p installation the android sh script androidsh script installs java if you prefer to install it separately you can download the jdk here http www oracle com technetwork java javase downloads index html or run brew update brew install caskroom cask brew cask brew cask install appdir applications java android sdk p align center img src https raw githubusercontent com donnemartin dev setup resources master res androidsdk png br p the android sh script androidsh script installs the android sdk if you prefer to install it separately you can download it here https developer android com sdk index html android studio p align center img src https raw githubusercontent com donnemartin dev setup resources master res androidstudio png br p the android sh script androidsh script installs android studio if you prefer to install it separately you can download it here https developer android com sdk index html or run brew update brew install caskroom cask brew cask brew cask install appdir applications android studio intellij idea p align center img src https raw githubusercontent com donnemartin dev setup resources master res intellij png br p the android sh script androidsh script installs java if you prefer to install it separately you can download it here https www jetbrains com idea download or run brew update brew install caskroom cask brew cask brew cask install appdir applications intellij idea ce section 8 misc contributions bug reports suggestions and pull requests are welcome https github com donnemartin dev setup issues credits see the credits page https github com donnemartin dev setup blob master credits md contact info feel free to contact me to discuss any issues questions or comments my contact info can be found on my github page https github com donnemartin license this repository contains a variety of content some developed by donne martin and some from third parties the third party content is distributed under the license provided by those parties the content developed by donne martin is distributed under the following license i am providing code and resources in this repository to you under an open source license because this is my personal repository the license you receive to my code and resources is from me and not my employer facebook copyright 2015 donne martin licensed under the apache license version 2 0 the license you may not use this file except in compliance with the license you may obtain a copy of the license at http www apache org licenses license 2 0 unless required by applicable law or agreed to in writing software distributed under the license is distributed on an as is basis without warranties or conditions of any kind either express or implied see the license for the specific language governing permissions and limitations under the license | macos mac vim sublime-text bash iterm2 python spark aws cloud android-development cli git mysql postgresql mongodb redis elasticsearch nodejs linux | front_end |
quickvision | quickvision faster computer vision div align center github issues https img shields io github issues quick ai quickvision https github com quick ai quickvision issues github forks https img shields io github forks quick ai quickvision https github com quick ai quickvision network github stars https img shields io github stars quick ai quickvision https github com quick ai quickvision stargazers github license https img shields io github license quick ai quickvision https github com quick ai quickvision blob master license codecov https codecov io gh quick ai quickvision branch master graph badge svg token vafpqtqk1i https codecov io gh quick ai quickvision pep8 https github com quick ai quickvision workflows check 20code 20formatting badge svg ci tests https github com quick ai quickvision workflows ci 20tests badge svg docs https github com quick ai quickvision workflows deploy 20mkdocs badge svg pypi release https github com quick ai quickvision workflows pypi 20release badge svg slack https img shields io badge slack chat green svg logo slack https join slack com t quickai shared invite zt iz7tqk3r iqa4soxjgik5ws8vdzhzeq downloads https pepy tech badge quickvision https pepy tech project quickvision downloads https pepy tech badge quickvision month https pepy tech project quickvision downloads https pepy tech badge quickvision week https pepy tech project quickvision div demo assets demo png install quickvision install from pypi current stable release 0 1 1 needs pytorch 1 7 1 and torchvision 0 8 2 pip install quickvision what is quickvision quickvision makes computer vision tasks much faster and easier with pytorch it provides 1 easy to use pytorch native api for fit train step val step of models 2 easily customizable and configurable models with various backbones 3 a complete pytorch native interface all models are nn module all the training apis are optional and not binded to models 4 a lightning api which helps to accelerate training over multiple gpus tpus 5 a datasets api to convert common data formats very easily and quickly to pytorch formats 6 a minimal package with very low dependencies train your models faster quickvision has already implemented the long learning in pytorch quickvision is just pytorch quickvision does not make you learn a new library if you know pytorch you are good to go quickvision does not abstract any code from pytorch nor implements any custom classes over it it keeps the data format in tensor so that you don t need to convert it do you want just a model with some backbone configuration use model made by us it s just a nn module which has tensors only input and output format quickvision provides reference scripts too for training it do you want to train your model but not write lengthy loops just use our training methods such as fit train step val step do you want multi gpu training but worried about model configuration just subclass the pytorch lightning model implement the train step val step | computer-vision deep-learning pytorch pytorch-lightning | ai |
Tap-Tap-Embed | tap tap embed a game based on the popular dance dance revolution embedded system game designed using c code here is a demo of tap tap embed https www youtube com watch v yatgs1azg6i | os |
|
rtosLwip | digital tv controller supports rest apis features one board support package bootloader and os based on free rtos and lwip work for atmel atsame70q21 cpu arm cortex m7 and xilinx artix 7 fpga accessing spi flash i2c chips uart rs232 ethernet switch chips implemented tftp telnet multicast igmp mdns server client http server client based on raw connection http server support static web page dynamic web page html upload file web sockets rest apis get post patch limits maximum number of concurrent clients implement rest api controlling device implement ptp precision time protocol ieee 1588 2008 with both software precision and hardware aiding precision private control commands based on multicast command line interface xmodem tftp http for firmware updating it can be worked in onboard or linux simulation environment python testing package for most network protocols | rtos freertos ucos lwip linux atmel sameq70 cortexm7 python | os |
nalp | nalp natural adversarial language processing latest release https img shields io github release gugarosa nalp svg https github com gugarosa nalp releases open issues https img shields io github issues gugarosa nalp svg https github com gugarosa nalp issues license https img shields io github license gugarosa nalp svg https github com gugarosa nalp blob master license welcome to nalp have you ever wanted to create natural text from raw sources if yes nalp is for you this package is an innovative way of dealing with natural language processing and adversarial learning from bottom to top from embeddings to neural networks we will foster all research related to this new trend use nalp if you need a library or wish to create your embeddings design or use pre loaded state of art neural networks mix and match different strategies to solve your problem because it is cool to play with text read the docs at nalp readthedocs io https nalp readthedocs io nalp is compatible with python 3 6 package guidelines 1 the very first information you need is in the very next section 2 installing is also easy if you wish to read the code and bump yourself into follow along 3 note that there might be some additional steps in order to use our solutions 4 if there is a problem please do not hesitate call us getting started 60 seconds with nalp first of all we have examples yes they are commented just browse to examples chose your subpackage and follow the example we have high level examples for most tasks we could think of alternatively if you wish to learn even more please take a minute nalp is based on the following structure and you should pay attention to its tree yaml nalp core corpus dataset encoder model corpus audio sentence text datasets image language modeling encoders integer word2vec models discriminators conv embedded text linear lstm text generators bi lstm conv gru gumbel lstm gumbel rmc linear lstm rmc rnn stacked rnn layers gumbel softmax multi head attention relational memory cell dcgan gan gsgan maligan relgan seqgan wgan utils constants loader logging preprocess core the core is the core essentially it is the parent of everything you should find parent classes defining the basis of our structure they should provide variables and methods that will help to construct other modules corpus every pipeline has its first step right the corpus package serves as a basic class to load raw text audio and sentences datasets because we need data right datasets are composed of classes and methods that allow preparing data for further neural networks encoders text or numbers encodings are used to make embeddings embeddings are used to feed into neural networks remember that networks cannot read raw text therefore you might want to pre encode your data using well known encoders models each neural network architecture is defined in this package from na ve rnns to bilstms from gans to textgans you can use whatever suits your needs utils this is a utility package common things shared across the application should be implemented here it is better to implement once and use it as you wish than re implementing the same thing over and over again installation we believe that everything has to be easy not tricky or daunting nalp will be the one to go package that you will need from the very first installation to the daily tasks implementing needs if you may just run the following under your most preferred python environment raw conda virtualenv whatever bash pip install nalp alternatively if you prefer to install the bleeding edge version please clone this repository and use bash pip install e environment configuration note that sometimes there is a need for additional implementation if needed from here you will be the one to know all of its details ubuntu no specific additional commands needed windows no specific additional commands needed macos no specific additional commands needed support we know that we do our best but it is inevitable to acknowledge that we make mistakes if you ever need to report a bug report a problem talk to us please do so we will be available at our bests at this repository or gustavo rosa unesp br | python natural-language-processing adversarial-learning tensorflow2 machine-learning | ai |
C21-Front-End-Level-1 | c21 front end level 1 de eerste module van front end | front_end |
|
GraphGPT | center img src images graphgpt png style width 10 graphgpt graph instruction tuning for large language models center jiabin tang https tjb tech github io yuhao yang http yuh yang github io wei wei lei shi lixin su suqi cheng dawei yin https www yindawei com and chao huang https sites google com view chaoh home correspondence data intelligence lab https sites google com view chaoh home university of hong kong https www hku hk baidu inc a href https graphgpt github io img src https img shields io badge project page green a a href img src https img shields io badge demo page purple a a href https arxiv org abs 2310 13023 img src https img shields io badge paper pdf orange a youtube https badges aleen42 com src youtube svg this repository hosts the code data and model weight of graphgpt news x 2023 10 23 the full paper of our graphgpt is available at https arxiv org abs 2310 13023 https arxiv org abs 2310 13023 please check out it and give us more feedbacks x 2023 10 15 release the code of graphgpt todo release our utilized instruction data release checkpoints of our graphgpt and pre trained graph encoder exploring the potential of our graphgpt for more graph learning tasks span id introduction brief introduction we present the graphgpt framework that aligns llms with graph structural knowledge with a graph instruction tuning paradigm structural information encoding with text graph grounding to enhance the understanding of graph structural information by large language models our framework emphasizes aligning the encoding of graph structures with the natural language space this alignment aims to enable language models to effectively comprehend and interpret the structural elements of the graph leveraging their inherent language understanding capabilities to achieve this objective we introduce a text graph grounding paradigm that generates prompts designed to preserve the graph s structural context for language models this paradigm acts as a bridge connecting the semantic understanding of textual information with the inherent structural relationships found within the graph dual stage graph instruction tuning the dual stage graph instruction tuning paradigm proposed in this work builds upon the concept of instruction tuning which has been recently introduced to enhance the adaptability of language models for specific domains in this paradigm we aim to align the language capacity of the model with the nuances of graph learning tasks enabling the language model to generate more accurate and contextually appropriate responses for graph structured data chain of thought cot distillation when faced with diverse graph data language models may encounter new or unfamiliar patterns and structures this distribution shift can pose challenges in generating accurate and coherent responses especially when the number of node classes varies across different types of graph data to address this challenge and boost accuracy in the presence of distribution shift it is essential to equip our graphgpt with step by step reasoning abilities in this regard we propose utilizing the chain of thought cot technique 47 which explicitly models the flow of thoughts and reasoning steps by incorporating cot our language model improves the coherence and consistency of generated text it enables the model to follow a logical progression of ideas enhancing its ability to understand and reason about the given graph data for more technical details kindly refer to the paper and the project website https graphgpt github io of our graph span id usage getting started span id all catelogue table of contents a href code structure 1 code structure a a href environment preparation 2 environment preparation a a href training graphgpt 3 training graphgpt a a href prepare pre trained checkpoint 3 1 prepare pre trained checkpoint a a href self supervised instruction tuning 3 2 self supervised instruction tuning a a href extract the trained projector 3 3 extract the trained projector a a href task specific instruction tuning 3 4 task specific instruction tuning a a href evaluating graphgpt 4 evaluating graphgpt a a href preparing checkpoints and data 4 1 preparing checkpoints and data a a href running evaluation 4 2 running evaluation a span id code structure 1 code structure readme md assets demo narrow gif screenshot cli png screenshot gui png server arch png vicuna logo jpeg format sh graphgpt init py constants py conversation py eval readme md requirements txt run graphgpt py run graphgpt lp py run vicuna py script run model qa yaml model graphllama py init py apply delta py apply lora py builder py compression py convert fp16 py graph layers init py bpe simple vocab 16e6 txt gz clip graph py graph transformer py mpnn py simple tokenizer py make delta py model adapter py model registry py monkey patch non inplace py utils py protocol openai api protocol py serve init py api provider py bard worker py cacheflow worker py cli py controller py gateway readme md nginx conf gradio block arena anony py gradio block arena named py gradio css py gradio patch py gradio web server py gradio web server multi py huggingface api py inference py model worker py monitor basic stats py clean battle data py elo analysis py hf space leaderboard app py monitor py openai api server py register worker py test message py test throughput py train graphchat trainer py llama flash attn monkey patch py train graph py train lora py train mem py utils py playground inspect conv py test embedding readme md test classification py test semantic search py test sentence similarity py test openai api anthropic api py openai api py pyproject toml scripts eval script graphgpt eval sh extract graph projector py serving controller yaml model worker yaml tune script extract projector sh graphgpt stage1 sh graphgpt stage2 sh tests test openai curl sh test openai langchain py test openai sdk py span id environment preparation 2 environment preparation a href all catelogue back to top a please first clone the repo and install the required environment which can be done by running the following commands shell conda env create n graphgpt python 3 8 conda activate graphgpt torch with cuda 11 7 pip install torch 1 13 0 cu117 torchvision 0 14 0 cu117 torchaudio 0 13 0 extra index url https download pytorch org whl cu117 to support vicuna base model pip3 install fschat model worker webui clone our graphgpt git clone https github com hkuds graphgpt git cd graphgpt install required libaries pip install r requirements txt span id training graphgpt 3 training graphgpt graphgpt tuning paradigm consists of two stages 1 self supervised instruction tuning use approximately 600k filtered cc3m to connect a frozen pretrained vision encoder to a frozen llm 2 task specific instruction tuning use 150k gpt generated multimodal instruction following to teach the model to follow multimodal instructions span id prepare pre trained checkpoint 3 1 preparing pre trained checkpoint a href all catelogue back to top a graphgpt is trained based on following excellent existing models please follow the instructions to prepare the checkpoints vicuna prepare our base model vicuna which is an instruction tuned chatbot and base model in our implementation please download its weights here https github com lm sys fastchat model weights we generally utilize v1 1 and v1 5 model with 7b parameters graph encoder is used to encode graph structures we empoly text graph grounding approach to obtain the pre trained graph transformer model which you could download by graph transformer and put it at graphgpt graphgpt graph data is a combination of all utilized pyg graph data that contain node features edge index and son on you can download by all graph data pt and put it at graphgpt graph data graphgpt graph data span id self supervised instruction tuning 3 2 self supervised instruction tuning a href all catelogue back to top a prepare data please download our instruction tuning data graph matching json https huggingface co datasets jiabin99 graph matching for the graph matching task start tuning after the aforementioned steps you could start the first stage tuning by filling blanks at graphgpt stage1 sh https github com hkuds graphgpt scripts tune script graphgpt stage1 sh there is an example as below shell to fill in the following path to run the first stage of our graphgpt model path vicuna 7b v1 5 16k instruct ds data stage 1 graph matching json graph data path graph data all graph data pt pretra gnn clip gt arxiv output model checkpoints stage 1 wandb offline python m torch distributed run nnodes 1 nproc per node 4 master port 20001 graphgpt train train mem py model name or path model path version v1 data path instruct ds graph content arxiv ti ab json graph data path graph data path graph tower pretra gnn tune graph mlp adapter true graph select layer 2 use graph start end bf16 true output dir output model num train epochs 3 per device train batch size 2 per device eval batch size 2 gradient accumulation steps 1 evaluation strategy no save strategy steps save steps 2400 save total limit 1 learning rate 2e 3 weight decay 0 warmup ratio 0 03 lr scheduler type cosine logging steps 1 tf32 true model max length 2048 gradient checkpointing true lazy preprocess true report to wandb span id extract the trained projector 3 3 extract the trained projector a href all catelogue back to top a we could extract the trained projector in the stage 1 by filling blanks at extract projector sh https github com hkuds graphgpt scripts tune script extract projector sh there is an example as below shell to fill in the following path to extract projector for the second tuning stage src model checkpoints stage 1 output proj checkpoints stage 1 projector stage 1 projector bin python3 8 scripts extract graph projector py model name or path src model output output proj span id task specific instruction tuning 3 4 task specific instruction tuning a href all catelogue back to top a prepare data the choices of our task specific instruction data could be diverse e g standard or cot chain of thought node classifiction link prediction or mixing data for multitasking please refer to the task specific https huggingface co datasets jiabin99 task specific start tuning after the aforementioned steps you could start the second stage tuning by filling blanks at graphgpt stage2 sh https github com hkuds graphgpt scripts tune script graphgpt stage2 sh there is an example as below shell to fill in the following path to run the second stage of our graphgpt model path vicuna 7b v1 5 16k instruct ds data stage 2 data all mix json graph data path graph data all graph data pt pretra gnn clip gt arxiv tuned proj checkpoints stage 1 projector stage 1 projector bin output model checkpoints stage 2 wandb offline python m torch distributed run nnodes 1 nproc per node 4 master port 20001 graphgpt train train mem py model name or path model path version v1 data path instruct ds graph content arxiv ti ab json graph data path graph data path graph tower pretra gnn pretrain graph mlp adapter tuned proj tune graph mlp adapter true graph select layer 2 use graph start end true bf16 true output dir output model num train epochs 2 per device train batch size 1 per device eval batch size 1 gradient accumulation steps 1 evaluation strategy no save strategy steps save steps 50000 save total limit 1 learning rate 2e 5 weight decay 0 warmup ratio 0 03 lr scheduler type cosine logging steps 1 tf32 true model max length 2048 gradient checkpointing true dataloader num workers 4 lazy preprocess true report to wandb span id evaluating graphgpt 4 evaluating graphgpt a href all catelogue back to top a span id preparing checkpoints and data 4 1 preparing checkpoints and data checkpoints you could try to evaluate graphgpt by using your own model or our released checkpoints data we split test sets for different graph datasets and make the instruction data for evaluation please refer to the evaluating https huggingface co datasets jiabin99 evaluating span id running evaluation 4 2 running evaluation you could start the second stage tuning by filling blanks at graphgpt eval sh https github com hkuds graphgpt scripts eval script graphgpt eval sh there is an example as below shell to fill in the following path to extract projector for the second tuning stage output model checkpoints stage 2 datapath data eval arxiv nc json graph data path graph data all graph data pt res path output stage 2 arxiv nc start id 0 end id 20000 num gpus 2 python3 8 graphgpt eval run graphgpt py model name output model prompting file datapath graph data path graph data path output res path res path start id start id end id end id num gpus num gpus contact for any questions or feedback feel free to contact jiabin tang mailto jiabintang77 gmail com citation if you find graphgpt useful in your research or applications please kindly cite tex articles tang2023graphgpt title graphgpt graph instruction tuning for large language models author jiabin tang and yuhao yang and wei wei and lei shi and lixin su and suqi cheng and dawei yin and chao huang year 2023 eprint 2310 13023 archiveprefix arxiv primaryclass cs cl acknowledgements you may refer to related work that serves as foundations for our framework and code repository vicuna https github com lm sys fastchat llava https github com haotian liu llava we also partially draw inspirations from minigpt 4 https github com vision cair minigpt 4 the design of our website and readme md was inspired by next gpt https next gpt github io thanks for their wonderful works | graph-l graph-neural-networks instruction-tuning large-language-models text-graph | ai |
PeggleClone | peggleclone developer guide the developer guide can be found here docs developer guide md rules of the game cannon direction the player moves the cannon by either dragging on the screen when the player drags on the screen the cannon will point towards the location of the user s finger when the user lifts the finger a ball should fire alternatively tapping on the screen will immediately point the cannon towards the direction of the user s finger a ball will also fire if the user chooses to tap when the game is active a ball is shot the cannon is still can be aimed using the methods above win and lose conditions to win the user needs to clear all orange pegs a game should therefore have at least one orange peg to be considered playable without auto winning when the game starts you lose the game if you run out of red balls game balls you start with 10 game balls every shot decreases that amount by 1 note that if the ball enters the bucket the user will get a free ball number of balls left increases by 1 you also can lose the game if the timer for the game expires you will have 5 minutes to clear all orange pegs to win the game once the game is won a winning alert and sound is played the user can tap on the okay button to go back to the previous page once the game is lost a losing alert and sound is played the user can tap on the okay button to go back to the previous page level designer additional features scrolling levels a level with arbitrary height can be created by scrolling up down in the level designer drag with one finger to scroll a offset counter is shown in the level designer at the bottom left to indicate the current offset a offset of zero indicates that the level the level cannot be scrolled further down a level can be infinitely high but it is recommended to create a level that is winnable within 5 mins due to the game timer peg block rotation tapping on the peg block should bring up a rotation slider at the top of the level designer dragging that slider will allow you to rotate the pegs blocks from 0 to 360 degrees peg resizing pegs tapping on the peg should bring up a radius slider at the top of the level designer dragging that slider will allow you to change the radius of the pegs you can increase the radius by 2 times i e 4 times the area blocks tapping on the block should bring up a width and height slider at the top of the level designer dragging that slider will allow you to change the width and height of the blocks you can increase the width and height by 2 times each the maximum area will therefore be 4 times the original area oscillating blocks when a block is tapped it can be make to oscillate using the oscillate switch found at the top right of the level designer once oscillation is activated a gray circle will be shown on top of the current selected block the 4 corners will have small circles that can be dragged to adjust the springiness of the block the larger this radius the looser the springiness will be bells and whistles sound effect and music a background music for the level designer and main menu a background music for the gameplay two click sounds that get played when various ui elements are tapped cannon shoot sound a sound when a peg is hit a sound when a block is hit a sound when the bucket is hit free ball sound effect a sound when the game is won a sound when the game is lost a explosion sound effect when a blue peg is hit ka boom powerup a ghostly sound effect when the spooky ball reappears at the top a shape shift sound effect if the shape shift powerup activates when hitting the gray ball a flash sound effect if the flash powerup activates when hitting the yellow ball a rain fire sound effect if the rain fire powerup activates when hitting the pink ball note that for all these sounds the sound will only be played if the same sound effect is not already being played for example if the game ball hits one yellow peg causing one flash sound effect to be played and hits another yellow peg the sound effect for the second yellow peg flash powerup might not play as the previous sound effect might still be playing for all powerup sound effects it will only be played if the powerup is activated score system pink peg 1 point blue peg 10 point gray peg 50 point orange peg 100 point purple peg 500 point yellow peg 1000 point multiplier is exactly the same as in the problem set website under the appendix section https cs3217 github io cs3217 docs problem sets problem set 4 scoring system displaying the number of pegs remaining during gameplay displaying the number of balls left red balls remaining during gameplay displaying the number of pegs blocks added in the level designer a timer that results in a lost game when it expires set to 5 minutes additional powerups flash the game ball has a change to have its speed doubled if a yellow peg is hit for the first time this powerup only activates if the ball is not already travelling too fast shape shift the game ball has a change to have its size changed when a gray peg is hit for the first time note that the change in size might not be noticeable sometimes depending on the random size chosen rain fire 4 additional helper game balls dark gray in color will be spawned if the main game ball red in color hits a pink peg note that there can only be 4 helper game balls at any given time so if the game ball happens to hit multiple pink pegs the powerup might not activate depending on the number of game ball present the helper game balls disappear when the main game ball exits the game screen helper game balls can activate some powerups like spooky ball and ka boom if the helper ball activates the spooky ball powerup the main ball becomes spooky i e the main ball will reappear at the top after it exits helper game balls cannot activate rain fire because of point 2 helper game balls can fall into the bucket if they do all balls will be removed including the main ball and you get a free ball and you can shoot again helper game balls themselves cannot get affected by shape shift and flash power ups all helper game balls activated powerups will affect the main ball tests the test plans can be found here docs test plans md | game game-engine physics-engine swift swiftui | os |
react-native-starter | react native starter view demo https play google com store apps details id com reactnativestarter upd download https github com flatlogic react native starter git more templates https flatlogic com templates support forum https flatlogic com forum you re viewing the new and updated version of react native starter previous version can be found under the v1 branch https github com flatlogic react native starter tree v1 a powerful react native starter template that bootstraps development of your mobile application react native starter is a mobile application template with lots of built in components like sidebar navigation form elements etc all you need to start building your mobile app faster check out live demo on app store https play google com store apps details id com reactnativestarter lite google play https play google com store apps details id com reactnativestarter app react native starter https i imgur com vcz4bu6 png a href https play google com store apps details id com reactnativestarter upd img width 200 alt get it on google play src https play google com intl en us badges images generic en badge web generic png a a href https play google com store apps details id com reactnativestarter upd img width 200 alt download on app store src https i imgur com 7ixtmv0 png a what s inside always up to date react native scaffolding ui ux design from industry experts modular and well documented structure for application code redux for state management react navigation for simple navigation disk persisted application state caching more than 16 ready to use pages getting started 1 clone and install bash clone the repo git clone https github com flatlogic react native starter git navigate to clonned folder and install dependencies cd react native starter yarn install install pods cd ios pod install 2 open rns in your ios simulator run this command to start the development server and to start your app on ios simulator yarn run ios or if you prefer android yarn run android that s it cool right documentation our handy documentation can be found on official rns website https docs reactnativestarter com contributing if you find any problems please open an issue https github com flatlogic react native starter issues new or submit a fix as a pull request want more we have a premium version of this mobile application template that saves you even more time and money and comes with advanced features premium red color scheme more than 5 additional screens such as chat profile product item etc contains an extended charting library to visualize all the data you need premium support and updates included much much more read more and purchase it at https reactnativestarter com support for any additional information please go to our support forum https flatlogic com forum and raise your questions or feedback provide there we highly appreciate your participation how can i support developers star our github repo star tweet about it https twitter com intent tweet text amazing 20mobile 20application 20template 20built 20with 20react 20native url https github com flatlogic react native starter via flatlogic create pull requests submit bugs suggest new features or documentation updates wrench follow flatlogic on twitter https twitter com flatlogic subscribe to react native starter newsletter at reactnativestarter com https reactnativestarter com like our page on facebook https www facebook com flatlogic thumbsup more from flatlogic awesome bootstrap checkboxes radios https github com flatlogic awesome bootstrap checkbox pure css way to make inputs look prettier sing app dashboard https github com flatlogic sing app free and open source admin dashboard template built with bootstrap 4 license mozilla public license 2 0 license | react-native starter-kit template react-native-starter react-native-template redux expo | front_end |
awesome-blockchain-kor | awesome blockchain kor awesome https awesome re badge svg https awesome re hyperledger fabric https img shields io badge hyperledger 20fabric v1 0 20or 20higher blue svg solidity https img shields io badge solidity v0 4 20 yellow svg truffle https img shields io badge truffle v1 0 violet svg node js https img shields io badge node js v6 2 0 green svg docker https img shields io badge docker v1 13 20or 20higher blue svg docker compose https img shields io badge docker 20compose v1 8 20or 20higher lightgrey svg scam repository project https github com yunho0130 awesome blockchain kor projects 1 pull request pull request star follow hyperledger media bookcover hyperledger fabric jpg http www yes24 com product goods 69279313 art of blockchain media bookcover art of blockchain jpeg http www yes24 com product goods 69751266 table of contents ceo ceo media 15298083249681 jpg unsplash free use reading blockchain via desire media reading blockchain via desire md 1 https brunch co kr bumgeunsong 50 2 https brunch co kr bumgeunsong 51 ico ico https steemkr com coinkorea tyami 3 ico 1 ico https steemit com kr yoon 3f8ty8 private sale pre sale public sale crowd sale main sale soft cap hard cap white list kyc know your customer https steemit com cryptocurrency andrewdoyon whitelist kyc ico https brunch co kr rrblog 34 metamask 1 2 3 https steemit com kr twinbraid metamask 01 https steemit com kr twinbraid metamask 02 https steemit com kr twinbraid metamask 03 https steemit com kr twinbraid fee rate gas gas limit block gas limit gas price total fee https steemit com kr jinkim gas gas limit block gas limit gas price total fee gwei https steemit com kr kanghamin gwei https tokenpost kr terms 11579 airdrop snapshot https tokenpost kr terms 11567 https steemit com kr twinbraid 538vxn 2 ico scam ico https steemit com hellocrypto donekim ico scam https steemit com kr ico altcoin ico series 3 ico https etherscan io https steemit com kr holcoin https etherscan io alex park 2018 alex park medium https medium com hexlant ea b1 b0 eb 9e 98 ec 86 8c ed 95 b4 ed 82 b9 ec 8b 9c eb b3 b4 ec 83 81 ea b0 80 eb 8a a5 ec 97 ac eb b6 80 ec 97 90 eb 8c 80 ed 95 9c ed 86 a0 ed 81 b0 ec bb a8 ed 8a b8 eb 9e 99 ed 8a b8 ec bd 94 eb 93 9c eb b6 84 ec 84 9d ad40fc35a845 ceo media 15298087942908 jpg pexels player http verticalplatform kr archives 10015 ico ico trillema i ii iii medium https medium com onther tech ico ed 8a b8 eb a6 b4 eb a0 88 eb a7 88 trillema ico ec 82 bc ec 9c 84 ec 9d bc ec b2 b4 eb b6 88 ea b0 80 eb 8a a5 ec 84 b1 87c6d9233a78 ico initial coin offering private blockchain understanding ico and private blockchain media understanding ico and private blockchain kor md permissionless vs permissioned crypto anchors iot itworld 2018 http www itworld co kr news 108616 1 2 3 3 eos whitepapers eos technicalwhitepaper md https github com bookchainio eos docs blob master ko kr technicalwhitepaper md 2 ethereum whitepapers ethereum ethereum whitepaper official wiki md github https github com ethereum wiki wiki 5bkorean 5d white paper 1 bitcoin whitepapers bitcoin bitcoin kor pdf https www ddengle com board free 205187 pull request url https icon foundation tps whitepapers icon icon whitepaper ko draft pdf https steem io https github com taeminlee blockchain eos blob master 20170901 20steem 20white 20paper md github https www cardano org whitepapers ada whycardanokr pdf ico ico http verticalplatform kr archives 9772 ico initial coin offering and platform building media initial 20coin 20offering 20and 20platform 20building 20 1 pdf ico https steemit com kr newtoken ico ico https tokenbank co kr optional git clone git config system core longpaths true https stackoverflow com questions 22575662 filename too long in git for windows consensus pow pos dpos https steemit com kr donekim consensus pow pos dpos pow pos 2018 dpos consensus bgp https steemkr com kr donekim consensus bgp nem poi https tokenpost kr terms 6867 dag directed acyclic graph iota https steemkr com kr sjchoi adk 3 0 dag directed acyclic graph zero knowledge proof z cash https www ventascorp com news mod document uid 38 3 means of exchange burn mint medium https medium com chajesse ed 86 a0 ed 81 b0 eb 94 94 ec 9e 90 ec 9d b8 ed 8c a8 ed 84 b4 ec 8b 9c eb a6 ac ec a6 88 3 means of exchange ec 9d 98 burn mint ed 8c a8 ed 84 b4 88663d3f7cc segwit https www ventascorp com news mod document uid 21 merkle tree full node light node https steemit com kr jsralph merkle trees 51 51 https steemit com kr join jsralph 51 solidity media 15282674796425 jpg solidity https steemit com kr jumsun eth ethereum 1 turing complete https steemit com ethereum sigmoid 4mtmow https github com naksir melody go ethereum https docs google com presentation d 1ieohhcp83ueqlf3e5irwvun6gtzxiej8xzpzd1xcxvq edit slide id p5 daos dacs das and more an incomplete terminology guide https atomrigs blogspot com 2015 02 dao dac html solidity erc20 mvp minimum viable product https github com yunho0130 awesome blockchain kor blob master ethereum token v0 1 generate token sol erc20 more complete https github com yunho0130 awesome blockchain kor blob master ethereum token v0 1 generate token complete sol solidity cryptozombie https cryptozombies io ko solidity truffle zeppelin solidity https medium com dnext post solidity tutorial 1 252c9edf2f84 https medium com dnext post solidity tutorial 2 e26a57c8fdd1 https medium com dnext post solidity tutorial 3 e163f6de5ab5 https medium com dnext post solidity tutorial 4 7a048c76d8e4 https medium com dnext post solidity tutorial 5 85dfe51dcd54 https medium com dnext post solidity tutorial 6 543bd342d928 udemy solidity mooc 10 https dealsea com view udemy com https www udemy com blockchain and bitcoin fundamentals solidity https www udemy com ethereum and solidity the complete developers guide erc20 ethereum erc20 vulnerability overflow underflow safemath erc20 transfer overflow underflow erc20 300 2017 12 27 smt smartmesh https medium com onther tech smt ed 86 a0 ed 81 b0 ec 9d b4 ec a4 91 ec a7 80 eb b6 88 ed 95 b4 ed 82 b9 ec 84 a4 eb aa 85 ea b3 bc eb 8c 80 eb b9 84 ec b1 85 8bef3f41bcd2 nothing at stake pos 6 1 https medium com ihcho131313 ec 8a a4 eb a7 88 ed 8a b8 ec bb a8 ed 8a b8 eb a0 89 ed 8a b8 eb b3 b4 ec 95 88 6 ea b0 9c ec 9d 98 ec 86 94 eb a6 ac eb 94 94 ed 8b b0 ec b7 a8 ec 95 bd ec a0 90 eb b0 8f ea b7 b8 ec 97 90 eb 8c 80 ed 95 9c eb 8c 80 eb b9 84 ec b1 85 ed 8c 8c ed 8a b8 1 f4f32d19a558 https github com onther tech auditing reference https github com yunho0130 awesome blockchain kor blob master media eb b8 94 eb a1 9d ec b2 b4 ec 9d b8 20 ea b8 b0 ec 88 a0 ea b3 bc 20 eb b3 b4 ec 95 88 20 ea b3 a0 eb a0 a4 ec 82 ac ed 95 ad vfn pdf security audit audit openzeppelin https github com openzeppelin openzeppelin solidity https steemit com kr dev modolee onchain offchain pbtf practical byzantine fault tolerance media practical byzantine fault tolerance pdf 2018 it chain plasma raiden sharding https cryptokiwi kr currency id eth category 2 content id 227 plasma medium https medium com decipher media eb b8 94 eb a1 9d ec b2 b4 ec 9d b8 ed 99 95 ec 9e a5 ec 84 b1 ec 86 94 eb a3 a8 ec 85 98 ec 8b 9c eb a6 ac ec a6 88 2 1 plasma overview 7e6875f4c20d azure https github com yunho0130 awesome blockchain kor tree master ethereum private ms azure template hyperledger fabric composer media 15282706026711 png go node js java baas blockchain as a service business network archive bna 2018 media 15303858888992 jpg hyperledger fabric v1 1 deployment developerworks 1 readme md https github com yunho0130 awesome blockchain kor blob master hyperledger developerworks blockchainnetwork compositejourney master readme ko md 2 readme md https github com yunho0130 awesome blockchain kor blob master hyperledger developerworks blockchainsmartcontracttrading compositejourney master readme ko md node js sdk readme md https github com yunho0130 awesome blockchain kor blob master hyperledger developerworks deployment nodesdk master readme ko md python sdk fabric sdk py master hyperledger fabric sdk py master ibm cloud developerworks hyperledger composer developerworks developerworks api api doc https docs upbit com ubci r https cran r project org web packages ubci index html api doc https www bithumb com u1 us127 api doc https apidocs korbit co kr ko python https github com hoonjin korbit python php code https github com yks118 korbit api php gopax api doc https gopaxapi github io gopax api doc http doc coinone co kr python readme md quant trading ccxt master readme md https wikidocs net book 110 citation book development hyperledger fabric title dapp author isbn 9791162241462 url https www hanbit co kr store books look php p code b4172582620 year 2019 publisher book the art of blockchain title the art of blockchain author isbn 9791187497202 url https jiandson co kr books 117 year 2019 publisher alex park 2018 medium castro m liskov b 1999 practical byzantine fault tolerance in osdi vol 99 pp 173 186 li j mann w 2018 initial coin offering and platform building itworld 2018 2018 392 2018 it chain 392 2018 392 contributors harrydrippin jb7959 mrchypark jinyounghwa mingrammer wooqii weyoui https www linkedin com in yunho0130 ibm data ai california autodevbot ios python sk c c kisti nrf dbpia apache zepplin python sdk qiskit disclaimer ibm etc this repository is consist of several experimental blockchain project and learning materials not related to ibm it s personal repository so i have no responsibility for using this code about all kind of problems such as technical security legal etc | blockchain solidity smart-contracts ico token hyperledger erc20 erc20-tokens | blockchain |
BlockChain | https github com chengqueducation blockchain blob master screenshots banner jpeg 2017 it bug github fork star https pan baidu com s 1v305zabuplqfegnvedxl a 1 0 01 day01 day01 00 https pan baidu com s 1ruzy2bnpnyp8rziqpufo7g day01 01 https pan baidu com s 1y5g7ikfliurxesrlvb2fkw day01 02 https pan baidu com s 1m0aajcqq8dqeddp tp8sug day01 03 https pan baidu com s 1fodqbx0xswqqe6ajyf rww day01 04 1 https pan baidu com s 1qfwotapl2 wxj1dw6cqdeq day01 05 2 https pan baidu com s 1nzx8 0rajeybsvoxhepl a day01 06 1 https pan baidu com s 1rwhgmflrlxsy qft7rpgng day01 07 2 https pan baidu com s 1i80bsrhc 2zcfj3qfrxddq 02 day02 day02 01 hash 1 https pan baidu com s 1smxjs4i6ue2nozzb9zpvkq day02 02 hash 2 https pan baidu com s 1cmis4pftykpwxk6d61xbda day02 03 rsa 1 https pan baidu com s 1xf1b1dsjlci rnef 9jfzw day02 04 rsa 2 https pan baidu com s 1ouoga 5yslx1ionr0w076g day02 05 https pan baidu com s 1uc7dmhdl0l5bqtqu2zrika day02 06 base64 https pan baidu com s 1pafvayoyn1xuhgx igzvsg day02 07 https pan baidu com s 1yufrldlzdkuhwe5hndjeiw day02 08 base58 secp256k1 https pan baidu com s 1rsj9u lefizdaxewklxkia day02 09 wif 16 https pan baidu com s 1omteaofr 8nkcs jn5ilnq day02 10 https pan baidu com s 12qekjtvcmtyfruwve5nlew day02 11 ecdsa 1 https pan baidu com s 1u0q9qxdr sqs 81lrxiqrg day02 12 ecdsa 2 secp256k1util https pan baidu com s 1jnih zheckplewbcri008g 03 day03 day03 01 1 https pan baidu com s 1uuu w29cjpr9yfac66amdg day03 02 2 https pan baidu com s 1qi ouadviwtsqmrlpgpkzw day03 03 java 1 https pan baidu com s 1cuagdgdkblybqauh3glwya day03 04 java 2 https pan baidu com s 1 4moozhmhloitgra68apq day03 05 hash https pan baidu com s 13qvdltaskoqmgbjxoucyaq 04 day04 day04 01 1 https pan baidu com s 19e5h08jrvzkpjkrwourxoa day04 02 2 https pan baidu com s 18qubd06tl0u ieh5d vtsw day04 03 bitcoincore 1 https pan baidu com s 1fnxroycy0fjjgdtx2lw ag day04 04 bitcoincore 2 https pan baidu com s 1la5vtan7e0ybtu5udgl2 q day04 05 bitcoin cli https pan baidu com s 1zbi zxhlypjo5ot ypodnw day04 06 bitcoind https pan baidu com s 1mfb1w64hii1erxj9keap9q day04 07 bitcoind rpc https pan baidu com s 1xas 1r749anbgx gvprrpw 05 day05 day05 01 https pan baidu com s 1ctxmnjrjfpdxt2cp4t76aq day05 02 https pan baidu com s 1gsetjs2lqrvt5gvqqsvdyg day05 03 1 https pan baidu com s 1 wqavkbnsw6mzm0edlszqw day05 04 2 p2pkh https pan baidu com s 1f4ohom03nzluyqxbxtvo5a day05 05 3 p2pk coinbase https pan baidu com s 10nswkugzfrozih55xjetmq 06 day06 day06 01 regtest https pan baidu com s 1o0k5t9nm1m7fehdchsg4aw day06 02 https pan baidu com s 1whw ozfeta ukmd4yh86lg day06 03 rpc https pan baidu com s 1sotnwyysxoubgdatlp0cda 2 0 01 https pan baidu com s 14chgg10cd5d0jsfgcsnqjg 02 v https pan baidu com s 1jvjtt8o316rp0nsljhq00w 03 01 https pan baidu com s 1siuaih1yhuzqm95ynhbfpg 04 02 https pan baidu com s 1rqpyzpzp5eapswix0ohqcw 05 03 https pan baidu com s 1n9utx8m7 yw0vsaeagj5dg 06 04 https pan baidu com s 1nbgxlspe3tf0xos1jmxcog 07 https pan baidu com s 1ezcjo5wlccjpokou vk3vq 08 rinkeby https pan baidu com s 10oas6dxyxpepofbem 73ca 09 metamask https pan baidu com s 1wvxuaaivyvmr0cqj 1jbkq 10 remix ide https pan baidu com s 1whpdsfzdtuucatvouefw1a 11 solidity https pan baidu com s 1oestqq8eyhiph5ra uaawg 12 https pan baidu com s 1lf2vzdcguewa8vb j2sreg 13 web3 js https pan baidu com s 1s1yc8jpqdhc vrsexninlq 14 https pan baidu com s 1flrfyihqdwrog11hskewtg 15 https pan baidu com s 17z8jigir jlb7qpptkmj5q 16 https pan baidu com s 1kerxp5txvusxlfnsz0cylw 17 https pan baidu com s 1rw3le3izr9lk sgkvf9erg 18 truffle 01 https pan baidu com s 1az65ntr areti3no21dvag 19 truffle 01 nodejs https pan baidu com s 1fb4jdhtrnebclryrqubxca 20 truffle 03 https pan baidu com s 142spodxw0z29z225y2wkrw 21 truffle 04 https pan baidu com s 17d yosllh0ulaqoxlx20ea 22 truffle 04 2 https pan baidu com s 1 hsyk5rgtrh6hrce g0m4a 23 windows mist https pan baidu com s 11zgfkeuccdiqtf6cw h4zg 3 0 hyperledger fabric fabric fabric v1 0 fabric docker fabric go java | blockchain |
|
kinomore | kinomore front end https kinomore netlify com https kinomore netlify com react react hooks typescript redux toolkit rtk query sass css modules jest nextjs pwa react hook form yup testing library storybook api https kinopoisk dev documentation html https kinopoisk dev documentation html | jest redux-toolkit scss typescript nextjs cinema storybook | front_end |
LLM-Reasoning-Papers | h1 align center reasoning in large language models h1 p align center img src https awesome re badge svg href https github com atfortes llm reasoning papers img src https img shields io badge license mit green svg href https opensource org licenses mit img src https img shields io badge prs welcome red img src https img shields io github last commit atfortes llm reasoning papers color green p p align center b collection of papers and resources on how to unlock the reasoning ability of large language models b p p align center i also check out the a href https github com atfortes awesome multimodal reasoning awesome multimodal reasoning a collection i p p align center img src assets cot svg width 90 style align center p p align center large language models have revolutionized the nlp landscape showing improved performance and sample efficiency over smaller models however increasing model size alone has not proved sufficient for high performance on challenging reasoning tasks such as solving arithmetic or commonsense problems we present a collection of papers and resources on how to unlock these abilities p contents survey survey analysis analysis technique technique reasoning in large language models an emergent ability llm scaling smaller language models to reason lm benchmark benchmark other useful resources other useful resources other awesome lists other awesome lists contributing contributing survey 1 reasoning with language model prompting a survey acl 2023 shuofei qiao yixin ou ningyu zhang xiang chen yunzhi yao shumin deng chuanqi tan fei huang huajun chen paper https arxiv org abs 2212 09597 code https github com zjunlp prompt4reasoningpapers 2022 12 1 towards reasoning in large language models a survey acl 2023 findings jie huang kevin chen chuan chang paper https arxiv org abs 2212 10403 code https github com jeffhj lm reasoning 2022 12 analysis 1 can language models learn from explanations in context emnlp 2022 andrew k lampinen ishita dasgupta stephanie c y chan kory matthewson michael henry tessler antonia creswell james l mcclelland jane x wang felix hill paper https arxiv org abs 2204 02329 2022 4 1 emergent abilities of large language models tmlr 2022 jason wei yi tay rishi bommasani colin raffel barret zoph sebastian borgeaud dani yogatama maarten bosma denny zhou donald metzler ed h chi tatsunori hashimoto oriol vinyals percy liang jeff dean william fedus paper https arxiv org abs 2206 07682 blog https ai googleblog com 2022 11 characterizing emergent phenomena in html 2022 6 1 challenging big bench tasks and whether chain of thought can solve them preprint mirac suzgun nathan scales nathanael sch rli sebastian gehrmann yi tay hyung won chung aakanksha chowdhery quoc v le ed h chi denny zhou jason wei paper https arxiv org abs 2210 09261 code https github com suzgunmirac big bench hard 2022 10 1 towards understanding chain of thought prompting an empirical study of what matters acl 2023 boshi wang sewon min xiang deng jiaming shen you wu luke zettlemoyer huan sun paper https arxiv org abs 2212 10001 code https github com sunlab osu understanding cot 2022 12 1 on second thought let s not think step by step bias and toxicity in zero shot reasoning acl 2023 omar shaikh hongxin zhang william held michael bernstein diyi yang paper https arxiv org abs 2212 08061 2022 12 1 dissociating language and thought in large language models a cognitive perspective preprint kyle mahowald anna a ivanova idan a blank nancy kanwisher joshua b tenenbaum evelina fedorenko paper https arxiv org abs 2301 06627 2023 1 1 large language models can be easily distracted by irrelevant context icml 2023 freda shi xinyun chen kanishka misra nathan scales david dohan ed chi nathanael sch rli denny zhou paper https arxiv org abs 2302 00093 2023 1 1 a multitask multilingual multimodal evaluation of chatgpt on reasoning hallucination and interactivity preprint yejin bang samuel cahyawijaya nayeon lee wenliang dai dan su bryan wilie holy lovenia ziwei ji tiezheng yu willy chung quyet v do yan xu pascale fung paper https arxiv org abs 2302 04023 2023 2 1 language models don t always say what they think unfaithful explanations in chain of thought prompting preprint miles turpin julian michael ethan perez samuel r bowman paper https arxiv org abs 2305 04388 code https github com milesaturpin cot unfaithfulness 2023 5 1 faith and fate limits of transformers on compositionality preprint nouha dziri ximing lu melanie sclar xiang lorraine li liwei jiang bill yuchen lin peter west chandra bhagavatula ronan le bras jena d hwang soumya sanyal sean welleck xiang ren allyson ettinger zaid harchaoui yejin choi paper https arxiv org abs 2305 18654 2023 5 1 measuring faithfulness in chain of thought reasoning preprint tamera lanham anna chen ansh radhakrishnan benoit steiner carson denison danny hernandez dustin li esin durmus evan hubinger jackson kernion kamil luko i t karina nguyen newton cheng nicholas joseph nicholas schiefer oliver rausch robin larson sam mccandlish sandipan kundu saurav kadavath shannon yang thomas henighan timothy maxwell timothy telleen lawton tristan hume zac hatfield dodds jared kaplan jan brauner samuel r bowman ethan perez paper https arxiv org abs 2307 13702 2023 7 technique h3 id llm reasoning in large language models i an emergent ability i h3 1 chain of thought prompting elicits reasoning in large language models neurips 2022 jason wei xuezhi wang dale schuurmans maarten bosma brian ichter fei xia ed chi quoc le denny zhou paper https arxiv org abs 2201 11903 blog https ai googleblog com 2022 05 language models perform reasoning via html 2022 1 1 self consistency improves chain of thought reasoning in language models iclr 2023 xuezhi wang jason wei dale schuurmans quoc le ed chi sharan narang aakanksha chowdhery denny zhou paper https arxiv org abs 2203 11171 2022 3 1 iteratively prompt pre trained language models for chain of thought emnlp 2022 boshi wang xiang deng huan sun paper https arxiv org abs 2203 08383 code https github com sunlab osu iterprompt 1 least to most prompting enables complex reasoning in large language models iclr 2023 denny zhou nathanael sch rli le hou jason wei nathan scales xuezhi wang dale schuurmans claire cui olivier bousquet quoc le ed chi paper https arxiv org abs 2205 10625 2022 5 1 large language models are zero shot reasoners neurips 2022 takeshi kojima shixiang shane gu machel reid yutaka matsuo yusuke iwasawa paper https arxiv org abs 2205 11916 2022 5 1 making large language models better reasoners with step aware verifier preprint yifei li zeqi lin shizhuo zhang qiang fu bei chen jian guang lou weizhu chen paper https arxiv org abs 2206 02336 2022 6 1 large language models still can t plan neurips 2022 karthik valmeekam alberto olmo sarath sreedharan subbarao kambhampati paper https arxiv org abs 2206 10498 code https github com karthikv792 gpt plan benchmark 2022 6 1 solving quantitative reasoning problems with language models neurips 2022 aitor lewkowycz anders andreassen david dohan ethan dyer henryk michalewski vinay ramasesh ambrose slone cem anil imanol schlag theo gutman solo yuhuai wu behnam neyshabur guy gur ari vedant misra paper https arxiv org abs 2206 14858 blog https ai googleblog com 2022 06 minerva solving quantitative reasoning html 2022 6 1 rationale augmented ensembles in language models preprint xuezhi wang jason wei dale schuurmans quoc le ed chi denny zhou paper https arxiv org abs 2207 00747 2022 7 1 dynamic prompt learning via policy gradient for semi structured mathematical reasoning iclr 2023 pan lu liang qiu kai wei chang ying nian wu song chun zhu tanmay rajpurohit peter clark ashwin kalyan project https promptpg github io paper https arxiv org abs 2209 14610 code https github com lupantech promptpg 2022 9 1 ask me anything a simple strategy for prompting language models iclr 2023 simran arora avanika narayan mayee f chen laurel orr neel guha kush bhatia ines chami frederic sala christopher r paper https arxiv org abs 2210 02441 code https github com hazyresearch ama prompting 2022 10 1 language models are multilingual chain of thought reasoners iclr 2023 freda shi mirac suzgun markus freitag xuezhi wang suraj srivats soroush vosoughi hyung won chung yi tay sebastian ruder denny zhou dipanjan das jason wei paper https arxiv org abs 2210 03057 2022 10 1 measuring and narrowing the compositionality gap in language models preprint ofir press muru zhang sewon min ludwig schmidt noah a smith mike lewis paper https arxiv org abs 2210 03350 2022 10 1 automatic chain of thought prompting in large language models iclr 2023 zhuosheng zhang aston zhang mu li alex smola paper https arxiv org abs 2210 03493 code https github com amazon research auto cot 2022 10 1 react synergizing reasoning and acting in language models neurips 2022 workshop fmdm shunyu yao jeffrey zhao dian yu nan du izhak shafran karthik narasimhan yuan cao project https react lm github io paper https arxiv org abs 2210 03629 code https github com ysymyth react blog https ai googleblog com 2022 11 react synergizing reasoning and acting html 2022 10 1 reflection of thought inversely eliciting numerical reasoning in language models via solving linear systems preprint fan zhou haoyu dong qian liu zhoujun cheng shi han dongmei zhang paper https arxiv org abs 2210 05075 2022 10 1 mind s eye grounded language model reasoning through simulation iclr 2023 ruibo liu jason wei shixiang shane gu te yen wu soroush vosoughi claire cui denny zhou andrew m dai paper https arxiv org abs 2210 05359 2022 10 1 language models of code are few shot commonsense learners emnlp 2022 aman madaan shuyan zhou uri alon yiming yang graham neubig paper https arxiv org abs 2210 07128 code https github com madaan cocogen 2022 10 1 large language models can self improve preprint jiaxin huang shixiang shane gu le hou yuexin wu xuezhi wang hongkun yu jiawei han paper https arxiv org abs 2210 11610 2022 10 1 retrieval augmentation for commonsense reasoning a unified approach emnlp 2022 wenhao yu chenguang zhu zhihan zhang shuohang wang zhuosheng zhang yuwei fang meng jiang paper https arxiv org abs 2210 12887 code https github com wyu97 raco 2022 10 1 pal program aided language models icml 2023 luyu gao aman madaan shuyan zhou uri alon pengfei liu yiming yang jamie callan graham neubig project https reasonwithpal com paper https arxiv org abs 2211 10435 code https github com reasoning machines pal 2022 11 1 unsupervised explanation generation via correct instantiations aaai 2023 sijie cheng zhiyong wu jiangjie chen zhixing li yang liu lingpeng kong paper https arxiv org abs 2211 11160 2022 11 1 program of thoughts prompting disentangling computation from reasoning for numerical reasoning tasks preprint wenhu chen xueguang ma xinyi wang william w cohen paper https arxiv org abs 2211 12588 code https github com wenhuchen program of thoughts 2022 11 1 complementary explanations for effective in context learning acl 2023 findings xi ye srinivasan iyer asli celikyilmaz ves stoyanov greg durrett ramakanth pasunuru paper https arxiv org abs 2211 13892 2022 11 1 murmur modular multi step reasoning for semi structured data to text generation preprint swarnadeep saha xinyan velocity yu mohit bansal ramakanth pasunuru asli celikyilmaz paper https arxiv org abs 2212 08607 2022 12 1 can retriever augmented language models reason the blame game between the retriever and the language model preprint parishad behnamghader santiago miret siva reddy paper https arxiv org abs 2212 09146 code https github com mcgill nlp retriever lm reasoning 2022 12 1 large language models are reasoners with self verification preprint yixuan weng minjun zhu shizhu he kang liu jun zhao paper https arxiv org abs 2212 09561 code https github com wengsyx self verification 2022 12 1 interleaving retrieval with chain of thought reasoning for knowledge intensive multi step questions preprint harsh trivedi niranjan balasubramanian tushar khot ashish sabharwal paper https arxiv org abs 2212 10509 code https github com stonybrooknlp ircot 2022 12 1 language models as inductive reasoners preprint zonglin yang li dong xinya du hao cheng erik cambria xiaodong liu jianfeng gao furu wei paper https arxiv org abs 2212 10923 2022 12 1 lambada backward chaining for automated reasoning in natural language preprint seyed mehran kazemi najoung kim deepti bhatia xin xu deepak ramachandran paper https arxiv org abs 2212 13894 2022 12 1 rethinking with retrieval faithful large language model inference preprint hangfeng he hongming zhang dan roth paper https arxiv org abs 2301 00303 2023 1 1 specializing smaller language models towards multi step reasoning preprint yao fu hao peng litu ou ashish sabharwal tushar khot paper https arxiv org abs 2301 12726 2023 1 1 faithful chain of thought reasoning preprint qing lyu shreya havaldar adam stein li zhang delip rao eric wong marianna apidianaki chris callison burch paper https arxiv org abs 2301 13379 2023 1 1 large language models are versatile decomposers decompose evidence and questions for table based reasoning preprint yunhu ye binyuan hui min yang binhua li fei huang yongbin li paper https arxiv org abs 2301 13808 2023 1 1 synthetic prompting generating chain of thought demonstrations for large language models preprint zhihong shao yeyun gong yelong shen minlie huang nan duan weizhu chen paper https arxiv org abs 2302 00618 2023 2 1 multimodal chain of thought reasoning in language models preprint zhuosheng zhang aston zhang mu li hai zhao george karypis alex smola paper https arxiv org abs 2302 00923 code https github com amazon science mm cot 2023 2 1 active prompting with chain of thought for large language models preprint shizhe diao pengcheng wang yong lin tong zhang paper https arxiv org abs 2302 12246 code https github com shizhediao active cot 2023 2 1 automatic prompt augmentation and selection with chain of thought from labeled data preprint kashun shum shizhe diao tong zhang paper https arxiv org abs 2302 12822 code https github com shizhediao automate cot 2023 2 1 language is not all you need aligning perception with language models preprint shaohan huang li dong wenhui wang yaru hao saksham singhal shuming ma tengchao lv lei cui owais khan mohammed barun patra qiang liu kriti aggarwal zewen chi johan bjorck vishrav chaudhary subhojit som xia song furu wei paper https arxiv org abs 2302 14045 code https github com microsoft unilm 2023 2 1 art automatic multi step reasoning and tool use for large language models preprint bhargavi paranjape scott lundberg sameer singh hannaneh hajishirzi luke zettlemoyer marco tulio ribeiro paper https arxiv org abs 2303 09014 2023 3 1 refiner reasoning feedback on intermediate representations preprint debjit paul mete ismayilzada maxime peyrard beatriz borges antoine bosselut robert west boi faltings project https debjitpaul github io refiner paper https arxiv org abs 2304 01904 code https github com debjitpaul refiner 2023 4 1 tree of thoughts deliberate problem solving with large language models preprint shunyu yao dian yu jeffrey zhao izhak shafran thomas l griffiths yuan cao karthik narasimhan paper https arxiv org abs 2305 10601 code https github com ysymyth tree of thought llm 2023 5 1 reasoning implicit sentiment with chain of thought prompting acl 2023 hao fei bobo li qian liu lidong bing fei li tat seng chua paper https arxiv org abs 2305 11255 code https github com scofield7419 thor isa 2023 05 1 llms as factual reasoners insights from existing benchmarks and beyond preprint philippe laban wojciech kry ci ski divyansh agarwal alexander r fabbri caiming xiong shafiq joty chien sheng wu paper https arxiv org abs 2305 14540 2023 5 1 reasoning with language model is planning with world model preprint shibo hao yi gu haodi ma joshua jiahua hong zhen wang daisy zhe wang zhiting hu paper https arxiv org abs 2305 14992 2023 5 1 recursion of thought a divide and conquer approach to multi context reasoning with language models acl 2023 findings soochan lee gunhee kim paper https arxiv org abs 2306 06891 code https github com soochan lee rot poster https soochanlee com img rot rot poster pdf 2023 6 1 question decomposition improves the faithfulness of model generated reasoning preprint ansh radhakrishnan karina nguyen anna chen carol chen carson denison danny hernandez esin durmus evan hubinger jackson kernion kamil luko i t newton cheng nicholas joseph nicholas schiefer oliver rausch sam mccandlish sheer el showk tamera lanham tim maxwell venkatesa chandrasekaran zac hatfield dodds jared kaplan jan brauner samuel r bowman ethan perez paper https arxiv org abs 2307 11768 code https github com anthropics decompositionfaithfulnesspaper 2023 7 1 skeleton of thought large language models can do parallel decoding preprint xuefei ning zinan lin zixuan zhou huazhong yang yu wang paper https arxiv org abs 2307 15337 2023 7 1 skills in context prompting unlocking compositionality in large language models preprint jiaao chen xiaoman pan dian yu kaiqiang song xiaoyang wang dong yu jianshu chen paper https arxiv org abs 2308 00304 2023 8 h3 id lm scaling smaller language models to reason h3 1 scaling instruction finetuned language models preprint hyung won chung le hou shayne longpre barret zoph yi tay william fedus eric li xuezhi wang mostafa dehghani siddhartha brahma albert webson shixiang shane gu zhuyun dai mirac suzgun xinyun chen aakanksha chowdhery sharan narang gaurav mishra adams yu vincent zhao yanping huang andrew dai hongkun yu slav petrov ed h chi jeff dean jacob devlin adam roberts denny zhou quoc v le jason wei paper https arxiv org abs 2210 11416 2022 10 1 distilling multi step reasoning capabilities of large language models into smaller models via semantic decompositions acl 2023 findings kumar shridhar alessandro stolfo mrinmaya sachan paper https arxiv org abs 2212 00193 2022 12 1 teaching small language models to reason preprint lucie charlotte magister jonathan mallinson jakub adamek eric malmi aliaksei severyn paper https arxiv org abs 2212 08410 2022 12 1 large language models are reasoning teachers acl 2023 namgyu ho laura schmid se young yun paper https arxiv org abs 2212 10071 code https github com itsnamgyu reasoning teacher 2022 12 1 specializing smaller language models towards multi step reasoning preprint yao fu hao peng litu ou ashish sabharwal tushar khot paper https arxiv org abs 2301 12726 2023 1 1 symbolic chain of thought distillation small models can also think step by step acl 2023 liunian harold li jack hessel youngjae yu xiang ren kai wei chang yejin choi paper https arxiv org abs 2306 14050 code https github com allenai cot distillation 2023 6 benchmark reasoning ability benchmarks arithmetic gsm8k https arxiv org abs 2110 14168 svamp https aclanthology org 2021 naacl main 168 asdiv https aclanthology org 2020 acl main 92 aqua https aclanthology org p17 1015 mawps https aclanthology org n16 1136 addsub https aclanthology org d14 1058 multiarith https aclanthology org d15 1202 singleeq https aclanthology org q15 1042 singleop https doi org 10 1162 tacl a 00118 lila https arxiv org abs 2210 17517 commonsense commonsenseqa https arxiv org abs 1811 00937 strategyqa https arxiv org abs 2101 02235 arc https arxiv org abs 1803 05457 boolq https arxiv org abs 1905 10044 hotpotqa https arxiv org abs 1809 09600 openbookqa https arxiv org abs 1809 02789 piqa https arxiv org abs 1911 11641 symbolic coinflip https arxiv org abs 2201 11903 lastletterconcatenation https arxiv org abs 2201 11903 reverselist https arxiv org abs 2201 11903v1 logical reclor https arxiv org abs 2002 04326 logiqa https arxiv org abs 2007 08124 proofwriter https arxiv org abs 2012 13048 other arb https arxiv org abs 2307 13692 big bench https doi org 10 48550 arxiv 2206 04615 agieval https arxiv org abs 2304 06364 alert https arxiv org abs 2212 08286 condaqa https arxiv org abs 2211 00295 scan https arxiv org abs 1711 00350 wikiwhy https arxiv org abs 2210 12152 note although there is no official version for the symbolic reasoning benchmarks you can generate your own here https github com atfortes datagenlm other useful resources llm reasoners https github com ber666 llm reasoners a library for advanced large language model reasoning chain of thought hub https github com franxyao chain of thought hub benchmarking llm reasoning performance with chain of thought prompting thoughtsource https github com openbiolink thoughtsource central and open resource for data and tools related to chain of thought reasoning in large language models cotever https github com seungonekim cotever chain of thought prompting annotation toolkit for explanation verification agentchain https github com jina ai agentchain chain together llms for reasoning orchestrate multiple large models for accomplishing complex tasks cascades https github com google research cascades python library which enables complex compositions of language models such as scratchpads chain of thought tool use selection inference and more logitorch https github com logitorch logitorch pytorch based library for logical reasoning on natural language promptify https github com promptslab promptify solve nlp problems with llm s easily generate different nlp task prompts for popular generative models like gpt palm and more minichain https github com srush minichain tiny library for large language models llamaindex https github com jerryjliu llama index provides a central interface to connect your llm s with external data easyinstruct https github com zjunlp easyinstruct easy to use package for instructing large language models llms like gpt 3 in research experiments other awesome lists awesome multimodal reasoning https github com atfortes awesome multimodal reasoning collection of papers and resources on multimodal reasoning including vision language models multimodal chain of thought visual inference and others chain of thoughtspapers https github com timothyxxx chain of thoughtspapers a trend starts from chain of thought prompting elicits reasoning in large language models lm reasoning https github com jeffhj lm reasoning collection of papers and resources on reasoning in large language models prompt4reasoningpapers https github com zjunlp prompt4reasoningpapers repository for the paper reasoning with language model prompting a survey reasoningnlp https github com freedomintelligence reasoningnlp paper list on reasoning in nlp instruction tuning papers https github com sinclaircoder instruction tuning papers reading list of instruction tuning deep reasoning papers https github com floodsung deep reasoning papers recent papers including neural symbolic reasoning logical reasoning visual reasoning planning and any other topics connecting deep learning and reasoning awesome llm https github com hannibal046 awesome llm curated list of large language model contributing add a new paper or update an existing paper thinking about which category the work should belong to use the same format as existing entries to describe the work add the abstract link of the paper abs format if it is an arxiv publication don t worry if you do something wrong it will be fixed for you contributors a href https github com atfortes lm reasoning papers graphs contributors img src https contrib rocks image repo atfortes lm reasoning papers a star history star history chart https api star history com svg repos atfortes llm reasoning papers type date https star history com atfortes llm reasoning papers date | commonsense-reasoning language-models reasoning symbolic-reasoning prompt-learning prompt logical-reasoning question-answering in-context-learning chatgpt datasets chain-of-thought prompt-engineering arithmetic-reasoning cot gpt3 instruction-tuning awesome lm | ai |
COSC2196 | cosc2196 introduction to information technology | server |
|
dwata | dwata understand monitor or manage business data without sql or code track user growth find sales insights or bottlenecks share kpis without engineering home screen docs assets home view v3 png raw true home screen what is dwata complete software for business insights and operations query complex data without any sql or code share business insights with entire team onboard team members easily to your business process keep track of kpis from any data source compare weekly monthly quarterly growth directly integrates with stripe mailchimp paypal shippo etc build live reports and dashboards for customers suppliers automatically infer reflect database structure automatically infer database structure docs assets screencast db reflect db reflect gif raw true inferring database structure uses db reflection from sqlalchemy modern user experience grid showing columns docs assets grid view v4 png raw true grid showing columns show joined tables in a nice way multi table merge 1 1 1 m relations merge related data docs assets grid merge view v4 png raw true merge related data there is not sql to write everything is visual automatically generates sql join sub query grouping aggregates etc generates sql for join or sub query docs assets generated sql v4 png raw true generates sql for join or sub query rich set of ui elements boolean and large text fields docs assets detail view v4 png raw true boolean and large text fields why would you use dwata created with sales marketing analyst or any non developer in mind no need to invest in django laravel rails express based admin complete auto pilot software does not need developers help to operate enable and encourage everyone to query business insights and share building an admin application is an undifferentiated investment that does not add direct value to your customers some more details on what it does and how it works dwata scans mysql postgresql sqlite or mongodb coming soon automatically understands data schema tables columns relations builds dynamic ui automatically with grids widgets column view etc visually allows you to merge related data generates sql for you visually apply aggregates averages grouping and many other functions coming soon can connect directly with stripe paypal mailchimp etc coming soon has widgets for everything from numbers booleans large text dates timestamps geographic data with maps coming soon full text search coming soon internal cache so queries are not repeated unnecessarily dwata is in early development stage | admin mysql mariadb postgresql language-agnostic framework-agnostic self-hosted b2b enterprise team-work react workflows stripe dashboards zustand | server |
IoT_Sentinel | iot sentinel this program is an implementation of iot sentinel https arxiv org pdf 1611 04880 pdf device fingerprint it takes as input pcaps and tests each packets against 23 features link layer protocol 2 arp llc network layer protocol 4 ip icmp icmpv6 eapol transport layer protocol 2 tcp udp application layer protocol 8 http https dhcp bootp ssdp dns mdns ntp ip options 2 padding routeralert packet content 2 size int raw data ip address 1 destination ip counter int port class 2 source int destination int usage iot fingerprint py d inputdir or i inputpcap l label and o outputdir example iot fingerprint py d captures iot sentinel captures iot sentinel o csv result full | server |
|
FCJ2020_Fingerprint_Database | fcj2020 fingerprint database fcj2020 fingerprint database fcj2020 is a collection of fingerprint images created by the contribution of the students of the computer science and engineering department in jatiya kabi kazi nazrul islam university bangladesh in 2019 2020 all the fingerprint images were compiled and analyzed by the principal investigator prof dr md mijanur rahman fcj2020 database size of database 1200 db1 1st impression of four fingers of 50 persons 50x4 200 db2 2nd impression of four fingers of 50 persons 50x4 200 db3 3rd impression of four fingers of 50 persons 50x4 200 db4 4th impression of four fingers of 50 persons 50x4 200 db5 5th impression of four fingers of 50 persons 50x4 200 db6 6th impression of four fingers of 50 persons 50x4 200 use the database only for research and education purposes fingerprint acquisition tools accuracy analysis programs this folder contains fingerprint acquisition documentation tools and matlab codes for fingerprint accuracy analysis 50 student s fingerprints contact prof dr md mijanur rahman department of computer science and engineering jatiya kabi kazi nazrul islam university trishal mymensingh bangladesh email mijan jkkniu edu bd citation md mijanur rahman and tanjarul islam mishu fcj2020 fingerprint database department of computer science and engineering jatiya kabi kazi nazrul islam university github repository 2020 https github com mijancse fcj2020 fingerprint database rahman md mijanur and tanjarul islam mishu fcj2020 generating fingerprint templates with image processing and verification 2021 international conference on computing sciences iccs ieee 2021 | server |
Subsets and Splits