File size: 2,927 Bytes
78b4271 4b04973 78b4271 4b04973 ff7aaa0 e9c93b3 4b04973 e9c93b3 4b04973 c2bd35c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
---
language:
- zh
tags:
- Transformer
- korean
- romanization
- person name
- 한국어
license: cc-by-nc-sa-4.0
---
# <font color="IndianRed"> Kraft (Korean Romanization From Transformer) </font>
[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/drive/1aIGvyvqRdHv7QTRahhD1sf8L6yV39kxc?usp=sharing/)
The Kraft (Korean Romanization From Transformer) model translates the characters (Hangul) of a Korean person name into the Roman alphabet ([McCune–Reischauer system](https://en.wikipedia.org/wiki/McCune%E2%80%93Reischauer)). Kraft uses the Transformer architecture, which is a type of neural network architecture that was introduced in the 2017 paper "Attention Is All You Need" by Google researchers. It is designed for sequence-to-sequence tasks, such as machine translation, language modeling, and summarization.
Translating a Korean name into an English romanization is a type of machine translation task, where the input is a sequence of characters representing a Korean name, and the output is a sequence of characters representing the English romanization of that name. The Transformer model, with its attention mechanism and ability to handle input sequences of varying lengths, is well-suited to this type of task, and is able to accurately translate Korean names to English romanization.
## <font color="IndianRed"> Model description </font>
The transformer model has an encoder and a decoder, in which the encoder takes a sentence in the source language and the decoder outputs it into the target language.
## <font color="IndianRed"> Intended uses & limitations </font>
Note that this model primarily aims at translating Korean names into English romanization.
## <font color="IndianRed"> Authors </font>
<a href="https://www.w3.org/">Queenie Luo</a>
<br>
<a href="https://www.iq.harvard.edu/people/yafei-chen">Yafei Chen</a>
<br>
<a href="https://github.com/sudoghut">Hongsu Wang</a>
<br>
<a href="https://ealc.fas.harvard.edu/people/kanghun-ahn">Kanghun Ahn</a>
<br>
<a href="https://ealc.fas.harvard.edu/people/sun-joo-kim">Sun Joo Kim</a>
<br>
<a href="https://ealc.fas.harvard.edu/people/peter-k-bol">Peter Bol</a>
<br>
<a href="https://projects.iq.harvard.edu/cbdb/home">CBDB Group</a>
## <font color="IndianRed"> Acknowledgement </font>
<a href="https://library.harvard.edu/staff/mikyung-kang">Mikyung Kang</a>
<br>
<a href="https://library.princeton.edu/staff/hyoungl">Hyoungbae Lee</a>
<br>
Shirley Boya Ouyang
## <font color="IndianRed"> License </font>
Copyright (c) 2023 CBDB
Except where otherwise noted, content on this repository is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License (CC BY-NC-SA 4.0).
To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/4.0/ or
send a letter to Creative Commons, PO Box 1866, Mountain View, CA 94042, USA.
|