prithivMLmods
commited on
Commit
•
a4d169e
1
Parent(s):
9c08bdc
Update README.md
Browse files
README.md
CHANGED
@@ -3,6 +3,9 @@ license: creativeml-openrail-m
|
|
3 |
---
|
4 |
## BART (Bidirectional and Auto-Regressive Transformer) Architecture
|
5 |
|
|
|
|
|
|
|
6 |
BART’s primary task is used to generate clean semantically coherent text from corrupted text data but it can also be used for a variety of different NLP sub-tasks like language translation, question-answering tasks, text summarization, paraphrasing, etc.
|
7 |
|
8 |
As BART is an autoencoder model, it consists of an encoder model and a decoder model. For its encoder model, BART uses a bi-directional encoder that is used in BERT, and for its decoder mode, it uses an autoregressive decoder that forms the core aspect of a GPT -1 model.
|
|
|
3 |
---
|
4 |
## BART (Bidirectional and Auto-Regressive Transformer) Architecture
|
5 |
|
6 |
+
![row01](assets/Bart.png)
|
7 |
+
|
8 |
+
|
9 |
BART’s primary task is used to generate clean semantically coherent text from corrupted text data but it can also be used for a variety of different NLP sub-tasks like language translation, question-answering tasks, text summarization, paraphrasing, etc.
|
10 |
|
11 |
As BART is an autoencoder model, it consists of an encoder model and a decoder model. For its encoder model, BART uses a bi-directional encoder that is used in BERT, and for its decoder mode, it uses an autoregressive decoder that forms the core aspect of a GPT -1 model.
|