Divyasreepat commited on
Commit
8d65b50
·
verified ·
1 Parent(s): e5ad571

Update README.md with new model card content

Browse files
Files changed (1) hide show
  1. README.md +25 -0
README.md CHANGED
@@ -26,6 +26,31 @@ warranties or conditions of any kind. The underlying model is provided by a
26
  third party and subject to a separate license, available
27
  [here](https://github.com/facebookresearch/fairseq).
28
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
29
 
30
  __Arguments__
31
 
 
26
  third party and subject to a separate license, available
27
  [here](https://github.com/facebookresearch/fairseq).
28
 
29
+ ## Links
30
+
31
+ * [RoBERTa Quickstart Notebook](https://www.kaggle.com/code/laxmareddypatlolla/roberta-quickstart-notebook)
32
+ * [RoBERTa API Documentation](https://keras.io/keras_hub/api/models/roberta/)
33
+ * [KerasHub Beginner Guide](https://keras.io/guides/keras_hub/getting_started/)
34
+ * [KerasHub Model Publishing Guide](https://keras.io/guides/keras_hub/upload/)
35
+
36
+ ## Installation
37
+
38
+ Keras and KerasHub can be installed with:
39
+
40
+ ```
41
+ pip install -U -q keras-Hub
42
+ pip install -U -q keras
43
+ ```
44
+
45
+ Jax, TensorFlow, and Torch come preinstalled in Kaggle Notebooks. For instructions on installing them in another environment see the [Keras Getting Started](https://keras.io/getting_started/) page.
46
+
47
+ ## Presets
48
+
49
+ The following model checkpoints are provided by the Keras team. Full code examples for each are available below.
50
+ | Preset name | Parameters | Description |
51
+ |----------------|------------|--------------------------------------------------|
52
+ | roberta_base_en | 124.05M | 12-layer RoBERTa model where case is maintained.Trained on English Wikipedia, BooksCorpus, CommonCraw, and OpenWebText. |
53
+ | roberta_large_en | 354.31M | 24-layer RoBERTa model where case is maintained.Trained on English Wikipedia, BooksCorpus, CommonCraw, and OpenWebText. |
54
 
55
  __Arguments__
56