harshildarji
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,20 @@
|
|
1 |
-
---
|
2 |
-
license: cc0-1.0
|
3 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: cc0-1.0
|
3 |
+
---
|
4 |
+
|
5 |
+
A Reber sequence is a grammar string made of finite states, in simple words, formed by using a confined set of characters. In the research paper that proposed the [LSTM](https://dl.acm.org/doi/10.1162/neco.1997.9.8.1735), the authors use embedded Reber grammar due to its short time lags.
|
6 |
+
|
7 |
+
[Here](https://drive.google.com/file/d/1ZFcCZsmC_2zMGuDG4X8GHYxKDpErfb4a/view) is the chapter from my Master's thesis that briefly explains the dataset, and also includes various plots about its statistics.
|
8 |
+
|
9 |
+
**Data visualiations available at [about_reber](https://www.kaggle.com/harshildarji/about-reber).**
|
10 |
+
|
11 |
+
**Master's thesis**: [Investigating Sparsity in Recurrent Neural Networks](https://arxiv.org/abs/2407.20601)
|
12 |
+
|
13 |
+
```
|
14 |
+
@article{darji2024investigating,
|
15 |
+
title={Investigating Sparsity in Recurrent Neural Networks},
|
16 |
+
author={Darji, Harshil},
|
17 |
+
journal={arXiv preprint arXiv:2407.20601},
|
18 |
+
year={2024}
|
19 |
+
}
|
20 |
+
```
|