File size: 639 Bytes
dbae87f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
datasets:
- albertvillanova/legal_contracts 
---

# bert-tiny-finetuned-legal-contracts-longer

This model is a fine-tuned version of [google/bert_uncased_L-4_H-512_A-8](https://huggingface.co/google/google/google/bert_uncased_L-4_H-512_A-8) on the portion of legal_contracts dataset for 1 epoch.

# Note 
The model was not trained on the whole dataset which is around 9.5 GB, but only 

## The first 20% of `train` + the last 5% of `train`.

```bash
datasets_train = load_dataset('albertvillanova/legal_contracts' , split='train[:20%]')
datasets_validation = load_dataset('albertvillanova/legal_contracts' , split='train[-5%:]')
```