ricdomolm commited on
Commit
81a5a72
1 Parent(s): c760a81

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -0
README.md CHANGED
@@ -5,3 +5,52 @@ license: mit
5
  tags:
6
  - legal
7
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5
  tags:
6
  - legal
7
  ---
8
+
9
+ # Lawma 8B
10
+
11
+ Lawma 8B is a fine-tune of Llama 3 8B Instruct on 260 legal classification tasks derived from [Supreme Court](http://scdb.wustl.edu/data.php) and [Songer Court of Appeals](www.songerproject.org/us-courts-of-appeals-databases.html) databases. Lawma was fine-tuned on over 500k task examples, totalling 2B tokens. As a result, Lawma 8B outperforms GPT-4 on 95\% of these legal classification tasks, on average by over 17 accuracy points. See our [arXiv preprint](https://arxiv.org/abs/2407.16615) and [GitHub repository](https://github.com/socialfoundations/lawma) for more details.
12
+
13
+ ## Evaluations
14
+
15
+ We report mean classification accuracy across the 260 legal classification tasks that we consider. We use the standard MMLU multiple-choice prompt, and evaluate models zero-shot. You can find our evaluation code [here](https://github.com/socialfoundations/lawma/tree/main/evaluation).
16
+
17
+ | Model | All tasks | Supreme Court tasks | Court of Appeals tasks |
18
+ |---------|:---------:|:-------------:|:----------------:|
19
+ | Lawma 70B | **81.9** | **84.1** | **81.5** |
20
+ | Lawma 8B | 80.3 | 82.4 | 79.9 |
21
+ | GPT4 | 62.9 | 59.8 | 63.4 |
22
+ | Llama 3 70B Inst | 58.4 | 47.1 | 60.3 |
23
+ | Mixtral 8x7B Inst | 43.2 | 24.4 | 46.4 |
24
+ | Llama 3 8B Inst | 42.6 | 32.8 | 44.2 |
25
+ | Majority classifier | 41.7 | 31.5 | 43.5 |
26
+ | Mistral 7B Inst | 39.9 | 19.5 | 43.4 |
27
+ | Saul 7B Inst | 34.4 | 20.2 | 36.8 |
28
+ | LegalBert | 24.6 | 13.6 | 26.4 |
29
+
30
+ ## FAQ
31
+
32
+ **What are the Lawma models useful for?** We recommend using the Lawma models only for legal classification tasks that are very similar to those in the Supreme Court and Court of Appeals databases. The main take-away of our paper is that fine-tuning on specific tasks of interest leads to large improvements in performance. Therefore, we strongly recommend to further fine-tune on the tasks that you whish to use the model for. Relatively few examples --i.e, dozens or hundreds-- may already lead to large gains in performance. Since Lawma was fine-tuned on a diverse set of legal classification tasks, fine-tuning Lawma may yield better resuts compared to fine-tuning more general instruction-tuned models.
33
+
34
+ **What legal classification tasks do we consider?** We consider almost all of the variables of the [Supreme Court](http://scdb.wustl.edu/data.php) and [Songer Court of Appeals](www.songerproject.org/us-courts-of-appeals-databases.html) databases. Our reasons to study these legal classification tasks are both technical and substantive. From a technical machine learning perspective, these tasks provide highly non-trivial classification problems where
35
+ even the best models leave much room for improvement. From a substantive legal perspective, efficient
36
+ solutions to such classification problems have rich and important applications in legal research.
37
+
38
+ ## Citation
39
+
40
+ This model was trained for the project
41
+
42
+ *Lawma: The Power of Specizalization for Legal Tasks. Ricardo Dominguez-Olmedo and Vedant Nanda and Rediet Abebe and Stefan Bechtold and Christoph Engel and Jens Frankenreiter and Krishna Gummadi and Moritz Hardt and Michael Livermore. 2024*
43
+
44
+ Please cite as:
45
+
46
+ ```
47
+ @misc{dominguezolmedo2024lawmapowerspecializationlegal,
48
+ title={Lawma: The Power of Specialization for Legal Tasks},
49
+ author={Ricardo Dominguez-Olmedo and Vedant Nanda and Rediet Abebe and Stefan Bechtold and Christoph Engel and Jens Frankenreiter and Krishna Gummadi and Moritz Hardt and Michael Livermore},
50
+ year={2024},
51
+ eprint={2407.16615},
52
+ archivePrefix={arXiv},
53
+ primaryClass={cs.CL},
54
+ url={https://arxiv.org/abs/2407.16615},
55
+ }
56
+ ```