Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,73 @@
|
|
1 |
-
---
|
2 |
-
license: apache-2.0
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
---
|
6 |
+
|
7 |
+
# MedM-VL-CT-3B-en
|
8 |
+
|
9 |
+
## Introduction
|
10 |
+
|
11 |
+
A medical LVLM, trained on **English** data, accepts text and **a single 3D CT volume** as input, and text-based results as output, enabling tasks such as **report generation** and **medical VQA**.
|
12 |
+
|
13 |
+
Here are the evaluation results on **M3D-Bench**:
|
14 |
+
|
15 |
+
<table>
|
16 |
+
<tr>
|
17 |
+
<td rowspan="2"> Method </td>
|
18 |
+
<td align="center" colspan="4"> Report Generation </td>
|
19 |
+
<td align="center" colspan="5"> Medical VQA </td>
|
20 |
+
</tr>
|
21 |
+
<tr align="center">
|
22 |
+
<td> BLEU </td>
|
23 |
+
<td> ROUGE </td>
|
24 |
+
<td> METEOR </td>
|
25 |
+
<td> BERT-Score </td>
|
26 |
+
<td> Accuracy </td>
|
27 |
+
<td> BLEU </td>
|
28 |
+
<td> ROUGE </td>
|
29 |
+
<td> METEOR </td>
|
30 |
+
<td> BERT-Score </td>
|
31 |
+
</tr>
|
32 |
+
<tr>
|
33 |
+
<td> RadFM </td>
|
34 |
+
<td align="center"> 12.23 </td>
|
35 |
+
<td align="center"> 16.49 </td>
|
36 |
+
<td align="center"> 11.57 </td>
|
37 |
+
<td align="center"> 87.93 </td>
|
38 |
+
<td align="center"> 19.79 </td>
|
39 |
+
<td align="center"> 16.39 </td>
|
40 |
+
<td align="center"> 26.13 </td>
|
41 |
+
<td align="center"> 21.33 </td>
|
42 |
+
<td align="center"> 88.72 </td>
|
43 |
+
</tr>
|
44 |
+
<tr>
|
45 |
+
<td> M3D-LaMed </td>
|
46 |
+
<td align="center"> 15.15 </td>
|
47 |
+
<td align="center"> 19.55 </td>
|
48 |
+
<td align="center"> 14.38 </td>
|
49 |
+
<td align="center"> 88.46 </td>
|
50 |
+
<td align="center"> 75.78 </td>
|
51 |
+
<td align="center"> 49.38 </td>
|
52 |
+
<td align="center"> 52.39 </td>
|
53 |
+
<td align="center"> 33.58 </td>
|
54 |
+
<td align="center"> 91.53 </td>
|
55 |
+
</tr>
|
56 |
+
<tr>
|
57 |
+
<td> MedM-VL-CT-3B-en </td>
|
58 |
+
<td align="center"> <b>49.81</b> </td>
|
59 |
+
<td align="center"> <b>52.45</b> </td>
|
60 |
+
<td align="center"> <b>49.27</b> </td>
|
61 |
+
<td align="center"> <b>90.38</b> </td>
|
62 |
+
<td align="center"> <b>80.12</b> </td>
|
63 |
+
<td align="center"> <b>56.56</b> </td>
|
64 |
+
<td align="center"> <b>59.96</b> </td>
|
65 |
+
<td align="center"> <b>39.75</b> </td>
|
66 |
+
<td align="center"> <b>92.85</b> </td>
|
67 |
+
</tr>
|
68 |
+
</table>
|
69 |
+
|
70 |
+
|
71 |
+
## Quickstart
|
72 |
+
|
73 |
+
Please refer to [MedM-VL](https://github.com/MSIIP/MedM-VL).
|