yinghy2018 commited on
Commit
f3d3fa0
1 Parent(s): 3deff36

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -3
README.md CHANGED
@@ -1,3 +1,54 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - zh
5
+ - en
6
+ base_model:
7
+ - Qwen/Qwen2.5-7B-Instruct
8
+ - meta-llama/Llama-3.1-8B-Instruct
9
+ pipeline_tag: feature-extraction
10
+ tags:
11
+ - structuring
12
+ - EHR
13
+ - medical
14
+ - IE
15
+ ---
16
+ # Model Card for GENIE
17
+
18
+
19
+ ## Model Details
20
+
21
+ Model Size: 8B (English) / 7B (Chinese)
22
+
23
+ Max Tokens: 8192
24
+
25
+ Base model: Llama 3.1 8B (English) / Qwen 2.5 7B (Chinese)
26
+
27
+ ### Model Description
28
+
29
+ GENIE (Generative Note Information Extraction) is an end-to-end model for structuring EHR data.
30
+ GENIE can process an entire paragraph of clinical notes in a single pass, outputting structured information on named entities, assertion statuses, locations, other relevant modifiers, clinical values, and intended purposes.
31
+ This end-to-end approach simplifies the structuring process, reduces errors, and enables healthcare providers to derive structured data from EHRs more efficiently, without the need for extensive manual adjustments.
32
+ And experiments have shown that GENIE achieves high accuracy in each of the task.
33
+
34
+
35
+ ## Usage
36
+
37
+ ```python
38
+ from vllm import LLM, SamplingParams
39
+
40
+ PROMPT_TEMPLATE = "Human:\n{query}\n\n Assistant:"
41
+ sampling_params = SamplingParams(temperature=temperature, max_tokens=max_new_token)
42
+ EHR = ['xxxxx1','xxxxx2']
43
+ texts = [PROMPT_TEMPLATE.format(query=k) for k in EHR]
44
+ output = model.generate(texts, sampling_params)
45
+ ```
46
+
47
+
48
+ ## Citation [optional]
49
+
50
+ If you find our paper or models helpful, please consider cite: (to be released)
51
+
52
+ **BibTeX:**
53
+
54
+ [More Information Needed]