angelosps commited on
Commit
1fed022
·
verified ·
1 Parent(s): c46c8d8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -11
README.md CHANGED
@@ -11,24 +11,20 @@ metrics:
11
  pipeline_tag: text-classification
12
  ---
13
 
14
- # Model Card for DELTA<sub>M</sub>
15
 
16
- <!-- Provide a quick summary of what the model is/does. -->
 
 
17
 
18
- <!-- This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1). -->
19
 
20
  ## Model Details
21
 
22
  ### Model Description
23
 
24
- This is the model DELTA<sub>M</sub> which is a DeBERTaV3 large model fine-tuned on the DELTA<sub>D</sub> dataset.
25
 
26
- <!-- Provide a longer summary of what this model is. -->
27
-
28
- <!-- - **Developed by:** [More Information Needed]
29
- - **Funded by [optional]:** [More Information Needed]
30
- - **Shared by [optional]:** [More Information Needed]
31
- - **Model type:** [More Information Needed] -->
32
  - **License:** MIT
33
  - **Finetuned from model:** `microsoft/deberta-v3-large`
34
 
@@ -37,7 +33,6 @@ This is the model DELTA<sub>M</sub> which is a DeBERTaV3 large model fine-tuned
37
  - **Repository:** https://github.com/angelosps/DELTA
38
  - **Paper:** [Transformers in the Service of Description Logic-based Contexts](https://arxiv.org/abs/2311.08941)
39
 
40
- <!-- - **Demo [optional]:** [More Information Needed] -->
41
 
42
  <!-- ## Uses
43
 
 
11
  pipeline_tag: text-classification
12
  ---
13
 
14
+ # DELTA: Description Logics with Transformers
15
 
16
+ Fine-tuning a transformer model for textual entailment over expressive contexts generated from description logic knowledge bases.
17
+ Specifically, the model is given a context (a set of facts and rules) and a question.
18
+ The model should answer with "True" if the question is logically implied from the context, "False" if it contradicts the context, and "Unknown" if none of the two.
19
 
20
+ For more info please see our paper.
21
 
22
  ## Model Details
23
 
24
  ### Model Description
25
 
26
+ DELTA<sub>M</sub> is a DeBERTaV3 large model fine-tuned on the DELTA<sub>D</sub> dataset.
27
 
 
 
 
 
 
 
28
  - **License:** MIT
29
  - **Finetuned from model:** `microsoft/deberta-v3-large`
30
 
 
33
  - **Repository:** https://github.com/angelosps/DELTA
34
  - **Paper:** [Transformers in the Service of Description Logic-based Contexts](https://arxiv.org/abs/2311.08941)
35
 
 
36
 
37
  <!-- ## Uses
38