update model card README.md
Browse files
README.md
CHANGED
@@ -15,27 +15,27 @@ should probably proofread and complete it, then remove this comment. -->
|
|
15 |
|
16 |
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Loss: 0.
|
19 |
-
- Law Precision: 0.
|
20 |
-
- Law Recall: 0.
|
21 |
-
- Law F1: 0.
|
22 |
- Law Number: 75
|
23 |
-
- Violated by Precision: 0.
|
24 |
-
- Violated by Recall: 0.
|
25 |
-
- Violated by F1: 0.
|
26 |
- Violated by Number: 75
|
27 |
-
- Violated on Precision: 0.
|
28 |
-
- Violated on Recall: 0.
|
29 |
-
- Violated on F1: 0.
|
30 |
- Violated on Number: 75
|
31 |
-
- Violation Precision: 0.
|
32 |
-
- Violation Recall: 0.
|
33 |
-
- Violation F1: 0.
|
34 |
- Violation Number: 616
|
35 |
-
- Overall Precision: 0.
|
36 |
-
- Overall Recall: 0.
|
37 |
-
- Overall F1: 0.
|
38 |
-
- Overall Accuracy: 0.
|
39 |
|
40 |
## Model description
|
41 |
|
@@ -61,62 +61,22 @@ The following hyperparameters were used during training:
|
|
61 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
62 |
- lr_scheduler_type: linear
|
63 |
- lr_scheduler_warmup_steps: 500
|
64 |
-
- num_epochs:
|
65 |
|
66 |
### Training results
|
67 |
|
68 |
| Training Loss | Epoch | Step | Validation Loss | Law Precision | Law Recall | Law F1 | Law Number | Violated by Precision | Violated by Recall | Violated by F1 | Violated by Number | Violated on Precision | Violated on Recall | Violated on F1 | Violated on Number | Violation Precision | Violation Recall | Violation F1 | Violation Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
69 |
|:-------------:|:-----:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:---------------------:|:------------------:|:--------------:|:------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
|
|
78 |
-
|
|
79 |
-
|
|
80 |
-
| No log | 11.0 | 495 | 0.3221 | 0.7234 | 0.9067 | 0.8047 | 75 | 0.8919 | 0.88 | 0.8859 | 75 | 0.7105 | 0.72 | 0.7152 | 75 | 0.5030 | 0.5422 | 0.5219 | 616 | 0.5749 | 0.6207 | 0.5969 | 0.9374 |
|
81 |
-
| 0.271 | 12.0 | 540 | 0.3475 | 0.6538 | 0.9067 | 0.7598 | 75 | 0.6542 | 0.9333 | 0.7692 | 75 | 0.4831 | 0.76 | 0.5907 | 75 | 0.5141 | 0.5341 | 0.5239 | 616 | 0.5408 | 0.6231 | 0.5790 | 0.9344 |
|
82 |
-
| 0.271 | 13.0 | 585 | 0.2667 | 0.7128 | 0.8933 | 0.7929 | 75 | 0.6 | 0.92 | 0.7263 | 75 | 0.5824 | 0.7067 | 0.6386 | 75 | 0.5530 | 0.6526 | 0.5987 | 616 | 0.5755 | 0.7027 | 0.6328 | 0.9484 |
|
83 |
-
| 0.271 | 14.0 | 630 | 0.2868 | 0.7113 | 0.92 | 0.8023 | 75 | 0.8642 | 0.9333 | 0.8974 | 75 | 0.6706 | 0.76 | 0.7125 | 75 | 0.5393 | 0.6347 | 0.5831 | 616 | 0.5941 | 0.6980 | 0.6419 | 0.9508 |
|
84 |
-
| 0.271 | 15.0 | 675 | 0.3035 | 0.7419 | 0.92 | 0.8214 | 75 | 0.8537 | 0.9333 | 0.8917 | 75 | 0.6471 | 0.7333 | 0.6875 | 75 | 0.5681 | 0.6769 | 0.6178 | 616 | 0.6147 | 0.7265 | 0.6659 | 0.9475 |
|
85 |
-
| 0.271 | 16.0 | 720 | 0.2746 | 0.7053 | 0.8933 | 0.7882 | 75 | 0.8861 | 0.9333 | 0.9091 | 75 | 0.6875 | 0.7333 | 0.7097 | 75 | 0.5654 | 0.6737 | 0.6148 | 616 | 0.6144 | 0.7218 | 0.6638 | 0.9523 |
|
86 |
-
| 0.271 | 17.0 | 765 | 0.2846 | 0.7907 | 0.9067 | 0.8447 | 75 | 0.9221 | 0.9467 | 0.9342 | 75 | 0.6709 | 0.7067 | 0.6883 | 75 | 0.5904 | 0.6623 | 0.6243 | 616 | 0.6431 | 0.7134 | 0.6764 | 0.9506 |
|
87 |
-
| 0.271 | 18.0 | 810 | 0.3004 | 0.75 | 0.92 | 0.8263 | 75 | 0.8434 | 0.9333 | 0.8861 | 75 | 0.6747 | 0.7467 | 0.7089 | 75 | 0.5501 | 0.6769 | 0.6070 | 616 | 0.6024 | 0.7277 | 0.6591 | 0.9464 |
|
88 |
-
| 0.271 | 19.0 | 855 | 0.3235 | 0.6832 | 0.92 | 0.7841 | 75 | 0.8214 | 0.92 | 0.8679 | 75 | 0.6667 | 0.7733 | 0.7160 | 75 | 0.5451 | 0.6867 | 0.6078 | 616 | 0.5906 | 0.7360 | 0.6554 | 0.9482 |
|
89 |
-
| 0.271 | 20.0 | 900 | 0.3274 | 0.7391 | 0.9067 | 0.8144 | 75 | 0.8861 | 0.9333 | 0.9091 | 75 | 0.6835 | 0.72 | 0.7013 | 75 | 0.5679 | 0.6786 | 0.6183 | 616 | 0.6187 | 0.7253 | 0.6678 | 0.9474 |
|
90 |
-
| 0.271 | 21.0 | 945 | 0.3756 | 0.7882 | 0.8933 | 0.8375 | 75 | 0.8537 | 0.9333 | 0.8917 | 75 | 0.6962 | 0.7333 | 0.7143 | 75 | 0.6046 | 0.6429 | 0.6231 | 616 | 0.6526 | 0.6992 | 0.6751 | 0.9468 |
|
91 |
-
| 0.271 | 22.0 | 990 | 0.3511 | 0.8072 | 0.8933 | 0.8481 | 75 | 0.8974 | 0.9333 | 0.9150 | 75 | 0.7215 | 0.76 | 0.7403 | 75 | 0.5813 | 0.6445 | 0.6112 | 616 | 0.6403 | 0.7027 | 0.6701 | 0.9460 |
|
92 |
-
| 0.007 | 23.0 | 1035 | 0.3187 | 0.8375 | 0.8933 | 0.8645 | 75 | 0.9079 | 0.92 | 0.9139 | 75 | 0.7237 | 0.7333 | 0.7285 | 75 | 0.5816 | 0.6656 | 0.6207 | 616 | 0.6414 | 0.7146 | 0.6760 | 0.9507 |
|
93 |
-
| 0.007 | 24.0 | 1080 | 0.3383 | 0.8171 | 0.8933 | 0.8535 | 75 | 0.8118 | 0.92 | 0.8625 | 75 | 0.6747 | 0.7467 | 0.7089 | 75 | 0.5727 | 0.6331 | 0.6014 | 616 | 0.6251 | 0.6920 | 0.6569 | 0.9473 |
|
94 |
-
| 0.007 | 25.0 | 1125 | 0.3231 | 0.7952 | 0.88 | 0.8354 | 75 | 0.8434 | 0.9333 | 0.8861 | 75 | 0.6667 | 0.72 | 0.6923 | 75 | 0.5524 | 0.6851 | 0.6116 | 616 | 0.6053 | 0.7277 | 0.6609 | 0.9483 |
|
95 |
-
| 0.007 | 26.0 | 1170 | 0.3099 | 0.7033 | 0.8533 | 0.7711 | 75 | 0.8642 | 0.9333 | 0.8974 | 75 | 0.7143 | 0.7333 | 0.7237 | 75 | 0.5722 | 0.6753 | 0.6195 | 616 | 0.6199 | 0.7194 | 0.6659 | 0.9509 |
|
96 |
-
| 0.007 | 27.0 | 1215 | 0.3202 | 0.7701 | 0.8933 | 0.8272 | 75 | 0.8961 | 0.92 | 0.9079 | 75 | 0.6582 | 0.6933 | 0.6753 | 75 | 0.5977 | 0.6705 | 0.6320 | 616 | 0.6435 | 0.7146 | 0.6772 | 0.9509 |
|
97 |
-
| 0.007 | 28.0 | 1260 | 0.3381 | 0.7263 | 0.92 | 0.8118 | 75 | 0.7931 | 0.92 | 0.8519 | 75 | 0.7 | 0.7467 | 0.7226 | 75 | 0.5909 | 0.6753 | 0.6303 | 616 | 0.6315 | 0.7253 | 0.6752 | 0.9496 |
|
98 |
-
| 0.007 | 29.0 | 1305 | 0.3413 | 0.7841 | 0.92 | 0.8466 | 75 | 0.7955 | 0.9333 | 0.8589 | 75 | 0.6463 | 0.7067 | 0.6752 | 75 | 0.5895 | 0.6737 | 0.6288 | 616 | 0.6310 | 0.7218 | 0.6733 | 0.95 |
|
99 |
-
| 0.007 | 30.0 | 1350 | 0.3427 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8961 | 0.92 | 0.9079 | 75 | 0.6962 | 0.7333 | 0.7143 | 75 | 0.5974 | 0.6721 | 0.6325 | 616 | 0.6492 | 0.7218 | 0.6836 | 0.9508 |
|
100 |
-
| 0.007 | 31.0 | 1395 | 0.3473 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8861 | 0.9333 | 0.9091 | 75 | 0.7143 | 0.7333 | 0.7237 | 75 | 0.5738 | 0.6623 | 0.6149 | 616 | 0.6317 | 0.7158 | 0.6711 | 0.9493 |
|
101 |
-
| 0.007 | 32.0 | 1440 | 0.3531 | 0.7188 | 0.92 | 0.8070 | 75 | 0.8140 | 0.9333 | 0.8696 | 75 | 0.7195 | 0.7867 | 0.7516 | 75 | 0.5740 | 0.6672 | 0.6171 | 616 | 0.6214 | 0.7241 | 0.6689 | 0.9495 |
|
102 |
-
| 0.007 | 33.0 | 1485 | 0.3556 | 0.7263 | 0.92 | 0.8118 | 75 | 0.8140 | 0.9333 | 0.8696 | 75 | 0.7108 | 0.7867 | 0.7468 | 75 | 0.5783 | 0.6656 | 0.6189 | 616 | 0.6249 | 0.7229 | 0.6703 | 0.9492 |
|
103 |
-
| 0.0009 | 34.0 | 1530 | 0.3569 | 0.7667 | 0.92 | 0.8364 | 75 | 0.8140 | 0.9333 | 0.8696 | 75 | 0.7215 | 0.76 | 0.7403 | 75 | 0.5783 | 0.6656 | 0.6189 | 616 | 0.6286 | 0.7206 | 0.6715 | 0.9493 |
|
104 |
-
| 0.0009 | 35.0 | 1575 | 0.3630 | 0.7841 | 0.92 | 0.8466 | 75 | 0.8140 | 0.9333 | 0.8696 | 75 | 0.7 | 0.7467 | 0.7226 | 75 | 0.5838 | 0.6672 | 0.6227 | 616 | 0.6326 | 0.7206 | 0.6737 | 0.9489 |
|
105 |
-
| 0.0009 | 36.0 | 1620 | 0.3624 | 0.7667 | 0.92 | 0.8364 | 75 | 0.8140 | 0.9333 | 0.8696 | 75 | 0.7179 | 0.7467 | 0.7320 | 75 | 0.5782 | 0.6721 | 0.6216 | 616 | 0.6278 | 0.7241 | 0.6726 | 0.9493 |
|
106 |
-
| 0.0009 | 37.0 | 1665 | 0.3614 | 0.7667 | 0.92 | 0.8364 | 75 | 0.8140 | 0.9333 | 0.8696 | 75 | 0.7403 | 0.76 | 0.75 | 75 | 0.5744 | 0.6705 | 0.6187 | 616 | 0.6265 | 0.7241 | 0.6718 | 0.9493 |
|
107 |
-
| 0.0009 | 38.0 | 1710 | 0.3630 | 0.7841 | 0.92 | 0.8466 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7662 | 0.7867 | 0.7763 | 75 | 0.5842 | 0.6705 | 0.6243 | 616 | 0.6385 | 0.7265 | 0.6796 | 0.9496 |
|
108 |
-
| 0.0009 | 39.0 | 1755 | 0.3645 | 0.8118 | 0.92 | 0.8625 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7662 | 0.7867 | 0.7763 | 75 | 0.5842 | 0.6705 | 0.6243 | 616 | 0.6405 | 0.7265 | 0.6808 | 0.9499 |
|
109 |
-
| 0.0009 | 40.0 | 1800 | 0.3670 | 0.8313 | 0.92 | 0.8734 | 75 | 0.8333 | 0.9333 | 0.8805 | 75 | 0.7632 | 0.7733 | 0.7682 | 75 | 0.5784 | 0.6705 | 0.6211 | 616 | 0.6374 | 0.7253 | 0.6785 | 0.9499 |
|
110 |
-
| 0.0009 | 41.0 | 1845 | 0.3693 | 0.8214 | 0.92 | 0.8679 | 75 | 0.8333 | 0.9333 | 0.8805 | 75 | 0.7632 | 0.7733 | 0.7682 | 75 | 0.5831 | 0.6721 | 0.6244 | 616 | 0.6405 | 0.7265 | 0.6808 | 0.9497 |
|
111 |
-
| 0.0009 | 42.0 | 1890 | 0.3727 | 0.7931 | 0.92 | 0.8519 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7436 | 0.7733 | 0.7582 | 75 | 0.5946 | 0.6737 | 0.6317 | 616 | 0.6456 | 0.7277 | 0.6842 | 0.9491 |
|
112 |
-
| 0.0009 | 43.0 | 1935 | 0.3739 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7160 | 0.7733 | 0.7436 | 75 | 0.5968 | 0.6753 | 0.6337 | 616 | 0.6459 | 0.7289 | 0.6849 | 0.9490 |
|
113 |
-
| 0.0009 | 44.0 | 1980 | 0.3750 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7 | 0.7467 | 0.7226 | 75 | 0.5968 | 0.6753 | 0.6337 | 616 | 0.6445 | 0.7265 | 0.6831 | 0.9490 |
|
114 |
-
| 0.0001 | 45.0 | 2025 | 0.3753 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7160 | 0.7733 | 0.7436 | 75 | 0.5932 | 0.6769 | 0.6323 | 616 | 0.6429 | 0.7301 | 0.6837 | 0.9490 |
|
115 |
-
| 0.0001 | 46.0 | 2070 | 0.3755 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7342 | 0.7733 | 0.7532 | 75 | 0.5912 | 0.6786 | 0.6319 | 616 | 0.6426 | 0.7313 | 0.6841 | 0.9492 |
|
116 |
-
| 0.0001 | 47.0 | 2115 | 0.3759 | 0.8023 | 0.92 | 0.8571 | 75 | 0.8235 | 0.9333 | 0.8750 | 75 | 0.7342 | 0.7733 | 0.7532 | 75 | 0.5896 | 0.6786 | 0.6309 | 616 | 0.6413 | 0.7313 | 0.6833 | 0.9492 |
|
117 |
-
| 0.0001 | 48.0 | 2160 | 0.3742 | 0.8118 | 0.92 | 0.8625 | 75 | 0.8434 | 0.9333 | 0.8861 | 75 | 0.7532 | 0.7733 | 0.7632 | 75 | 0.5770 | 0.6688 | 0.6195 | 616 | 0.6350 | 0.7241 | 0.6767 | 0.9493 |
|
118 |
-
| 0.0001 | 49.0 | 2205 | 0.3744 | 0.8118 | 0.92 | 0.8625 | 75 | 0.8434 | 0.9333 | 0.8861 | 75 | 0.7532 | 0.7733 | 0.7632 | 75 | 0.5760 | 0.6705 | 0.6197 | 616 | 0.6341 | 0.7253 | 0.6767 | 0.9495 |
|
119 |
-
| 0.0001 | 50.0 | 2250 | 0.3745 | 0.8118 | 0.92 | 0.8625 | 75 | 0.8434 | 0.9333 | 0.8861 | 75 | 0.7532 | 0.7733 | 0.7632 | 75 | 0.5768 | 0.6705 | 0.6201 | 616 | 0.6348 | 0.7253 | 0.6770 | 0.9495 |
|
120 |
|
121 |
|
122 |
### Framework versions
|
|
|
15 |
|
16 |
This model is a fine-tuned version of [microsoft/deberta-v3-base](https://huggingface.co/microsoft/deberta-v3-base) on an unknown dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 0.2674
|
19 |
+
- Law Precision: 0.6932
|
20 |
+
- Law Recall: 0.8133
|
21 |
+
- Law F1: 0.7485
|
22 |
- Law Number: 75
|
23 |
+
- Violated by Precision: 0.8684
|
24 |
+
- Violated by Recall: 0.88
|
25 |
+
- Violated by F1: 0.8742
|
26 |
- Violated by Number: 75
|
27 |
+
- Violated on Precision: 0.5882
|
28 |
+
- Violated on Recall: 0.6667
|
29 |
+
- Violated on F1: 0.625
|
30 |
- Violated on Number: 75
|
31 |
+
- Violation Precision: 0.5287
|
32 |
+
- Violation Recall: 0.6429
|
33 |
+
- Violation F1: 0.5802
|
34 |
- Violation Number: 616
|
35 |
+
- Overall Precision: 0.5741
|
36 |
+
- Overall Recall: 0.6813
|
37 |
+
- Overall F1: 0.6232
|
38 |
+
- Overall Accuracy: 0.9461
|
39 |
|
40 |
## Model description
|
41 |
|
|
|
61 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
62 |
- lr_scheduler_type: linear
|
63 |
- lr_scheduler_warmup_steps: 500
|
64 |
+
- num_epochs: 10
|
65 |
|
66 |
### Training results
|
67 |
|
68 |
| Training Loss | Epoch | Step | Validation Loss | Law Precision | Law Recall | Law F1 | Law Number | Violated by Precision | Violated by Recall | Violated by F1 | Violated by Number | Violated on Precision | Violated on Recall | Violated on F1 | Violated on Number | Violation Precision | Violation Recall | Violation F1 | Violation Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
|
69 |
|:-------------:|:-----:|:----:|:---------------:|:-------------:|:----------:|:------:|:----------:|:---------------------:|:------------------:|:--------------:|:------------------:|:---------------------:|:------------------:|:--------------:|:------------------:|:-------------------:|:----------------:|:------------:|:----------------:|:-----------------:|:--------------:|:----------:|:----------------:|
|
70 |
+
| 1.9748 | 1.0 | 45 | 1.1555 | 0.0 | 0.0 | 0.0 | 75 | 0.0 | 0.0 | 0.0 | 75 | 0.0 | 0.0 | 0.0 | 75 | 0.0 | 0.0 | 0.0 | 616 | 0.0 | 0.0 | 0.0 | 0.7437 |
|
71 |
+
| 0.4536 | 2.0 | 90 | 0.3670 | 0.0 | 0.0 | 0.0 | 75 | 0.0 | 0.0 | 0.0 | 75 | 0.0 | 0.0 | 0.0 | 75 | 0.1704 | 0.2955 | 0.2162 | 616 | 0.1704 | 0.2164 | 0.1907 | 0.8901 |
|
72 |
+
| 0.2704 | 3.0 | 135 | 0.2199 | 0.7059 | 0.64 | 0.6713 | 75 | 0.3095 | 0.1733 | 0.2222 | 75 | 0.0909 | 0.0133 | 0.0233 | 75 | 0.3291 | 0.5097 | 0.4000 | 616 | 0.3498 | 0.4471 | 0.3925 | 0.9277 |
|
73 |
+
| 0.1475 | 4.0 | 180 | 0.1959 | 0.6263 | 0.8267 | 0.7126 | 75 | 0.9153 | 0.72 | 0.8060 | 75 | 0.3182 | 0.3733 | 0.3436 | 75 | 0.4641 | 0.5974 | 0.5224 | 616 | 0.4928 | 0.6088 | 0.5447 | 0.9407 |
|
74 |
+
| 0.0879 | 5.0 | 225 | 0.2038 | 0.5909 | 0.8667 | 0.7027 | 75 | 0.7590 | 0.84 | 0.7975 | 75 | 0.3982 | 0.6 | 0.4787 | 75 | 0.4692 | 0.6055 | 0.5287 | 616 | 0.4959 | 0.6492 | 0.5623 | 0.9434 |
|
75 |
+
| 0.0499 | 6.0 | 270 | 0.2466 | 0.5913 | 0.9067 | 0.7158 | 75 | 0.7674 | 0.88 | 0.8199 | 75 | 0.4412 | 0.6 | 0.5085 | 75 | 0.4832 | 0.6071 | 0.5381 | 616 | 0.5135 | 0.6576 | 0.5766 | 0.9425 |
|
76 |
+
| 0.0291 | 7.0 | 315 | 0.2980 | 0.5755 | 0.8133 | 0.6740 | 75 | 0.7976 | 0.8933 | 0.8428 | 75 | 0.3802 | 0.6133 | 0.4694 | 75 | 0.4929 | 0.5617 | 0.5250 | 616 | 0.5133 | 0.6183 | 0.5609 | 0.9389 |
|
77 |
+
| 0.0341 | 8.0 | 360 | 0.2660 | 0.5739 | 0.88 | 0.6947 | 75 | 0.8193 | 0.9067 | 0.8608 | 75 | 0.48 | 0.64 | 0.5486 | 75 | 0.4800 | 0.6445 | 0.5502 | 616 | 0.5147 | 0.6885 | 0.5890 | 0.9366 |
|
78 |
+
| 0.0228 | 9.0 | 405 | 0.3186 | 0.3505 | 0.9067 | 0.5056 | 75 | 0.6126 | 0.9067 | 0.7312 | 75 | 0.3216 | 0.7333 | 0.4472 | 75 | 0.4365 | 0.5519 | 0.4875 | 616 | 0.4231 | 0.6314 | 0.5067 | 0.9301 |
|
79 |
+
| 0.0173 | 10.0 | 450 | 0.2674 | 0.6932 | 0.8133 | 0.7485 | 75 | 0.8684 | 0.88 | 0.8742 | 75 | 0.5882 | 0.6667 | 0.625 | 75 | 0.5287 | 0.6429 | 0.5802 | 616 | 0.5741 | 0.6813 | 0.6232 | 0.9461 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
80 |
|
81 |
|
82 |
### Framework versions
|