aliasgerovs commited on
Commit
41a6a33
1 Parent(s): 9efa03e

Fixed analysis.py issue

Browse files
Files changed (2) hide show
  1. app.py +1 -1
  2. nohup.out +158 -0
app.py CHANGED
@@ -63,7 +63,7 @@ def main(
63
  domains_to_skip,
64
  source_block_size,
65
  )
66
- depth_analysis_plot = depth_analysis(bias_buster_selected, input)
67
  bc_score = predict_bc_scores(input)
68
  mc_score = predict_mc_scores(input)
69
  quilscore = predict_quillbot(input, bias_buster_selected)
 
63
  domains_to_skip,
64
  source_block_size,
65
  )
66
+ depth_analysis_plot = depth_analysis(input, bias_buster_selected)
67
  bc_score = predict_bc_scores(input)
68
  mc_score = predict_mc_scores(input)
69
  quilscore = predict_quillbot(input, bias_buster_selected)
nohup.out CHANGED
@@ -2388,3 +2388,161 @@ To enable the following instructions: AVX2 FMA, in other operations, rebuild Ten
2388
  [nltk_data] Package stopwords is already up-to-date!
2389
  /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2390
  warnings.warn(
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2388
  [nltk_data] Package stopwords is already up-to-date!
2389
  /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2390
  warnings.warn(
2391
+ /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2392
+ warnings.warn(
2393
+ /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2394
+ warnings.warn(
2395
+ 2024-05-28 15:57:01.159593: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
2396
+ To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2397
+ [nltk_data] Downloading package punkt to /root/nltk_data...
2398
+ [nltk_data] Package punkt is already up-to-date!
2399
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
2400
+ [nltk_data] Package stopwords is already up-to-date!
2401
+ /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2402
+ warnings.warn(
2403
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
2404
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
2405
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
2406
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
2407
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
2408
+ Some weights of the model checkpoint at textattack/roberta-base-CoLA were not used when initializing RobertaForSequenceClassification: ['roberta.pooler.dense.bias', 'roberta.pooler.dense.weight']
2409
+ - This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
2410
+ - This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
2411
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
2412
+ Framework not specified. Using pt to export the model.
2413
+ Some weights of the model checkpoint at textattack/roberta-base-CoLA were not used when initializing RobertaForSequenceClassification: ['roberta.pooler.dense.bias', 'roberta.pooler.dense.weight']
2414
+ - This IS expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model).
2415
+ - This IS NOT expected if you are initializing RobertaForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model).
2416
+ Using the export variant default. Available variants are:
2417
+ - default: The default ONNX variant.
2418
+
2419
+ ***** Exporting submodel 1/1: RobertaForSequenceClassification *****
2420
+ Using framework PyTorch: 2.3.0+cu121
2421
+ Overriding 1 configuration item(s)
2422
+ - use_cache -> False
2423
+ Framework not specified. Using pt to export the model.
2424
+ Using the export variant default. Available variants are:
2425
+ - default: The default ONNX variant.
2426
+ Some non-default generation parameters are set in the model config. These should go into a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model) instead. This warning will be raised to an exception in v4.41.
2427
+ Non-default generation parameters: {'max_length': 512, 'min_length': 8, 'num_beams': 2, 'no_repeat_ngram_size': 4}
2428
+ /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2429
+ warnings.warn(
2430
+
2431
+ ***** Exporting submodel 1/3: T5Stack *****
2432
+ Using framework PyTorch: 2.3.0+cu121
2433
+ Overriding 1 configuration item(s)
2434
+ - use_cache -> False
2435
+
2436
+ ***** Exporting submodel 2/3: T5ForConditionalGeneration *****
2437
+ Using framework PyTorch: 2.3.0+cu121
2438
+ Overriding 1 configuration item(s)
2439
+ - use_cache -> True
2440
+ /usr/local/lib/python3.9/dist-packages/transformers/modeling_utils.py:1017: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
2441
+ if causal_mask.shape[1] < attention_mask.shape[1]:
2442
+
2443
+ ***** Exporting submodel 3/3: T5ForConditionalGeneration *****
2444
+ Using framework PyTorch: 2.3.0+cu121
2445
+ Overriding 1 configuration item(s)
2446
+ - use_cache -> True
2447
+ /usr/local/lib/python3.9/dist-packages/transformers/models/t5/modeling_t5.py:503: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs!
2448
+ elif past_key_value.shape[2] != key_value_states.shape[1]:
2449
+ In-place op on output of tensor.shape. See https://pytorch.org/docs/master/onnx.html#avoid-inplace-operations-when-using-tensor-shape-in-tracing-mode
2450
+ In-place op on output of tensor.shape. See https://pytorch.org/docs/master/onnx.html#avoid-inplace-operations-when-using-tensor-shape-in-tracing-mode
2451
+ Some non-default generation parameters are set in the model config. These should go into a GenerationConfig file (https://huggingface.co/docs/transformers/generation_strategies#save-a-custom-decoding-strategy-with-your-model) instead. This warning will be raised to an exception in v4.41.
2452
+ Non-default generation parameters: {'max_length': 512, 'min_length': 8, 'num_beams': 2, 'no_repeat_ngram_size': 4}
2453
+ [nltk_data] Downloading package cmudict to /root/nltk_data...
2454
+ [nltk_data] Package cmudict is already up-to-date!
2455
+ [nltk_data] Downloading package punkt to /root/nltk_data...
2456
+ [nltk_data] Package punkt is already up-to-date!
2457
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
2458
+ [nltk_data] Package stopwords is already up-to-date!
2459
+ [nltk_data] Downloading package wordnet to /root/nltk_data...
2460
+ [nltk_data] Package wordnet is already up-to-date!
2461
+ Collecting en-core-web-sm==3.7.1
2462
+ Downloading https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.7.1/en_core_web_sm-3.7.1-py3-none-any.whl (12.8 MB)
2463
+ Requirement already satisfied: spacy<3.8.0,>=3.7.2 in /usr/local/lib/python3.9/dist-packages (from en-core-web-sm==3.7.1) (3.7.2)
2464
+ Requirement already satisfied: langcodes<4.0.0,>=3.2.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (3.4.0)
2465
+ Requirement already satisfied: tqdm<5.0.0,>=4.38.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (4.66.4)
2466
+ Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.0.8)
2467
+ Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (3.0.9)
2468
+ Requirement already satisfied: smart-open<7.0.0,>=5.2.1 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (6.4.0)
2469
+ Requirement already satisfied: setuptools in /usr/lib/python3/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (52.0.0)
2470
+ Requirement already satisfied: wasabi<1.2.0,>=0.9.1 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (1.1.2)
2471
+ Requirement already satisfied: typer<0.10.0,>=0.3.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (0.9.4)
2472
+ Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (1.0.10)
2473
+ Requirement already satisfied: catalogue<2.1.0,>=2.0.6 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.0.10)
2474
+ Requirement already satisfied: spacy-legacy<3.1.0,>=3.0.11 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (3.0.12)
2475
+ Requirement already satisfied: pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.7.1)
2476
+ Requirement already satisfied: srsly<3.0.0,>=2.4.3 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.4.8)
2477
+ Requirement already satisfied: weasel<0.4.0,>=0.1.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (0.3.4)
2478
+ Requirement already satisfied: thinc<8.3.0,>=8.1.8 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (8.2.3)
2479
+ Requirement already satisfied: jinja2 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (3.1.4)
2480
+ Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (24.0)
2481
+ Requirement already satisfied: numpy>=1.19.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (1.26.4)
2482
+ Requirement already satisfied: requests<3.0.0,>=2.13.0 in /usr/lib/python3/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.25.1)
2483
+ Requirement already satisfied: spacy-loggers<2.0.0,>=1.0.0 in /usr/local/lib/python3.9/dist-packages (from spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (1.0.5)
2484
+ Requirement already satisfied: language-data>=1.2 in /usr/local/lib/python3.9/dist-packages (from langcodes<4.0.0,>=3.2.0->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (1.2.0)
2485
+ Requirement already satisfied: marisa-trie>=0.7.7 in /usr/local/lib/python3.9/dist-packages (from language-data>=1.2->langcodes<4.0.0,>=3.2.0->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (1.1.1)
2486
+ Requirement already satisfied: annotated-types>=0.4.0 in /usr/local/lib/python3.9/dist-packages (from pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (0.6.0)
2487
+ Requirement already satisfied: typing-extensions>=4.6.1 in /usr/local/lib/python3.9/dist-packages (from pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (4.11.0)
2488
+ Requirement already satisfied: pydantic-core==2.18.2 in /usr/local/lib/python3.9/dist-packages (from pydantic!=1.8,!=1.8.1,<3.0.0,>=1.7.4->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.18.2)
2489
+ Requirement already satisfied: blis<0.8.0,>=0.7.8 in /usr/local/lib/python3.9/dist-packages (from thinc<8.3.0,>=8.1.8->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (0.7.11)
2490
+ Requirement already satisfied: confection<1.0.0,>=0.0.1 in /usr/local/lib/python3.9/dist-packages (from thinc<8.3.0,>=8.1.8->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (0.1.4)
2491
+ Requirement already satisfied: click<9.0.0,>=7.1.1 in /usr/local/lib/python3.9/dist-packages (from typer<0.10.0,>=0.3.0->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (8.1.7)
2492
+ Requirement already satisfied: cloudpathlib<0.17.0,>=0.7.0 in /usr/local/lib/python3.9/dist-packages (from weasel<0.4.0,>=0.1.0->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (0.16.0)
2493
+ Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.9/dist-packages (from jinja2->spacy<3.8.0,>=3.7.2->en-core-web-sm==3.7.1) (2.1.5)
2494
+ ✔ Download and installation successful
2495
+ You can now load the package via spacy.load('en_core_web_sm')
2496
+ /usr/local/lib/python3.9/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
2497
+ warnings.warn(
2498
+ Some characters could not be decoded, and were replaced with REPLACEMENT CHARACTER.
2499
+ IMPORTANT: You are using gradio version 4.26.0, however version 4.29.0 is available, please upgrade.
2500
+ --------
2501
+ Running on local URL: http://0.0.0.0:80
2502
+ Running on public URL: https://a144cab9f7c4473792.gradio.live
2503
+
2504
+ This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)
2505
+ ['Here is a brief text about climate change in the USA: Climate change is having a significant impact across the United States.', 'The country is experiencing rising temperatures, with the last decade being the hottest on record.', 'Extreme weather events like heatwaves, droughts, wildfires, and powerful hurricanes are becoming more frequent and intense due to climate change.', 'Different regions are being affected in different ways.', 'The Western states are seeing protracted droughts, reduced snowpack, and water shortages.', 'Coastal communities face threats from sea level rise and stronger storms.', 'In Alaska, rising temperatures are causing permafrost to thaw and coastal erosion.', 'Midwestern agricultural sectors are vulnerable to climate shifts.', "The United States is one of the world's largest emitters of greenhouse gases that cause climate change.", 'However, efforts are underway to transition to clean energy sources like solar and wind power.', 'Many states, cities, businesses, and individuals are taking steps to reduce emissions and adapt to the unavoidable impacts already occurring.', 'Overcoming climate change will require concerted action both within the US and on a global scale.', 'PARAPHRASE IT']
2506
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2507
+ To disable this warning, you can either:
2508
+ - Avoid using `tokenizers` before the fork if possible
2509
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2510
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2511
+ To disable this warning, you can either:
2512
+ - Avoid using `tokenizers` before the fork if possible
2513
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2514
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2515
+ To disable this warning, you can either:
2516
+ - Avoid using `tokenizers` before the fork if possible
2517
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2518
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2519
+ To disable this warning, you can either:
2520
+ - Avoid using `tokenizers` before the fork if possible
2521
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2522
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2523
+ To disable this warning, you can either:
2524
+ - Avoid using `tokenizers` before the fork if possible
2525
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2526
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2527
+ To disable this warning, you can either:
2528
+ - Avoid using `tokenizers` before the fork if possible
2529
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2530
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2531
+ To disable this warning, you can either:
2532
+ - Avoid using `tokenizers` before the fork if possible
2533
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2534
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
2535
+ To disable this warning, you can either:
2536
+ - Avoid using `tokenizers` before the fork if possible
2537
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
2538
+ PLAGIARISM PROCESSING TIME: 37.69768393505365
2539
+
2540
+ /usr/local/lib/python3.9/dist-packages/optimum/bettertransformer/models/encoder_models.py:301: UserWarning: The PyTorch API of nested tensors is in prototype stage and will change in the near future. (Triggered internally at ../aten/src/ATen/NestedTensorImpl.cpp:178.)
2541
+ hidden_states = torch._nested_tensor_from_mask(hidden_states, ~attention_mask)
2542
+ Original BC scores: AI: 1.0, HUMAN: 2.2416668521429983e-09
2543
+ Calibration BC scores: AI: 0.9995505136986301, HUMAN: 0.00044948630136987244
2544
+ Input Text: Here is a brief text about climate change in the USA: Climate change is having a significant impact across the United States. The country is experiencing rising temperatures, with the last decade being the hottest on record. Extreme weather events like heatwaves, droughts, wildfires, and powerful hurricanes are becoming more frequent and intense due to climate change. Different regions are being affected in different ways. The Western states are seeing protracted droughts, reduced snowpack, and water shortages. Coastal communities face threats from sea level rise and stronger storms. In Alaska, rising temperatures are causing permafrost to thaw and coastal erosion. Midwestern agricultural sectors are vulnerable to climate shifts. The United States is one of the world's largest emitters of greenhouse gases that cause climate change. However, efforts are underway to transition to clean energy sources like solar and wind power. Many states, cities, businesses, and individuals are taking steps to reduce emissions and adapt to the unavoidable impacts already occurring. Overcoming climate change will require concerted action both within the US and on a global scale. PARAPHRASE IT
2545
+ Original BC scores: AI: 1.0, HUMAN: 2.2416668521429983e-09
2546
+ Calibration BC scores: AI: 0.9995505136986301, HUMAN: 0.00044948630136987244
2547
+ MC Score: {'OPENAI GPT': 8.698716239556535e-07, 'MISTRAL': 1.644195680447571e-11, 'CLAUDE': 0.9995496796086839, 'GEMINI': 5.766663159828316e-08, 'GRAMMAR ENHANCER': 5.043857131198447e-08}
2548
+