aliasgerovs commited on
Commit
80f1bd8
1 Parent(s): 8e79582
Files changed (3) hide show
  1. app.py +1 -1
  2. isotonic_regression_model.joblib +0 -0
  3. nohup.out +207 -0
app.py CHANGED
@@ -391,5 +391,5 @@ with gr.Blocks() as demo:
391
 
392
  if __name__ == "__main__":
393
  demo.launch(
394
- share=True, server_name="0.0.0.0", auth=("polygraf-admin", "test@aisd")
395
  )
 
391
 
392
  if __name__ == "__main__":
393
  demo.launch(
394
+ share=True, server_name="0.0.0.0", server_port=80, auth=("polygraf-admin", "test@aisd")
395
  )
isotonic_regression_model.joblib CHANGED
Binary files a/isotonic_regression_model.joblib and b/isotonic_regression_model.joblib differ
 
nohup.out CHANGED
@@ -143,3 +143,210 @@ hint: See PEP 668 for the detailed specification.
143
  probas = F.softmax(tensor_logits).detach().cpu().numpy()
144
  /home/aliasgarov/copyright_checker/predictors.py:197: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
145
  probas = F.softmax(tensor_logits).detach().cpu().numpy()
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
143
  probas = F.softmax(tensor_logits).detach().cpu().numpy()
144
  /home/aliasgarov/copyright_checker/predictors.py:197: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
145
  probas = F.softmax(tensor_logits).detach().cpu().numpy()
146
+ /usr/lib/python3/dist-packages/requests/__init__.py:87: RequestsDependencyWarning: urllib3 (2.2.1) or chardet (4.0.0) doesn't match a supported version!
147
+ warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
148
+ 2024-05-15 06:29:38.253910: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
149
+ To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
150
+ 2024-05-15 06:29:42.912970: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
151
+ [nltk_data] Downloading package punkt to /root/nltk_data...
152
+ [nltk_data] Package punkt is already up-to-date!
153
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
154
+ [nltk_data] Package stopwords is already up-to-date!
155
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
156
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
157
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
158
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
159
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
160
+ Traceback (most recent call last):
161
+ File "/home/aliasgarov/copyright_checker/app.py", line 4, in <module>
162
+ from predictors import predict_bc_scores, predict_mc_scores
163
+ File "/home/aliasgarov/copyright_checker/predictors.py", line 93, in <module>
164
+ iso_reg = joblib.load("isotonic_regression_model.joblib")
165
+ File "/usr/local/lib/python3.9/dist-packages/joblib/numpy_pickle.py", line 658, in load
166
+ obj = _unpickle(fobj, filename, mmap_mode)
167
+ File "/usr/local/lib/python3.9/dist-packages/joblib/numpy_pickle.py", line 577, in _unpickle
168
+ obj = unpickler.load()
169
+ File "/usr/lib/python3.9/pickle.py", line 1212, in load
170
+ dispatch[key[0]](self)
171
+ KeyError: 118
172
+ /usr/lib/python3/dist-packages/requests/__init__.py:87: RequestsDependencyWarning: urllib3 (2.2.1) or chardet (4.0.0) doesn't match a supported version!
173
+ warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
174
+ 2024-05-15 06:35:49.751024: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
175
+ To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
176
+ 2024-05-15 06:35:50.950991: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
177
+ [nltk_data] Downloading package punkt to /root/nltk_data...
178
+ [nltk_data] Package punkt is already up-to-date!
179
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
180
+ [nltk_data] Package stopwords is already up-to-date!
181
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
182
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
183
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
184
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
185
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
186
+ [nltk_data] Downloading package cmudict to /root/nltk_data...
187
+ [nltk_data] Package cmudict is already up-to-date!
188
+ [nltk_data] Downloading package punkt to /root/nltk_data...
189
+ [nltk_data] Package punkt is already up-to-date!
190
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
191
+ [nltk_data] Package stopwords is already up-to-date!
192
+ [nltk_data] Downloading package wordnet to /root/nltk_data...
193
+ [nltk_data] Package wordnet is already up-to-date!
194
+ /usr/lib/python3/dist-packages/requests/__init__.py:87: RequestsDependencyWarning: urllib3 (2.2.1) or chardet (4.0.0) doesn't match a supported version!
195
+ warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
196
+ Collecting en_core_web_sm==2.3.1
197
+ Using cached en_core_web_sm-2.3.1-py3-none-any.whl
198
+ Requirement already satisfied: spacy<2.4.0,>=2.3.0 in /usr/local/lib/python3.9/dist-packages (from en_core_web_sm==2.3.1) (2.3.9)
199
+ Requirement already satisfied: srsly<1.1.0,>=1.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.0.7)
200
+ Requirement already satisfied: catalogue<1.1.0,>=0.0.7 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.0.2)
201
+ Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (3.0.9)
202
+ Requirement already satisfied: requests<3.0.0,>=2.13.0 in /usr/lib/python3/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (2.25.1)
203
+ Requirement already satisfied: setuptools in /usr/lib/python3/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (52.0.0)
204
+ Requirement already satisfied: thinc<7.5.0,>=7.4.1 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (7.4.6)
205
+ Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (2.0.8)
206
+ Requirement already satisfied: tqdm<5.0.0,>=4.38.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (4.66.2)
207
+ Requirement already satisfied: numpy>=1.15.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.26.4)
208
+ Requirement already satisfied: wasabi<1.1.0,>=0.4.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (0.10.1)
209
+ Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.0.10)
210
+ Requirement already satisfied: blis<0.8.0,>=0.4.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (0.7.11)
211
+ Requirement already satisfied: plac<1.2.0,>=0.9.6 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.1.3)
212
+ ✔ Download and installation successful
213
+ You can now load the model via spacy.load('en_core_web_sm')
214
+ /usr/lib/python3/dist-packages/requests/__init__.py:87: RequestsDependencyWarning: urllib3 (2.2.1) or chardet (4.0.0) doesn't match a supported version!
215
+ warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
216
+ 2024-05-15 06:39:28.651855: I tensorflow/core/platform/cpu_feature_guard.cc:210] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
217
+ To enable the following instructions: AVX2 FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
218
+ 2024-05-15 06:39:29.794203: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
219
+ [nltk_data] Downloading package punkt to /root/nltk_data...
220
+ [nltk_data] Package punkt is already up-to-date!
221
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
222
+ [nltk_data] Package stopwords is already up-to-date!
223
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
224
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
225
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
226
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
227
+ The BetterTransformer implementation does not support padding during training, as the fused kernels do not support attention masks. Beware that passing padded batched data during training may result in unexpected outputs. Please refer to https://huggingface.co/docs/optimum/bettertransformer/overview for more details.
228
+ [nltk_data] Downloading package cmudict to /root/nltk_data...
229
+ [nltk_data] Package cmudict is already up-to-date!
230
+ [nltk_data] Downloading package punkt to /root/nltk_data...
231
+ [nltk_data] Package punkt is already up-to-date!
232
+ [nltk_data] Downloading package stopwords to /root/nltk_data...
233
+ [nltk_data] Package stopwords is already up-to-date!
234
+ [nltk_data] Downloading package wordnet to /root/nltk_data...
235
+ [nltk_data] Package wordnet is already up-to-date!
236
+ /usr/lib/python3/dist-packages/requests/__init__.py:87: RequestsDependencyWarning: urllib3 (2.2.1) or chardet (4.0.0) doesn't match a supported version!
237
+ warnings.warn("urllib3 ({}) or chardet ({}) doesn't match a supported "
238
+ Collecting en_core_web_sm==2.3.1
239
+ Using cached en_core_web_sm-2.3.1-py3-none-any.whl
240
+ Requirement already satisfied: spacy<2.4.0,>=2.3.0 in /usr/local/lib/python3.9/dist-packages (from en_core_web_sm==2.3.1) (2.3.9)
241
+ Requirement already satisfied: srsly<1.1.0,>=1.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.0.7)
242
+ Requirement already satisfied: wasabi<1.1.0,>=0.4.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (0.10.1)
243
+ Requirement already satisfied: numpy>=1.15.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.26.4)
244
+ Requirement already satisfied: plac<1.2.0,>=0.9.6 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.1.3)
245
+ Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (2.0.8)
246
+ Requirement already satisfied: thinc<7.5.0,>=7.4.1 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (7.4.6)
247
+ Requirement already satisfied: blis<0.8.0,>=0.4.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (0.7.11)
248
+ Requirement already satisfied: catalogue<1.1.0,>=0.0.7 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.0.2)
249
+ Requirement already satisfied: requests<3.0.0,>=2.13.0 in /usr/lib/python3/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (2.25.1)
250
+ Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (1.0.10)
251
+ Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (3.0.9)
252
+ Requirement already satisfied: tqdm<5.0.0,>=4.38.0 in /usr/local/lib/python3.9/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (4.66.2)
253
+ Requirement already satisfied: setuptools in /usr/lib/python3/dist-packages (from spacy<2.4.0,>=2.3.0->en_core_web_sm==2.3.1) (52.0.0)
254
+ ✔ Download and installation successful
255
+ You can now load the model via spacy.load('en_core_web_sm')
256
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
257
+ To disable this warning, you can either:
258
+ - Avoid using `tokenizers` before the fork if possible
259
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
260
+ /usr/local/lib/python3.9/dist-packages/torch/cuda/__init__.py:619: UserWarning: Can't initialize NVML
261
+ warnings.warn("Can't initialize NVML")
262
+ Some characters could not be decoded, and were replaced with REPLACEMENT CHARACTER.
263
+ IMPORTANT: You are using gradio version 4.28.3, however version 4.29.0 is available, please upgrade.
264
+ --------
265
+ Running on local URL: http://0.0.0.0:80
266
+ Running on public URL: https://ca11231f7d0d270866.gradio.live
267
+
268
+ This share link expires in 72 hours. For free permanent hosting and GPU upgrades, run `gradio deploy` from Terminal to deploy to Spaces (https://huggingface.co/spaces)
269
+ ['Multiple factors are helping Russia’s military advance, including a delay in American weaponry and Moscow’s technological innovations on the battlefield.']
270
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
271
+ To disable this warning, you can either:
272
+ - Avoid using `tokenizers` before the fork if possible
273
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
274
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
275
+ To disable this warning, you can either:
276
+ - Avoid using `tokenizers` before the fork if possible
277
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
278
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
279
+ To disable this warning, you can either:
280
+ - Avoid using `tokenizers` before the fork if possible
281
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
282
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
283
+ To disable this warning, you can either:
284
+ - Avoid using `tokenizers` before the fork if possible
285
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
286
+ /usr/local/lib/python3.9/dist-packages/optimum/bettertransformer/models/encoder_models.py:301: UserWarning: The PyTorch API of nested tensors is in prototype stage and will change in the near future. (Triggered internally at ../aten/src/ATen/NestedTensorImpl.cpp:178.)
287
+ hidden_states = torch._nested_tensor_from_mask(hidden_states, ~attention_mask)
288
+ /home/aliasgarov/copyright_checker/predictors.py:247: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
289
+ probas = F.softmax(tensor_logits).detach().cpu().numpy()
290
+ /home/aliasgarov/copyright_checker/predictors.py:247: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
291
+ probas = F.softmax(tensor_logits).detach().cpu().numpy()
292
+ WARNING: Invalid HTTP request received.
293
+ PLAGIARISM PROCESSING TIME: 21.636404959950596
294
+ Original BC scores: AI: 0.9994519352912903, HUMAN: 0.0005480951513163745
295
+ Calibration BC scores: AI: 0.8166666666666667, HUMAN: 0.18333333333333335
296
+ Input Text: sMultiple factors are helping Russias military advance, including a delay in American weaponry and Moscows technological innovations on the battlefield. /s
297
+ Models to Test: ['OpenAI GPT', 'Mistral', 'CLAUDE', 'Gemini', 'Grammar Enhancer']
298
+ Original BC scores: AI: 0.9994519352912903, HUMAN: 0.0005480951513163745
299
+ Calibration BC scores: AI: 0.8166666666666667, HUMAN: 0.18333333333333335
300
+ Starting MC
301
+ MC Score: {'OpenAI GPT': 4.604327676532164e-11, 'Mistral': 1.2912608567245758e-11, 'CLAUDE': 3.2367452925959875e-11, 'Gemini': 3.2201130588138284e-11, 'Grammar Enhancer': 0.8166666665431422}
302
+ Original BC scores: AI: 0.9994519352912903, HUMAN: 0.0005480951513163745
303
+ Calibration BC scores: AI: 0.8166666666666667, HUMAN: 0.18333333333333335
304
+ Input Text: sMultiple factors are helping Russias military advance, including a delay in American weaponry and Moscows technological innovations on the battlefield. /s
305
+ {'Multiple factors are helping Russia’s military advance, including a delay in American weaponry and Moscow’s technological innovations on the battlefield.': 0.016840771452527072} bc
306
+ {'Multiple factors are helping Russia’s military advance, including a delay in American weaponry and Moscow’s technological innovations on the battlefield.': 0.0006924518071430807} quillbot
307
+ ['Ilya and OpenAI are going to part ways.', 'This is very sad to me; Ilya is easily one of the greatest minds of our generation, a guiding light of our field, and a dear friend.', 'His brilliance and vision are well known; his warmth and compassion are less well known but no less important.', 'OpenAI would not be what it is without him.', 'Although he has something personally meaningful he is going to go work on, I am forever grateful for what he did here and committed to finishing the mission we started together.', 'I am happy that for so long I got to be close to such genuinely remarkable genius, and someone so focused on getting to the best future for humanity.', 'Jakub is going to be our new Chief Scientist.', 'Jakub is also easily one of the greatest minds of our generation; I am thrilled he is taking the baton here.', 'He has run many of our most important projects, and I am very confident he will lead us to make rapid and safe progress towards our mission of ensuring that AGI benefits everyone.']
308
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
309
+ To disable this warning, you can either:
310
+ - Avoid using `tokenizers` before the fork if possible
311
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
312
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
313
+ To disable this warning, you can either:
314
+ - Avoid using `tokenizers` before the fork if possible
315
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
316
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
317
+ To disable this warning, you can either:
318
+ - Avoid using `tokenizers` before the fork if possible
319
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
320
+ huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
321
+ To disable this warning, you can either:
322
+ - Avoid using `tokenizers` before the fork if possible
323
+ - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
324
+ WARNING: Invalid HTTP request received.
325
+ WARNING: Invalid HTTP request received.
326
+ /home/aliasgarov/copyright_checker/predictors.py:247: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
327
+ probas = F.softmax(tensor_logits).detach().cpu().numpy()
328
+ /home/aliasgarov/copyright_checker/predictors.py:247: UserWarning: Implicit dimension choice for softmax has been deprecated. Change the call to include dim=X as an argument.
329
+ probas = F.softmax(tensor_logits).detach().cpu().numpy()
330
+ WARNING: Invalid HTTP request received.
331
+ WARNING: Invalid HTTP request received.
332
+ WARNING: Invalid HTTP request received.
333
+ WARNING: Invalid HTTP request received.
334
+ WARNING: Invalid HTTP request received.
335
+ WARNING: Invalid HTTP request received.
336
+ WARNING: Invalid HTTP request received.
337
+ WARNING: Invalid HTTP request received.
338
+ WARNING: Invalid HTTP request received.
339
+ WARNING: Invalid HTTP request received.
340
+ WARNING: Invalid HTTP request received.
341
+ WARNING: Invalid HTTP request received.
342
+ WARNING: Invalid HTTP request received.
343
+ WARNING: Invalid HTTP request received.
344
+ WARNING: Invalid HTTP request received.
345
+ WARNING: Invalid HTTP request received.
346
+ WARNING: Invalid HTTP request received.
347
+ WARNING: Invalid HTTP request received.
348
+ WARNING: Invalid HTTP request received.
349
+ WARNING: Invalid HTTP request received.
350
+ WARNING: Invalid HTTP request received.
351
+ WARNING: Invalid HTTP request received.
352
+ WARNING: Invalid HTTP request received.