Datasets:

Modalities:
Text
Formats:
json
Size:
< 1K
Libraries:
Datasets
pandas
License:
itazap HF Staff commited on
Commit
a402251
·
verified ·
1 Parent(s): 6b7e6dc

Upload tokenization_test_data.json

Browse files
Files changed (1) hide show
  1. tokenization_test_data.json +2351 -0
tokenization_test_data.json ADDED
@@ -0,0 +1,2351 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "TestClass": "AlbertTokenizationTest",
4
+ "sp_base": false,
5
+ "sequence": "I was born in 92000, and this is falsé.",
6
+ "tokens": [
7
+ "▁i",
8
+ "▁was",
9
+ "▁born",
10
+ "▁in",
11
+ "▁9",
12
+ "2000",
13
+ ",",
14
+ "▁and",
15
+ "▁this",
16
+ "▁is",
17
+ "▁false",
18
+ "."
19
+ ],
20
+ "encoded": [
21
+ 31,
22
+ 23,
23
+ 386,
24
+ 19,
25
+ 561,
26
+ 3050,
27
+ 15,
28
+ 17,
29
+ 48,
30
+ 25,
31
+ 4997,
32
+ 9
33
+ ],
34
+ "encoded_special": [
35
+ 2,
36
+ 31,
37
+ 23,
38
+ 386,
39
+ 19,
40
+ 561,
41
+ 3050,
42
+ 15,
43
+ 17,
44
+ 48,
45
+ 25,
46
+ 4997,
47
+ 9,
48
+ 3
49
+ ],
50
+ "params": {},
51
+ "params_encode": {}
52
+ },
53
+ {
54
+ "TestClass": "BertTokenizationTest",
55
+ "sp_base": false,
56
+ "sequence": "UNwantéd,running",
57
+ "tokens": [
58
+ "un",
59
+ "##want",
60
+ "##ed",
61
+ ",",
62
+ "runn",
63
+ "##ing"
64
+ ],
65
+ "encoded": [
66
+ 9,
67
+ 6,
68
+ 7,
69
+ 12,
70
+ 10,
71
+ 11
72
+ ],
73
+ "encoded_special": [
74
+ 1,
75
+ 9,
76
+ 6,
77
+ 7,
78
+ 12,
79
+ 10,
80
+ 11,
81
+ 2
82
+ ],
83
+ "params": {},
84
+ "params_encode": {}
85
+ },
86
+ {
87
+ "TestClass": "BertTokenizationTest",
88
+ "sp_base": false,
89
+ "sequence": "UNwantéd,running",
90
+ "tokens": [
91
+ "un",
92
+ "##want",
93
+ "##ed",
94
+ ",",
95
+ "runn",
96
+ "##ing"
97
+ ],
98
+ "encoded": [
99
+ 9,
100
+ 6,
101
+ 7,
102
+ 12,
103
+ 10,
104
+ 11
105
+ ],
106
+ "encoded_special": [
107
+ 1,
108
+ 9,
109
+ 6,
110
+ 7,
111
+ 12,
112
+ 10,
113
+ 11,
114
+ 2
115
+ ],
116
+ "params": {
117
+ "do_lower_case": true
118
+ },
119
+ "params_encode": {}
120
+ },
121
+ {
122
+ "TestClass": "BigBirdPegasusTokenizationTest",
123
+ "sp_base": true,
124
+ "sequence": "This is a test",
125
+ "tokens": [
126
+ "▁This",
127
+ "▁is",
128
+ "▁a",
129
+ "▁",
130
+ "t",
131
+ "est"
132
+ ],
133
+ "encoded": [
134
+ 288,
135
+ 46,
136
+ 9,
137
+ 3,
138
+ 12,
139
+ 390
140
+ ],
141
+ "encoded_special": [
142
+ 288,
143
+ 46,
144
+ 9,
145
+ 3,
146
+ 12,
147
+ 390,
148
+ 1
149
+ ],
150
+ "params": {},
151
+ "params_encode": {}
152
+ },
153
+ {
154
+ "TestClass": "BigBirdTokenizationTest",
155
+ "sp_base": false,
156
+ "sequence": "I was born in 92000, and this is falsé.",
157
+ "tokens": [
158
+ "▁I",
159
+ "▁was",
160
+ "▁b",
161
+ "or",
162
+ "n",
163
+ "▁in",
164
+ "▁",
165
+ "9",
166
+ "2",
167
+ "0",
168
+ "0",
169
+ "0",
170
+ ",",
171
+ "▁and",
172
+ "▁this",
173
+ "▁is",
174
+ "▁f",
175
+ "al",
176
+ "s",
177
+ "é",
178
+ "."
179
+ ],
180
+ "encoded": [
181
+ 8,
182
+ 21,
183
+ 84,
184
+ 55,
185
+ 24,
186
+ 19,
187
+ 7,
188
+ 0,
189
+ 602,
190
+ 347,
191
+ 347,
192
+ 347,
193
+ 3,
194
+ 12,
195
+ 66,
196
+ 46,
197
+ 72,
198
+ 80,
199
+ 6,
200
+ 0,
201
+ 4
202
+ ],
203
+ "encoded_special": [
204
+ 1002,
205
+ 8,
206
+ 21,
207
+ 84,
208
+ 55,
209
+ 24,
210
+ 19,
211
+ 7,
212
+ 0,
213
+ 602,
214
+ 347,
215
+ 347,
216
+ 347,
217
+ 3,
218
+ 12,
219
+ 66,
220
+ 46,
221
+ 72,
222
+ 80,
223
+ 6,
224
+ 0,
225
+ 4,
226
+ 1000
227
+ ],
228
+ "params": {},
229
+ "params_encode": {}
230
+ },
231
+ {
232
+ "TestClass": "CLIPTokenizationTest",
233
+ "sp_base": false,
234
+ "sequence": "lower newer",
235
+ "tokens": [
236
+ "lo",
237
+ "w",
238
+ "er</w>",
239
+ "n",
240
+ "e",
241
+ "w",
242
+ "er</w>"
243
+ ],
244
+ "encoded": [
245
+ 10,
246
+ 2,
247
+ 16,
248
+ 9,
249
+ 3,
250
+ 2,
251
+ 16
252
+ ],
253
+ "encoded_special": [
254
+ 21,
255
+ 10,
256
+ 2,
257
+ 16,
258
+ 9,
259
+ 3,
260
+ 2,
261
+ 16,
262
+ 22
263
+ ],
264
+ "params": {},
265
+ "params_encode": {}
266
+ },
267
+ {
268
+ "TestClass": "CamembertTokenizationTest",
269
+ "sp_base": true,
270
+ "sequence": "I was born in 92000, and this is falsé.",
271
+ "tokens": [
272
+ "▁I",
273
+ "▁was",
274
+ "▁b",
275
+ "or",
276
+ "n",
277
+ "▁in",
278
+ "▁",
279
+ "9",
280
+ "2",
281
+ "0",
282
+ "0",
283
+ "0",
284
+ ",",
285
+ "▁and",
286
+ "▁this",
287
+ "▁is",
288
+ "▁f",
289
+ "al",
290
+ "s",
291
+ "é",
292
+ "."
293
+ ],
294
+ "encoded": [
295
+ 12,
296
+ 25,
297
+ 88,
298
+ 59,
299
+ 28,
300
+ 23,
301
+ 11,
302
+ 3,
303
+ 606,
304
+ 351,
305
+ 351,
306
+ 351,
307
+ 7,
308
+ 16,
309
+ 70,
310
+ 50,
311
+ 76,
312
+ 84,
313
+ 10,
314
+ 3,
315
+ 8
316
+ ],
317
+ "encoded_special": [
318
+ 5,
319
+ 12,
320
+ 25,
321
+ 88,
322
+ 59,
323
+ 28,
324
+ 23,
325
+ 11,
326
+ 3,
327
+ 606,
328
+ 351,
329
+ 351,
330
+ 351,
331
+ 7,
332
+ 16,
333
+ 70,
334
+ 50,
335
+ 76,
336
+ 84,
337
+ 10,
338
+ 3,
339
+ 8,
340
+ 6
341
+ ],
342
+ "params": {},
343
+ "params_encode": {}
344
+ },
345
+ {
346
+ "TestClass": "CodeGenTokenizationTest",
347
+ "sp_base": false,
348
+ "sequence": "lower newer",
349
+ "tokens": [
350
+ "Ġlow",
351
+ "er",
352
+ "Ġ",
353
+ "n",
354
+ "e",
355
+ "w",
356
+ "er"
357
+ ],
358
+ "encoded": [
359
+ 14,
360
+ 15,
361
+ 10,
362
+ 9,
363
+ 3,
364
+ 2,
365
+ 15
366
+ ],
367
+ "encoded_special": [
368
+ 14,
369
+ 15,
370
+ 10,
371
+ 9,
372
+ 3,
373
+ 2,
374
+ 15
375
+ ],
376
+ "params": {
377
+ "add_prefix_space": true
378
+ },
379
+ "params_encode": {}
380
+ },
381
+ {
382
+ "TestClass": "DPRContextEncoderTokenizationTest",
383
+ "sp_base": false,
384
+ "sequence": "UNwantéd,running",
385
+ "tokens": [
386
+ "un",
387
+ "##want",
388
+ "##ed",
389
+ ",",
390
+ "runn",
391
+ "##ing"
392
+ ],
393
+ "encoded": [
394
+ 9,
395
+ 6,
396
+ 7,
397
+ 12,
398
+ 10,
399
+ 11
400
+ ],
401
+ "encoded_special": [
402
+ 1,
403
+ 9,
404
+ 6,
405
+ 7,
406
+ 12,
407
+ 10,
408
+ 11,
409
+ 2
410
+ ],
411
+ "params": {},
412
+ "params_encode": {}
413
+ },
414
+ {
415
+ "TestClass": "DPRQuestionEncoderTokenizationTest",
416
+ "sp_base": false,
417
+ "sequence": "UNwantéd,running",
418
+ "tokens": [
419
+ "un",
420
+ "##want",
421
+ "##ed",
422
+ ",",
423
+ "runn",
424
+ "##ing"
425
+ ],
426
+ "encoded": [
427
+ 9,
428
+ 6,
429
+ 7,
430
+ 12,
431
+ 10,
432
+ 11
433
+ ],
434
+ "encoded_special": [
435
+ 1,
436
+ 9,
437
+ 6,
438
+ 7,
439
+ 12,
440
+ 10,
441
+ 11,
442
+ 2
443
+ ],
444
+ "params": {},
445
+ "params_encode": {}
446
+ },
447
+ {
448
+ "TestClass": "DPRReaderTokenizationTest",
449
+ "sp_base": false,
450
+ "sequence": "UNwantéd,running",
451
+ "tokens": [
452
+ "un",
453
+ "##want",
454
+ "##ed",
455
+ ",",
456
+ "runn",
457
+ "##ing"
458
+ ],
459
+ "encoded": [
460
+ 9,
461
+ 6,
462
+ 7,
463
+ 12,
464
+ 10,
465
+ 11
466
+ ],
467
+ "encoded_special": [
468
+ 1,
469
+ 9,
470
+ 6,
471
+ 7,
472
+ 12,
473
+ 10,
474
+ 11,
475
+ 2
476
+ ],
477
+ "params": {},
478
+ "params_encode": {}
479
+ },
480
+ {
481
+ "TestClass": "DebertaTokenizationTest",
482
+ "sp_base": false,
483
+ "sequence": "lower newer",
484
+ "tokens": [
485
+ "l",
486
+ "o",
487
+ "w",
488
+ "er",
489
+ "Ġ",
490
+ "n",
491
+ "e",
492
+ "w",
493
+ "er"
494
+ ],
495
+ "encoded": [
496
+ 0,
497
+ 1,
498
+ 2,
499
+ 15,
500
+ 10,
501
+ 9,
502
+ 3,
503
+ 2,
504
+ 15
505
+ ],
506
+ "encoded_special": [
507
+ 20,
508
+ 0,
509
+ 1,
510
+ 2,
511
+ 15,
512
+ 10,
513
+ 9,
514
+ 3,
515
+ 2,
516
+ 15,
517
+ 21
518
+ ],
519
+ "params": {},
520
+ "params_encode": {}
521
+ },
522
+ {
523
+ "TestClass": "DebertaV2TokenizationTest",
524
+ "sp_base": false,
525
+ "sequence": "I was born in 92000, and this is falsé!",
526
+ "tokens": [
527
+ "▁",
528
+ "I",
529
+ "▁was",
530
+ "▁born",
531
+ "▁in",
532
+ "▁9",
533
+ "2000",
534
+ ",",
535
+ "▁and",
536
+ "▁this",
537
+ "▁is",
538
+ "▁fal",
539
+ "s",
540
+ "é",
541
+ "!"
542
+ ],
543
+ "encoded": [
544
+ 13,
545
+ 1,
546
+ 23,
547
+ 386,
548
+ 19,
549
+ 561,
550
+ 3050,
551
+ 15,
552
+ 17,
553
+ 48,
554
+ 25,
555
+ 8256,
556
+ 18,
557
+ 1,
558
+ 187
559
+ ],
560
+ "encoded_special": [
561
+ 2,
562
+ 13,
563
+ 1,
564
+ 23,
565
+ 386,
566
+ 19,
567
+ 561,
568
+ 3050,
569
+ 15,
570
+ 17,
571
+ 48,
572
+ 25,
573
+ 8256,
574
+ 18,
575
+ 1,
576
+ 187,
577
+ 3
578
+ ],
579
+ "params": {},
580
+ "params_encode": {}
581
+ },
582
+ {
583
+ "TestClass": "DistilBertTokenizationTest",
584
+ "sp_base": false,
585
+ "sequence": "UNwantéd,running",
586
+ "tokens": [
587
+ "un",
588
+ "##want",
589
+ "##ed",
590
+ ",",
591
+ "runn",
592
+ "##ing"
593
+ ],
594
+ "encoded": [
595
+ 9,
596
+ 6,
597
+ 7,
598
+ 12,
599
+ 10,
600
+ 11
601
+ ],
602
+ "encoded_special": [
603
+ 1,
604
+ 9,
605
+ 6,
606
+ 7,
607
+ 12,
608
+ 10,
609
+ 11,
610
+ 2
611
+ ],
612
+ "params": {},
613
+ "params_encode": {}
614
+ },
615
+ {
616
+ "TestClass": "ElectraTokenizationTest",
617
+ "sp_base": false,
618
+ "sequence": "UNwantéd,running",
619
+ "tokens": [
620
+ "un",
621
+ "##want",
622
+ "##ed",
623
+ ",",
624
+ "runn",
625
+ "##ing"
626
+ ],
627
+ "encoded": [
628
+ 9,
629
+ 6,
630
+ 7,
631
+ 12,
632
+ 10,
633
+ 11
634
+ ],
635
+ "encoded_special": [
636
+ 1,
637
+ 9,
638
+ 6,
639
+ 7,
640
+ 12,
641
+ 10,
642
+ 11,
643
+ 2
644
+ ],
645
+ "params": {},
646
+ "params_encode": {}
647
+ },
648
+ {
649
+ "TestClass": "ElectraTokenizationTest",
650
+ "sp_base": false,
651
+ "sequence": "UNwantéd,running",
652
+ "tokens": [
653
+ "un",
654
+ "##want",
655
+ "##ed",
656
+ ",",
657
+ "runn",
658
+ "##ing"
659
+ ],
660
+ "encoded": [
661
+ 9,
662
+ 6,
663
+ 7,
664
+ 12,
665
+ 10,
666
+ 11
667
+ ],
668
+ "encoded_special": [
669
+ 1,
670
+ 9,
671
+ 6,
672
+ 7,
673
+ 12,
674
+ 10,
675
+ 11,
676
+ 2
677
+ ],
678
+ "params": {
679
+ "do_lower_case": true
680
+ },
681
+ "params_encode": {}
682
+ },
683
+ {
684
+ "TestClass": "FNetTokenizationTest",
685
+ "sp_base": false,
686
+ "sequence": "I was born in 92000, and this is falsé.",
687
+ "tokens": [
688
+ "▁",
689
+ "I",
690
+ "▁was",
691
+ "▁born",
692
+ "▁in",
693
+ "▁9",
694
+ "2000",
695
+ ",",
696
+ "▁and",
697
+ "▁this",
698
+ "▁is",
699
+ "▁fal",
700
+ "s",
701
+ "é",
702
+ "."
703
+ ],
704
+ "encoded": [
705
+ 13,
706
+ 1,
707
+ 23,
708
+ 386,
709
+ 19,
710
+ 561,
711
+ 3050,
712
+ 15,
713
+ 17,
714
+ 48,
715
+ 25,
716
+ 8256,
717
+ 18,
718
+ 1,
719
+ 9
720
+ ],
721
+ "encoded_special": [
722
+ 2,
723
+ 13,
724
+ 1,
725
+ 23,
726
+ 386,
727
+ 19,
728
+ 561,
729
+ 3050,
730
+ 15,
731
+ 17,
732
+ 48,
733
+ 25,
734
+ 8256,
735
+ 18,
736
+ 1,
737
+ 9,
738
+ 3
739
+ ],
740
+ "params": {},
741
+ "params_encode": {}
742
+ },
743
+ {
744
+ "TestClass": "FunnelTokenizationTest",
745
+ "sp_base": false,
746
+ "sequence": "UNwantéd,running",
747
+ "tokens": [
748
+ "un",
749
+ "##want",
750
+ "##ed",
751
+ ",",
752
+ "runn",
753
+ "##ing"
754
+ ],
755
+ "encoded": [
756
+ 7,
757
+ 4,
758
+ 5,
759
+ 10,
760
+ 8,
761
+ 9
762
+ ],
763
+ "encoded_special": [
764
+ 1,
765
+ 7,
766
+ 4,
767
+ 5,
768
+ 10,
769
+ 8,
770
+ 9,
771
+ 2
772
+ ],
773
+ "params": {},
774
+ "params_encode": {}
775
+ },
776
+ {
777
+ "TestClass": "GPT2TokenizationTest",
778
+ "sp_base": false,
779
+ "sequence": "lower newer",
780
+ "tokens": [
781
+ "Ġlow",
782
+ "er",
783
+ "Ġ",
784
+ "n",
785
+ "e",
786
+ "w",
787
+ "er"
788
+ ],
789
+ "encoded": [
790
+ 14,
791
+ 15,
792
+ 10,
793
+ 9,
794
+ 3,
795
+ 2,
796
+ 15
797
+ ],
798
+ "encoded_special": [
799
+ 14,
800
+ 15,
801
+ 10,
802
+ 9,
803
+ 3,
804
+ 2,
805
+ 15
806
+ ],
807
+ "params": {
808
+ "add_prefix_space": true
809
+ },
810
+ "params_encode": {}
811
+ },
812
+ {
813
+ "TestClass": "HerbertTokenizationTest",
814
+ "sp_base": false,
815
+ "sequence": "lower,newer",
816
+ "tokens": [
817
+ "low",
818
+ "er</w>",
819
+ ",</w>",
820
+ "n",
821
+ "e",
822
+ "w",
823
+ "er</w>"
824
+ ],
825
+ "encoded": [
826
+ 16,
827
+ 17,
828
+ 22,
829
+ 11,
830
+ 5,
831
+ 4,
832
+ 17
833
+ ],
834
+ "encoded_special": [
835
+ 0,
836
+ 16,
837
+ 17,
838
+ 22,
839
+ 11,
840
+ 5,
841
+ 4,
842
+ 17,
843
+ 1
844
+ ],
845
+ "params": {},
846
+ "params_encode": {}
847
+ },
848
+ {
849
+ "TestClass": "LayoutLMTokenizationTest",
850
+ "sp_base": false,
851
+ "sequence": "UNwantéd,running",
852
+ "tokens": [
853
+ "un",
854
+ "##want",
855
+ "##ed",
856
+ ",",
857
+ "runn",
858
+ "##ing"
859
+ ],
860
+ "encoded": [
861
+ 7,
862
+ 4,
863
+ 5,
864
+ 10,
865
+ 8,
866
+ 9
867
+ ],
868
+ "encoded_special": [
869
+ 1,
870
+ 7,
871
+ 4,
872
+ 5,
873
+ 10,
874
+ 8,
875
+ 9,
876
+ 2
877
+ ],
878
+ "params": {},
879
+ "params_encode": {}
880
+ },
881
+ {
882
+ "TestClass": "LayoutLMv2TokenizationTest",
883
+ "sp_base": false,
884
+ "sequence": false,
885
+ "tokens": false,
886
+ "encoded": [
887
+ 10,
888
+ 11,
889
+ 12,
890
+ 13
891
+ ],
892
+ "encoded_special": [
893
+ 1,
894
+ 10,
895
+ 11,
896
+ 12,
897
+ 13,
898
+ 2
899
+ ],
900
+ "params": {},
901
+ "params_encode": {
902
+ "text": ["a", "weirdly", "test"],
903
+ "boxes": [[423, 237, 440, 251], [427, 272, 441, 287], [419, 115, 437, 129]]
904
+ }
905
+ },
906
+ {
907
+ "TestClass": "LayoutLMv3TokenizationTest",
908
+ "sp_base": false,
909
+ "sequence": false,
910
+ "tokens": false,
911
+ "encoded": [
912
+ 14,
913
+ 15,
914
+ 10,
915
+ 9,
916
+ 3,
917
+ 2,
918
+ 15
919
+ ],
920
+ "encoded_special": [
921
+ 20,
922
+ 14,
923
+ 15,
924
+ 10,
925
+ 9,
926
+ 3,
927
+ 2,
928
+ 15,
929
+ 21
930
+ ],
931
+ "params": {},
932
+ "params_encode": {
933
+ "text": [
934
+ "lower",
935
+ "newer"
936
+ ],
937
+ "boxes": [
938
+ [
939
+ 423,
940
+ 237,
941
+ 440,
942
+ 251
943
+ ],
944
+ [
945
+ 427,
946
+ 272,
947
+ 441,
948
+ 287
949
+ ]
950
+ ]
951
+ }
952
+ },
953
+ {
954
+ "TestClass": "LayoutXLMTokenizationTest",
955
+ "sp_base": true,
956
+ "sequence": false,
957
+ "tokens": false,
958
+ "encoded": [
959
+ 11,
960
+ 113,
961
+ 159,
962
+ 17,
963
+ 39,
964
+ 171,
965
+ 383
966
+ ],
967
+ "encoded_special": [
968
+ 0,
969
+ 11,
970
+ 113,
971
+ 159,
972
+ 17,
973
+ 39,
974
+ 171,
975
+ 383,
976
+ 2
977
+ ],
978
+ "params": {},
979
+ "params_encode": {
980
+ "text":["a", "weirdly", "test"],
981
+ "boxes":[[423, 237, 440, 251], [427, 272, 441, 287], [419, 115, 437, 129]]
982
+ }
983
+ },
984
+ {
985
+ "TestClass": "LongformerTokenizationTest",
986
+ "sp_base": false,
987
+ "sequence": "lower newer",
988
+ "tokens": [
989
+ "l",
990
+ "o",
991
+ "w",
992
+ "er",
993
+ "Ġ",
994
+ "n",
995
+ "e",
996
+ "w",
997
+ "er"
998
+ ],
999
+ "encoded": [
1000
+ 0,
1001
+ 1,
1002
+ 2,
1003
+ 15,
1004
+ 10,
1005
+ 9,
1006
+ 3,
1007
+ 2,
1008
+ 15
1009
+ ],
1010
+ "encoded_special": [
1011
+ 20,
1012
+ 0,
1013
+ 1,
1014
+ 2,
1015
+ 15,
1016
+ 10,
1017
+ 9,
1018
+ 3,
1019
+ 2,
1020
+ 15,
1021
+ 21
1022
+ ],
1023
+ "params": {},
1024
+ "params_encode": {}
1025
+ },
1026
+ {
1027
+ "TestClass": "LxmertTokenizationTest",
1028
+ "sp_base": false,
1029
+ "sequence": "I was born in 92000, and this is falsé.",
1030
+ "tokens": [
1031
+ "[UNK]",
1032
+ "[UNK]",
1033
+ "[UNK]",
1034
+ "[UNK]",
1035
+ "[UNK]",
1036
+ ",",
1037
+ "[UNK]",
1038
+ "[UNK]",
1039
+ "[UNK]",
1040
+ "[UNK]",
1041
+ "[UNK]"
1042
+ ],
1043
+ "encoded": [
1044
+ 0,
1045
+ 0,
1046
+ 0,
1047
+ 0,
1048
+ 0,
1049
+ 10,
1050
+ 0,
1051
+ 0,
1052
+ 0,
1053
+ 0,
1054
+ 0
1055
+ ],
1056
+ "encoded_special": [
1057
+ 1,
1058
+ 0,
1059
+ 0,
1060
+ 0,
1061
+ 0,
1062
+ 0,
1063
+ 10,
1064
+ 0,
1065
+ 0,
1066
+ 0,
1067
+ 0,
1068
+ 0,
1069
+ 2
1070
+ ],
1071
+ "params": {},
1072
+ "params_encode": {}
1073
+ },
1074
+ {
1075
+ "TestClass": "MBart50TokenizationTest",
1076
+ "sp_base": true,
1077
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1078
+ "tokens": [
1079
+ "▁the",
1080
+ "▁I",
1081
+ "▁to",
1082
+ "▁a",
1083
+ "▁and",
1084
+ "▁of",
1085
+ "▁in",
1086
+ "▁was",
1087
+ "▁it",
1088
+ "▁me",
1089
+ "▁that",
1090
+ "▁be",
1091
+ "▁he",
1092
+ "▁for",
1093
+ "▁with",
1094
+ "▁my",
1095
+ "▁not",
1096
+ "▁is",
1097
+ "▁s",
1098
+ "▁you"
1099
+ ],
1100
+ "encoded": [
1101
+ 6,
1102
+ 9,
1103
+ 10,
1104
+ 11,
1105
+ 13,
1106
+ 14,
1107
+ 20,
1108
+ 22,
1109
+ 26,
1110
+ 32,
1111
+ 35,
1112
+ 37,
1113
+ 38,
1114
+ 41,
1115
+ 42,
1116
+ 44,
1117
+ 45,
1118
+ 47,
1119
+ 48,
1120
+ 49
1121
+ ],
1122
+ "encoded_special": [
1123
+ 1004,
1124
+ 6,
1125
+ 9,
1126
+ 10,
1127
+ 11,
1128
+ 13,
1129
+ 14,
1130
+ 20,
1131
+ 22,
1132
+ 26,
1133
+ 32,
1134
+ 35,
1135
+ 37,
1136
+ 38,
1137
+ 41,
1138
+ 42,
1139
+ 44,
1140
+ 45,
1141
+ 47,
1142
+ 48,
1143
+ 49,
1144
+ 2
1145
+ ],
1146
+ "params": {},
1147
+ "params_encode": {}
1148
+ },
1149
+ {
1150
+ "TestClass": "MBartTokenizationTest",
1151
+ "sp_base": true,
1152
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1153
+ "tokens": [
1154
+ "▁the",
1155
+ "▁I",
1156
+ "▁to",
1157
+ "▁a",
1158
+ "▁and",
1159
+ "▁of",
1160
+ "▁in",
1161
+ "▁was",
1162
+ "▁it",
1163
+ "▁me",
1164
+ "▁that",
1165
+ "▁be",
1166
+ "▁he",
1167
+ "▁for",
1168
+ "▁with",
1169
+ "▁my",
1170
+ "▁not",
1171
+ "▁is",
1172
+ "▁s",
1173
+ "▁you"
1174
+ ],
1175
+ "encoded": [
1176
+ 6,
1177
+ 9,
1178
+ 10,
1179
+ 11,
1180
+ 13,
1181
+ 14,
1182
+ 20,
1183
+ 22,
1184
+ 26,
1185
+ 32,
1186
+ 35,
1187
+ 37,
1188
+ 38,
1189
+ 41,
1190
+ 42,
1191
+ 44,
1192
+ 45,
1193
+ 47,
1194
+ 48,
1195
+ 49
1196
+ ],
1197
+ "encoded_special": [
1198
+ 6,
1199
+ 9,
1200
+ 10,
1201
+ 11,
1202
+ 13,
1203
+ 14,
1204
+ 20,
1205
+ 22,
1206
+ 26,
1207
+ 32,
1208
+ 35,
1209
+ 37,
1210
+ 38,
1211
+ 41,
1212
+ 42,
1213
+ 44,
1214
+ 45,
1215
+ 47,
1216
+ 48,
1217
+ 49,
1218
+ 2,
1219
+ 1004
1220
+ ],
1221
+ "params": {},
1222
+ "params_encode": {}
1223
+ },
1224
+ {
1225
+ "TestClass": "MPNetTokenizerTest",
1226
+ "sp_base": false,
1227
+ "sequence": "UNwantéd,running",
1228
+ "tokens": [
1229
+ "un",
1230
+ "##want",
1231
+ "##ed",
1232
+ ",",
1233
+ "runn",
1234
+ "##ing"
1235
+ ],
1236
+ "encoded": [
1237
+ 9,
1238
+ 6,
1239
+ 7,
1240
+ 12,
1241
+ 10,
1242
+ 11
1243
+ ],
1244
+ "encoded_special": [
1245
+ 15,
1246
+ 9,
1247
+ 6,
1248
+ 7,
1249
+ 12,
1250
+ 10,
1251
+ 11,
1252
+ 16
1253
+ ],
1254
+ "params": {},
1255
+ "params_encode": {}
1256
+ },
1257
+ {
1258
+ "TestClass": "MarkupLMTokenizationTest",
1259
+ "sp_base": false,
1260
+ "sequence": false,
1261
+ "tokens": false,
1262
+ "encoded": [
1263
+ 21,
1264
+ 3,
1265
+ 0,
1266
+ 0,
1267
+ 1,
1268
+ 2,
1269
+ 1,
1270
+ 4,
1271
+ 0,
1272
+ 8
1273
+ ],
1274
+ "encoded_special": [
1275
+ 22,
1276
+ 21,
1277
+ 3,
1278
+ 0,
1279
+ 0,
1280
+ 1,
1281
+ 2,
1282
+ 1,
1283
+ 4,
1284
+ 0,
1285
+ 8,
1286
+ 23
1287
+ ],
1288
+ "params": {},
1289
+ "params_encode": {
1290
+ "text": ["hello", "world"],
1291
+ "xpaths": [",/html/body/div/li[1]/div/span", ",/html/body/div/li[1]/div/span"]
1292
+ }
1293
+ },
1294
+ {
1295
+ "TestClass": "MobileBERTTokenizationTest",
1296
+ "sp_base": false,
1297
+ "sequence": "UNwantéd,running",
1298
+ "tokens": [
1299
+ "un",
1300
+ "##want",
1301
+ "##ed",
1302
+ ",",
1303
+ "runn",
1304
+ "##ing"
1305
+ ],
1306
+ "encoded": [
1307
+ 9,
1308
+ 6,
1309
+ 7,
1310
+ 12,
1311
+ 10,
1312
+ 11
1313
+ ],
1314
+ "encoded_special": [
1315
+ 1,
1316
+ 9,
1317
+ 6,
1318
+ 7,
1319
+ 12,
1320
+ 10,
1321
+ 11,
1322
+ 2
1323
+ ],
1324
+ "params": {},
1325
+ "params_encode": {}
1326
+ },
1327
+ {
1328
+ "TestClass": "NllbTokenizationTest",
1329
+ "sp_base": true,
1330
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1331
+ "tokens": [
1332
+ "▁the",
1333
+ "▁I",
1334
+ "▁to",
1335
+ "▁a",
1336
+ "▁and",
1337
+ "▁of",
1338
+ "▁in",
1339
+ "▁was",
1340
+ "▁it",
1341
+ "▁me",
1342
+ "▁that",
1343
+ "▁be",
1344
+ "▁he",
1345
+ "▁for",
1346
+ "▁with",
1347
+ "▁my",
1348
+ "▁not",
1349
+ "▁is",
1350
+ "▁s",
1351
+ "▁you"
1352
+ ],
1353
+ "encoded": [
1354
+ 6,
1355
+ 9,
1356
+ 10,
1357
+ 11,
1358
+ 13,
1359
+ 14,
1360
+ 20,
1361
+ 22,
1362
+ 26,
1363
+ 32,
1364
+ 35,
1365
+ 37,
1366
+ 38,
1367
+ 41,
1368
+ 42,
1369
+ 44,
1370
+ 45,
1371
+ 47,
1372
+ 48,
1373
+ 49
1374
+ ],
1375
+ "encoded_special": [
1376
+ 1048,
1377
+ 6,
1378
+ 9,
1379
+ 10,
1380
+ 11,
1381
+ 13,
1382
+ 14,
1383
+ 20,
1384
+ 22,
1385
+ 26,
1386
+ 32,
1387
+ 35,
1388
+ 37,
1389
+ 38,
1390
+ 41,
1391
+ 42,
1392
+ 44,
1393
+ 45,
1394
+ 47,
1395
+ 48,
1396
+ 49,
1397
+ 2
1398
+ ],
1399
+ "params": {},
1400
+ "params_encode": {}
1401
+ },
1402
+ {
1403
+ "TestClass": "OpenAIGPTTokenizationTest",
1404
+ "sp_base": false,
1405
+ "sequence": "lower newer",
1406
+ "tokens": [
1407
+ "low",
1408
+ "er</w>",
1409
+ "n",
1410
+ "e",
1411
+ "w",
1412
+ "er</w>"
1413
+ ],
1414
+ "encoded": [
1415
+ 14,
1416
+ 15,
1417
+ 9,
1418
+ 3,
1419
+ 2,
1420
+ 15
1421
+ ],
1422
+ "encoded_special": [
1423
+ 14,
1424
+ 15,
1425
+ 9,
1426
+ 3,
1427
+ 2,
1428
+ 15
1429
+ ],
1430
+ "params": {},
1431
+ "params_encode": {}
1432
+ },
1433
+ {
1434
+ "TestClass": "PegasusTokenizationTest",
1435
+ "sp_base": true,
1436
+ "sequence": "This is a test",
1437
+ "tokens": [
1438
+ "▁This",
1439
+ "▁is",
1440
+ "▁a",
1441
+ "▁",
1442
+ "t",
1443
+ "est"
1444
+ ],
1445
+ "encoded": [
1446
+ 391,
1447
+ 149,
1448
+ 112,
1449
+ 106,
1450
+ 115,
1451
+ 493
1452
+ ],
1453
+ "encoded_special": [
1454
+ 391,
1455
+ 149,
1456
+ 112,
1457
+ 106,
1458
+ 115,
1459
+ 493,
1460
+ 1
1461
+ ],
1462
+ "params": {},
1463
+ "params_encode": {}
1464
+ },
1465
+ {
1466
+ "TestClass": "Qwen2TokenizationTest",
1467
+ "sp_base": false,
1468
+ "sequence": "lower lower newer 010;}\r\n<|endoftext|>ϓ",
1469
+ "tokens": [
1470
+ "l",
1471
+ "o",
1472
+ "w",
1473
+ "er",
1474
+ "Ġlow",
1475
+ "er",
1476
+ "Ġ",
1477
+ "n",
1478
+ "e",
1479
+ "w",
1480
+ "er",
1481
+ "Ġ",
1482
+ "0",
1483
+ "1",
1484
+ "0",
1485
+ ";}",
1486
+ "č",
1487
+ "Ċ",
1488
+ "<|endoftext|>",
1489
+ "Ïĵ"
1490
+ ],
1491
+ "encoded": [
1492
+ 75,
1493
+ 78,
1494
+ 86,
1495
+ 260,
1496
+ 259,
1497
+ 260,
1498
+ 220,
1499
+ 77,
1500
+ 68,
1501
+ 86,
1502
+ 260,
1503
+ 220,
1504
+ 15,
1505
+ 16,
1506
+ 15,
1507
+ 265,
1508
+ 201,
1509
+ 198,
1510
+ 270,
1511
+ 267
1512
+ ],
1513
+ "encoded_special": [
1514
+ 75,
1515
+ 78,
1516
+ 86,
1517
+ 260,
1518
+ 259,
1519
+ 260,
1520
+ 220,
1521
+ 77,
1522
+ 68,
1523
+ 86,
1524
+ 260,
1525
+ 220,
1526
+ 15,
1527
+ 16,
1528
+ 15,
1529
+ 265,
1530
+ 201,
1531
+ 198,
1532
+ 270,
1533
+ 267
1534
+ ],
1535
+ "params": {},
1536
+ "params_encode": {}
1537
+ },
1538
+ {
1539
+ "TestClass": "ReformerTokenizationTest",
1540
+ "sp_base": false,
1541
+ "sequence": "I was born in 92000, and this is falsé.",
1542
+ "tokens": [
1543
+ "▁I",
1544
+ "▁was",
1545
+ "▁b",
1546
+ "or",
1547
+ "n",
1548
+ "▁in",
1549
+ "▁",
1550
+ "9",
1551
+ "2",
1552
+ "0",
1553
+ "0",
1554
+ "0",
1555
+ ",",
1556
+ "▁and",
1557
+ "▁this",
1558
+ "▁is",
1559
+ "▁f",
1560
+ "al",
1561
+ "s",
1562
+ "é",
1563
+ "."
1564
+ ],
1565
+ "encoded": [
1566
+ 8,
1567
+ 21,
1568
+ 84,
1569
+ 55,
1570
+ 24,
1571
+ 19,
1572
+ 7,
1573
+ 0,
1574
+ 602,
1575
+ 347,
1576
+ 347,
1577
+ 347,
1578
+ 3,
1579
+ 12,
1580
+ 66,
1581
+ 46,
1582
+ 72,
1583
+ 80,
1584
+ 6,
1585
+ 0,
1586
+ 4
1587
+ ],
1588
+ "encoded_special": [
1589
+ 8,
1590
+ 21,
1591
+ 84,
1592
+ 55,
1593
+ 24,
1594
+ 19,
1595
+ 7,
1596
+ 0,
1597
+ 602,
1598
+ 347,
1599
+ 347,
1600
+ 347,
1601
+ 3,
1602
+ 12,
1603
+ 66,
1604
+ 46,
1605
+ 72,
1606
+ 80,
1607
+ 6,
1608
+ 0,
1609
+ 4
1610
+ ],
1611
+ "params": {},
1612
+ "params_encode": {}
1613
+ },
1614
+ {
1615
+ "TestClass": "RemBertTokenizationTest",
1616
+ "sp_base": true,
1617
+ "sequence": "this is a test",
1618
+ "tokens": [
1619
+ "▁this",
1620
+ "▁is",
1621
+ "▁a",
1622
+ "▁t",
1623
+ "est"
1624
+ ],
1625
+ "encoded": [
1626
+ 66,
1627
+ 46,
1628
+ 10,
1629
+ 170,
1630
+ 382
1631
+ ],
1632
+ "encoded_special": [
1633
+ 1000,
1634
+ 66,
1635
+ 46,
1636
+ 10,
1637
+ 170,
1638
+ 382,
1639
+ 1001
1640
+ ],
1641
+ "params": {},
1642
+ "params_encode": {}
1643
+ },
1644
+ {
1645
+ "TestClass": "RobertaTokenizationTest",
1646
+ "sp_base": false,
1647
+ "sequence": "lower newer",
1648
+ "tokens": [
1649
+ "l",
1650
+ "o",
1651
+ "w",
1652
+ "er",
1653
+ "Ġ",
1654
+ "n",
1655
+ "e",
1656
+ "w",
1657
+ "er"
1658
+ ],
1659
+ "encoded": [
1660
+ 0,
1661
+ 1,
1662
+ 2,
1663
+ 15,
1664
+ 10,
1665
+ 9,
1666
+ 3,
1667
+ 2,
1668
+ 15
1669
+ ],
1670
+ "encoded_special": [
1671
+ 20,
1672
+ 0,
1673
+ 1,
1674
+ 2,
1675
+ 15,
1676
+ 10,
1677
+ 9,
1678
+ 3,
1679
+ 2,
1680
+ 15,
1681
+ 21
1682
+ ],
1683
+ "params": {},
1684
+ "params_encode": {}
1685
+ },
1686
+ {
1687
+ "TestClass": "SeamlessM4TTokenizationTest",
1688
+ "sp_base": true,
1689
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
1690
+ "tokens": [
1691
+ "▁the",
1692
+ "▁I",
1693
+ "▁to",
1694
+ "▁a",
1695
+ "▁and",
1696
+ "▁of",
1697
+ "▁in",
1698
+ "▁was",
1699
+ "▁it",
1700
+ "▁me",
1701
+ "▁that",
1702
+ "▁be",
1703
+ "▁he",
1704
+ "▁for",
1705
+ "▁with",
1706
+ "▁my",
1707
+ "▁not",
1708
+ "▁is",
1709
+ "▁s",
1710
+ "▁you"
1711
+ ],
1712
+ "encoded": [
1713
+ 6,
1714
+ 9,
1715
+ 10,
1716
+ 11,
1717
+ 13,
1718
+ 14,
1719
+ 20,
1720
+ 22,
1721
+ 26,
1722
+ 32,
1723
+ 35,
1724
+ 37,
1725
+ 38,
1726
+ 41,
1727
+ 42,
1728
+ 44,
1729
+ 45,
1730
+ 47,
1731
+ 48,
1732
+ 49
1733
+ ],
1734
+ "encoded_special": [
1735
+ 3,
1736
+ 1,
1737
+ 6,
1738
+ 9,
1739
+ 10,
1740
+ 11,
1741
+ 13,
1742
+ 14,
1743
+ 20,
1744
+ 22,
1745
+ 26,
1746
+ 32,
1747
+ 35,
1748
+ 37,
1749
+ 38,
1750
+ 41,
1751
+ 42,
1752
+ 44,
1753
+ 45,
1754
+ 47,
1755
+ 48,
1756
+ 49,
1757
+ 3
1758
+ ],
1759
+ "params": {},
1760
+ "params_encode": {}
1761
+ },
1762
+ {
1763
+ "TestClass": "SplinterTokenizationTest",
1764
+ "sp_base": false,
1765
+ "sequence": "I need to test this rigor",
1766
+ "tokens": [
1767
+ "[UNK]",
1768
+ "[UNK]",
1769
+ "[UNK]",
1770
+ "test",
1771
+ "this",
1772
+ "rigor"
1773
+ ],
1774
+ "encoded": [
1775
+ 3,
1776
+ 10,
1777
+ 10,
1778
+ 10,
1779
+ 16,
1780
+ 13,
1781
+ 21,
1782
+ 1
1783
+ ],
1784
+ "encoded_special": NaN,
1785
+ "params": {},
1786
+ "params_encode": {}
1787
+ },
1788
+ {
1789
+ "TestClass": "SqueezeBertTokenizationTest",
1790
+ "sp_base": false,
1791
+ "sequence": "UNwantéd,running",
1792
+ "tokens": [
1793
+ "un",
1794
+ "##want",
1795
+ "##ed",
1796
+ ",",
1797
+ "runn",
1798
+ "##ing"
1799
+ ],
1800
+ "encoded": [
1801
+ 9,
1802
+ 6,
1803
+ 7,
1804
+ 12,
1805
+ 10,
1806
+ 11
1807
+ ],
1808
+ "encoded_special": [
1809
+ 1,
1810
+ 9,
1811
+ 6,
1812
+ 7,
1813
+ 12,
1814
+ 10,
1815
+ 11,
1816
+ 2
1817
+ ],
1818
+ "params": {},
1819
+ "params_encode": {}
1820
+ },
1821
+ {
1822
+ "TestClass": "T5TokenizationTest",
1823
+ "sp_base": false,
1824
+ "sequence": "I was born in 92000, and this is falsé.",
1825
+ "tokens": [
1826
+ "▁I",
1827
+ "▁was",
1828
+ "▁b",
1829
+ "or",
1830
+ "n",
1831
+ "▁in",
1832
+ "▁",
1833
+ "9",
1834
+ "2",
1835
+ "0",
1836
+ "0",
1837
+ "0",
1838
+ ",",
1839
+ "▁and",
1840
+ "▁this",
1841
+ "▁is",
1842
+ "▁f",
1843
+ "al",
1844
+ "s",
1845
+ "é",
1846
+ "."
1847
+ ],
1848
+ "encoded": [
1849
+ 8,
1850
+ 21,
1851
+ 84,
1852
+ 55,
1853
+ 24,
1854
+ 19,
1855
+ 7,
1856
+ 0,
1857
+ 602,
1858
+ 347,
1859
+ 347,
1860
+ 347,
1861
+ 3,
1862
+ 12,
1863
+ 66,
1864
+ 46,
1865
+ 72,
1866
+ 80,
1867
+ 6,
1868
+ 0,
1869
+ 4
1870
+ ],
1871
+ "encoded_special": [
1872
+ 8,
1873
+ 21,
1874
+ 84,
1875
+ 55,
1876
+ 24,
1877
+ 19,
1878
+ 7,
1879
+ 0,
1880
+ 602,
1881
+ 347,
1882
+ 347,
1883
+ 347,
1884
+ 3,
1885
+ 12,
1886
+ 66,
1887
+ 46,
1888
+ 72,
1889
+ 80,
1890
+ 6,
1891
+ 0,
1892
+ 4,
1893
+ 2
1894
+ ],
1895
+ "params": {},
1896
+ "params_encode": {}
1897
+ },
1898
+ {
1899
+ "TestClass": "TestTokenizationBart",
1900
+ "sp_base": false,
1901
+ "sequence": "lower newer",
1902
+ "tokens": [
1903
+ "l",
1904
+ "o",
1905
+ "w",
1906
+ "er",
1907
+ "Ġ",
1908
+ "n",
1909
+ "e",
1910
+ "w",
1911
+ "er"
1912
+ ],
1913
+ "encoded": [
1914
+ 0,
1915
+ 1,
1916
+ 2,
1917
+ 15,
1918
+ 10,
1919
+ 9,
1920
+ 3,
1921
+ 2,
1922
+ 15
1923
+ ],
1924
+ "encoded_special": [
1925
+ 20,
1926
+ 0,
1927
+ 1,
1928
+ 2,
1929
+ 15,
1930
+ 10,
1931
+ 9,
1932
+ 3,
1933
+ 2,
1934
+ 15,
1935
+ 21
1936
+ ],
1937
+ "params": {},
1938
+ "params_encode": {}
1939
+ },
1940
+ {
1941
+ "TestClass": "TestTokenizationLED",
1942
+ "sp_base": false,
1943
+ "sequence": "lower newer",
1944
+ "tokens": [
1945
+ "l",
1946
+ "o",
1947
+ "w",
1948
+ "er",
1949
+ "Ġ",
1950
+ "n",
1951
+ "e",
1952
+ "w",
1953
+ "er"
1954
+ ],
1955
+ "encoded": [
1956
+ 0,
1957
+ 1,
1958
+ 2,
1959
+ 15,
1960
+ 10,
1961
+ 9,
1962
+ 3,
1963
+ 2,
1964
+ 15
1965
+ ],
1966
+ "encoded_special": [
1967
+ 20,
1968
+ 0,
1969
+ 1,
1970
+ 2,
1971
+ 15,
1972
+ 10,
1973
+ 9,
1974
+ 3,
1975
+ 2,
1976
+ 15,
1977
+ 21
1978
+ ],
1979
+ "params": {},
1980
+ "params_encode": {}
1981
+ },
1982
+ {
1983
+ "TestClass": "TestTokenizationMvp",
1984
+ "sp_base": false,
1985
+ "sequence": "lower newer",
1986
+ "tokens": [
1987
+ "l",
1988
+ "o",
1989
+ "w",
1990
+ "er",
1991
+ "Ġ",
1992
+ "n",
1993
+ "e",
1994
+ "w",
1995
+ "er"
1996
+ ],
1997
+ "encoded": [
1998
+ 0,
1999
+ 1,
2000
+ 2,
2001
+ 15,
2002
+ 10,
2003
+ 9,
2004
+ 3,
2005
+ 2,
2006
+ 15
2007
+ ],
2008
+ "encoded_special": [
2009
+ 20,
2010
+ 0,
2011
+ 1,
2012
+ 2,
2013
+ 15,
2014
+ 10,
2015
+ 9,
2016
+ 3,
2017
+ 2,
2018
+ 15,
2019
+ 21
2020
+ ],
2021
+ "params": {},
2022
+ "params_encode": {}
2023
+ },
2024
+ {
2025
+ "TestClass": "UdopTokenizationTest",
2026
+ "sp_base": true,
2027
+ "sequence": false,
2028
+ "tokens": false,
2029
+ "encoded": [
2030
+ 10,
2031
+ 112,
2032
+ 158,
2033
+ 16,
2034
+ 38,
2035
+ 170,
2036
+ 382,
2037
+ 37,
2038
+ 86,
2039
+ 20
2040
+ ],
2041
+ "encoded_special": [
2042
+ 10,
2043
+ 112,
2044
+ 158,
2045
+ 16,
2046
+ 38,
2047
+ 170,
2048
+ 382,
2049
+ 37,
2050
+ 86,
2051
+ 20,
2052
+ 2
2053
+ ],
2054
+ "params": {},
2055
+ "params_encode": {
2056
+ "text": ["a", "weirdly", "test", "hello"],
2057
+ "boxes": [[423, 237, 440, 251], [427, 272, 441, 287], [419, 115, 437, 129], [961, 885, 992, 912]]
2058
+ }
2059
+ },
2060
+ {
2061
+ "TestClass": "WhisperTokenizerTest",
2062
+ "sp_base": false,
2063
+ "sequence": "A BCDEFGHIJKLMNOPQRST",
2064
+ "tokens": [
2065
+ "A",
2066
+ "ĠBC",
2067
+ "DE",
2068
+ "F",
2069
+ "GH",
2070
+ "I",
2071
+ "J",
2072
+ "K",
2073
+ "L",
2074
+ "M",
2075
+ "N",
2076
+ "OP",
2077
+ "Q",
2078
+ "R",
2079
+ "ST"
2080
+ ],
2081
+ "encoded": [
2082
+ 32,
2083
+ 14359,
2084
+ 22296,
2085
+ 37,
2086
+ 4269,
2087
+ 40,
2088
+ 41,
2089
+ 42,
2090
+ 43,
2091
+ 44,
2092
+ 45,
2093
+ 12059,
2094
+ 48,
2095
+ 49,
2096
+ 6840
2097
+ ],
2098
+ "encoded_special": [
2099
+ 50258,
2100
+ 50363,
2101
+ 32,
2102
+ 14359,
2103
+ 22296,
2104
+ 37,
2105
+ 4269,
2106
+ 40,
2107
+ 41,
2108
+ 42,
2109
+ 43,
2110
+ 44,
2111
+ 45,
2112
+ 12059,
2113
+ 48,
2114
+ 49,
2115
+ 6840,
2116
+ 50257
2117
+ ],
2118
+ "params": {},
2119
+ "params_encode": {}
2120
+ },
2121
+ {
2122
+ "TestClass": "XGLMTokenizationTest",
2123
+ "sp_base": false,
2124
+ "sequence": "I was born in 92000, and this is falsé.",
2125
+ "tokens": [
2126
+ "▁I",
2127
+ "▁was",
2128
+ "▁b",
2129
+ "or",
2130
+ "n",
2131
+ "▁in",
2132
+ "▁",
2133
+ "9",
2134
+ "2",
2135
+ "0",
2136
+ "0",
2137
+ "0",
2138
+ ",",
2139
+ "▁and",
2140
+ "▁this",
2141
+ "▁is",
2142
+ "▁f",
2143
+ "al",
2144
+ "s",
2145
+ "é",
2146
+ "."
2147
+ ],
2148
+ "encoded": [
2149
+ 9,
2150
+ 22,
2151
+ 85,
2152
+ 56,
2153
+ 25,
2154
+ 20,
2155
+ 8,
2156
+ 3,
2157
+ 603,
2158
+ 348,
2159
+ 348,
2160
+ 348,
2161
+ 4,
2162
+ 13,
2163
+ 67,
2164
+ 47,
2165
+ 73,
2166
+ 81,
2167
+ 7,
2168
+ 3,
2169
+ 5
2170
+ ],
2171
+ "encoded_special": [
2172
+ 2,
2173
+ 9,
2174
+ 22,
2175
+ 85,
2176
+ 56,
2177
+ 25,
2178
+ 20,
2179
+ 8,
2180
+ 3,
2181
+ 603,
2182
+ 348,
2183
+ 348,
2184
+ 348,
2185
+ 4,
2186
+ 13,
2187
+ 67,
2188
+ 47,
2189
+ 73,
2190
+ 81,
2191
+ 7,
2192
+ 3,
2193
+ 5
2194
+ ],
2195
+ "params": {},
2196
+ "params_encode": {}
2197
+ },
2198
+ {
2199
+ "TestClass": "XLMRobertaTokenizationTest",
2200
+ "sp_base": false,
2201
+ "sequence": "I was born in 92000, and this is falsé.",
2202
+ "tokens": [
2203
+ "▁I",
2204
+ "▁was",
2205
+ "▁b",
2206
+ "or",
2207
+ "n",
2208
+ "▁in",
2209
+ "▁",
2210
+ "9",
2211
+ "2",
2212
+ "0",
2213
+ "0",
2214
+ "0",
2215
+ ",",
2216
+ "▁and",
2217
+ "▁this",
2218
+ "▁is",
2219
+ "▁f",
2220
+ "al",
2221
+ "s",
2222
+ "é",
2223
+ "."
2224
+ ],
2225
+ "encoded": [
2226
+ 9,
2227
+ 22,
2228
+ 85,
2229
+ 56,
2230
+ 25,
2231
+ 20,
2232
+ 8,
2233
+ 3,
2234
+ 603,
2235
+ 348,
2236
+ 348,
2237
+ 348,
2238
+ 4,
2239
+ 13,
2240
+ 67,
2241
+ 47,
2242
+ 73,
2243
+ 81,
2244
+ 7,
2245
+ 3,
2246
+ 5
2247
+ ],
2248
+ "encoded_special": [
2249
+ 0,
2250
+ 9,
2251
+ 22,
2252
+ 85,
2253
+ 56,
2254
+ 25,
2255
+ 20,
2256
+ 8,
2257
+ 3,
2258
+ 603,
2259
+ 348,
2260
+ 348,
2261
+ 348,
2262
+ 4,
2263
+ 13,
2264
+ 67,
2265
+ 47,
2266
+ 73,
2267
+ 81,
2268
+ 7,
2269
+ 3,
2270
+ 5,
2271
+ 2
2272
+ ],
2273
+ "params": {},
2274
+ "params_encode": {}
2275
+ },
2276
+ {
2277
+ "TestClass": "XLNetTokenizationTest",
2278
+ "sp_base": true,
2279
+ "sequence": "the I to a and of in was it me that be he for with my not is s you",
2280
+ "tokens": [
2281
+ "▁the",
2282
+ "▁I",
2283
+ "▁to",
2284
+ "▁a",
2285
+ "▁and",
2286
+ "▁of",
2287
+ "▁in",
2288
+ "▁was",
2289
+ "▁it",
2290
+ "▁me",
2291
+ "▁that",
2292
+ "▁be",
2293
+ "▁he",
2294
+ "▁for",
2295
+ "▁with",
2296
+ "▁my",
2297
+ "▁not",
2298
+ "▁is",
2299
+ "▁s",
2300
+ "▁you"
2301
+ ],
2302
+ "encoded": [
2303
+ 5,
2304
+ 8,
2305
+ 9,
2306
+ 10,
2307
+ 12,
2308
+ 13,
2309
+ 19,
2310
+ 21,
2311
+ 25,
2312
+ 31,
2313
+ 34,
2314
+ 36,
2315
+ 37,
2316
+ 40,
2317
+ 41,
2318
+ 43,
2319
+ 44,
2320
+ 46,
2321
+ 47,
2322
+ 48
2323
+ ],
2324
+ "encoded_special": [
2325
+ 5,
2326
+ 8,
2327
+ 9,
2328
+ 10,
2329
+ 12,
2330
+ 13,
2331
+ 19,
2332
+ 21,
2333
+ 25,
2334
+ 31,
2335
+ 34,
2336
+ 36,
2337
+ 37,
2338
+ 40,
2339
+ 41,
2340
+ 43,
2341
+ 44,
2342
+ 46,
2343
+ 47,
2344
+ 48,
2345
+ 1000,
2346
+ 1002
2347
+ ],
2348
+ "params": {},
2349
+ "params_encode": {}
2350
+ }
2351
+ ]