Fetching metadata from the HF Docker repository...
agregar tokenizers
1569560
-
PY3
agregar tokenizers
-
6.15 kB
agregar tokenizers
-
8.57 kB
agregar tokenizers
czech.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.27 MB
agregar tokenizers
danish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.26 MB
agregar tokenizers
dutch.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
743 kB
agregar tokenizers
english.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
433 kB
agregar tokenizers
estonian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.6 MB
agregar tokenizers
finnish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.95 MB
agregar tokenizers
french.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
583 kB
agregar tokenizers
german.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.53 MB
agregar tokenizers
greek.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.95 MB
agregar tokenizers
italian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
658 kB
agregar tokenizers
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
221 kB
agregar tokenizers
norwegian.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.26 MB
agregar tokenizers
polish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
2.04 MB
agregar tokenizers
portuguese.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
649 kB
agregar tokenizers
russian.pickle
Detected Pickle imports (7)
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "__builtin__.long",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
33 kB
agregar tokenizers
slovene.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
833 kB
agregar tokenizers
spanish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
598 kB
agregar tokenizers
swedish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.03 MB
agregar tokenizers
turkish.pickle
Detected Pickle imports (9)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.object",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "copy_reg._reconstructor",
- "collections.defaultdict"
How to fix it?
1.23 MB
agregar tokenizers