Fetching metadata from the HF Docker repository...
agregar tokenizers
1569560
-
8.57 kB
agregar tokenizers
czech.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.12 MB
agregar tokenizers
danish.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.19 MB
agregar tokenizers
dutch.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
694 kB
agregar tokenizers
english.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
407 kB
agregar tokenizers
estonian.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.5 MB
agregar tokenizers
finnish.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.85 MB
agregar tokenizers
french.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
554 kB
agregar tokenizers
german.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.46 MB
agregar tokenizers
greek.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
876 kB
agregar tokenizers
italian.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
615 kB
agregar tokenizers
malayalam.pickle
Detected Pickle imports (7)
- "__builtin__.int",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "__builtin__.set",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "collections.defaultdict"
How to fix it?
221 kB
agregar tokenizers
norwegian.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.18 MB
agregar tokenizers
polish.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.74 MB
agregar tokenizers
portuguese.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
612 kB
agregar tokenizers
russian.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
33 kB
agregar tokenizers
slovene.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
734 kB
agregar tokenizers
spanish.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
562 kB
agregar tokenizers
swedish.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
980 kB
agregar tokenizers
turkish.pickle
Detected Pickle imports (7)
- "builtins.set",
- "nltk.tokenize.punkt.PunktSentenceTokenizer",
- "nltk.tokenize.punkt.PunktLanguageVars",
- "nltk.tokenize.punkt.PunktParameters",
- "nltk.tokenize.punkt.PunktToken",
- "builtins.int",
- "collections.defaultdict"
How to fix it?
1.02 MB
agregar tokenizers