Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
intelli-zen
/
multilingual_translation
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
main
multilingual_translation
/
thirdparty_data
/
nltk_data
/
tokenizers
/
punkt
/
PY3
Ctrl+K
Ctrl+K
1 contributor
History:
2 commits
qgyd2021
[update]add sent_tokenize model
6bebdfc
over 1 year ago
README
Safe
8.57 kB
[update]add sent_tokenize model
over 1 year ago
czech.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.12 MB
LFS
[update]add sent_tokenize model
over 1 year ago
danish.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.19 MB
LFS
[update]add sent_tokenize model
over 1 year ago
dutch.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
694 kB
LFS
[update]add sent_tokenize model
over 1 year ago
english.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
407 kB
LFS
[update]add sent_tokenize
over 1 year ago
estonian.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.5 MB
LFS
[update]add sent_tokenize model
over 1 year ago
finnish.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.85 MB
LFS
[update]add sent_tokenize model
over 1 year ago
french.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
554 kB
LFS
[update]add sent_tokenize model
over 1 year ago
german.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.46 MB
LFS
[update]add sent_tokenize model
over 1 year ago
greek.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
876 kB
LFS
[update]add sent_tokenize model
over 1 year ago
italian.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
615 kB
LFS
[update]add sent_tokenize model
over 1 year ago
norwegian.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.18 MB
LFS
[update]add sent_tokenize model
over 1 year ago
polish.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.74 MB
LFS
[update]add sent_tokenize model
over 1 year ago
portuguese.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
612 kB
LFS
[update]add sent_tokenize model
over 1 year ago
russian.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
33 kB
LFS
[update]add sent_tokenize model
over 1 year ago
slovene.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
734 kB
LFS
[update]add sent_tokenize model
over 1 year ago
spanish.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
562 kB
LFS
[update]add sent_tokenize model
over 1 year ago
swedish.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
980 kB
LFS
[update]add sent_tokenize model
over 1 year ago
turkish.pickle
pickle
Detected Pickle imports (7)
"builtins.set"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktParameters"
,
"nltk.tokenize.punkt.PunktToken"
,
"builtins.int"
,
"collections.defaultdict"
How to fix it?
1.02 MB
LFS
[update]add sent_tokenize model
over 1 year ago