Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Spaces:
Duplicated from
drewThomasson/ebook2audiobookXTTS
vuxuanhoan
/
ebook2audiobookXTTS
like
0
Running
App
Files
Files
Community
main
ebook2audiobookXTTS
/
nltk_data
/
tokenizers
/
punkt
1 contributor
History:
3 commits
drewThomasson
Upload 115 files
f045c49
verified
about 1 month ago
PY3
Upload 115 files
about 1 month ago
README
Safe
8.57 kB
Upload 115 files
about 1 month ago
czech.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.27 MB
LFS
Upload 115 files
about 1 month ago
danish.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
LFS
Upload 115 files
about 1 month ago
dutch.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
743 kB
LFS
Upload 115 files
about 1 month ago
english.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
433 kB
LFS
Upload 115 files
about 1 month ago
estonian.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.6 MB
LFS
Upload 115 files
about 1 month ago
finnish.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
LFS
Upload 115 files
about 1 month ago
french.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
583 kB
LFS
Upload 115 files
about 1 month ago
german.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.53 MB
LFS
Upload 115 files
about 1 month ago
greek.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.95 MB
LFS
Upload 115 files
about 1 month ago
italian.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
658 kB
LFS
Upload 115 files
about 1 month ago
malayalam.pickle
pickle
Detected Pickle imports (7)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
221 kB
LFS
Upload 115 files
about 1 month ago
norwegian.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.26 MB
LFS
Upload 115 files
about 1 month ago
polish.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
2.04 MB
LFS
Upload 115 files
about 1 month ago
portuguese.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
649 kB
LFS
Upload 115 files
about 1 month ago
russian.pickle
pickle
Detected Pickle imports (7)
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.long"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
33 kB
LFS
Upload 115 files
about 1 month ago
slovene.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
833 kB
LFS
Upload 115 files
about 1 month ago
spanish.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
598 kB
LFS
Upload 115 files
about 1 month ago
swedish.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.03 MB
LFS
Upload 115 files
about 1 month ago
turkish.pickle
pickle
Detected Pickle imports (9)
"__builtin__.int"
,
"nltk.tokenize.punkt.PunktToken"
,
"collections.defaultdict"
,
"__builtin__.set"
,
"nltk.tokenize.punkt.PunktLanguageVars"
,
"nltk.tokenize.punkt.PunktSentenceTokenizer"
,
"__builtin__.object"
,
"copy_reg._reconstructor"
,
"nltk.tokenize.punkt.PunktParameters"
How to fix it?
1.23 MB
LFS
Upload 115 files
about 1 month ago