Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
LinDee
/
ai-recommender-dapp-demo
like
0
Configuration error
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
b5023bb
ai-recommender-dapp-demo
Ctrl+K
Ctrl+K
1 contributor
History:
17 commits
LinDee
Delete tokenizer_config.json
b5023bb
verified
26 days ago
.gitattributes
Safe
1.52 kB
initial commit
26 days ago
README.md
Safe
10.5 kB
Upload 15 files
26 days ago
model.safetensors
Safe
90.9 MB
xet
Upload 15 files
26 days ago
posts_cleaned.csv
Safe
38.4 kB
Upload 15 files
26 days ago
recommender_model.pkl
pickle
Detected Pickle imports (44)
"sentence_transformers.SentenceTransformer.SentenceTransformer"
,
"transformers.models.bert.tokenization_bert_fast.BertTokenizerFast"
,
"tokenizers.AddedToken"
,
"transformers.models.bert.configuration_bert.BertConfig"
,
"numpy._core.multiarray._reconstruct"
,
"pandas.core.indexes.range.RangeIndex"
,
"pandas.core.frame.DataFrame"
,
"sentence_transformers.model_card.SentenceTransformerModelCardData"
,
"torch.nn.modules.linear.Linear"
,
"tokenizers.models.Model"
,
"torch.torch_version.TorchVersion"
,
"collections.OrderedDict"
,
"transformers.models.bert.modeling_bert.BertPooler"
,
"transformers.activations.GELUActivation"
,
"transformers.models.bert.modeling_bert.BertEncoder"
,
"transformers.models.bert.modeling_bert.BertAttention"
,
"pandas.core.internals.managers.BlockManager"
,
"numpy.dtype"
,
"sentence_transformers.models.Normalize.Normalize"
,
"pandas.core.indexes.base._new_Index"
,
"pandas._libs.internals._unpickle_block"
,
"torch.nn.modules.normalization.LayerNorm"
,
"torch.nn.modules.dropout.Dropout"
,
"transformers.models.bert.modeling_bert.BertSelfOutput"
,
"torch.float32"
,
"builtins.slice"
,
"sentence_transformers.models.Transformer.Transformer"
,
"transformers.models.bert.modeling_bert.BertEmbeddings"
,
"transformers.models.bert.modeling_bert.BertIntermediate"
,
"torch.nn.modules.sparse.Embedding"
,
"torch.nn.modules.activation.Tanh"
,
"torch._C._nn.gelu"
,
"transformers.models.bert.modeling_bert.BertOutput"
,
"transformers.models.bert.modeling_bert.BertLayer"
,
"torch._utils._rebuild_tensor_v2"
,
"pandas.core.indexes.base.Index"
,
"transformers.models.bert.modeling_bert.BertSdpaSelfAttention"
,
"sentence_transformers.models.Pooling.Pooling"
,
"torch.nn.modules.container.ModuleList"
,
"numpy.ndarray"
,
"torch._utils._rebuild_parameter"
,
"transformers.models.bert.modeling_bert.BertModel"
,
"torch.storage._load_from_bytes"
,
"tokenizers.Tokenizer"
How to fix it?
91.9 MB
xet
Upload 15 files
26 days ago
requirements.txt
Safe
68 Bytes
Upload 15 files
26 days ago
sentence_bert_config.json
Safe
53 Bytes
Upload 15 files
26 days ago
special_tokens_map.json
Safe
695 Bytes
Upload 15 files
26 days ago
tokenizer.json
Safe
712 kB
Upload 15 files
26 days ago