Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Deaksh
/
research-tool
like
1
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
bcc4e52
research-tool
Ctrl+K
Ctrl+K
1 contributor
History:
9 commits
Deaksh
Upload faiss_store_openai.pkl
bcc4e52
verified
5 months ago
.env
Safe
71 Bytes
Rename env.txt to .env
5 months ago
.gitattributes
Safe
1.52 kB
initial commit
5 months ago
README.md
Safe
306 Bytes
initial commit
5 months ago
app.py
Safe
2.69 kB
Rename main.py to app.py
5 months ago
faiss_store_openai.pkl
pickle
Detected Pickle imports (39)
"transformers.models.bert.modeling_bert.BertIntermediate"
,
"faiss.swigfaiss.IndexFlatL2"
,
"transformers.activations.GELUActivation"
,
"transformers.models.bert.modeling_bert.BertAttention"
,
"torch.nn.modules.activation.Tanh"
,
"langchain_community.vectorstores.utils.DistanceStrategy"
,
"sentence_transformers.models.Transformer.Transformer"
,
"torch.nn.modules.sparse.Embedding"
,
"torch.torch_version.TorchVersion"
,
"transformers.models.bert.configuration_bert.BertConfig"
,
"transformers.models.bert.modeling_bert.BertModel"
,
"transformers.models.bert.modeling_bert.BertOutput"
,
"transformers.models.bert.tokenization_bert_fast.BertTokenizerFast"
,
"torch._utils._rebuild_tensor_v2"
,
"sentence_transformers.SentenceTransformer.SentenceTransformer"
,
"torch._utils._rebuild_parameter"
,
"tokenizers.AddedToken"
,
"sentence_transformers.models.Normalize.Normalize"
,
"torch.nn.modules.linear.Linear"
,
"tokenizers.Tokenizer"
,
"transformers.models.bert.modeling_bert.BertEncoder"
,
"torch.nn.modules.dropout.Dropout"
,
"transformers.models.bert.modeling_bert.BertPooler"
,
"transformers.models.bert.modeling_bert.BertSdpaSelfAttention"
,
"sentence_transformers.models.Pooling.Pooling"
,
"torch.nn.modules.container.ModuleList"
,
"transformers.models.bert.modeling_bert.BertSelfOutput"
,
"transformers.models.bert.modeling_bert.BertLayer"
,
"torch._C._nn.gelu"
,
"langchain_core.documents.base.Document"
,
"langchain_community.embeddings.huggingface.HuggingFaceEmbeddings"
,
"tokenizers.models.Model"
,
"torch.storage._load_from_bytes"
,
"torch.nn.modules.normalization.LayerNorm"
,
"collections.OrderedDict"
,
"sentence_transformers.model_card.SentenceTransformerModelCardData"
,
"transformers.models.bert.modeling_bert.BertEmbeddings"
,
"langchain_community.docstore.in_memory.InMemoryDocstore"
,
"langchain_community.vectorstores.faiss.FAISS"
How to fix it?
91.4 MB
LFS
Upload faiss_store_openai.pkl
5 months ago
requirements.txt
Safe
219 Bytes
Update requirements.txt
5 months ago