Issue with ImportError: Unable to Load LLM2VecWrapper in velvetScar/llm2vec-llama-3.1-8B Model Using SentenceTransformers

#1
by Raki467 - opened

I'm experiencing an issue when attempting to load the velvetScar/llm2vec-llama-3.1-8B model from the Hugging Face Hub using the SentenceTransformers library. The error message indicates that the model cannot locate the LLM2VecWrapper attribute or class, with the following ImportError:
ImportError: Module "main" does not define a "LLM2VecWrapper" attribute/class

Error -

AttributeError Traceback (most recent call last)
File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/sentence_transformers/util.py:1129, in import_from_string(dotted_path)
1128 try:
-> 1129 return getattr(module, class_name)
1130 except AttributeError:

AttributeError: module 'main' has no attribute 'LLM2VecWrapper'

During handling of the above exception, another exception occurred:

ImportError Traceback (most recent call last)
Cell In[1], line 4
1 from sentence_transformers import SentenceTransformer
3 # Download from the πŸ€— Hub
----> 4 model = SentenceTransformer("velvetScar/llm2vec-llama-3.1-8B")
5 # Run inference
6 sentences = [
7 'The weather is lovely today.',
8 "It's so sunny outside!",
9 'He drove to the stadium.',
10 ]

File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py:306, in SentenceTransformer.init(self, model_name_or_path, modules, device, prompts, default_prompt_name, similarity_fn_name, cache_folder, trust_remote_code, revision, local_files_only, token, use_auth_token, truncate_dim, model_kwargs, tokenizer_kwargs, config_kwargs, model_card_data, backend)
297 model_name_or_path = MODEL_HUB_ORGANIZATION + "/" + model_name_or_path
299 if is_sentence_transformer_model(
300 model_name_or_path,
301 token,
(...)
304 local_files_only=local_files_only,
305 ):
--> 306 modules, self.module_kwargs = self._load_sbert_model(
307 model_name_or_path,
308 token=token,
309 cache_folder=cache_folder,
310 revision=revision,
311 trust_remote_code=trust_remote_code,
312 local_files_only=local_files_only,
313 model_kwargs=model_kwargs,
314 tokenizer_kwargs=tokenizer_kwargs,
315 config_kwargs=config_kwargs,
316 )
317 else:
318 modules = self._load_auto_model(
319 model_name_or_path,
320 token=token,
(...)
327 config_kwargs=config_kwargs,
328 )

File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py:1655, in SentenceTransformer._load_sbert_model(self, model_name_or_path, token, cache_folder, revision, trust_remote_code, local_files_only, model_kwargs, tokenizer_kwargs, config_kwargs)
1653 for module_config in modules_config:
1654 class_ref = module_config["type"]
-> 1655 module_class = self._load_module_class_from_ref(
1656 class_ref, model_name_or_path, trust_remote_code, revision, model_kwargs
1657 )
1659 # For Transformer, don't load the full directory, rely on transformers instead
1660 # But, do load the config file first.
1661 if module_config["path"] == "":

File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/sentence_transformers/SentenceTransformer.py:1560, in SentenceTransformer._load_module_class_from_ref(self, class_ref, model_name_or_path, trust_remote_code, revision, model_kwargs)
1556 except OSError:
1557 # Ignore the error if the file does not exist, and fall back to the default import
1558 pass
-> 1560 return import_from_string(class_ref)

File ~/anaconda3/envs/pytorch_p310/lib/python3.10/site-packages/sentence_transformers/util.py:1132, in import_from_string(dotted_path)
1130 except AttributeError:
1131 msg = f'Module "{module_path}" does not define a "{class_name}" attribute/class'
-> 1132 raise ImportError(msg)

ImportError: Module "main" does not define a "LLM2VecWrapper" attribute/class

Request -
Could anyone advise on a solution to load this model correctly or clarify if this model requires additional configuration for compatibility with SentenceTransformers? Additionally, any pointers on defining LLM2VecWrapper manually if feasible would be helpful.

Sign up or log in to comment