Update configuration_nvembed.py (transformers + sentence-transformers) and infinity usage
#23
by
michaelfeil
- opened
Forgot to invoke super method, which breaks compatibility for 4.45+ transformers versions.
File "/app/.cache/huggingface/modules/transformers_modules/nvidia/NV-Embed-v2/7604d305b621f14095a1aa23d351674c2859553a/modeling_nvembed.py", line 323, in __init__
self.latent_attention_model = AutoModel.from_config(config.latent_attention_config)
File "/app/.venv/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 440, in from_config
return model_class._from_config(config, **kwargs)
File "/app/.venv/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1494, in _from_config
if config._attn_implementation_internal is not None:
File "/app/.venv/lib/python3.10/site-packages/transformers/configuration_utils.py", line 202, in __getattribute__
return super().__getattribute__(key)
AttributeError: 'LatentAttentionConfig' object has no attribute '_attn_implementation_internal'
michaelfeil
changed pull request title from
Update configuration_nvembed.py
to Update configuration_nvembed.py (transformers + sentence-transformers) and infinity usage
@nada5 Can you review?
nada5
changed pull request status to
merged
Hi, @michaelfeil . The code is merged and hope this resolve the errors reported in https://github.com/michaelfeil/infinity/issues/470. However, we do not want to change the readme, so it is remain the same as before. Thank you.
Hi @nada5 ! Transparently, I am not very happy about the cherry pick resolution, as I spend a a couple of minutes to verify the above fix works. But sure, go ahead!