Could not locate the configuration_minicpm.py inside openbmb/MiniCPM-Llama3-V-2_5.

#4
by meijiaka - opened

When I use:
from transformers import AutoModel
model = AutoModel.from_pretrained("ContactDoctor/Bio-Medical-MultiModal-Llama-3-8B-V1",
trust_remote_code=True)
it always prompts me:
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like openbmb/MiniCPM-Llama3-V-2_5 is not the path to a directory containing a file named configuration_minicpm.py.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'."
}

SrikanthChellappa changed discussion status to closed
Contact Doctor Healthcare org

@meijiaka I believe this might be due to an intermittent internet disconnection on your end, which should resolve automatically. Please let me know if the issue persists even with a stable internet connection.

@meijiaka I believe this might be due to an intermittent internet disconnection on your end, which should resolve automatically. Please let me know if the issue persists even with a stable internet connection.

Thank you for your reply. There was no network error when I downloaded other models.

Sign up or log in to comment