Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
model_id = "I-BRICKS/Cerebro_BM_solar_v01"
model = AutoModelForCausalLM.from_pretrained(
model_id,
torch_dtype=torch.float16,
device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(model_id)
Correspond
Model Developer : YoungWoo Nam
Company : I-BRICKS
- Downloads last month
- 2,246
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.