ibleducation/ibl-fordham-7b

ibleducation/ibl-fordham-7b is a model finetuned on top of openchat/openchat_3.5

This model is finetuned to answer questions about fordham university.

Model Details

How to Use ibl-fordham-7b Model from Python Code (HuggingFace transformers)

Install the necessary packages

Requires: transformers 4.35.0 or later, and accelerate 0.23.0 or later.

pip install transformers==4.35.0
pip install accelerate==0.23.0

You can then try the following example code

from transformers import AutoModelForCausalLM, AutoTokenizer
import transformers
import torch

model_id = "ibleducation/ibl-fordham-7b"

tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(
  model_id,
  device_map="auto",
)
pipeline = transformers.pipeline(
    "text-generation",
    model=model,
    tokenizer=tokenizer,
)
prompt = "<s>What programmes are offered at fordham university?</s>"

response = pipeline(prompt)
print(response['generated_text'])

Important - Use the prompt template below for ibl-fordham-7b:

<s>{prompt}</s>
Downloads last month
17
Safetensors
Model size
7.24B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train iblai/ibl-fordham-7b