SipánGPT 0.5 Llama 3.2 1B GGUF

  • Modelo pre-entrenado para responder preguntas de la Universidad Señor de Sipán de Lambayeque, Perú.
  • Pre-trained model to answer questions from the Señor de Sipán University of Lambayeque, Peru.

Testing the model

  • Entrenado con 304000 conversaciones, el modelo puede generar alucinaciones.
  • Trained with 304000 conversations, the model can generate hallucinations

Uploaded model

  • Developed by: ussipan
  • License: apache-2.0
  • Finetuned from model : unsloth/llama-3.2-1b-instruct-bnb-4bit

This llama model was trained 2x faster with Unsloth and Huggingface's TRL library.


SipánGPT 0.5 Llama 3.2 1B GGUF

Hecho con ❤️ por Jhan Gómez P.
Downloads last month
124
GGUF
Model size
1.24B params
Architecture
llama

4-bit

5-bit

8-bit

16-bit

Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train ussipan/Llama-3.2-SipanGPT-v0.5-GGUF