YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Fresh Alpasta, done Al Dente!

It's da logical choice! Now with a similar personality emulation quality to GPT4-X-Alpasta-30b!

Model Info:

ChanSung's Alpaca-LoRA-30B-elina merged with Open Assistant's second Finetune

Benchmarks:

Wikitext2: 4.662261962890625

PTB: 24.547462463378906

C4: 7.05504846572876

4bit:

Wikitext2: 5.016242980957031

PTB: 25.576189041137695

C4: 7.332120418548584

~ Thanks to askmyteapot for performing these benchmarks!

Downloads last month
1,183
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Aeala/GPT4-x-AlpacaDente2-30b

Quantizations
2 models

Spaces using Aeala/GPT4-x-AlpacaDente2-30b 26