|
--- |
|
license: mit |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
tags: |
|
- astronomy |
|
- science |
|
- LLM |
|
--- |
|
|
|
# AstroLLaMA.gguf |
|
|
|
[AstroLLaMA](https://huggingface.co/universeTBD/astrollama) in GPT-Generated Unified Format. |
|
|
|
![](https://huggingface.co/universeTBD/astrollama/resolve/main/images/astrollama-logo.png) |
|
|
|
## What's GPT-Generated Unified Format? |
|
|
|
GGUF is file format used for storing models for inference, particularly in the context of language models like GPT. GGUF models **could be run on CPUs**, which made them accessible to a wider range of users. |
|
|
|
More detailed description can be found [here](https://medium.com/@phillipgimmi/what-is-gguf-and-ggml-e364834d241c). |
|
|
|
## How to play with AstroLLaMA on your laptop? |
|
|
|
1. Install [ollama](https://github.com/jmorganca/ollama); |
|
2. Download `astrollama.gguf` to your PC; |
|
3. Create a file named Modelfile, and add a FROM instruction with the local filepath to AstroLLaMA, e.g., |
|
``` |
|
FROM ./astrollama.gguf |
|
``` |
|
4. Create the model in Ollama |
|
``` |
|
ollama create astrollama -f path_to_modelfile |
|
``` |
|
5. Run AstroLLaMA locally |
|
``` |
|
ollama run astrollama |
|
``` |