|
--- |
|
base_model: aaditya/Llama3-OpenBioLLM-8B |
|
language: |
|
- en |
|
license: llama3 |
|
tags: |
|
- llama-3 |
|
- llama |
|
- Mixtral |
|
- instruct |
|
- finetune |
|
- chatml |
|
- DPO |
|
- RLHF |
|
- gpt4 |
|
- distillation |
|
- llama-cpp |
|
- gguf-my-repo |
|
widget: |
|
- example_title: OpenBioLLM-8B |
|
messages: |
|
- role: system |
|
content: You are an expert and experienced from the healthcare and biomedical |
|
domain with extensive medical knowledge and practical experience. |
|
- role: user |
|
content: How long does it take for newborn jaundice to go away? |
|
output: |
|
text: 'Newborn jaundice, also known as neonatal jaundice, is a common condition |
|
in newborns where the yellowing of the skin and eyes occurs due to an elevated |
|
level of bilirubin in the blood. Bilirubin is a yellow pigment that forms when |
|
red blood cells break down. In most cases, newborn jaundice resolves on its |
|
own without any specific treatment. |
|
|
|
The duration of newborn jaundice can vary depending on several factors such |
|
as the underlying cause, gestational age at birth, and individual variations |
|
in bilirubin metabolism. Here are some general guidelines: |
|
|
|
1. Physiological jaundice: This is the most common type of newborn jaundice |
|
and usually appears within 24-72 hours after birth. It tends to peak between |
|
the second and fifth day of life and gradually improves over the next week or |
|
two. By the time the baby is one week old, the jaundice should have mostly resolved. |
|
2. Breast milk jaundice: This type of jaundice occurs in breastfed babies and |
|
may appear later than physiological jaundice, typically between the fifth and |
|
fourteenth day of life. It tends to persist for a longer duration but usually |
|
resolves within six weeks after birth. 3. Pathological jaundice: This type of |
|
jaundice is less common and occurs due to an underlying medical condition that |
|
affects bilirubin metabolism or liver function. The duration of pathological |
|
jaundice depends on the specific cause and may require treatment. |
|
|
|
It''s important for parents to monitor their newborn''s jaundice closely and |
|
seek medical advice if the jaundice progresses rapidly, becomes severe, or is |
|
accompanied by other symptoms such as poor feeding, lethargy, or excessive sleepiness. |
|
In these cases, further evaluation and management may be necessary. Remember |
|
that each baby is unique, and the timing of jaundice resolution can vary. If |
|
you have concerns about your newborn''s jaundice, it''s always best to consult |
|
with a healthcare professional for personalized advice and guidance.' |
|
model-index: |
|
- name: OpenBioLLM-8B |
|
results: [] |
|
--- |
|
|
|
# AnirudhJM24/Llama3-OpenBioLLM-8B-Q4_K_M-GGUF |
|
This model was converted to GGUF format from [`aaditya/Llama3-OpenBioLLM-8B`](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. |
|
Refer to the [original model card](https://huggingface.co/aaditya/Llama3-OpenBioLLM-8B) for more details on the model. |
|
|
|
## Use with llama.cpp |
|
Install llama.cpp through brew (works on Mac and Linux) |
|
|
|
```bash |
|
brew install llama.cpp |
|
|
|
``` |
|
Invoke the llama.cpp server or the CLI. |
|
|
|
### CLI: |
|
```bash |
|
llama-cli --hf-repo AnirudhJM24/Llama3-OpenBioLLM-8B-Q4_K_M-GGUF --hf-file llama3-openbiollm-8b-q4_k_m.gguf -p "The meaning to life and the universe is" |
|
``` |
|
|
|
### Server: |
|
```bash |
|
llama-server --hf-repo AnirudhJM24/Llama3-OpenBioLLM-8B-Q4_K_M-GGUF --hf-file llama3-openbiollm-8b-q4_k_m.gguf -c 2048 |
|
``` |
|
|
|
Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. |
|
|
|
Step 1: Clone llama.cpp from GitHub. |
|
``` |
|
git clone https://github.com/ggerganov/llama.cpp |
|
``` |
|
|
|
Step 2: Move into the llama.cpp folder and build it with `LLAMA_CURL=1` flag along with other hardware-specific flags (for ex: LLAMA_CUDA=1 for Nvidia GPUs on Linux). |
|
``` |
|
cd llama.cpp && LLAMA_CURL=1 make |
|
``` |
|
|
|
Step 3: Run inference through the main binary. |
|
``` |
|
./llama-cli --hf-repo AnirudhJM24/Llama3-OpenBioLLM-8B-Q4_K_M-GGUF --hf-file llama3-openbiollm-8b-q4_k_m.gguf -p "The meaning to life and the universe is" |
|
``` |
|
or |
|
``` |
|
./llama-server --hf-repo AnirudhJM24/Llama3-OpenBioLLM-8B-Q4_K_M-GGUF --hf-file llama3-openbiollm-8b-q4_k_m.gguf -c 2048 |
|
``` |
|
|