tr416 commited on
Commit
dc21ca1
·
1 Parent(s): f92bacf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -7,7 +7,7 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- Modern LLMs are rooted in secular value systems that are often misaligned with christian organisations. AI for the Church allows anyone to train and deploying doctrinally correct LLMs based on Llama2. Effectively, we are aligning models to a set of values.
11
  This HF page is for storing models rooted in christian doctrine, that can be trusted to give christian answers (a few denominations already available).
12
 
13
  The corresponding PyPI package is made available to train and deploy the models.
@@ -24,9 +24,13 @@ align_llama2(doctrinal_dataset)
24
 
25
  At [aiforthechurch.org](aiforthechurch.org) we provide tools for generating doctrinal datasets, a few of which are made available here, and further instructions.
26
 
27
- #Model details
 
 
 
 
 
28
 
29
-
30
- Training requirements:
31
  - GPU with over 16GB of memory (we trained on NVIDIA Tesla V100 32GB and NVIDIA Ampere A6000 45GB)
32
  - 30GB of RAM (the raw model weights are about 29GB, our models are cast to 8bit to use less memory)
 
7
  pinned: false
8
  ---
9
 
10
+ Modern LLMs are rooted in secular value systems that are often misaligned with christian organisations. AI for the Church allows anyone to align and deploying doctrinally correct LLMs based on Llama2. Effectively, we are aligning models to a set of values.
11
  This HF page is for storing models rooted in christian doctrine, that can be trusted to give christian answers (a few denominations already available).
12
 
13
  The corresponding PyPI package is made available to train and deploy the models.
 
24
 
25
  At [aiforthechurch.org](aiforthechurch.org) we provide tools for generating doctrinal datasets, a few of which are made available here, and further instructions.
26
 
27
+ # Model details
28
+ The family of models presented here are derived from Llama-2-chat-7B, a seven billion parameter model trained by Meta for chat applications.
29
+ The base ChristianGPT model was then trained on 30000 question-answer pairs obtained from [biblechat](biblechat.ai) user data, which was first
30
+ filtered for high quality and to remove any personal identifiable information (PII). These were generated with a prompted GPT-3.5 model. This is
31
+ currently the standard for AI applications, but can prove un-reliable on important doctrinal issues. These were filtered out from the dataset.
32
+ Denomination specific models like ChristianGPT-catholic are based on ChristianGPT and fine-tuned on denomination specific datasets.
33
 
34
+ ## Training requirements:
 
35
  - GPU with over 16GB of memory (we trained on NVIDIA Tesla V100 32GB and NVIDIA Ampere A6000 45GB)
36
  - 30GB of RAM (the raw model weights are about 29GB, our models are cast to 8bit to use less memory)