Commit
·
08e6d8e
1
Parent(s):
b1ea2c6
update videos and products
Browse filesTrelis Research provides tools and tutorials for training, using and deploying large language models.
# Most-viewed Video Tutorials [updated weekly]
1. [Run Llama 2 on your Laptop with Jupyter](https://www.youtube.com/watch?v=nDJMHFsBU7M)
2. [Code Llama Tutorial](https://www.youtube.com/watch?v=rjSWCMVbD_U)
3. [A Detailed Guide to Fine tuning Language Models with QLoRa](https://www.youtube.com/watch?v=OQdp-OeG1as)
# Top Selling Products [updated weekly]
1. [Function-calling Llama 2/Code-Llama](https://huggingface.co/Trelis/Llama-2-13b-chat-hf-function-calling-v2)
2. [Advanced QLoRA Training Script / Fine-tuning Colab Notebook](https://buy.stripe.com/5kA5l69K52Hxf3a006)
3. [Long Document and Website Summarisation Tool](https://Summarise-Me.com)
README.md
CHANGED
@@ -5,39 +5,19 @@ colorFrom: red
|
|
5 |
colorTo: green
|
6 |
sdk: static
|
7 |
pinned: false
|
8 |
-
tags: [llama, jupyter, colab, function calling, QLoRa, fine-tuning, scripts, fine-tuning]
|
9 |
---
|
10 |
|
11 |
-
|
12 |
|
13 |
-
|
14 |
-
- Access the Google Colab script [here](https://colab.research.google.com/drive/1uMSS1o_8YOPyG1X_4k6ENEE3kJfBGGhH?usp=sharing).
|
15 |
|
16 |
-
|
|
|
|
|
17 |
|
18 |
-
|
19 |
-
This advanced script provides improved performance when training with small datasets:
|
20 |
-
- Includes a prompt loss-mask for improved performance when structured responses are required.
|
21 |
-
- Includes a stop token after responses - allowing the model to provide a short reponse (e.g. a function call) and then stop.
|
22 |
-
- Request [access here](https://buy.stripe.com/5kA5l69K52Hxf3a006). €14.99 (or $16.49) per seat/user. Access will be provided within 24 hours of purchase.
|
23 |
|
24 |
-
|
25 |
-
|
26 |
-
|
27 |
-
- Commercial dataset allowing language models to be fine-tuned for function calling ([Paid access here](https://huggingface.co/datasets/Trelis/function_calling_extended)).
|
28 |
-
- Created using only human input or Apache 2 licensed datasets (no third party commercial licensing limitations)
|
29 |
-
- Models trained with this dataset:
|
30 |
-
- - [Llama-2-7B](https://huggingface.co/Trelis/Llama-2-7b-chat-hf-function-calling-v2) - repo includes Google Colab notebook for inference
|
31 |
-
- - [Llama-2-13B](https://huggingface.co/Trelis/Llama-2-13b-chat-hf-function-calling-v2) - repo includes Google Colab notebook for inference
|
32 |
-
|
33 |
-
*Protein Stability*
|
34 |
-
- 250k dataset of protein mutations and their effect on stability. [Free access here](https://huggingface.co/datasets/Trelis/protein_stability_single_mutation).
|
35 |
-
|
36 |
-
# 3. Jupyter Llama for Laptop
|
37 |
-
|
38 |
-
A Chat Assistant built on Llama 2.
|
39 |
-
- Save and re-load chats.
|
40 |
-
- Upload pdf or text files for analysis.
|
41 |
-
- Search Bing (beta).
|
42 |
-
- Add custom functions (beta).
|
43 |
-
- Get access for €9.99 ($10.99) [here](https://buy.stripe.com/dR65l6f4p95V7AI6oA).
|
|
|
5 |
colorTo: green
|
6 |
sdk: static
|
7 |
pinned: false
|
8 |
+
tags: [llama, code-llama, jupyter, colab, function calling, QLoRa, fine-tuning, scripts, fine-tuning]
|
9 |
---
|
10 |
|
11 |
+
Trelis Research provides tools and tutorials for training, using and deploying large language models.
|
12 |
|
13 |
+
# Most-viewed Video Tutorials [updated weekly]
|
|
|
14 |
|
15 |
+
1. [Run Llama 2 on your Laptop with Jupyter](https://www.youtube.com/watch?v=nDJMHFsBU7M)
|
16 |
+
2. [Code Llama Tutorial](https://www.youtube.com/watch?v=rjSWCMVbD_U)
|
17 |
+
3. [A Detailed Guide to Fine tuning Language Models with QLoRa](https://www.youtube.com/watch?v=OQdp-OeG1as)
|
18 |
|
19 |
+
# Top Selling Products [updated weekly]
|
|
|
|
|
|
|
|
|
20 |
|
21 |
+
1. [Function-calling Llama 2/Code-Llama Model](https://huggingface.co/Trelis/Llama-2-13b-chat-hf-function-calling-v2)
|
22 |
+
2. [Advanced QLoRA Training Script / Fine-tuning Colab Notebook - see video #3 above](https://buy.stripe.com/5kA5l69K52Hxf3a006)
|
23 |
+
3. [Long Document and Website Summarisation Tool](https://Summarise-Me.com)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|