Add colab link
Browse files
README.md
CHANGED
@@ -20,6 +20,9 @@ This is a fine-tuned model from [Meta/Meta-Llama-3-8B](https://huggingface.co/Me
|
|
20 |
The motivation behind this model was to fine-tune an LLM that is capable of understanding the nuances of the Drilling Operations and provide 24-hour summarizations based on the inputs from Daily Drilling Reports hourly activities.
|
21 |
|
22 |
## How to use
|
|
|
|
|
|
|
23 |
### Recommended template for DriLLM-Summarizer:
|
24 |
``` python
|
25 |
TEMPLATE = """<|begin_of_text|>Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
|
@@ -92,11 +95,13 @@ If you are facing GPU constraints, you can try to load it with 8-bit quantizatio
|
|
92 |
``` python
|
93 |
from transformers import BitsAndBytesConfig
|
94 |
|
95 |
-
|
96 |
-
|
97 |
-
|
98 |
-
|
99 |
-
|
100 |
-
|
101 |
-
|
|
|
|
|
102 |
```
|
|
|
20 |
The motivation behind this model was to fine-tune an LLM that is capable of understanding the nuances of the Drilling Operations and provide 24-hour summarizations based on the inputs from Daily Drilling Reports hourly activities.
|
21 |
|
22 |
## How to use
|
23 |
+
### Sample Colab
|
24 |
+
Here's a [Google colab notebook](https://colab.research.google.com/drive/10Txp14M-yeJG3hRAB8U2ydPrWFE1bypW?usp=sharing) where you can get started with using the model
|
25 |
+
|
26 |
### Recommended template for DriLLM-Summarizer:
|
27 |
``` python
|
28 |
TEMPLATE = """<|begin_of_text|>Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request.
|
|
|
95 |
``` python
|
96 |
from transformers import BitsAndBytesConfig
|
97 |
|
98 |
+
pipeline = transformers.pipeline(
|
99 |
+
"text-generation",
|
100 |
+
model=model_id,
|
101 |
+
model_kwargs = {
|
102 |
+
"torch_dtype": torch.bfloat16,
|
103 |
+
"quantization_config": BitsAndBytesConfig(load_in_8bit=True), # Uncomment to use 8-bit quantization,
|
104 |
+
},
|
105 |
+
device_map="auto"
|
106 |
+
)
|
107 |
```
|