File size: 743 Bytes
ff7d722
 
 
 
 
 
 
 
 
 
 
 
 
 
ef72a0f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
license: gemma
datasets:
- ayoubkirouane/Small-Instruct-Alpaca_Format
language:
- en
library_name: transformers
pipeline_tag: text-generation
---

## base model : 
- google/gemma-2-9b

## dataset : 
- ayoubkirouane/Small-Instruct-Alpaca_Format

## Get Started : 

- Load model directly :
  
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("ayoubkirouane/gemma-2-9b-alpaca-small-Instruct")
model = AutoModelForCausalLM.from_pretrained("ayoubkirouane/gemma-2-9b-alpaca-small-Instruct")

```

- Use a pipeline as a high-level helper :

```python
from transformers import pipeline

pipe = pipeline("text-generation", model="ayoubkirouane/gemma-2-9b-alpaca-small-Instruct")

```