bakrianoo commited on
Commit
35da3af
ยท
verified ยท
1 Parent(s): e30bf68

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +7 -8
README.md CHANGED
@@ -10,13 +10,6 @@ tags:
10
 
11
  # Silma
12
 
13
- ---
14
- Thank you for being part of our journey to advance AI for the Arabic-speaking world! ๐ŸŒŸ
15
-
16
- **Authors**: [silma.ai](https://silma.ai)
17
-
18
- ### Description
19
-
20
  Silma is a leading Generative AI startup dedicated to empowering Arabic speakers with state-of-the-art AI solutions.
21
 
22
  ## ๐Ÿš€ Our Flagship Model: Silma 1.0 ๐Ÿš€
@@ -25,6 +18,8 @@ Silma is a leading Generative AI startup dedicated to empowering Arabic speakers
25
  ## ๐Ÿ‘ฅ Our Team
26
  Our team is composed of seasoned **Arabic AI experts** who understand the nuances of the language and cultural considerations, enabling us to build solutions that truly resonate with Arabic users. ๐ŸŒโœจ
27
 
 
 
28
  ### Usage
29
 
30
  Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with:
@@ -54,6 +49,7 @@ messages = [
54
  outputs = pipe(messages, max_new_tokens=256)
55
  assistant_response = outputs[0]["generated_text"][-1]["content"].strip()
56
  print(assistant_response)
 
57
  # ุงู„ุณู„ุงู… ุนู„ูŠูƒู… ูˆุฑุญู…ุฉ ุงู„ู„ู‡ ูˆุจุฑูƒุงุชู‡ุŒ ุฃูˆุฏู‘ ุฃู† ุฃุนุชุฐุฑ ุนู† ุนุฏู… ุงู„ุญุถูˆุฑ ุฅู„ู‰ ุงู„ุนู…ู„ ุงู„ูŠูˆู… ุจุณุจุจ ู…ุฑุถูŠ. ุฃุดูƒุฑูƒู… ุนู„ู‰ ุชูู‡ู…ูƒู….
58
  ```
59
 
@@ -72,11 +68,14 @@ model = AutoModelForCausalLM.from_pretrained(
72
  torch_dtype=torch.bfloat16,
73
  )
74
 
75
- input_text = "Write me a poem about Machine Learning."
76
  input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
77
 
78
  outputs = model.generate(**input_ids, max_new_tokens=32)
79
  print(tokenizer.decode(outputs[0]))
 
 
 
80
  ```
81
 
82
  You can ensure the correct chat template is applied by using `tokenizer.apply_chat_template` as follows:
 
10
 
11
  # Silma
12
 
 
 
 
 
 
 
 
13
  Silma is a leading Generative AI startup dedicated to empowering Arabic speakers with state-of-the-art AI solutions.
14
 
15
  ## ๐Ÿš€ Our Flagship Model: Silma 1.0 ๐Ÿš€
 
18
  ## ๐Ÿ‘ฅ Our Team
19
  Our team is composed of seasoned **Arabic AI experts** who understand the nuances of the language and cultural considerations, enabling us to build solutions that truly resonate with Arabic users. ๐ŸŒโœจ
20
 
21
+ **Authors**: [silma.ai](https://silma.ai)
22
+
23
  ### Usage
24
 
25
  Below we share some code snippets on how to get quickly started with running the model. First, install the Transformers library with:
 
49
  outputs = pipe(messages, max_new_tokens=256)
50
  assistant_response = outputs[0]["generated_text"][-1]["content"].strip()
51
  print(assistant_response)
52
+
53
  # ุงู„ุณู„ุงู… ุนู„ูŠูƒู… ูˆุฑุญู…ุฉ ุงู„ู„ู‡ ูˆุจุฑูƒุงุชู‡ุŒ ุฃูˆุฏู‘ ุฃู† ุฃุนุชุฐุฑ ุนู† ุนุฏู… ุงู„ุญุถูˆุฑ ุฅู„ู‰ ุงู„ุนู…ู„ ุงู„ูŠูˆู… ุจุณุจุจ ู…ุฑุถูŠ. ุฃุดูƒุฑูƒู… ุนู„ู‰ ุชูู‡ู…ูƒู….
54
  ```
55
 
 
68
  torch_dtype=torch.bfloat16,
69
  )
70
 
71
+ input_text = "ุฃูŠู‡ู…ุง ุฃุฎู ูˆุฒู†ุง, ูƒูŠู„ูˆ ู…ู† ุงู„ุญุฏูŠุฏ ุฃู… ูƒูŠู„ูˆ ู…ู† ุงู„ู‚ุทู†ุŸ"
72
  input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")
73
 
74
  outputs = model.generate(**input_ids, max_new_tokens=32)
75
  print(tokenizer.decode(outputs[0]))
76
+
77
+ # ูƒู„ุงู‡ู…ุง ู„ู‡ ู†ูุณ ุงู„ูˆุฒู†.
78
+
79
  ```
80
 
81
  You can ensure the correct chat template is applied by using `tokenizer.apply_chat_template` as follows: