AlanYky commited on
Commit
09d6e15
·
verified ·
1 Parent(s): afbe5fc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +62 -1
README.md CHANGED
@@ -6,7 +6,68 @@ tags: []
6
  # Model Card for Model ID
7
 
8
  <!-- Provide a quick summary of what the model is/does. -->
9
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
 
11
 
12
  ## Model Details
 
6
  # Model Card for Model ID
7
 
8
  <!-- Provide a quick summary of what the model is/does. -->
9
+ ```python
10
+ import torch
11
+ from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
12
+
13
+ # Set a manual seed for reproducibility
14
+ torch.manual_seed(0)
15
+
16
+ # Load the model with specific configurations
17
+ model = AutoModelForCausalLM.from_pretrained(
18
+ "AlanYky/phi-3.5_tweets_instruct",
19
+ device_map="cuda",
20
+ torch_dtype="auto",
21
+ trust_remote_code=True
22
+ )
23
+ model.to("cuda")
24
+
25
+ # Load the tokenizer
26
+ tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3.5-mini-instruct")
27
+
28
+ # Define a function to generate tweets
29
+ def generate_tweet(instruction, pipe, generation_args):
30
+ """
31
+ Generate a tweet response based on an instruction.
32
+ """
33
+ # Define the message structure
34
+ messages = [
35
+ {
36
+ "role": "user",
37
+ "content": instruction
38
+ }
39
+ ]
40
+
41
+ # Generate the tweet response
42
+ output = pipe(messages, **generation_args)
43
+
44
+ # Extract and return the generated tweet text
45
+ return output[0]['generated_text']
46
+
47
+ # Set up the pipeline for text generation
48
+ pipe = pipeline(
49
+ "text-generation",
50
+ model=model,
51
+ tokenizer=tokenizer,
52
+ )
53
+
54
+ # Define generation arguments for tweet creation
55
+ generation_args = {
56
+ "max_new_tokens": 70,
57
+ "return_full_text": False,
58
+ "temperature": 0.4,
59
+ "top_k": 50,
60
+ "top_p": 0.9,
61
+ "repetition_penalty": 1.2,
62
+ "do_sample": True,
63
+ }
64
+
65
+ # Specify an instruction for tweet generation
66
+ instruction = "Generate a tweet about Donald Trump is the 2024 US President."
67
+ generated_tweet = generate_tweet(instruction, pipe, generation_args)
68
+ print(generated_tweet)
69
+
70
+ ```
71
 
72
 
73
  ## Model Details