Spaces:
Runtime error
Runtime error
Upload 3 files
Browse files- PoemAnalysisSamples.txt +68 -0
- README.md +20 -6
- app.py +154 -0
PoemAnalysisSamples.txt
ADDED
@@ -0,0 +1,68 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
<poem>
|
2 |
+
Sundays too my father got up early
|
3 |
+
and put his clothes on in the blueblack cold,
|
4 |
+
then with cracked hands that ached
|
5 |
+
from labor in the weekday weather made
|
6 |
+
banked fires blaze. No one ever thanked him.
|
7 |
+
|
8 |
+
I’d wake and hear the cold splintering, breaking.
|
9 |
+
When the rooms were warm, he’d call,
|
10 |
+
and slowly I would rise and dress,
|
11 |
+
fearing the chronic angers of that house,
|
12 |
+
|
13 |
+
Speaking indifferently to him,
|
14 |
+
who had driven out the cold
|
15 |
+
and polished my good shoes as well.
|
16 |
+
What did I know, what did I know
|
17 |
+
of love’s austere and lonely offices?
|
18 |
+
</poem>
|
19 |
+
|
20 |
+
<response>
|
21 |
+
1 'too' : Poetic word order emphasizes father's labor, even on Sunday's, a day traditionally for rest.
|
22 |
+
2–3 'blueblack cold /…cracked hands that ached' : Innovative 'blueblack' imagery is also a transferred epithet, most properly describing the father's hands; note also the alliterative segments of b's and c's.
|
23 |
+
4 –5 'weekday weather made / banked fires blaze' : Alliteration of initial w's and b's; three adjacent heavy stresses in 'banked fires blaze'; lines enjambed, leading into a caesura in 5
|
24 |
+
6 'cold splintering, breaking' : striking auditory imagery
|
25 |
+
7 'When the rooms were warm' : Alliteration of w- sounds
|
26 |
+
9 'chronic angers of that house' : Personification
|
27 |
+
13 'What did I know, what did I know' : Repetition underscores speaker's youthful naïvety.
|
28 |
+
13-14 'what did I know / of love's austere and lonely offices?' : Enjambment
|
29 |
+
14 'offices' : Used in archaic sense of “duty”
|
30 |
+
|
31 |
+
Themes/Structure- The poem explores the complicated relationship between speaker and father, who provided for the family, while also abusing its other members (line 9). The speaker reflects on the sacrifices the father made that went unacknowledged. Lines are approximately 7–10 syllables in length, with no meter predominant. The poem is unrhymed, though formally resembles a sonnet, a traditional-seeming structure that perhaps nods to the traditional nuclear family that is its subject.
|
32 |
+
</response>
|
33 |
+
|
34 |
+
<poem>
|
35 |
+
Applauding youths laughed with young prostitutes
|
36 |
+
And watched her perfect, half-clothed body sway;
|
37 |
+
Her voice was like the sound of blended flutes
|
38 |
+
Blown by black players upon a picnic day.
|
39 |
+
She sang and danced on gracefully and calm,
|
40 |
+
The light gauze hanging loose about her form;
|
41 |
+
To me she seemed a proudly-swaying palm
|
42 |
+
Grown lovelier for passing through a storm.
|
43 |
+
Upon her swarthy neck black shiny curls
|
44 |
+
Luxuriant fell; and tossing coins in praise,
|
45 |
+
The wine-flushed, bold-eyed boys, and even the girls,
|
46 |
+
Devoured her shape with eager, passionate gaze;
|
47 |
+
But looking at her falsely-smiling face,
|
48 |
+
I knew her self was not in that strange place.
|
49 |
+
</poem>
|
50 |
+
|
51 |
+
<response>
|
52 |
+
1 'Applauding youths…young prostitutes' : Repetition at the beginning and end of the line emphasizes the youth both of the spectators and of the performers.
|
53 |
+
2 'her perfect, half-clothed body sway' : Synecdoche reduces the performer to her body, taking away her agency (she is 'watched').
|
54 |
+
3 'Her voice was like the sound of blended flutes' : Simile compares the performer's voice to the sound of flutes.
|
55 |
+
4 'Blown by black' : Initial b- alliteration
|
56 |
+
5 'gracefully and calm' : Inconcinnity, since grammatical parallelism requires 'calmly', though also producing the end rhymes 'calm' (5) and 'palm' (7)
|
57 |
+
7 'To me she seemed a proudly-swaying palm'
|
58 |
+
8 'Grown lovelier for passing through a storm' Metaphor compares the performer's life challenges to a storm.
|
59 |
+
9–10 'black shiny curls / Luxuriant fell' : Enjambment that mirrors the curls dropping
|
60 |
+
11 'wine-flushed' : Imagery describing the redness of the drunk spectators
|
61 |
+
11 'bold-eyed boys' : Alliteration of initial b's
|
62 |
+
12 'devoured her shape with eager, passionate gaze' : A metaphor that compares the spectators' consumption of the show to eating the performer. The synecdoche 'shape' reduces the performer again to her objectified body.
|
63 |
+
13: 'falsely-smiling face' : Alliteration of s sounds and initial f's.
|
64 |
+
14: 'self was not in that strange place' : Alliteration of s sounds and assonance ('strange place')
|
65 |
+
|
66 |
+
|
67 |
+
Themes/Structure- The poem explores the tension between the inward feelings and outward appearance of a performer, who is objectified by spectators according to society's standards of beauty. The poem is in the form of an English sonnet, with the stanzas consisting of three quatrains and a couplet. The rhyme scheme is ABAB CDCD EFEF GG, and the meter is iambic pentameter. The poem's scene of contemporary Harlem and popular entertainment contends against its traditional form.
|
68 |
+
</response>
|
README.md
CHANGED
@@ -1,12 +1,26 @@
|
|
1 |
---
|
2 |
-
title: Llama
|
3 |
-
emoji:
|
4 |
-
colorFrom:
|
5 |
-
colorTo:
|
6 |
sdk: gradio
|
7 |
-
sdk_version:
|
8 |
app_file: app.py
|
9 |
pinned: false
|
|
|
10 |
---
|
|
|
|
|
11 |
|
12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
title: (Llama-3) Poems in a Jiffy
|
3 |
+
emoji: ⚡
|
4 |
+
colorFrom: blue
|
5 |
+
colorTo: yellow
|
6 |
sdk: gradio
|
7 |
+
sdk_version: 3.45.1
|
8 |
app_file: app.py
|
9 |
pinned: false
|
10 |
+
short_description: Generates student commentary for poems
|
11 |
---
|
12 |
+
# Poem Commentary Generator
|
13 |
+
Project to create a free tool to help students learn to analyze poetry.
|
14 |
|
15 |
+
## Basics
|
16 |
+
This tool is inspired by language learning primers like [this Latin student aid](https://babel.hathitrust.org/cgi/pt?id=njp.32101015068578&seq=15) for Vergil's *Aeneid*. Drawing on the metaphor of a foreign language, I created this tool to create a line-by-line commentary on a target poem to support students.
|
17 |
+
|
18 |
+
**The outputs of this "custom GPT" should not be taken on trust, nor are they meant to replace or to simulate student work, but rather to supplement it.** Reflection on the outputs is encouraged, for example, why the LLM so often struggles with basic sound patterns (e.g., alliteration, rhyme). Such conversations can develop critical AI literacy skills.
|
19 |
+
|
20 |
+
## References
|
21 |
+
Much of the code is owed to [@osanseviero](https://huggingface.co/spaces/osanseviero/mistral-super-fast) and [@ysharma](https://huggingface.co/spaces/ysharma/Chat_with_Meta_llama3_8b)
|
22 |
+
|
23 |
+
## Llama 3 docs
|
24 |
+
[Llama 3 License](https://github.com/meta-llama/llama3/blob/main/LICENSE)
|
25 |
+
|
26 |
+
[Llama 3 Use Policy](https://ai.meta.com/llama/use-policy/)
|
app.py
ADDED
@@ -0,0 +1,154 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
from huggingface_hub import InferenceClient
|
2 |
+
import gradio as gr
|
3 |
+
import os
|
4 |
+
import re
|
5 |
+
|
6 |
+
# Get secret (HF_TOKEN)
|
7 |
+
HF_TOKEN = os.environ.get("HF_TOKEN", None)
|
8 |
+
|
9 |
+
#HTML/CSS stuff
|
10 |
+
DESCRIPTION = """
|
11 |
+
<div>
|
12 |
+
<h1 style="text-align: center;">Llama 3 Poem Analysis (Work-in-progress)</h1>
|
13 |
+
<p><h2>Copy-paste poem into textbox --> get Llama 3-generated commentary *hallucinations likely*</h2></p>
|
14 |
+
</div>
|
15 |
+
"""
|
16 |
+
|
17 |
+
LICENSE = """
|
18 |
+
<p/>
|
19 |
+
---
|
20 |
+
Built with Meta Llama 3
|
21 |
+
"""
|
22 |
+
#Not being used currently; having trouble integrating as a gr.Textbox in the params to gr.ChatInterface framework (end)
|
23 |
+
PLACEHOLDER = """
|
24 |
+
<div>
|
25 |
+
<img src="TBD" style="opacity: 0.55; ">
|
26 |
+
</div>
|
27 |
+
"""
|
28 |
+
|
29 |
+
|
30 |
+
css = """
|
31 |
+
h1 {
|
32 |
+
text-align: center;
|
33 |
+
display: block;
|
34 |
+
}
|
35 |
+
"""
|
36 |
+
#Initialize Llama as model; using InferenceClient for speed
|
37 |
+
client = InferenceClient(
|
38 |
+
"meta-llama/Meta-Llama-3-8B-Instruct"
|
39 |
+
)
|
40 |
+
#Get few-shot samples from PoemAnalysisSamples.txt
|
41 |
+
with open("PoemAnalysisSamples.txt", 'r') as f:
|
42 |
+
sample_poems = f.read()
|
43 |
+
|
44 |
+
pairs = re.findall(r'<poem>(.*?)</poem>\s*<response>(.*?)</response>', sample_poems, re.DOTALL)
|
45 |
+
|
46 |
+
#System message to initialize poetry assistant
|
47 |
+
sys_message = """
|
48 |
+
Assistant provides detailed analysis of poems following the format of the few-shot samples given. Assistant uses the following poetic terms and concepts to describe poem entered by user: simile, metaphor, metonymy, imagery, synecdoche, meter, diction, end rhyme, internal rhyme, and slant rhyme."
|
49 |
+
"""
|
50 |
+
|
51 |
+
#Helper function for formatting
|
52 |
+
def format_prompt(message, history):
|
53 |
+
"""Formats the prompt for the LLM
|
54 |
+
Args:
|
55 |
+
message: current user text entry
|
56 |
+
history: conversation history tracked by Gradio
|
57 |
+
Returns:
|
58 |
+
prompt: formatted properly for inference
|
59 |
+
"""
|
60 |
+
|
61 |
+
#Start with system message in Llama 3 message format: https://llama.meta.com/docs/model-cards-and-prompt-formats/meta-llama-3/
|
62 |
+
prompt=f"<|begin_of_text|><|start_header_id|>system<|end_header_id|>{sys_message}<|eot_id|>"
|
63 |
+
#Unpack the user and assistant messages from few-shot samples
|
64 |
+
for poem, response in pairs:
|
65 |
+
prompt+=f"<|start_header_id|>user<|end_header_id|>{poem}<|eot_id|>"
|
66 |
+
prompt+=f"<|start_header_id|>assistant<|end_header_id|>{response}<|eot_id|>"
|
67 |
+
#Unpack the conversation history stored by Gradio
|
68 |
+
for user_prompt, bot_response in history:
|
69 |
+
prompt+=f"<|start_header_id|>user<|end_header_id|>{user_prompt}<|eot_id|>"
|
70 |
+
prompt+=f"<|start_header_id|>assistant<|end_header_id|>{bot_response}<|eot_id|>"
|
71 |
+
#Add new message
|
72 |
+
prompt+=f"<|begin_of_text|><|start_header_id|>user<|end_header_id|>{message}<|eot_id|><|begin_of_text|><|start_header_id|>assistant<|end_header_id|>"
|
73 |
+
return prompt
|
74 |
+
|
75 |
+
#Function to generate LLM response
|
76 |
+
def generate(
|
77 |
+
prompt, history, temperature=0.1, max_new_tokens=1024, top_p=0.95, repetition_penalty=1.0,
|
78 |
+
):
|
79 |
+
|
80 |
+
temperature = float(temperature)
|
81 |
+
if temperature < 1e-2:
|
82 |
+
temperature = 1e-2
|
83 |
+
top_p = float(top_p)
|
84 |
+
|
85 |
+
generate_kwargs = dict(
|
86 |
+
temperature=temperature,
|
87 |
+
max_new_tokens=max_new_tokens,
|
88 |
+
top_p=top_p,
|
89 |
+
repetition_penalty=repetition_penalty,
|
90 |
+
do_sample=True,
|
91 |
+
seed=42,
|
92 |
+
stop_sequences=["<|eot_id|>"] #Llama 3 requires this stop token
|
93 |
+
)
|
94 |
+
|
95 |
+
formatted_prompt = format_prompt(prompt, history)
|
96 |
+
|
97 |
+
stream = client.text_generation(formatted_prompt, **generate_kwargs, stream=True, details=True, return_full_text=True) #change last to True for debugging conversation history
|
98 |
+
output = ""
|
99 |
+
|
100 |
+
for response in stream:
|
101 |
+
output += response.token.text
|
102 |
+
yield output
|
103 |
+
return output
|
104 |
+
|
105 |
+
# Initialize sliders
|
106 |
+
additional_inputs=[
|
107 |
+
gr.Slider(
|
108 |
+
label="Temperature",
|
109 |
+
value=0.1,
|
110 |
+
minimum=0.0,
|
111 |
+
maximum=1.0,
|
112 |
+
step=0.05,
|
113 |
+
interactive=True,
|
114 |
+
info="Higher values produce more diverse outputs",
|
115 |
+
),
|
116 |
+
gr.Slider(
|
117 |
+
label="Max new tokens",
|
118 |
+
value=1024,
|
119 |
+
minimum=0,
|
120 |
+
maximum=4096,
|
121 |
+
step=64,
|
122 |
+
interactive=True,
|
123 |
+
info="The maximum numbers of new tokens",
|
124 |
+
),
|
125 |
+
gr.Slider(
|
126 |
+
label="Top-p (nucleus sampling)",
|
127 |
+
value=0.90,
|
128 |
+
minimum=0.0,
|
129 |
+
maximum=1,
|
130 |
+
step=0.05,
|
131 |
+
interactive=True,
|
132 |
+
info="Higher values sample more low-probability tokens",
|
133 |
+
),
|
134 |
+
gr.Slider(
|
135 |
+
label="Repetition penalty",
|
136 |
+
value=1.2,
|
137 |
+
minimum=1.0,
|
138 |
+
maximum=2.0,
|
139 |
+
step=0.05,
|
140 |
+
interactive=True,
|
141 |
+
info="Penalize repeated tokens",
|
142 |
+
)
|
143 |
+
]
|
144 |
+
|
145 |
+
#Gradio UI
|
146 |
+
with gr.Blocks(css=css) as demo:
|
147 |
+
gr.ChatInterface(
|
148 |
+
fn=generate,
|
149 |
+
description=DESCRIPTION,
|
150 |
+
additional_inputs=additional_inputs
|
151 |
+
)
|
152 |
+
gr.Markdown(LICENSE)
|
153 |
+
|
154 |
+
demo.queue(concurrency_count=75, max_size=100).launch(debug=True)
|