File size: 1,976 Bytes
c6b51c7 6bb033b c2f33e3 2ec2fe7 1af2def 2ec2fe7 1af2def 2652db0 c6b51c7 6bb033b ffb680a 6bb033b f539568 2d66101 a293cbf 2d66101 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 |
---
license: openrail
pipeline_tag: text-generation
library_name: transformers
widget:
- text: "a photograph of"
example_title: "photo"
- text: "a bizarre cg render"
example_title: "render"
- text: "the spaghetti"
example_title: "meal?"
- text: "a (detailed+ intricate)+ picture"
example_title: "weights"
inference:
parameters:
temperature: 2.4
max_new_tokens: 200
---
A model based upon the prompts of all the images in my InvokeAI's output directory, meant to be used with [InvokeAI](https://github.com/invoke-ai/InvokeAI) (a Stable Diffusion implementation/UI) to generate new, probably wild nightmare images.
This is mostly trained on positive prompts, though you may catch some words in [] brackets, which will be treated as negative.
GPT-Neo is usually quite good at pairing parenthesis, quotation marks, etc - however, don't be too surprised if it generates something that's note quite InvokeAI prompt syntax.
To use this model, you can import it as a pipeline like so:
```py
from transformers import pipeline
generator = pipeline(model="cactusfriend/nightmare-invokeai-prompts",
tokenizer="cactusfriend/nightmare-invokeai-prompts",
task="text-generation")
```
Here's an example function that'll generate by default 20 prompts, at a temperature of 1.8 which seems good for this model.
```py
def makePrompts(prompt: str, *, p: float=0.9,
k: int = 40, num: int = 20,
temp: float = 1.8, mnt: int = 150):
outputs = generator(prompt, max_new_tokens=mnt,
temperature=temp, do_sample=True,
top_p=p, top_k=k, num_return_sequences=num)
items = set([i['generated_text'] for i in outputs])
print("-" * 60)
print("\n ---\n".join(items))
print("-" * 60)
```
Then, you can call it like so:
```py
makePrompts("a photograph of")
# or, to change some defaults:
makePrompts("spaghetti all over", temp=1.4, p=0.92, k=45)
``` |