modelId
stringlengths
4
112
lastModified
stringlengths
24
24
tags
list
pipeline_tag
stringclasses
21 values
files
list
publishedBy
stringlengths
2
37
downloads_last_month
int32
0
9.44M
library
stringclasses
15 values
modelCard
large_stringlengths
0
100k
huggingtweets/samyamar_
2021-05-22T21:50:56.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
43
transformers
--- language: en thumbnail: https://www.huggingtweets.com/samyamar_/1602257539501/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1180760757589463040/FwIJiFOW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Samy Amar 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@samyamar_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@samyamar_'s tweets](https://twitter.com/samyamar_). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2402</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1109</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>89</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1204</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/24ub2vjc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @samyamar_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/5ejcej6c) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/5ejcej6c/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/samyamar_'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/sanchezcastejon
2021-05-25T12:15:15.000Z
[ "pytorch", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sanchezcastejon/1621944887525/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1369177448408088576/r5zlE_8T_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Pedro Sánchez</div> <div style="text-align: center; font-size: 14px;">@sanchezcastejon</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Pedro Sánchez. | Data | Pedro Sánchez | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 1434 | | Short tweets | 0 | | Tweets kept | 1816 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2w5z1l5p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sanchezcastejon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1xpv7imi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1xpv7imi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sanchezcastejon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sandissauka
2021-05-22T21:52:11.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sandissauka/1617671340616/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1366886371374358530/IsxgfIUX_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sandis Sauka 🤖 AI Bot </div> <div style="font-size: 15px">@sandissauka bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sandissauka's tweets](https://twitter.com/sandissauka). | Data | Quantity | | --- | --- | | Tweets downloaded | 1882 | | Retweets | 539 | | Short tweets | 327 | | Tweets kept | 1016 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/c92lxy08/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sandissauka's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/gxcknvuj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/gxcknvuj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sandissauka') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sanhestpasmoi
2021-05-22T21:53:19.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
14
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1141847999917768704/Wn2vaOMO_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Victor Sanh 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sanhestpasmoi bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sanhestpasmoi's tweets](https://twitter.com/sanhestpasmoi). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>957</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>375</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>35</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>547</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1uhcy43z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sanhestpasmoi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2m1a68n0) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2m1a68n0/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sanhestpasmoi'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sapphirelally
2021-05-22T21:55:05.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sapphirelally/1616767752799/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1320826464489607173/NCYr3Kyj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sapphire Lally 🤖 AI Bot </div> <div style="font-size: 15px">@sapphirelally bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sapphirelally's tweets](https://twitter.com/sapphirelally). | Data | Quantity | | --- | --- | | Tweets downloaded | 3186 | | Retweets | 600 | | Short tweets | 143 | | Tweets kept | 2443 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/wntmqj00/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sapphirelally's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ge3zqrv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ge3zqrv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sapphirelally') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sardesairajdeep
2021-05-22T21:56:12.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sardesairajdeep/1617827229720/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1190959165319065600/-nKwExDB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rajdeep Sardesai 🤖 AI Bot </div> <div style="font-size: 15px">@sardesairajdeep bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sardesairajdeep's tweets](https://twitter.com/sardesairajdeep). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 425 | | Short tweets | 190 | | Tweets kept | 2635 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1szelcnp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sardesairajdeep's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/177s27zk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/177s27zk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sardesairajdeep') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sardied1
2021-05-22T21:57:17.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sardied1/1617813043017/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375443068317470721/awg-4XON_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">fuck 🤖 AI Bot </div> <div style="font-size: 15px">@sardied1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sardied1's tweets](https://twitter.com/sardied1). | Data | Quantity | | --- | --- | | Tweets downloaded | 3157 | | Retweets | 1195 | | Short tweets | 498 | | Tweets kept | 1464 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/bael3wg3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sardied1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1hp7dsgj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1hp7dsgj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sardied1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sashasoftshark
2021-05-22T21:58:33.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
10
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sashasoftshark/1617793554312/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377790224537956352/J5Swpv8x_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🌌 soft space shark 🦈 (🎂T-9!🚀) 🤖 AI Bot </div> <div style="font-size: 15px">@sashasoftshark bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sashasoftshark's tweets](https://twitter.com/sashasoftshark). | Data | Quantity | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 727 | | Short tweets | 597 | | Tweets kept | 1920 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/fbird304/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sashasoftshark's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/wipbd3h9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/wipbd3h9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sashasoftshark') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sayantandas_
2021-05-22T21:59:35.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sayantandas_/1610724668237/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1325352169776713728/R1rAfxQ7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sayantan Das 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sayantandas_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sayantandas_'s tweets](https://twitter.com/sayantandas_). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>1341</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>479</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>141</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>721</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2yyfzhps/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sayantandas_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/qpocnx1m) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/qpocnx1m/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sayantandas_'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sburhanova
2021-05-22T22:00:55.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sburhanova/1618946732960/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1376581665661804545/GDJjfP7W_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Сабинэлла 🤖 AI Bot </div> <div style="font-size: 15px">@sburhanova bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sburhanova's tweets](https://twitter.com/sburhanova). | Data | Quantity | | --- | --- | | Tweets downloaded | 3174 | | Retweets | 147 | | Short tweets | 389 | | Tweets kept | 2638 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/24qx8h9j/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sburhanova's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/xpxepnde) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/xpxepnde/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sburhanova') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scarlet_platnm
2021-05-22T22:02:04.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374138228576501763/Tt6KUbNh_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scarlet 🏳️‍⚧️ 🤖 AI Bot </div> <div style="font-size: 15px">@scarlet_platnm bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scarlet_platnm's tweets](https://twitter.com/scarlet_platnm). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 683 | | Short tweets | 458 | | Tweets kept | 2098 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3s65gk6s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scarlet_platnm's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3a49phf4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3a49phf4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scarlet_platnm') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scarysmilingdog
2021-05-22T22:03:37.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scarysmilingdog/1618977555882/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1380538178667446273/gNl0y2pb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Kiko 🤖 AI Bot </div> <div style="font-size: 15px">@scarysmilingdog bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scarysmilingdog's tweets](https://twitter.com/scarysmilingdog). | Data | Quantity | | --- | --- | | Tweets downloaded | 1567 | | Retweets | 255 | | Short tweets | 193 | | Tweets kept | 1119 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/sscoe37w/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scarysmilingdog's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/62i2trmb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/62i2trmb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scarysmilingdog') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scooterabrahaam
2021-05-22T22:04:59.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
12
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scooterabrahaam/1608160278699/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1314337993834983424/y6Ql9UL0_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scooter Abraham 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@scooterabrahaam bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scooterabrahaam's tweets](https://twitter.com/scooterabrahaam). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3214</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>50</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>376</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2788</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/34vu2a66/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scooterabrahaam's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/74h0ixpp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/74h0ixpp/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/scooterabrahaam'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scottadamssays
2021-05-22T22:06:16.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scottadamssays/1617520581602/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1259614511859765248/uxqTchXo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scott Adams 🤖 AI Bot </div> <div style="font-size: 15px">@scottadamssays bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scottadamssays's tweets](https://twitter.com/scottadamssays). | Data | Quantity | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 788 | | Short tweets | 178 | | Tweets kept | 2282 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gchrwyn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scottadamssays's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3pdazf12) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3pdazf12/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scottadamssays') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scottcrates
2021-05-22T22:07:30.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
13
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scottcrates/1601244862947/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1240742228445683713/mk1A_Qsc_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scottacular 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@scottcrates bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scottcrates's tweets](https://twitter.com/scottcrates). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3224</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1809</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>397</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1018</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2141p5hu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scottcrates's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/34o2dlz1) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/34o2dlz1/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/scottcrates'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/scottmorrisonmp
2021-05-22T22:08:55.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
17
transformers
--- language: en thumbnail: http://res.cloudinary.com/huggingtweets/image/upload/v1599880423/scottmorrisonmp.jpg tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1116081523394891776/AYnEcQnG_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scott Morrison 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@scottmorrisonmp bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scottmorrisonmp's tweets](https://twitter.com/scottmorrisonmp). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3224</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>638</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>41</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2545</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2aq792eb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scottmorrisonmp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/fsmcdjkf) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/fsmcdjkf/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/scottmorrisonmp'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scpebooks
2021-05-22T22:09:57.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scpebooks/1616772562331/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1133808969598808065/RBypAo1V_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">scp_txt 🤖 AI Bot </div> <div style="font-size: 15px">@scpebooks bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scpebooks's tweets](https://twitter.com/scpebooks). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 0 | | Short tweets | 493 | | Tweets kept | 2757 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/b8m9cmwx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scpebooks's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2flyadcu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2flyadcu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scpebooks') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scpwiki
2021-05-22T22:11:11.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
13
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1013363049158332417/MNhkdJcK_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">The SCP Foundation 🤖 AI Bot </div> <div style="font-size: 15px">@scpwiki bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scpwiki's tweets](https://twitter.com/scpwiki). | Data | Quantity | | --- | --- | | Tweets downloaded | 3219 | | Retweets | 385 | | Short tweets | 302 | | Tweets kept | 2532 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2kz7gdc3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scpwiki's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/17pdq2uc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/17pdq2uc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scpwiki') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scrawledsongs
2021-05-22T22:12:47.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scrawledsongs/1618324632740/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1077189369591562240/Ufhv9ZEX_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">tales were first told with a tune 🤖 AI Bot </div> <div style="font-size: 15px">@scrawledsongs bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scrawledsongs's tweets](https://twitter.com/scrawledsongs). | Data | Quantity | | --- | --- | | Tweets downloaded | 1502 | | Retweets | 36 | | Short tweets | 9 | | Tweets kept | 1457 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2semjvon/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scrawledsongs's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2pmobg5r) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2pmobg5r/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scrawledsongs') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scrmshw
2021-05-22T22:13:51.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374560780662759427/t3b2EBQ7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">~keeper of breeze≈ 🤖 AI Bot </div> <div style="font-size: 15px">@scrmshw bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scrmshw's tweets](https://twitter.com/scrmshw). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 186 | | Short tweets | 526 | | Tweets kept | 2527 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hava73l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scrmshw's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ocxz6v9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ocxz6v9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scrmshw') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scromiting
2021-05-22T22:14:57.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
8
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scromiting/1616728393546/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1323393340990201856/czyh4BSg_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">economic crisis actor 🤖 AI Bot </div> <div style="font-size: 15px">@scromiting bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scromiting's tweets](https://twitter.com/scromiting). | Data | Quantity | | --- | --- | | Tweets downloaded | 956 | | Retweets | 81 | | Short tweets | 129 | | Tweets kept | 746 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2dgr5c8c/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scromiting's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8oh7mcof) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8oh7mcof/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scromiting') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/scrubphilosophy
2021-05-22T22:16:23.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/scrubphilosophy/1616731281223/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1198090654263283719/Vud98Uvd_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scrub 🤖 AI Bot </div> <div style="font-size: 15px">@scrubphilosophy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@scrubphilosophy's tweets](https://twitter.com/scrubphilosophy). | Data | Quantity | | --- | --- | | Tweets downloaded | 1923 | | Retweets | 512 | | Short tweets | 467 | | Tweets kept | 944 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/39yhwp4h/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @scrubphilosophy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/33gnfi5r) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/33gnfi5r/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/scrubphilosophy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/seangaz
2021-05-22T22:17:26.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/seangaz/1616769751980/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/3536357845/7765251ab33f62d3fc550251fe76348c_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sean Gasiorowski 🤖 AI Bot </div> <div style="font-size: 15px">@seangaz bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seangaz's tweets](https://twitter.com/seangaz). | Data | Quantity | | --- | --- | | Tweets downloaded | 222 | | Retweets | 7 | | Short tweets | 34 | | Tweets kept | 181 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3n5mqr8l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seangaz's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2d14q9ol) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2d14q9ol/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/seangaz') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/seanmombo
2021-05-22T22:18:38.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/seanmombo/1617821232903/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378471365553102849/7ZoddLS8_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">sean is figuring things out 🤖 AI Bot </div> <div style="font-size: 15px">@seanmombo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seanmombo's tweets](https://twitter.com/seanmombo). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 86 | | Short tweets | 340 | | Tweets kept | 2823 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hsms4lb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seanmombo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/32fs7u86) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/32fs7u86/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/seanmombo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/seannameeshelle
2021-05-22T22:19:44.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
10
transformers
--- language: en thumbnail: https://www.huggingtweets.com/seannameeshelle/1616722006868/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1202336322280542208/aX27WAfE_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">billy but it's said in an english accent 🤖 AI Bot </div> <div style="font-size: 15px">@seannameeshelle bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seannameeshelle's tweets](https://twitter.com/seannameeshelle). | Data | Quantity | | --- | --- | | Tweets downloaded | 3207 | | Retweets | 885 | | Short tweets | 235 | | Tweets kept | 2087 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/5hw5t9cj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seannameeshelle's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/puifmxcf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/puifmxcf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/seannameeshelle') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sebastiankurz
2021-05-22T22:21:01.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
16
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/824015313863921664/Nb1P0KUH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sebastian Kurz 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sebastiankurz bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sebastiankurz's tweets](https://twitter.com/sebastiankurz). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3201</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>683</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>36</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2482</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2dioxzt9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sebastiankurz's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/wva1pyr5) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/wva1pyr5/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sebastiankurz'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sedirox
2021-05-22T22:22:08.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
15
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sedirox/1602273002412/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1303165215932989440/bhO1HSOj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sedi 🎀 @ FFXIV: ARR & Hades 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sedirox bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sedirox's tweets](https://twitter.com/sedirox). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3214</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1267</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>380</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1567</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2i0i5rzl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sedirox's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/7u77mo7t) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/7u77mo7t/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sedirox'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/seffsaid
2021-05-22T22:23:15.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/seffsaid/1612884596137/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/810810341198270464/2ZdZEdlT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Inspirational Quotes 🤖 AI Bot </div> <div style="font-size: 15px">@seffsaid bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seffsaid's tweets](https://twitter.com/seffsaid). | Data | Quantity | | --- | --- | | Tweets downloaded | 3233 | | Retweets | 74 | | Short tweets | 350 | | Tweets kept | 2809 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7wgqrnap/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seffsaid's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/khw5cvds) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/khw5cvds/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/seffsaid') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/seleniumreal
2021-05-22T22:24:34.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
14
transformers
--- language: en thumbnail: http://res.cloudinary.com/huggingtweets/image/upload/v1599953062/seleniumreal.jpg tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1229254969968205824/Dev2-C07_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Reasonably Selenium 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@seleniumreal bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seleniumreal's tweets](https://twitter.com/seleniumreal). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>316</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>18</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>71</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>227</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1xvf8gta/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seleniumreal's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3bckcjtw) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3bckcjtw/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/seleniumreal'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sellarsrespectr
2021-05-22T22:25:43.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sellarsrespectr/1616720155815/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1004831714231742464/zoP72CMZ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">•Nate• •BLM• 🤖 AI Bot </div> <div style="font-size: 15px">@sellarsrespectr bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sellarsrespectr's tweets](https://twitter.com/sellarsrespectr). | Data | Quantity | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 272 | | Short tweets | 416 | | Tweets kept | 2549 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2s51p72h/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sellarsrespectr's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/tus3zndp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/tus3zndp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sellarsrespectr') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/senorstallone
2021-05-22T22:27:01.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
22
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/710114707974320129/HTTtHH9q_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">filipetrocado 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@senorstallone bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@senorstallone's tweets](https://twitter.com/senorstallone). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2147</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>245</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>182</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1720</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/19wrfs81/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @senorstallone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3vxgemfh) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3vxgemfh/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/senorstallone'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sentienter
2021-05-22T22:28:11.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
15
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sentienter/1616642835417/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1274873508711940097/BKZv8mxD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Walker 🤖 AI Bot </div> <div style="font-size: 15px">@sentienter bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sentienter's tweets](https://twitter.com/sentienter). | Data | Quantity | | --- | --- | | Tweets downloaded | 77 | | Retweets | 16 | | Short tweets | 5 | | Tweets kept | 56 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2se5p98l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sentienter's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/27jgnob0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/27jgnob0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sentienter') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/seocamp
2021-05-22T22:29:06.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
48
transformers
--- language: en thumbnail: https://www.huggingtweets.com/seocamp/1600856567422/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/557135313558970369/0rA33HGL_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">SEO Camp 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@seocamp bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seocamp's tweets](https://twitter.com/seocamp). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3238</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>849</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>53</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2336</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2g3bq1ht/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seocamp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2725jswm) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2725jswm/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/seocamp'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/seraxiz
2021-05-22T22:30:09.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1325233006609649667/WWD8BL_W_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sera♪ 🤖 AI Bot </div> <div style="font-size: 15px">@seraxiz bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seraxiz's tweets](https://twitter.com/seraxiz). | Data | Quantity | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 266 | | Short tweets | 727 | | Tweets kept | 2251 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/31zjtgyq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seraxiz's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/b5wbv6sy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/b5wbv6sy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/seraxiz') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/seyitaylor
2021-05-22T22:31:18.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://www.huggingtweets.com/seyitaylor/1616653340594/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1254941388875206657/Q7HIttwB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">st. 🤖 AI Bot </div> <div style="font-size: 15px">@seyitaylor bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@seyitaylor's tweets](https://twitter.com/seyitaylor). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 617 | | Short tweets | 800 | | Tweets kept | 1829 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ncrau3d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @seyitaylor's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ej30oc7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ej30oc7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/seyitaylor') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sfy____
2021-05-22T22:32:37.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sfy____/1612019989079/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1315018122169024513/xiulsyLD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Selim 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sfy____ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sfy____'s tweets](https://twitter.com/sfy____). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>586</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>40</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>68</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>478</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1y4u5sex/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sfy____'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/hkdw3jxj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/hkdw3jxj/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sfy____'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shacharmirkin
2021-05-22T22:34:07.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
17
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shacharmirkin/1602245377709/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1260921115280576512/VEtqb-vj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Shachar Mirkin 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@shacharmirkin bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shacharmirkin's tweets](https://twitter.com/shacharmirkin). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>174</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>308</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2740</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/145gsic1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shacharmirkin's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3roq9iwb) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3roq9iwb/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/shacharmirkin'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/shadowkusanagi
2021-05-22T22:35:16.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shadowkusanagi/1617750131637/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1338305633557368832/Gj_QrzOT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">motoko silverhand 🤖 AI Bot </div> <div style="font-size: 15px">@shadowkusanagi bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shadowkusanagi's tweets](https://twitter.com/shadowkusanagi). | Data | Quantity | | --- | --- | | Tweets downloaded | 2994 | | Retweets | 1247 | | Short tweets | 430 | | Tweets kept | 1317 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1yrx4nl8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shadowkusanagi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2nde4blc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2nde4blc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shadowkusanagi') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shaklakhani
2021-05-22T22:37:04.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
13
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shaklakhani/1616695786529/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1125509289811107841/viXfInuC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Shak Lakhani 🤖 AI Bot </div> <div style="font-size: 15px">@shaklakhani bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shaklakhani's tweets](https://twitter.com/shaklakhani). | Data | Quantity | | --- | --- | | Tweets downloaded | 3234 | | Retweets | 144 | | Short tweets | 283 | | Tweets kept | 2807 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/afir0qr2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shaklakhani's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2bl8p8w3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2bl8p8w3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shaklakhani') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shallydarte
2021-05-22T22:38:19.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shallydarte/1616666440129/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1105161301872074754/gMFCDMgQ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">shally darte 🤖 AI Bot </div> <div style="font-size: 15px">@shallydarte bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shallydarte's tweets](https://twitter.com/shallydarte). | Data | Quantity | | --- | --- | | Tweets downloaded | 546 | | Retweets | 22 | | Short tweets | 53 | | Tweets kept | 471 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/bfyriehd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shallydarte's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2v5e9oki) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2v5e9oki/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shallydarte') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shape_nato
2021-05-22T22:39:26.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shape_nato/1615922075366/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1299281023516123136/MqsKcLzo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">SHAPE_NATO Allied Command Operations 🤖 AI Bot </div> <div style="font-size: 15px">@shape_nato bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shape_nato's tweets](https://twitter.com/shape_nato). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 1599 | | Short tweets | 63 | | Tweets kept | 1588 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lzbqj7w/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shape_nato's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2zmv1qox) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2zmv1qox/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shape_nato') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shartitheclown
2021-05-22T22:40:38.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shartitheclown/1614136368554/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362921843292831749/wwbmtSCM_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Franklin 💀✨ 🤖 AI Bot </div> <div style="font-size: 15px">@shartitheclown bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shartitheclown's tweets](https://twitter.com/shartitheclown). | Data | Quantity | | --- | --- | | Tweets downloaded | 3192 | | Retweets | 1453 | | Short tweets | 164 | | Tweets kept | 1575 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3bp8bisb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shartitheclown's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/bc8j6l7q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/bc8j6l7q/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shartitheclown') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shelbythanna
2021-05-22T22:41:54.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shelbythanna/1616726294169/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1300119184597307393/kWuQsYln_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Shelby T. Hanna 🤖 AI Bot </div> <div style="font-size: 15px">@shelbythanna bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shelbythanna's tweets](https://twitter.com/shelbythanna). | Data | Quantity | | --- | --- | | Tweets downloaded | 2322 | | Retweets | 157 | | Short tweets | 325 | | Tweets kept | 1840 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2128y3cg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shelbythanna's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1na2quvz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1na2quvz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shelbythanna') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shengokai
2021-05-22T22:43:00.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
16
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shengokai/1616728402938/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1146084503108104193/TzlypMFe_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dr. Johnathan Flowers says "Fuck your Academy." 🤖 AI Bot </div> <div style="font-size: 15px">@shengokai bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shengokai's tweets](https://twitter.com/shengokai). | Data | Quantity | | --- | --- | | Tweets downloaded | 3235 | | Retweets | 656 | | Short tweets | 198 | | Tweets kept | 2381 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/26iqvqo7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shengokai's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3a4sajqy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3a4sajqy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shengokai') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sheniroh
2021-06-13T23:46:27.000Z
[ "pytorch", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
0
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sheniroh/1623627889010/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1403896581972238346/bVstV1qf_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">😺🤳(me on twiter)</div> <div style="text-align: center; font-size: 14px;">@sheniroh</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 😺🤳(me on twiter). | Data | 😺🤳(me on twiter) | | --- | --- | | Tweets downloaded | 1339 | | Retweets | 226 | | Short tweets | 346 | | Tweets kept | 767 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ygskygk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sheniroh's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2uk3z22y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2uk3z22y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sheniroh') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shickdits
2021-05-22T22:44:22.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shickdits/1617758737222/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377623171222937601/NFYKiOFm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ShickDits 🤖 AI Bot </div> <div style="font-size: 15px">@shickdits bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shickdits's tweets](https://twitter.com/shickdits). | Data | Quantity | | --- | --- | | Tweets downloaded | 2769 | | Retweets | 755 | | Short tweets | 402 | | Tweets kept | 1612 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/34o01w7t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shickdits's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2kvibl61) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2kvibl61/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shickdits') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shivon
2021-05-22T22:45:30.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/755006920314937344/PPQ8LKFs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Shivon Zilis 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@shivon bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@shivon's tweets](https://twitter.com/shivon). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2630</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>327</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>161</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2142</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/fn5rbom8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shivon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/28713yo6) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/28713yo6/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/shivon'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shoe0nhead
2021-05-22T22:46:41.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
13
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shoe0nhead/1615240143166/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1367237688819073029/Z6eoYBbC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">shoe 🤖 AI Bot </div> <div style="font-size: 15px">@shoe0nhead bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shoe0nhead's tweets](https://twitter.com/shoe0nhead). | Data | Quantity | | --- | --- | | Tweets downloaded | 3222 | | Retweets | 219 | | Short tweets | 709 | | Tweets kept | 2294 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mnphvff/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shoe0nhead's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/31gimc2n) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/31gimc2n/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shoe0nhead') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shovelship
2021-05-22T22:47:49.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shovelship/1614483379812/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1323044209482440704/biTgCI0h_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">everly 🤖 AI Bot </div> <div style="font-size: 15px">@shovelship bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shovelship's tweets](https://twitter.com/shovelship). | Data | Quantity | | --- | --- | | Tweets downloaded | 1531 | | Retweets | 234 | | Short tweets | 443 | | Tweets kept | 854 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1epvkdlq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shovelship's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/pes09e1p) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/pes09e1p/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shovelship') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shrike76
2021-05-22T04:30:36.000Z
[ "pytorch", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
10
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shrike76/1621657812775/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1347057931364270086/xQ6p8pwl_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">shrike</div> <div style="text-align: center; font-size: 14px;">@shrike76</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from shrike. | Data | shrike | | --- | --- | | Tweets downloaded | 161 | | Retweets | 6 | | Short tweets | 45 | | Tweets kept | 110 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2u90mfie/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shrike76's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/l2upw48p) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/l2upw48p/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shrike76') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shuos_
2021-05-22T22:48:51.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
8
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shuos_/1614100122177/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357159293229891584/r4barENi_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">shuos 🤖 AI Bot </div> <div style="font-size: 15px">@shuos_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shuos_'s tweets](https://twitter.com/shuos_). | Data | Quantity | | --- | --- | | Tweets downloaded | 3178 | | Retweets | 1961 | | Short tweets | 286 | | Tweets kept | 931 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/275hjd6n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shuos_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/4ozxmlq6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/4ozxmlq6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shuos_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/shutupjamiepls
2021-05-22T22:50:15.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
23
transformers
--- language: en thumbnail: https://www.huggingtweets.com/shutupjamiepls/1617773398525/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379599647518375939/F7t0Jkg5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">hi, my name is Jamie Grace😎 🤖 AI Bot </div> <div style="font-size: 15px">@shutupjamiepls bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@shutupjamiepls's tweets](https://twitter.com/shutupjamiepls). | Data | Quantity | | --- | --- | | Tweets downloaded | 3021 | | Retweets | 2396 | | Short tweets | 79 | | Tweets kept | 546 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/10671kc1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @shutupjamiepls's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8144wgvh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8144wgvh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/shutupjamiepls') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sicatrix66
2021-05-22T22:51:18.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sicatrix66/1614214451470/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354287864004059136/yzDqQwjT_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">deimos anomaly 🤖 AI Bot </div> <div style="font-size: 15px">@sicatrix66 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sicatrix66's tweets](https://twitter.com/sicatrix66). | Data | Quantity | | --- | --- | | Tweets downloaded | 3083 | | Retweets | 1774 | | Short tweets | 228 | | Tweets kept | 1081 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3qk3zf5p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sicatrix66's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2lr60j1c) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2lr60j1c/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sicatrix66') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sidjindal1
2021-05-22T22:52:20.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sidjindal1/1617167056061/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359596382231924736/kFfe1B97_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sid Jindal 🤖 AI Bot </div> <div style="font-size: 15px">@sidjindal1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sidjindal1's tweets](https://twitter.com/sidjindal1). | Data | Quantity | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 93 | | Short tweets | 295 | | Tweets kept | 2860 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2takn730/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sidjindal1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/6fjrggo6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/6fjrggo6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sidjindal1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sigh_oh
2021-05-22T22:53:38.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
8
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sigh_oh/1616722580016/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1267481940497698817/qY9_WL4S_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">$io🇬🇾 🤖 AI Bot </div> <div style="font-size: 15px">@sigh_oh bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sigh_oh's tweets](https://twitter.com/sigh_oh). | Data | Quantity | | --- | --- | | Tweets downloaded | 2895 | | Retweets | 1021 | | Short tweets | 351 | | Tweets kept | 1523 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1a27rmpf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sigh_oh's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3l2mqdpg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3l2mqdpg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sigh_oh') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sigittanew
2021-05-22T22:54:41.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sigittanew/1617902420104/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1315307002999058432/Z4YtauZI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">☃️Sigitta🎅 🤖 AI Bot </div> <div style="font-size: 15px">@sigittanew bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sigittanew's tweets](https://twitter.com/sigittanew). | Data | Quantity | | --- | --- | | Tweets downloaded | 3216 | | Retweets | 1319 | | Short tweets | 109 | | Tweets kept | 1788 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ecj53ccd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sigittanew's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jm7ev1c0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jm7ev1c0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sigittanew') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sigsys
2021-05-22T22:56:23.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
138
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sigsys/1617904484486/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1215779813560025089/ka9neEZ4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">PanickedJanet 🤖 AI Bot </div> <div style="font-size: 15px">@sigsys bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sigsys's tweets](https://twitter.com/sigsys). | Data | Quantity | | --- | --- | | Tweets downloaded | 3207 | | Retweets | 1423 | | Short tweets | 378 | | Tweets kept | 1406 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/15vp8xpf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sigsys's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18htet0h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18htet0h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sigsys') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sillynous
2021-05-22T22:57:32.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
12
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sillynous/1617238560880/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1314767099471032322/-9CLybi3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Tomas Albergo 🤖 AI Bot </div> <div style="font-size: 15px">@sillynous bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sillynous's tweets](https://twitter.com/sillynous). | Data | Quantity | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 301 | | Short tweets | 771 | | Tweets kept | 2171 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gu980fr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sillynous's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3vpacwrb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3vpacwrb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sillynous') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sinirlasansiz
2021-05-22T22:58:44.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
8
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sinirlasansiz/1616940697619/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1186030454572490757/rRH-LcBr_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">BazenFurkan 🤖 AI Bot </div> <div style="font-size: 15px">@sinirlasansiz bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sinirlasansiz's tweets](https://twitter.com/sinirlasansiz). | Data | Quantity | | --- | --- | | Tweets downloaded | 688 | | Retweets | 6 | | Short tweets | 43 | | Tweets kept | 639 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/5js76uys/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sinirlasansiz's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2pq3jwah) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2pq3jwah/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sinirlasansiz') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sirsfurther
2021-05-22T22:59:51.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
13
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sirsfurther/1616708554421/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374367016392355845/UDefUzJo_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">twister of temperance 🤖 AI Bot </div> <div style="font-size: 15px">@sirsfurther bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sirsfurther's tweets](https://twitter.com/sirsfurther). | Data | Quantity | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 209 | | Short tweets | 895 | | Tweets kept | 2144 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/yqe91w95/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sirsfurther's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/n0x86qnk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/n0x86qnk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sirsfurther') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/skabpixels
2021-05-21T20:18:22.000Z
[ "pytorch", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/skabpixels/1621628297355/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/961920619012087809/dSaIkQUk_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Gab Fratus</div> <div style="text-align: center; font-size: 14px;">@skabpixels</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Gab Fratus. | Data | Gab Fratus | | --- | --- | | Tweets downloaded | 1556 | | Retweets | 251 | | Short tweets | 185 | | Tweets kept | 1120 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ei5jqez/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @skabpixels's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2g089rwi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2g089rwi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/skabpixels') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sky_obito
2021-05-22T23:00:58.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sky_obito/1614214046985/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1347274090051117057/3fKG8-pm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Lenalee (CW: Dragon Prince) 🤖 AI Bot </div> <div style="font-size: 15px">@sky_obito bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sky_obito's tweets](https://twitter.com/sky_obito). | Data | Quantity | | --- | --- | | Tweets downloaded | 3113 | | Retweets | 2349 | | Short tweets | 236 | | Tweets kept | 528 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1z2vftrh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sky_obito's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/396z3s7q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/396z3s7q/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sky_obito') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/slainkinsman
2021-05-22T23:02:05.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/slainkinsman/1617812785653/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1302741435830149120/uZSpDxqN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sඞlfish Dying Relative 🤖 AI Bot </div> <div style="font-size: 15px">@slainkinsman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@slainkinsman's tweets](https://twitter.com/slainkinsman). | Data | Quantity | | --- | --- | | Tweets downloaded | 3205 | | Retweets | 2771 | | Short tweets | 27 | | Tweets kept | 407 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/z5f80l0r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @slainkinsman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/qforafva) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/qforafva/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/slainkinsman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/slashdashdot
2021-05-22T23:03:08.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/slashdashdot/1617813916366/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/728735814570500096/RyJZkh4s_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Hamstar 🤖 AI Bot </div> <div style="font-size: 15px">@slashdashdot bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@slashdashdot's tweets](https://twitter.com/slashdashdot). | Data | Quantity | | --- | --- | | Tweets downloaded | 3228 | | Retweets | 1695 | | Short tweets | 282 | | Tweets kept | 1251 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/lu03c6s8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @slashdashdot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/26xltebd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/26xltebd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/slashdashdot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/slimepriestess
2021-05-22T23:04:19.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1319135470656180224/cxISAFko_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Octavia 🤖 AI Bot </div> <div style="font-size: 15px">@slimepriestess bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@slimepriestess's tweets](https://twitter.com/slimepriestess). | Data | Quantity | | --- | --- | | Tweets downloaded | 201 | | Retweets | 23 | | Short tweets | 16 | | Tweets kept | 162 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1f2gufmd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @slimepriestess's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3h5af3aw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3h5af3aw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/slimepriestess') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/slowcoregod
2021-05-22T23:06:32.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/slowcoregod/1616688358797/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374383701673439241/XUY3-0Td_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">spence 🤖 AI Bot </div> <div style="font-size: 15px">@slowcoregod bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@slowcoregod's tweets](https://twitter.com/slowcoregod). | Data | Quantity | | --- | --- | | Tweets downloaded | 233 | | Retweets | 34 | | Short tweets | 30 | | Tweets kept | 169 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1b38n558/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @slowcoregod's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/whiudw8e) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/whiudw8e/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/slowcoregod') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sluckbo
2021-05-22T23:07:41.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sluckbo/1614218469985/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1311447659337584640/jf4aDIax_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">FullofSoundandCurry 🤖 AI Bot </div> <div style="font-size: 15px">@sluckbo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sluckbo's tweets](https://twitter.com/sluckbo). | Data | Quantity | | --- | --- | | Tweets downloaded | 3105 | | Retweets | 1703 | | Short tweets | 49 | | Tweets kept | 1353 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ky0c0m7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sluckbo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/14axipec) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/14axipec/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sluckbo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sludge_girl
2021-05-22T23:08:54.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sludge_girl/1616684418606/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1351081559294697477/O0xCUKQW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ms. Hole LLC 🤖 AI Bot </div> <div style="font-size: 15px">@sludge_girl bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sludge_girl's tweets](https://twitter.com/sludge_girl). | Data | Quantity | | --- | --- | | Tweets downloaded | 3181 | | Retweets | 530 | | Short tweets | 705 | | Tweets kept | 1946 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2prknbig/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sludge_girl's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2z0ma6xu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2z0ma6xu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sludge_girl') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/smithchitty
2021-05-22T23:09:56.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
12
transformers
--- language: en thumbnail: https://www.huggingtweets.com/smithchitty/1616662203644/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1112820359047208960/0OKcmL16_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Chaos librarian 🤖 AI Bot </div> <div style="font-size: 15px">@smithchitty bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@smithchitty's tweets](https://twitter.com/smithchitty). | Data | Quantity | | --- | --- | | Tweets downloaded | 2807 | | Retweets | 633 | | Short tweets | 225 | | Tweets kept | 1949 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3qcfmql1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @smithchitty's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3b8xbtoe) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3b8xbtoe/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/smithchitty') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/smokyblue__
2021-05-22T23:11:24.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/smokyblue__/1610893224130/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1245434376789397511/8EN5syw3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Smoky Blue 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@smokyblue__ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@smokyblue__'s tweets](https://twitter.com/smokyblue__). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3019</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2681</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>88</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>250</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/20f3u1ck/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @smokyblue__'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/eg3neoby) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/eg3neoby/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/smokyblue__'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sn0ozefest
2021-05-22T23:12:32.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sn0ozefest/1616689326898/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364963243610013698/V8ZCqkzG_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🦙 🤖 AI Bot </div> <div style="font-size: 15px">@sn0ozefest bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sn0ozefest's tweets](https://twitter.com/sn0ozefest). | Data | Quantity | | --- | --- | | Tweets downloaded | 3222 | | Retweets | 349 | | Short tweets | 536 | | Tweets kept | 2337 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2hj4kx56/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sn0ozefest's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1yy9eby7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1yy9eby7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sn0ozefest') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sn_fk_n
2021-05-22T23:13:51.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sn_fk_n/1616623113276/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1352368398974541888/3AP_Sebd_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">snufkin 🤖 AI Bot </div> <div style="font-size: 15px">@sn_fk_n bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sn_fk_n's tweets](https://twitter.com/sn_fk_n). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 15 | | Short tweets | 714 | | Tweets kept | 2520 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2eh0ydd7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sn_fk_n's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1xsbdzix) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1xsbdzix/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sn_fk_n') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/snackmerritt
2021-05-22T23:14:59.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/snackmerritt/1616888395440/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1371627947069739010/vX4nm8l-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack "Knows Nothing About Politics" Merritt🗳🍦 🤖 AI Bot </div> <div style="font-size: 15px">@snackmerritt bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@snackmerritt's tweets](https://twitter.com/snackmerritt). | Data | Quantity | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 388 | | Short tweets | 548 | | Tweets kept | 2308 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/mjo4ke89/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @snackmerritt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18xtt8zh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18xtt8zh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/snackmerritt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/snackteeth
2021-05-22T23:16:55.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/snackteeth/1617850153601/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1339420191428653058/Vj757Zlw_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Floral Flavor Blend 🐊 bIm 🤖 AI Bot </div> <div style="font-size: 15px">@snackteeth bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@snackteeth's tweets](https://twitter.com/snackteeth). | Data | Quantity | | --- | --- | | Tweets downloaded | 3215 | | Retweets | 1408 | | Short tweets | 146 | | Tweets kept | 1661 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1vp9yx2e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @snackteeth's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1q2qlzfz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1q2qlzfz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/snackteeth') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/snackuporsackup
2021-05-22T23:18:26.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/snackuporsackup/1616645126928/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/913700876967075840/Gd2_19b__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Captain Oats 🤖 AI Bot </div> <div style="font-size: 15px">@snackuporsackup bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@snackuporsackup's tweets](https://twitter.com/snackuporsackup). | Data | Quantity | | --- | --- | | Tweets downloaded | 432 | | Retweets | 53 | | Short tweets | 40 | | Tweets kept | 339 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/btc6haab/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @snackuporsackup's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2lx55ce2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2lx55ce2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/snackuporsackup') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sneakygnida
2021-05-22T23:19:29.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
8
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sneakygnida/1617819258406/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1154628264499064832/i-CdEX_w_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">sneaky gnida 🤖 AI Bot </div> <div style="font-size: 15px">@sneakygnida bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sneakygnida's tweets](https://twitter.com/sneakygnida). | Data | Quantity | | --- | --- | | Tweets downloaded | 415 | | Retweets | 34 | | Short tweets | 164 | | Tweets kept | 217 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2a37cn9l/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sneakygnida's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/vj2p6n18) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/vj2p6n18/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sneakygnida') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/snobiwan
2021-05-22T23:20:43.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/snobiwan/1616716702325/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1041395890437537792/AnVu__Fb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Andrew Snowdon (he/they//him/them) 🤖 AI Bot </div> <div style="font-size: 15px">@snobiwan bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@snobiwan's tweets](https://twitter.com/snobiwan). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 317 | | Short tweets | 188 | | Tweets kept | 2744 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1c3032fr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @snobiwan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ux6rf7y9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ux6rf7y9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/snobiwan') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/snooterboops
2021-05-22T23:22:03.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
11
transformers
--- language: en thumbnail: https://www.huggingtweets.com/snooterboops/1614167277329/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1326718606378463233/VNf1kT6R_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">cartoon goat 🤖 AI Bot </div> <div style="font-size: 15px">@snooterboops bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@snooterboops's tweets](https://twitter.com/snooterboops). | Data | Quantity | | --- | --- | | Tweets downloaded | 3175 | | Retweets | 1624 | | Short tweets | 168 | | Tweets kept | 1383 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2c5i26k4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @snooterboops's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/h4k0m3z6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/h4k0m3z6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/snooterboops') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/soashworth
2021-05-22T23:23:12.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://www.huggingtweets.com/soashworth/1616725376956/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/908716330991439874/9_53GDxB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Scott Ashworth 🤖 AI Bot </div> <div style="font-size: 15px">@soashworth bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@soashworth's tweets](https://twitter.com/soashworth). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 266 | | Short tweets | 394 | | Tweets kept | 2590 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2o3heigk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @soashworth's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ro8u89w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ro8u89w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/soashworth') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sodaag
2021-05-22T23:24:50.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "added_tokens.json", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
18
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sodaag/1621031819814/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1055157318306926593/FzzqSgoS_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Shokugeki no Soda</div> <div style="text-align: center; font-size: 14px;">@sodaag</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Shokugeki no Soda. | Data | Shokugeki no Soda | | --- | --- | | Tweets downloaded | 2928 | | Retweets | 2459 | | Short tweets | 49 | | Tweets kept | 420 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/27z6hcfi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sodaag's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/170hx5ab) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/170hx5ab/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sodaag') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/solarmonke
2021-05-22T23:25:48.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/solarmonke/1618091821686/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1380728043761700865/ORlB55uo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🌞 𝕊𝕠𝕝𝕒𝕣 𝕄𝕠𝕟𝕜𝕖 🐵 🤖 AI Bot </div> <div style="font-size: 15px">@solarmonke bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@solarmonke's tweets](https://twitter.com/solarmonke). | Data | Quantity | | --- | --- | | Tweets downloaded | 858 | | Retweets | 173 | | Short tweets | 129 | | Tweets kept | 556 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3vdq6iwf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @solarmonke's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3exssexq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3exssexq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/solarmonke') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/solarsystern
2021-05-22T23:27:01.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/solarsystern/1617207302255/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375987406780964866/8gMlfYxv_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ira!! 🤖 AI Bot </div> <div style="font-size: 15px">@solarsystern bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@solarsystern's tweets](https://twitter.com/solarsystern). | Data | Quantity | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 155 | | Short tweets | 309 | | Tweets kept | 2773 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ix2xlbi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @solarsystern's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/15nj4eem) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/15nj4eem/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/solarsystern') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/soleil__vt
2021-05-22T23:28:04.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
12
transformers
--- language: en thumbnail: https://www.huggingtweets.com/soleil__vt/1620680042258/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1370389337179893761/OcxAtpTV_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Soleil | VTuber | Space Pirate</div> <div style="text-align: center; font-size: 14px;">@soleil__vt</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Soleil | VTuber | Space Pirate. | Data | Soleil | VTuber | Space Pirate | | --- | --- | | Tweets downloaded | 1129 | | Retweets | 67 | | Short tweets | 307 | | Tweets kept | 755 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gvdri1u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @soleil__vt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/15ap84wq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/15ap84wq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/soleil__vt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/some_bxdy
2021-05-22T23:29:59.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/some_bxdy/1617906706870/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379486260297932808/yvXqwjo-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Freddo 🤖 AI Bot </div> <div style="font-size: 15px">@some_bxdy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@some_bxdy's tweets](https://twitter.com/some_bxdy). | Data | Quantity | | --- | --- | | Tweets downloaded | 724 | | Retweets | 337 | | Short tweets | 43 | | Tweets kept | 344 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/m3z2802r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @some_bxdy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3tuk7ev3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3tuk7ev3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/some_bxdy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sonyaism
2021-05-22T23:31:06.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sonyaism/1617756213982/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1371921425246863367/xyrKgok4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">sonya؜ 🤖 AI Bot </div> <div style="font-size: 15px">@sonyaism bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sonyaism's tweets](https://twitter.com/sonyaism). | Data | Quantity | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 16 | | Short tweets | 579 | | Tweets kept | 2648 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2hujh3sc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sonyaism's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/202umy6y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/202umy6y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sonyaism') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sorenemile
2021-05-22T23:32:14.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
11
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sorenemile/1616687865472/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1351377883239903233/7F9a5YZ7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Soren 🤖 AI Bot </div> <div style="font-size: 15px">@sorenemile bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sorenemile's tweets](https://twitter.com/sorenemile). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 19 | | Short tweets | 939 | | Tweets kept | 2288 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/22file1d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sorenemile's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/12kez6wa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/12kez6wa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sorenemile') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sosadtoday
2021-05-22T23:33:47.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
11
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sosadtoday/1605760372148/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/595303483659587584/V-8JB3-E_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">so sad today 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sosadtoday bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sosadtoday's tweets](https://twitter.com/sosadtoday). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3201</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>390</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>224</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2587</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2z7key7v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sosadtoday's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/15qxih1w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/15qxih1w/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sosadtoday'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/sovereign_beast
2021-05-22T23:34:50.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sovereign_beast/1617890642358/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360371143618801665/kgYG2UQ3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🏳️‍⚧️ Indecipherable Scrawlings 🏳️‍⚧️ 🤖 AI Bot </div> <div style="font-size: 15px">@sovereign_beast bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sovereign_beast's tweets](https://twitter.com/sovereign_beast). | Data | Quantity | | --- | --- | | Tweets downloaded | 3145 | | Retweets | 1016 | | Short tweets | 116 | | Tweets kept | 2013 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35o219p2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sovereign_beast's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3mmo9uhd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3mmo9uhd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/sovereign_beast') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spacebananaza
2021-05-22T23:35:52.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spacebananaza/1617774737011/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370063594520408064/bC3Dbs4D_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Tess 🤖 AI Bot </div> <div style="font-size: 15px">@spacebananaza bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spacebananaza's tweets](https://twitter.com/spacebananaza). | Data | Quantity | | --- | --- | | Tweets downloaded | 593 | | Retweets | 308 | | Short tweets | 46 | | Tweets kept | 239 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jzrx9ry/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spacebananaza's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/9vv9pgcs) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/9vv9pgcs/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spacebananaza') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spacedoddyssey
2021-03-25T20:33:33.000Z
[]
[ ".gitattributes" ]
huggingtweets
0
huggingtweets/spacedsheep
2021-05-22T23:37:00.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
7
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spacedsheep/1614108778392/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361342244045864960/U588ty33_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Clara 🤖 AI Bot </div> <div style="font-size: 15px">@spacedsheep bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spacedsheep's tweets](https://twitter.com/spacedsheep). | Data | Quantity | | --- | --- | | Tweets downloaded | 3106 | | Retweets | 682 | | Short tweets | 604 | | Tweets kept | 1820 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/m9wz5qpe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spacedsheep's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jxagx89r) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jxagx89r/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spacedsheep') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spam_can
2021-05-22T23:38:14.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
48
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spam_can/1617789719879/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370899730826399744/AwBMn6G6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cay 🏳️‍🌈🐱🏳️‍⚧️ 🤖 AI Bot </div> <div style="font-size: 15px">@spam_can bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spam_can's tweets](https://twitter.com/spam_can). | Data | Quantity | | --- | --- | | Tweets downloaded | 3231 | | Retweets | 1216 | | Short tweets | 177 | | Tweets kept | 1838 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1u0hq0wb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spam_can's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2e7i2emb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2e7i2emb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spam_can') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spatermensch
2021-05-22T23:39:28.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spatermensch/1616648269598/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1300305786476752896/soc1wh42_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">spätermensch 🤖 AI Bot </div> <div style="font-size: 15px">@spatermensch bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spatermensch's tweets](https://twitter.com/spatermensch). | Data | Quantity | | --- | --- | | Tweets downloaded | 999 | | Retweets | 212 | | Short tweets | 211 | | Tweets kept | 576 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ted9nk7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spatermensch's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18qyjlqw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18qyjlqw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spatermensch') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/speakerpelosi
2021-05-22T23:40:35.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
13
transformers
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1114294290375688193/P9mcJNGb_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Nancy Pelosi 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@speakerpelosi bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@speakerpelosi's tweets](https://twitter.com/speakerpelosi). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3221</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>601</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>4</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2616</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lhx8q9a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @speakerpelosi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3alajmxr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3alajmxr/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/speakerpelosi'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
huggingtweets/spiffffer
2021-05-22T23:42:03.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
10
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spiffffer/1614098628466/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357203592776740865/wWw_MmAs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">clb 🤖 AI Bot </div> <div style="font-size: 15px">@spiffffer bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spiffffer's tweets](https://twitter.com/spiffffer). | Data | Quantity | | --- | --- | | Tweets downloaded | 3181 | | Retweets | 673 | | Short tweets | 420 | | Tweets kept | 2088 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/icfilwek/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spiffffer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1zshqxuh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1zshqxuh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spiffffer') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spknnk
2021-05-22T23:43:17.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
9
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spknnk/1616845130596/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1355067555254300673/j96wD3_V_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">я миша 🤖 AI Bot </div> <div style="font-size: 15px">@spknnk bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spknnk's tweets](https://twitter.com/spknnk). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 42 | | Short tweets | 1066 | | Tweets kept | 2142 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qqeli5b6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spknnk's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1hgf21to) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1hgf21to/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spknnk') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spookymachine
2021-05-22T23:44:41.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
11
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spookymachine/1617758539359/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379523570473242625/YmJkdku3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Alea, Conjecture Of Goo 🤖 AI Bot </div> <div style="font-size: 15px">@spookymachine bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@spookymachine's tweets](https://twitter.com/spookymachine). | Data | Quantity | | --- | --- | | Tweets downloaded | 3236 | | Retweets | 217 | | Short tweets | 254 | | Tweets kept | 2765 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/p3syzv61/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spookymachine's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2g5tax8a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2g5tax8a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spookymachine') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/spookysimon1
2021-05-22T23:45:51.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
8
transformers
--- language: en thumbnail: https://www.huggingtweets.com/spookysimon1/1621369998182/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1355874900704161792/xTvexkap_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">spooky_simon</div> <div style="text-align: center; font-size: 14px;">@spookysimon1</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from spooky_simon. | Data | spooky_simon | | --- | --- | | Tweets downloaded | 3225 | | Retweets | 128 | | Short tweets | 954 | | Tweets kept | 2143 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/jdigg9qt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @spookysimon1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/e675ooeo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/e675ooeo/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/spookysimon1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/sprobertson
2021-05-22T23:47:14.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
16
transformers
--- language: en thumbnail: https://www.huggingtweets.com/sprobertson/1608083159952/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/875580385765146624/EYvWHUn-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Sean Robertson 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@sprobertson bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@sprobertson's tweets](https://twitter.com/sprobertson). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>369</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>39</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>41</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>289</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bd4il18/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @sprobertson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2uo0uk83) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2uo0uk83/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/sprobertson'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/ssriprincess
2021-05-22T23:48:23.000Z
[ "pytorch", "jax", "gpt2", "lm-head", "causal-lm", "en", "transformers", "huggingtweets", "text-generation" ]
text-generation
[ ".gitattributes", "README.md", "config.json", "flax_model.msgpack", "merges.txt", "pytorch_model.bin", "special_tokens_map.json", "tokenizer_config.json", "training_args.bin", "vocab.json" ]
huggingtweets
6
transformers
--- language: en thumbnail: https://www.huggingtweets.com/ssriprincess/1616689455038/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365831180843589635/YdR_-q6p_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">coup enj*yer (16 year old nazi "tradwife" virgin) 🤖 AI Bot </div> <div style="font-size: 15px">@ssriprincess bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ssriprincess's tweets](https://twitter.com/ssriprincess). | Data | Quantity | | --- | --- | | Tweets downloaded | 1983 | | Retweets | 193 | | Short tweets | 287 | | Tweets kept | 1503 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mm7v3cz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ssriprincess's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/md2txogk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/md2txogk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ssriprincess') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)