File size: 1,411 Bytes
87f771f
 
 
 
 
 
65aaead
87f771f
 
 
 
 
 
064d505
87f771f
0949505
 
 
 
a9166b8
d07c658
 
 
 
 
 
 
 
a9166b8
 
 
 
27b7058
 
 
 
d07c658
 
 
3ab71ab
 
0949505
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
---
title: GPT-2 rumblings
emoji: 🤖
colorFrom: blue
colorTo: red
sdk: gradio
sdk_version: 3.14.0
app_file: app.py
pinned: false
tags:
- gpt-2 rumblings
---

![](img/demo_1.png)

## Demo 
Here is a link to Huggingface's demo of the GPT-2 rumblings model.
[https://huggingface.co/spaces/codebender/gpt-2-rumblings](https://huggingface.co/spaces/codebender/gpt-2-rumblings)

# AI Text Generation
This repository contains jupyter notebooks and python scripts that demo the training of a GPT model and the generation of text using the trained model. 

## What is GPT?
GPT is a language model that is trained on a large corpus of text. It is a deep learning model that is trained using a technique called backpropagation. GPT in full is Generative Pre-trained Transformer. 

GPT has a lot of real-world applications and it can accelerate workflows of different kinds. 

In summary, imagine an AI model that generates new text based on a given text. This is what GPT does.

## GPT Model Generation
This is full of experiments that are based of aitextgen Python package. 

## How to run
1. Clone the repository
2. Create a virtual environment using pipenv
3. Install the dependencies using pipenv
4. Run the jupyter notebooks in the `notebooks` folder

To run the demos, simply run the python scripts after installing the dependencies.

## GPT Model Training
Different notebooks are used for different experiments.