Spaces:
Running
Running
Switch to Mistral Nemo and increase the output length
Browse files- README.md +3 -3
- global_config.py +10 -3
README.md
CHANGED
|
@@ -16,7 +16,7 @@ We spend a lot of time on creating the slides and organizing our thoughts for an
|
|
| 16 |
With SlideDeck AI, co-create slide decks on any topic with Generative Artificial Intelligence.
|
| 17 |
Describe your topic and let SlideDeck AI generate a PowerPoint slide deck for you—it's as simple as that!
|
| 18 |
|
| 19 |
-
SlideDeck AI is powered by [Mistral
|
| 20 |
Originally, it was built using the Llama 2 API provided by Clarifai.
|
| 21 |
|
| 22 |
*Update (v4.0)*: Legacy SlideDeck AI allowed one-shot generation of a slide deck based on the inputs.
|
|
@@ -28,7 +28,7 @@ where you can create and improve the presentation.
|
|
| 28 |
|
| 29 |
SlideDeck AI works in the following way:
|
| 30 |
|
| 31 |
-
1. Given a topic description, it uses Mistral
|
| 32 |
The output is generated as structured JSON data based on a pre-defined schema.
|
| 33 |
2. Subsequently, it uses the `python-pptx` library to generate the slides,
|
| 34 |
based on the JSON data from the previous step.
|
|
@@ -52,7 +52,7 @@ number of allowed characters in the textbox, pasting would not work.
|
|
| 52 |
|
| 53 |
# Local Development
|
| 54 |
|
| 55 |
-
SlideDeck AI uses [Mistral
|
| 56 |
via the Hugging Face Inference API.
|
| 57 |
To run this project by yourself, you need to provide the `HUGGINGFACEHUB_API_TOKEN` API key,
|
| 58 |
for example, in a `.env` file. Visit the respective websites to obtain the keys.
|
|
|
|
| 16 |
With SlideDeck AI, co-create slide decks on any topic with Generative Artificial Intelligence.
|
| 17 |
Describe your topic and let SlideDeck AI generate a PowerPoint slide deck for you—it's as simple as that!
|
| 18 |
|
| 19 |
+
SlideDeck AI is powered by [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407).
|
| 20 |
Originally, it was built using the Llama 2 API provided by Clarifai.
|
| 21 |
|
| 22 |
*Update (v4.0)*: Legacy SlideDeck AI allowed one-shot generation of a slide deck based on the inputs.
|
|
|
|
| 28 |
|
| 29 |
SlideDeck AI works in the following way:
|
| 30 |
|
| 31 |
+
1. Given a topic description, it uses Mistral Nemo Instruct to generate the *initial* content of the slides.
|
| 32 |
The output is generated as structured JSON data based on a pre-defined schema.
|
| 33 |
2. Subsequently, it uses the `python-pptx` library to generate the slides,
|
| 34 |
based on the JSON data from the previous step.
|
|
|
|
| 52 |
|
| 53 |
# Local Development
|
| 54 |
|
| 55 |
+
SlideDeck AI uses [Mistral-Nemo-Instruct-2407](https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407)
|
| 56 |
via the Hugging Face Inference API.
|
| 57 |
To run this project by yourself, you need to provide the `HUGGINGFACEHUB_API_TOKEN` API key,
|
| 58 |
for example, in a `.env` file. Visit the respective websites to obtain the keys.
|
global_config.py
CHANGED
|
@@ -1,3 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
| 1 |
import logging
|
| 2 |
import os
|
| 3 |
|
|
@@ -10,10 +13,14 @@ load_dotenv()
|
|
| 10 |
|
| 11 |
@dataclass(frozen=True)
|
| 12 |
class GlobalConfig:
|
| 13 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 14 |
LLM_MODEL_TEMPERATURE: float = 0.2
|
| 15 |
-
LLM_MODEL_MIN_OUTPUT_LENGTH: int =
|
| 16 |
-
LLM_MODEL_MAX_OUTPUT_LENGTH: int = 4096
|
| 17 |
LLM_MODEL_MAX_INPUT_LENGTH: int = 750
|
| 18 |
|
| 19 |
HUGGINGFACEHUB_API_TOKEN = os.environ.get('HUGGINGFACEHUB_API_TOKEN', '')
|
|
|
|
| 1 |
+
"""
|
| 2 |
+
A set of configurations used by the app.
|
| 3 |
+
"""
|
| 4 |
import logging
|
| 5 |
import os
|
| 6 |
|
|
|
|
| 13 |
|
| 14 |
@dataclass(frozen=True)
|
| 15 |
class GlobalConfig:
|
| 16 |
+
"""
|
| 17 |
+
A data class holding the configurations.
|
| 18 |
+
"""
|
| 19 |
+
|
| 20 |
+
HF_LLM_MODEL_NAME = 'mistralai/Mistral-Nemo-Instruct-2407'
|
| 21 |
LLM_MODEL_TEMPERATURE: float = 0.2
|
| 22 |
+
LLM_MODEL_MIN_OUTPUT_LENGTH: int = 100
|
| 23 |
+
LLM_MODEL_MAX_OUTPUT_LENGTH: int = 4 * 4096
|
| 24 |
LLM_MODEL_MAX_INPUT_LENGTH: int = 750
|
| 25 |
|
| 26 |
HUGGINGFACEHUB_API_TOKEN = os.environ.get('HUGGINGFACEHUB_API_TOKEN', '')
|