Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
seanpedrickcase
/
llm_topic_modelling
like
0
Running
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
b7f4700
llm_topic_modelling
3 contributors
History:
7 commits
seanpedrickcase
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
b7f4700
4 months ago
.github
First commit
4 months ago
tools
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
4 months ago
.dockerignore
Safe
137 Bytes
First commit
4 months ago
.gitignore
Safe
137 Bytes
First commit
4 months ago
Dockerfile
Safe
1.88 kB
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
4 months ago
README.md
2.21 kB
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
4 months ago
app.py
25.2 kB
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
4 months ago
requirements.txt
Safe
419 Bytes
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
4 months ago
requirements_cpu.txt
Safe
415 Bytes
Added support for using local models (specifically Gemma 2b) for topic extraction and summary. Generally improved output format safeguards.
4 months ago