Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
seanpedrickcase
/
llm_topic_modelling
like
0
Running
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
fd8dddc
llm_topic_modelling
/
tools
Ctrl+K
Ctrl+K
3 contributors
History:
19 commits
seanpedrickcase
Updated packages, a few bug fixes
64ffd3a
3 months ago
__init__.py
Safe
0 Bytes
First commit
6 months ago
auth.py
Safe
1.54 kB
Allowed for server port, queue size, and file size to be specified by environment variables
6 months ago
aws_functions.py
Safe
7.26 kB
Corrected line in upload_file_to_s3 function that was causing issues
6 months ago
chatfuncs.py
Safe
8.14 kB
Topic deduplication/merging now separated from summarisation. Gradio upgrade
4 months ago
helper_functions.py
Safe
14.4 kB
Changed default requirements to CPU version of llama cpp. Added Gemini Flash 2.0 to model list. Output files should contain only final files.
3 months ago
llm_api_call.py
Safe
105 kB
Updated packages, a few bug fixes
3 months ago
prompts.py
Safe
5.25 kB
Refactor app.py and related modules for improved topic extraction and summarization. Updated UI prompts for clarity, enhanced file upload functionality, and added error handling in AWS file uploads. Introduced new functions for converting response text to markdown tables, creating general topics from subtopics, and improved overall code structure for better maintainability.
6 months ago