Spaces:
Paused
Paused
github-actions[bot]
commited on
Commit
·
775fb16
1
Parent(s):
4a72078
GitHub deploy: 0554cc612891bbdf2e4271ffdf6e88ceabe404e1
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .github/workflows/deploy-to-hf-spaces.yml +1 -1
- .github/workflows/integration-test.yml +7 -0
- CHANGELOG.md +27 -0
- Dockerfile +1 -1
- README.md +4 -6
- backend/apps/audio/main.py +7 -7
- backend/apps/images/main.py +9 -7
- backend/apps/images/utils/comfyui.py +7 -5
- backend/apps/ollama/main.py +90 -315
- backend/apps/openai/main.py +8 -5
- backend/apps/rag/main.py +9 -9
- backend/apps/rag/search/brave.py +2 -2
- backend/apps/rag/search/duckduckgo.py +3 -3
- backend/apps/rag/search/google_pse.py +2 -2
- backend/apps/rag/search/jina_search.py +1 -1
- backend/apps/rag/search/searxng.py +5 -5
- backend/apps/rag/search/serper.py +2 -2
- backend/apps/rag/search/serply.py +2 -2
- backend/apps/rag/search/serpstack.py +2 -2
- backend/apps/rag/search/tavily.py +1 -1
- backend/apps/rag/utils.py +6 -6
- backend/apps/webui/main.py +4 -2
- backend/apps/webui/models/auths.py +6 -6
- backend/apps/webui/models/chats.py +20 -20
- backend/apps/webui/models/documents.py +5 -5
- backend/apps/webui/models/files.py +5 -5
- backend/apps/webui/models/functions.py +10 -10
- backend/apps/webui/models/memories.py +10 -10
- backend/apps/webui/models/models.py +3 -3
- backend/apps/webui/models/prompts.py +5 -5
- backend/apps/webui/models/tags.py +8 -8
- backend/apps/webui/models/tools.py +8 -8
- backend/apps/webui/models/users.py +12 -12
- backend/apps/webui/routers/chats.py +11 -11
- backend/apps/webui/routers/configs.py +7 -7
- backend/apps/webui/routers/documents.py +4 -4
- backend/apps/webui/routers/files.py +2 -2
- backend/apps/webui/routers/functions.py +5 -5
- backend/apps/webui/routers/memories.py +2 -2
- backend/apps/webui/routers/models.py +2 -2
- backend/apps/webui/routers/prompts.py +3 -3
- backend/apps/webui/routers/tools.py +3 -3
- backend/apps/webui/routers/users.py +2 -2
- backend/apps/webui/routers/utils.py +2 -2
- backend/apps/webui/utils.py +14 -0
- backend/config.py +19 -14
- backend/data/litellm/config.yaml +1 -3
- backend/main.py +6 -5
- backend/requirements.txt +6 -6
- backend/start.sh +0 -1
.github/workflows/deploy-to-hf-spaces.yml
CHANGED
@@ -44,7 +44,7 @@ jobs:
|
|
44 |
echo "---" >> temp_readme.md
|
45 |
cat README.md >> temp_readme.md
|
46 |
mv temp_readme.md README.md
|
47 |
-
|
48 |
- name: Configure git
|
49 |
run: |
|
50 |
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
|
|
44 |
echo "---" >> temp_readme.md
|
45 |
cat README.md >> temp_readme.md
|
46 |
mv temp_readme.md README.md
|
47 |
+
|
48 |
- name: Configure git
|
49 |
run: |
|
50 |
git config --global user.email "41898282+github-actions[bot]@users.noreply.github.com"
|
.github/workflows/integration-test.yml
CHANGED
@@ -15,6 +15,13 @@ jobs:
|
|
15 |
name: Run Cypress Integration Tests
|
16 |
runs-on: ubuntu-latest
|
17 |
steps:
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
18 |
- name: Checkout Repository
|
19 |
uses: actions/checkout@v4
|
20 |
|
|
|
15 |
name: Run Cypress Integration Tests
|
16 |
runs-on: ubuntu-latest
|
17 |
steps:
|
18 |
+
- name: Maximize build space
|
19 |
+
uses: AdityaGarg8/[email protected]
|
20 |
+
with:
|
21 |
+
remove-android: 'true'
|
22 |
+
remove-haskell: 'true'
|
23 |
+
remove-codeql: 'true'
|
24 |
+
|
25 |
- name: Checkout Repository
|
26 |
uses: actions/checkout@v4
|
27 |
|
CHANGELOG.md
CHANGED
@@ -5,6 +5,33 @@ All notable changes to this project will be documented in this file.
|
|
5 |
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
6 |
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
7 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
8 |
## [0.3.12] - 2024-08-07
|
9 |
|
10 |
### Added
|
|
|
5 |
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
6 |
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
7 |
|
8 |
+
## [0.3.13] - 2024-08-14
|
9 |
+
|
10 |
+
### Added
|
11 |
+
|
12 |
+
- **🎨 Enhanced Markdown Rendering**: Significant improvements in rendering markdown, ensuring smooth and reliable display of LaTeX and Mermaid charts, enhancing user experience with more robust visual content.
|
13 |
+
- **🔄 Auto-Install Tools & Functions Python Dependencies**: For 'Tools' and 'Functions', Open WebUI now automatically install extra python requirements specified in the frontmatter, streamlining setup processes and customization.
|
14 |
+
- **🌀 OAuth Email Claim Customization**: Introduced an 'OAUTH_EMAIL_CLAIM' variable to allow customization of the default "email" claim within OAuth configurations, providing greater flexibility in authentication processes.
|
15 |
+
- **📶 Websocket Reconnection**: Enhanced reliability with the capability to automatically reconnect when a websocket is closed, ensuring consistent and stable communication.
|
16 |
+
- **🤳 Haptic Feedback on Support Devices**: Android devices now support haptic feedback for an immersive tactile experience during certain interactions.
|
17 |
+
|
18 |
+
### Fixed
|
19 |
+
|
20 |
+
- **🛠️ ComfyUI Performance Improvement**: Addressed an issue causing FastAPI to stall when ComfyUI image generation was active; now runs in a separate thread to prevent UI unresponsiveness.
|
21 |
+
- **🔀 Session Handling**: Fixed an issue mandating session_id on client-side to ensure smoother session management and transitions.
|
22 |
+
- **🖋️ Minor Bug Fixes and Format Corrections**: Various minor fixes including typo corrections, backend formatting improvements, and test amendments enhancing overall system stability and performance.
|
23 |
+
|
24 |
+
### Changed
|
25 |
+
|
26 |
+
- **🚀 Migration to SvelteKit 2**: Upgraded the underlying framework to SvelteKit version 2, offering enhanced speed, better code structure, and improved deployment capabilities.
|
27 |
+
- **🧹 General Cleanup and Refactoring**: Performed broad cleanup and refactoring across the platform, improving code efficiency and maintaining high standards of code health.
|
28 |
+
- **🚧 Integration Testing Improvements**: Modified how Cypress integration tests detect chat messages and updated sharing tests for better reliability and accuracy.
|
29 |
+
- **📁 Standardized '.safetensors' File Extension**: Renamed the '.sft' file extension to '.safetensors' for ComfyUI workflows, standardizing file formats across the platform.
|
30 |
+
|
31 |
+
### Removed
|
32 |
+
|
33 |
+
- **🗑️ Deprecated Frontend Functions**: Removed frontend functions that were migrated to backend to declutter the codebase and reduce redundancy.
|
34 |
+
|
35 |
## [0.3.12] - 2024-08-07
|
36 |
|
37 |
### Added
|
Dockerfile
CHANGED
@@ -2,7 +2,7 @@
|
|
2 |
# Initialize device type args
|
3 |
# use build args in the docker build commmand with --build-arg="BUILDARG=true"
|
4 |
ARG USE_CUDA=false
|
5 |
-
ARG USE_OLLAMA=
|
6 |
# Tested with cu117 for CUDA 11 and cu121 for CUDA 12 (default)
|
7 |
ARG USE_CUDA_VER=cu121
|
8 |
# any sentence transformer model; models to use can be found at https://huggingface.co/models?library=sentence-transformers
|
|
|
2 |
# Initialize device type args
|
3 |
# use build args in the docker build commmand with --build-arg="BUILDARG=true"
|
4 |
ARG USE_CUDA=false
|
5 |
+
ARG USE_OLLAMA=false
|
6 |
# Tested with cu117 for CUDA 11 and cu121 for CUDA 12 (default)
|
7 |
ARG USE_CUDA_VER=cu121
|
8 |
# any sentence transformer model; models to use can be found at https://huggingface.co/models?library=sentence-transformers
|
README.md
CHANGED
@@ -1,14 +1,12 @@
|
|
1 |
---
|
2 |
-
title:
|
3 |
emoji: 🐳
|
4 |
colorFrom: purple
|
5 |
colorTo: gray
|
6 |
sdk: docker
|
7 |
app_port: 8080
|
8 |
-
license: apache-2.0
|
9 |
---
|
10 |
-
|
11 |
-
# OpenOllama 👋
|
12 |
|
13 |

|
14 |

|
@@ -25,7 +23,7 @@ Open WebUI is an [extensible](https://github.com/open-webui/pipelines), feature-
|
|
25 |
|
26 |

|
27 |
|
28 |
-
## Key Features of
|
29 |
|
30 |
- 🚀 **Effortless Setup**: Install seamlessly using Docker or Kubernetes (kubectl, kustomize or helm) for a hassle-free experience with support for both `:ollama` and `:cuda` tagged images.
|
31 |
|
@@ -210,4 +208,4 @@ If you have any questions, suggestions, or need assistance, please open an issue
|
|
210 |
|
211 |
---
|
212 |
|
213 |
-
Created by [Timothy J. Baek](https://github.com/tjbck) - Let's make Open WebUI even more amazing together! 💪
|
|
|
1 |
---
|
2 |
+
title: Open WebUI
|
3 |
emoji: 🐳
|
4 |
colorFrom: purple
|
5 |
colorTo: gray
|
6 |
sdk: docker
|
7 |
app_port: 8080
|
|
|
8 |
---
|
9 |
+
# Open WebUI (Formerly Ollama WebUI) 👋
|
|
|
10 |
|
11 |

|
12 |

|
|
|
23 |
|
24 |

|
25 |
|
26 |
+
## Key Features of Open WebUI ⭐
|
27 |
|
28 |
- 🚀 **Effortless Setup**: Install seamlessly using Docker or Kubernetes (kubectl, kustomize or helm) for a hassle-free experience with support for both `:ollama` and `:cuda` tagged images.
|
29 |
|
|
|
208 |
|
209 |
---
|
210 |
|
211 |
+
Created by [Timothy J. Baek](https://github.com/tjbck) - Let's make Open WebUI even more amazing together! 💪
|
backend/apps/audio/main.py
CHANGED
@@ -15,7 +15,7 @@ from fastapi.responses import StreamingResponse, JSONResponse, FileResponse
|
|
15 |
from fastapi.middleware.cors import CORSMiddleware
|
16 |
from pydantic import BaseModel
|
17 |
|
18 |
-
|
19 |
import uuid
|
20 |
import requests
|
21 |
import hashlib
|
@@ -244,7 +244,7 @@ async def speech(request: Request, user=Depends(get_verified_user)):
|
|
244 |
res = r.json()
|
245 |
if "error" in res:
|
246 |
error_detail = f"External: {res['error']['message']}"
|
247 |
-
except:
|
248 |
error_detail = f"External: {e}"
|
249 |
|
250 |
raise HTTPException(
|
@@ -299,7 +299,7 @@ async def speech(request: Request, user=Depends(get_verified_user)):
|
|
299 |
res = r.json()
|
300 |
if "error" in res:
|
301 |
error_detail = f"External: {res['error']['message']}"
|
302 |
-
except:
|
303 |
error_detail = f"External: {e}"
|
304 |
|
305 |
raise HTTPException(
|
@@ -353,7 +353,7 @@ def transcribe(
|
|
353 |
|
354 |
try:
|
355 |
model = WhisperModel(**whisper_kwargs)
|
356 |
-
except:
|
357 |
log.warning(
|
358 |
"WhisperModel initialization failed, attempting download with local_files_only=False"
|
359 |
)
|
@@ -421,7 +421,7 @@ def transcribe(
|
|
421 |
res = r.json()
|
422 |
if "error" in res:
|
423 |
error_detail = f"External: {res['error']['message']}"
|
424 |
-
except:
|
425 |
error_detail = f"External: {e}"
|
426 |
|
427 |
raise HTTPException(
|
@@ -438,7 +438,7 @@ def transcribe(
|
|
438 |
)
|
439 |
|
440 |
|
441 |
-
def get_available_models() ->
|
442 |
if app.state.config.TTS_ENGINE == "openai":
|
443 |
return [{"id": "tts-1"}, {"id": "tts-1-hd"}]
|
444 |
elif app.state.config.TTS_ENGINE == "elevenlabs":
|
@@ -466,7 +466,7 @@ async def get_models(user=Depends(get_verified_user)):
|
|
466 |
return {"models": get_available_models()}
|
467 |
|
468 |
|
469 |
-
def get_available_voices() ->
|
470 |
if app.state.config.TTS_ENGINE == "openai":
|
471 |
return [
|
472 |
{"name": "alloy", "id": "alloy"},
|
|
|
15 |
from fastapi.middleware.cors import CORSMiddleware
|
16 |
from pydantic import BaseModel
|
17 |
|
18 |
+
|
19 |
import uuid
|
20 |
import requests
|
21 |
import hashlib
|
|
|
244 |
res = r.json()
|
245 |
if "error" in res:
|
246 |
error_detail = f"External: {res['error']['message']}"
|
247 |
+
except Exception:
|
248 |
error_detail = f"External: {e}"
|
249 |
|
250 |
raise HTTPException(
|
|
|
299 |
res = r.json()
|
300 |
if "error" in res:
|
301 |
error_detail = f"External: {res['error']['message']}"
|
302 |
+
except Exception:
|
303 |
error_detail = f"External: {e}"
|
304 |
|
305 |
raise HTTPException(
|
|
|
353 |
|
354 |
try:
|
355 |
model = WhisperModel(**whisper_kwargs)
|
356 |
+
except Exception:
|
357 |
log.warning(
|
358 |
"WhisperModel initialization failed, attempting download with local_files_only=False"
|
359 |
)
|
|
|
421 |
res = r.json()
|
422 |
if "error" in res:
|
423 |
error_detail = f"External: {res['error']['message']}"
|
424 |
+
except Exception:
|
425 |
error_detail = f"External: {e}"
|
426 |
|
427 |
raise HTTPException(
|
|
|
438 |
)
|
439 |
|
440 |
|
441 |
+
def get_available_models() -> list[dict]:
|
442 |
if app.state.config.TTS_ENGINE == "openai":
|
443 |
return [{"id": "tts-1"}, {"id": "tts-1-hd"}]
|
444 |
elif app.state.config.TTS_ENGINE == "elevenlabs":
|
|
|
466 |
return {"models": get_available_models()}
|
467 |
|
468 |
|
469 |
+
def get_available_voices() -> list[dict]:
|
470 |
if app.state.config.TTS_ENGINE == "openai":
|
471 |
return [
|
472 |
{"name": "alloy", "id": "alloy"},
|
backend/apps/images/main.py
CHANGED
@@ -94,7 +94,7 @@ app.state.config.COMFYUI_FLUX_FP8_CLIP = COMFYUI_FLUX_FP8_CLIP
|
|
94 |
|
95 |
|
96 |
def get_automatic1111_api_auth():
|
97 |
-
if app.state.config.AUTOMATIC1111_API_AUTH
|
98 |
return ""
|
99 |
else:
|
100 |
auth1111_byte_string = app.state.config.AUTOMATIC1111_API_AUTH.encode("utf-8")
|
@@ -145,28 +145,30 @@ async def get_engine_url(user=Depends(get_admin_user)):
|
|
145 |
async def update_engine_url(
|
146 |
form_data: EngineUrlUpdateForm, user=Depends(get_admin_user)
|
147 |
):
|
148 |
-
if form_data.AUTOMATIC1111_BASE_URL
|
149 |
app.state.config.AUTOMATIC1111_BASE_URL = AUTOMATIC1111_BASE_URL
|
150 |
else:
|
151 |
url = form_data.AUTOMATIC1111_BASE_URL.strip("/")
|
152 |
try:
|
153 |
r = requests.head(url)
|
|
|
154 |
app.state.config.AUTOMATIC1111_BASE_URL = url
|
155 |
except Exception as e:
|
156 |
-
raise HTTPException(status_code=400, detail=ERROR_MESSAGES.
|
157 |
|
158 |
-
if form_data.COMFYUI_BASE_URL
|
159 |
app.state.config.COMFYUI_BASE_URL = COMFYUI_BASE_URL
|
160 |
else:
|
161 |
url = form_data.COMFYUI_BASE_URL.strip("/")
|
162 |
|
163 |
try:
|
164 |
r = requests.head(url)
|
|
|
165 |
app.state.config.COMFYUI_BASE_URL = url
|
166 |
except Exception as e:
|
167 |
-
raise HTTPException(status_code=400, detail=ERROR_MESSAGES.
|
168 |
|
169 |
-
if form_data.AUTOMATIC1111_API_AUTH
|
170 |
app.state.config.AUTOMATIC1111_API_AUTH = AUTOMATIC1111_API_AUTH
|
171 |
else:
|
172 |
app.state.config.AUTOMATIC1111_API_AUTH = form_data.AUTOMATIC1111_API_AUTH
|
@@ -514,7 +516,7 @@ async def image_generations(
|
|
514 |
|
515 |
data = ImageGenerationPayload(**data)
|
516 |
|
517 |
-
res = comfyui_generate_image(
|
518 |
app.state.config.MODEL,
|
519 |
data,
|
520 |
user.id,
|
|
|
94 |
|
95 |
|
96 |
def get_automatic1111_api_auth():
|
97 |
+
if app.state.config.AUTOMATIC1111_API_AUTH is None:
|
98 |
return ""
|
99 |
else:
|
100 |
auth1111_byte_string = app.state.config.AUTOMATIC1111_API_AUTH.encode("utf-8")
|
|
|
145 |
async def update_engine_url(
|
146 |
form_data: EngineUrlUpdateForm, user=Depends(get_admin_user)
|
147 |
):
|
148 |
+
if form_data.AUTOMATIC1111_BASE_URL is None:
|
149 |
app.state.config.AUTOMATIC1111_BASE_URL = AUTOMATIC1111_BASE_URL
|
150 |
else:
|
151 |
url = form_data.AUTOMATIC1111_BASE_URL.strip("/")
|
152 |
try:
|
153 |
r = requests.head(url)
|
154 |
+
r.raise_for_status()
|
155 |
app.state.config.AUTOMATIC1111_BASE_URL = url
|
156 |
except Exception as e:
|
157 |
+
raise HTTPException(status_code=400, detail=ERROR_MESSAGES.INVALID_URL)
|
158 |
|
159 |
+
if form_data.COMFYUI_BASE_URL is None:
|
160 |
app.state.config.COMFYUI_BASE_URL = COMFYUI_BASE_URL
|
161 |
else:
|
162 |
url = form_data.COMFYUI_BASE_URL.strip("/")
|
163 |
|
164 |
try:
|
165 |
r = requests.head(url)
|
166 |
+
r.raise_for_status()
|
167 |
app.state.config.COMFYUI_BASE_URL = url
|
168 |
except Exception as e:
|
169 |
+
raise HTTPException(status_code=400, detail=ERROR_MESSAGES.INVALID_URL)
|
170 |
|
171 |
+
if form_data.AUTOMATIC1111_API_AUTH is None:
|
172 |
app.state.config.AUTOMATIC1111_API_AUTH = AUTOMATIC1111_API_AUTH
|
173 |
else:
|
174 |
app.state.config.AUTOMATIC1111_API_AUTH = form_data.AUTOMATIC1111_API_AUTH
|
|
|
516 |
|
517 |
data = ImageGenerationPayload(**data)
|
518 |
|
519 |
+
res = await comfyui_generate_image(
|
520 |
app.state.config.MODEL,
|
521 |
data,
|
522 |
user.id,
|
backend/apps/images/utils/comfyui.py
CHANGED
@@ -1,5 +1,5 @@
|
|
|
|
1 |
import websocket # NOTE: websocket-client (https://github.com/websocket-client/websocket-client)
|
2 |
-
import uuid
|
3 |
import json
|
4 |
import urllib.request
|
5 |
import urllib.parse
|
@@ -170,7 +170,7 @@ FLUX_DEFAULT_PROMPT = """
|
|
170 |
},
|
171 |
"10": {
|
172 |
"inputs": {
|
173 |
-
"vae_name": "ae.
|
174 |
},
|
175 |
"class_type": "VAELoader"
|
176 |
},
|
@@ -184,7 +184,7 @@ FLUX_DEFAULT_PROMPT = """
|
|
184 |
},
|
185 |
"12": {
|
186 |
"inputs": {
|
187 |
-
"unet_name": "flux1-dev.
|
188 |
"weight_dtype": "default"
|
189 |
},
|
190 |
"class_type": "UNETLoader"
|
@@ -328,7 +328,7 @@ class ImageGenerationPayload(BaseModel):
|
|
328 |
flux_fp8_clip: Optional[bool] = None
|
329 |
|
330 |
|
331 |
-
def comfyui_generate_image(
|
332 |
model: str, payload: ImageGenerationPayload, client_id, base_url
|
333 |
):
|
334 |
ws_url = base_url.replace("http://", "ws://").replace("https://", "wss://")
|
@@ -397,7 +397,9 @@ def comfyui_generate_image(
|
|
397 |
return None
|
398 |
|
399 |
try:
|
400 |
-
images =
|
|
|
|
|
401 |
except Exception as e:
|
402 |
log.exception(f"Error while receiving images: {e}")
|
403 |
images = None
|
|
|
1 |
+
import asyncio
|
2 |
import websocket # NOTE: websocket-client (https://github.com/websocket-client/websocket-client)
|
|
|
3 |
import json
|
4 |
import urllib.request
|
5 |
import urllib.parse
|
|
|
170 |
},
|
171 |
"10": {
|
172 |
"inputs": {
|
173 |
+
"vae_name": "ae.safetensors"
|
174 |
},
|
175 |
"class_type": "VAELoader"
|
176 |
},
|
|
|
184 |
},
|
185 |
"12": {
|
186 |
"inputs": {
|
187 |
+
"unet_name": "flux1-dev.safetensors",
|
188 |
"weight_dtype": "default"
|
189 |
},
|
190 |
"class_type": "UNETLoader"
|
|
|
328 |
flux_fp8_clip: Optional[bool] = None
|
329 |
|
330 |
|
331 |
+
async def comfyui_generate_image(
|
332 |
model: str, payload: ImageGenerationPayload, client_id, base_url
|
333 |
):
|
334 |
ws_url = base_url.replace("http://", "ws://").replace("https://", "wss://")
|
|
|
397 |
return None
|
398 |
|
399 |
try:
|
400 |
+
images = await asyncio.to_thread(
|
401 |
+
get_images, ws, comfyui_prompt, client_id, base_url
|
402 |
+
)
|
403 |
except Exception as e:
|
404 |
log.exception(f"Error while receiving images: {e}")
|
405 |
images = None
|
backend/apps/ollama/main.py
CHANGED
@@ -1,47 +1,36 @@
|
|
1 |
from fastapi import (
|
2 |
FastAPI,
|
3 |
Request,
|
4 |
-
Response,
|
5 |
HTTPException,
|
6 |
Depends,
|
7 |
-
status,
|
8 |
UploadFile,
|
9 |
File,
|
10 |
-
BackgroundTasks,
|
11 |
)
|
12 |
from fastapi.middleware.cors import CORSMiddleware
|
13 |
from fastapi.responses import StreamingResponse
|
14 |
-
from fastapi.concurrency import run_in_threadpool
|
15 |
|
16 |
from pydantic import BaseModel, ConfigDict
|
17 |
|
18 |
import os
|
19 |
import re
|
20 |
-
import copy
|
21 |
import random
|
22 |
import requests
|
23 |
import json
|
24 |
-
import uuid
|
25 |
import aiohttp
|
26 |
import asyncio
|
27 |
import logging
|
28 |
import time
|
29 |
from urllib.parse import urlparse
|
30 |
-
from typing import Optional,
|
31 |
|
32 |
from starlette.background import BackgroundTask
|
33 |
|
34 |
from apps.webui.models.models import Models
|
35 |
-
from apps.webui.models.users import Users
|
36 |
from constants import ERROR_MESSAGES
|
37 |
from utils.utils import (
|
38 |
-
decode_token,
|
39 |
-
get_current_user,
|
40 |
get_verified_user,
|
41 |
get_admin_user,
|
42 |
)
|
43 |
-
from utils.task import prompt_template
|
44 |
-
|
45 |
|
46 |
from config import (
|
47 |
SRC_LOG_LEVELS,
|
@@ -53,7 +42,12 @@ from config import (
|
|
53 |
UPLOAD_DIR,
|
54 |
AppConfig,
|
55 |
)
|
56 |
-
from utils.misc import
|
|
|
|
|
|
|
|
|
|
|
57 |
|
58 |
log = logging.getLogger(__name__)
|
59 |
log.setLevel(SRC_LOG_LEVELS["OLLAMA"])
|
@@ -120,7 +114,7 @@ async def get_ollama_api_urls(user=Depends(get_admin_user)):
|
|
120 |
|
121 |
|
122 |
class UrlUpdateForm(BaseModel):
|
123 |
-
urls:
|
124 |
|
125 |
|
126 |
@app.post("/urls/update")
|
@@ -183,7 +177,7 @@ async def post_streaming_url(url: str, payload: str, stream: bool = True):
|
|
183 |
res = await r.json()
|
184 |
if "error" in res:
|
185 |
error_detail = f"Ollama: {res['error']}"
|
186 |
-
except:
|
187 |
error_detail = f"Ollama: {e}"
|
188 |
|
189 |
raise HTTPException(
|
@@ -238,7 +232,7 @@ async def get_all_models():
|
|
238 |
async def get_ollama_tags(
|
239 |
url_idx: Optional[int] = None, user=Depends(get_verified_user)
|
240 |
):
|
241 |
-
if url_idx
|
242 |
models = await get_all_models()
|
243 |
|
244 |
if app.state.config.ENABLE_MODEL_FILTER:
|
@@ -269,7 +263,7 @@ async def get_ollama_tags(
|
|
269 |
res = r.json()
|
270 |
if "error" in res:
|
271 |
error_detail = f"Ollama: {res['error']}"
|
272 |
-
except:
|
273 |
error_detail = f"Ollama: {e}"
|
274 |
|
275 |
raise HTTPException(
|
@@ -282,8 +276,7 @@ async def get_ollama_tags(
|
|
282 |
@app.get("/api/version/{url_idx}")
|
283 |
async def get_ollama_versions(url_idx: Optional[int] = None):
|
284 |
if app.state.config.ENABLE_OLLAMA_API:
|
285 |
-
if url_idx
|
286 |
-
|
287 |
# returns lowest version
|
288 |
tasks = [
|
289 |
fetch_url(f"{url}/api/version")
|
@@ -323,7 +316,7 @@ async def get_ollama_versions(url_idx: Optional[int] = None):
|
|
323 |
res = r.json()
|
324 |
if "error" in res:
|
325 |
error_detail = f"Ollama: {res['error']}"
|
326 |
-
except:
|
327 |
error_detail = f"Ollama: {e}"
|
328 |
|
329 |
raise HTTPException(
|
@@ -346,8 +339,6 @@ async def pull_model(
|
|
346 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
347 |
log.info(f"url: {url}")
|
348 |
|
349 |
-
r = None
|
350 |
-
|
351 |
# Admin should be able to pull models from any source
|
352 |
payload = {**form_data.model_dump(exclude_none=True), "insecure": True}
|
353 |
|
@@ -367,7 +358,7 @@ async def push_model(
|
|
367 |
url_idx: Optional[int] = None,
|
368 |
user=Depends(get_admin_user),
|
369 |
):
|
370 |
-
if url_idx
|
371 |
if form_data.name in app.state.MODELS:
|
372 |
url_idx = app.state.MODELS[form_data.name]["urls"][0]
|
373 |
else:
|
@@ -417,7 +408,7 @@ async def copy_model(
|
|
417 |
url_idx: Optional[int] = None,
|
418 |
user=Depends(get_admin_user),
|
419 |
):
|
420 |
-
if url_idx
|
421 |
if form_data.source in app.state.MODELS:
|
422 |
url_idx = app.state.MODELS[form_data.source]["urls"][0]
|
423 |
else:
|
@@ -428,13 +419,13 @@ async def copy_model(
|
|
428 |
|
429 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
430 |
log.info(f"url: {url}")
|
|
|
|
|
|
|
|
|
|
|
431 |
|
432 |
try:
|
433 |
-
r = requests.request(
|
434 |
-
method="POST",
|
435 |
-
url=f"{url}/api/copy",
|
436 |
-
data=form_data.model_dump_json(exclude_none=True).encode(),
|
437 |
-
)
|
438 |
r.raise_for_status()
|
439 |
|
440 |
log.debug(f"r.text: {r.text}")
|
@@ -448,7 +439,7 @@ async def copy_model(
|
|
448 |
res = r.json()
|
449 |
if "error" in res:
|
450 |
error_detail = f"Ollama: {res['error']}"
|
451 |
-
except:
|
452 |
error_detail = f"Ollama: {e}"
|
453 |
|
454 |
raise HTTPException(
|
@@ -464,7 +455,7 @@ async def delete_model(
|
|
464 |
url_idx: Optional[int] = None,
|
465 |
user=Depends(get_admin_user),
|
466 |
):
|
467 |
-
if url_idx
|
468 |
if form_data.name in app.state.MODELS:
|
469 |
url_idx = app.state.MODELS[form_data.name]["urls"][0]
|
470 |
else:
|
@@ -476,12 +467,12 @@ async def delete_model(
|
|
476 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
477 |
log.info(f"url: {url}")
|
478 |
|
|
|
|
|
|
|
|
|
|
|
479 |
try:
|
480 |
-
r = requests.request(
|
481 |
-
method="DELETE",
|
482 |
-
url=f"{url}/api/delete",
|
483 |
-
data=form_data.model_dump_json(exclude_none=True).encode(),
|
484 |
-
)
|
485 |
r.raise_for_status()
|
486 |
|
487 |
log.debug(f"r.text: {r.text}")
|
@@ -495,7 +486,7 @@ async def delete_model(
|
|
495 |
res = r.json()
|
496 |
if "error" in res:
|
497 |
error_detail = f"Ollama: {res['error']}"
|
498 |
-
except:
|
499 |
error_detail = f"Ollama: {e}"
|
500 |
|
501 |
raise HTTPException(
|
@@ -516,12 +507,12 @@ async def show_model_info(form_data: ModelNameForm, user=Depends(get_verified_us
|
|
516 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
517 |
log.info(f"url: {url}")
|
518 |
|
|
|
|
|
|
|
|
|
|
|
519 |
try:
|
520 |
-
r = requests.request(
|
521 |
-
method="POST",
|
522 |
-
url=f"{url}/api/show",
|
523 |
-
data=form_data.model_dump_json(exclude_none=True).encode(),
|
524 |
-
)
|
525 |
r.raise_for_status()
|
526 |
|
527 |
return r.json()
|
@@ -533,7 +524,7 @@ async def show_model_info(form_data: ModelNameForm, user=Depends(get_verified_us
|
|
533 |
res = r.json()
|
534 |
if "error" in res:
|
535 |
error_detail = f"Ollama: {res['error']}"
|
536 |
-
except:
|
537 |
error_detail = f"Ollama: {e}"
|
538 |
|
539 |
raise HTTPException(
|
@@ -556,7 +547,7 @@ async def generate_embeddings(
|
|
556 |
url_idx: Optional[int] = None,
|
557 |
user=Depends(get_verified_user),
|
558 |
):
|
559 |
-
if url_idx
|
560 |
model = form_data.model
|
561 |
|
562 |
if ":" not in model:
|
@@ -573,12 +564,12 @@ async def generate_embeddings(
|
|
573 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
574 |
log.info(f"url: {url}")
|
575 |
|
|
|
|
|
|
|
|
|
|
|
576 |
try:
|
577 |
-
r = requests.request(
|
578 |
-
method="POST",
|
579 |
-
url=f"{url}/api/embeddings",
|
580 |
-
data=form_data.model_dump_json(exclude_none=True).encode(),
|
581 |
-
)
|
582 |
r.raise_for_status()
|
583 |
|
584 |
return r.json()
|
@@ -590,7 +581,7 @@ async def generate_embeddings(
|
|
590 |
res = r.json()
|
591 |
if "error" in res:
|
592 |
error_detail = f"Ollama: {res['error']}"
|
593 |
-
except:
|
594 |
error_detail = f"Ollama: {e}"
|
595 |
|
596 |
raise HTTPException(
|
@@ -603,10 +594,9 @@ def generate_ollama_embeddings(
|
|
603 |
form_data: GenerateEmbeddingsForm,
|
604 |
url_idx: Optional[int] = None,
|
605 |
):
|
606 |
-
|
607 |
log.info(f"generate_ollama_embeddings {form_data}")
|
608 |
|
609 |
-
if url_idx
|
610 |
model = form_data.model
|
611 |
|
612 |
if ":" not in model:
|
@@ -623,12 +613,12 @@ def generate_ollama_embeddings(
|
|
623 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
624 |
log.info(f"url: {url}")
|
625 |
|
|
|
|
|
|
|
|
|
|
|
626 |
try:
|
627 |
-
r = requests.request(
|
628 |
-
method="POST",
|
629 |
-
url=f"{url}/api/embeddings",
|
630 |
-
data=form_data.model_dump_json(exclude_none=True).encode(),
|
631 |
-
)
|
632 |
r.raise_for_status()
|
633 |
|
634 |
data = r.json()
|
@@ -638,7 +628,7 @@ def generate_ollama_embeddings(
|
|
638 |
if "embedding" in data:
|
639 |
return data["embedding"]
|
640 |
else:
|
641 |
-
raise "Something went wrong :/"
|
642 |
except Exception as e:
|
643 |
log.exception(e)
|
644 |
error_detail = "Open WebUI: Server Connection Error"
|
@@ -647,16 +637,16 @@ def generate_ollama_embeddings(
|
|
647 |
res = r.json()
|
648 |
if "error" in res:
|
649 |
error_detail = f"Ollama: {res['error']}"
|
650 |
-
except:
|
651 |
error_detail = f"Ollama: {e}"
|
652 |
|
653 |
-
raise error_detail
|
654 |
|
655 |
|
656 |
class GenerateCompletionForm(BaseModel):
|
657 |
model: str
|
658 |
prompt: str
|
659 |
-
images: Optional[
|
660 |
format: Optional[str] = None
|
661 |
options: Optional[dict] = None
|
662 |
system: Optional[str] = None
|
@@ -674,8 +664,7 @@ async def generate_completion(
|
|
674 |
url_idx: Optional[int] = None,
|
675 |
user=Depends(get_verified_user),
|
676 |
):
|
677 |
-
|
678 |
-
if url_idx == None:
|
679 |
model = form_data.model
|
680 |
|
681 |
if ":" not in model:
|
@@ -700,12 +689,12 @@ async def generate_completion(
|
|
700 |
class ChatMessage(BaseModel):
|
701 |
role: str
|
702 |
content: str
|
703 |
-
images: Optional[
|
704 |
|
705 |
|
706 |
class GenerateChatCompletionForm(BaseModel):
|
707 |
model: str
|
708 |
-
messages:
|
709 |
format: Optional[str] = None
|
710 |
options: Optional[dict] = None
|
711 |
template: Optional[str] = None
|
@@ -713,6 +702,18 @@ class GenerateChatCompletionForm(BaseModel):
|
|
713 |
keep_alive: Optional[Union[int, str]] = None
|
714 |
|
715 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
716 |
@app.post("/api/chat")
|
717 |
@app.post("/api/chat/{url_idx}")
|
718 |
async def generate_chat_completion(
|
@@ -720,12 +721,7 @@ async def generate_chat_completion(
|
|
720 |
url_idx: Optional[int] = None,
|
721 |
user=Depends(get_verified_user),
|
722 |
):
|
723 |
-
|
724 |
-
log.debug(
|
725 |
-
"form_data.model_dump_json(exclude_none=True).encode(): {0} ".format(
|
726 |
-
form_data.model_dump_json(exclude_none=True).encode()
|
727 |
-
)
|
728 |
-
)
|
729 |
|
730 |
payload = {
|
731 |
**form_data.model_dump(exclude_none=True, exclude=["metadata"]),
|
@@ -740,185 +736,21 @@ async def generate_chat_completion(
|
|
740 |
if model_info.base_model_id:
|
741 |
payload["model"] = model_info.base_model_id
|
742 |
|
743 |
-
|
744 |
|
745 |
-
if
|
746 |
if payload.get("options") is None:
|
747 |
payload["options"] = {}
|
748 |
|
749 |
-
|
750 |
-
|
751 |
-
and payload["options"].get("mirostat") is None
|
752 |
-
):
|
753 |
-
payload["options"]["mirostat"] = model_info.params.get("mirostat", None)
|
754 |
-
|
755 |
-
if (
|
756 |
-
model_info.params.get("mirostat_eta", None)
|
757 |
-
and payload["options"].get("mirostat_eta") is None
|
758 |
-
):
|
759 |
-
payload["options"]["mirostat_eta"] = model_info.params.get(
|
760 |
-
"mirostat_eta", None
|
761 |
-
)
|
762 |
-
|
763 |
-
if (
|
764 |
-
model_info.params.get("mirostat_tau", None)
|
765 |
-
and payload["options"].get("mirostat_tau") is None
|
766 |
-
):
|
767 |
-
payload["options"]["mirostat_tau"] = model_info.params.get(
|
768 |
-
"mirostat_tau", None
|
769 |
-
)
|
770 |
-
|
771 |
-
if (
|
772 |
-
model_info.params.get("num_ctx", None)
|
773 |
-
and payload["options"].get("num_ctx") is None
|
774 |
-
):
|
775 |
-
payload["options"]["num_ctx"] = model_info.params.get("num_ctx", None)
|
776 |
-
|
777 |
-
if (
|
778 |
-
model_info.params.get("num_batch", None)
|
779 |
-
and payload["options"].get("num_batch") is None
|
780 |
-
):
|
781 |
-
payload["options"]["num_batch"] = model_info.params.get(
|
782 |
-
"num_batch", None
|
783 |
-
)
|
784 |
-
|
785 |
-
if (
|
786 |
-
model_info.params.get("num_keep", None)
|
787 |
-
and payload["options"].get("num_keep") is None
|
788 |
-
):
|
789 |
-
payload["options"]["num_keep"] = model_info.params.get("num_keep", None)
|
790 |
-
|
791 |
-
if (
|
792 |
-
model_info.params.get("repeat_last_n", None)
|
793 |
-
and payload["options"].get("repeat_last_n") is None
|
794 |
-
):
|
795 |
-
payload["options"]["repeat_last_n"] = model_info.params.get(
|
796 |
-
"repeat_last_n", None
|
797 |
-
)
|
798 |
-
|
799 |
-
if (
|
800 |
-
model_info.params.get("frequency_penalty", None)
|
801 |
-
and payload["options"].get("frequency_penalty") is None
|
802 |
-
):
|
803 |
-
payload["options"]["repeat_penalty"] = model_info.params.get(
|
804 |
-
"frequency_penalty", None
|
805 |
-
)
|
806 |
-
|
807 |
-
if (
|
808 |
-
model_info.params.get("temperature", None) is not None
|
809 |
-
and payload["options"].get("temperature") is None
|
810 |
-
):
|
811 |
-
payload["options"]["temperature"] = model_info.params.get(
|
812 |
-
"temperature", None
|
813 |
-
)
|
814 |
-
|
815 |
-
if (
|
816 |
-
model_info.params.get("seed", None) is not None
|
817 |
-
and payload["options"].get("seed") is None
|
818 |
-
):
|
819 |
-
payload["options"]["seed"] = model_info.params.get("seed", None)
|
820 |
-
|
821 |
-
if (
|
822 |
-
model_info.params.get("stop", None)
|
823 |
-
and payload["options"].get("stop") is None
|
824 |
-
):
|
825 |
-
payload["options"]["stop"] = (
|
826 |
-
[
|
827 |
-
bytes(stop, "utf-8").decode("unicode_escape")
|
828 |
-
for stop in model_info.params["stop"]
|
829 |
-
]
|
830 |
-
if model_info.params.get("stop", None)
|
831 |
-
else None
|
832 |
-
)
|
833 |
-
|
834 |
-
if (
|
835 |
-
model_info.params.get("tfs_z", None)
|
836 |
-
and payload["options"].get("tfs_z") is None
|
837 |
-
):
|
838 |
-
payload["options"]["tfs_z"] = model_info.params.get("tfs_z", None)
|
839 |
-
|
840 |
-
if (
|
841 |
-
model_info.params.get("max_tokens", None)
|
842 |
-
and payload["options"].get("max_tokens") is None
|
843 |
-
):
|
844 |
-
payload["options"]["num_predict"] = model_info.params.get(
|
845 |
-
"max_tokens", None
|
846 |
-
)
|
847 |
-
|
848 |
-
if (
|
849 |
-
model_info.params.get("top_k", None)
|
850 |
-
and payload["options"].get("top_k") is None
|
851 |
-
):
|
852 |
-
payload["options"]["top_k"] = model_info.params.get("top_k", None)
|
853 |
-
|
854 |
-
if (
|
855 |
-
model_info.params.get("top_p", None)
|
856 |
-
and payload["options"].get("top_p") is None
|
857 |
-
):
|
858 |
-
payload["options"]["top_p"] = model_info.params.get("top_p", None)
|
859 |
-
|
860 |
-
if (
|
861 |
-
model_info.params.get("min_p", None)
|
862 |
-
and payload["options"].get("min_p") is None
|
863 |
-
):
|
864 |
-
payload["options"]["min_p"] = model_info.params.get("min_p", None)
|
865 |
-
|
866 |
-
if (
|
867 |
-
model_info.params.get("use_mmap", None)
|
868 |
-
and payload["options"].get("use_mmap") is None
|
869 |
-
):
|
870 |
-
payload["options"]["use_mmap"] = model_info.params.get("use_mmap", None)
|
871 |
-
|
872 |
-
if (
|
873 |
-
model_info.params.get("use_mlock", None)
|
874 |
-
and payload["options"].get("use_mlock") is None
|
875 |
-
):
|
876 |
-
payload["options"]["use_mlock"] = model_info.params.get(
|
877 |
-
"use_mlock", None
|
878 |
-
)
|
879 |
-
|
880 |
-
if (
|
881 |
-
model_info.params.get("num_thread", None)
|
882 |
-
and payload["options"].get("num_thread") is None
|
883 |
-
):
|
884 |
-
payload["options"]["num_thread"] = model_info.params.get(
|
885 |
-
"num_thread", None
|
886 |
-
)
|
887 |
-
|
888 |
-
system = model_info.params.get("system", None)
|
889 |
-
if system:
|
890 |
-
system = prompt_template(
|
891 |
-
system,
|
892 |
-
**(
|
893 |
-
{
|
894 |
-
"user_name": user.name,
|
895 |
-
"user_location": (
|
896 |
-
user.info.get("location") if user.info else None
|
897 |
-
),
|
898 |
-
}
|
899 |
-
if user
|
900 |
-
else {}
|
901 |
-
),
|
902 |
)
|
|
|
903 |
|
904 |
-
|
905 |
-
|
906 |
-
system, payload["messages"]
|
907 |
-
)
|
908 |
-
|
909 |
-
if url_idx == None:
|
910 |
-
if ":" not in payload["model"]:
|
911 |
-
payload["model"] = f"{payload['model']}:latest"
|
912 |
-
|
913 |
-
if payload["model"] in app.state.MODELS:
|
914 |
-
url_idx = random.choice(app.state.MODELS[payload["model"]]["urls"])
|
915 |
-
else:
|
916 |
-
raise HTTPException(
|
917 |
-
status_code=400,
|
918 |
-
detail=ERROR_MESSAGES.MODEL_NOT_FOUND(form_data.model),
|
919 |
-
)
|
920 |
|
921 |
-
url =
|
922 |
log.info(f"url: {url}")
|
923 |
log.debug(payload)
|
924 |
|
@@ -940,7 +772,7 @@ class OpenAIChatMessage(BaseModel):
|
|
940 |
|
941 |
class OpenAIChatCompletionForm(BaseModel):
|
942 |
model: str
|
943 |
-
messages:
|
944 |
|
945 |
model_config = ConfigDict(extra="allow")
|
946 |
|
@@ -952,83 +784,28 @@ async def generate_openai_chat_completion(
|
|
952 |
url_idx: Optional[int] = None,
|
953 |
user=Depends(get_verified_user),
|
954 |
):
|
955 |
-
|
956 |
-
payload = {**
|
957 |
-
|
958 |
if "metadata" in payload:
|
959 |
del payload["metadata"]
|
960 |
|
961 |
-
model_id =
|
962 |
model_info = Models.get_model_by_id(model_id)
|
963 |
|
964 |
if model_info:
|
965 |
if model_info.base_model_id:
|
966 |
payload["model"] = model_info.base_model_id
|
967 |
|
968 |
-
|
969 |
|
970 |
-
if
|
971 |
-
payload
|
972 |
-
payload
|
973 |
-
payload["max_tokens"] = model_info.params.get("max_tokens", None)
|
974 |
-
payload["frequency_penalty"] = model_info.params.get(
|
975 |
-
"frequency_penalty", None
|
976 |
-
)
|
977 |
-
payload["seed"] = model_info.params.get("seed", None)
|
978 |
-
payload["stop"] = (
|
979 |
-
[
|
980 |
-
bytes(stop, "utf-8").decode("unicode_escape")
|
981 |
-
for stop in model_info.params["stop"]
|
982 |
-
]
|
983 |
-
if model_info.params.get("stop", None)
|
984 |
-
else None
|
985 |
-
)
|
986 |
|
987 |
-
|
988 |
-
|
989 |
-
if system:
|
990 |
-
system = prompt_template(
|
991 |
-
system,
|
992 |
-
**(
|
993 |
-
{
|
994 |
-
"user_name": user.name,
|
995 |
-
"user_location": (
|
996 |
-
user.info.get("location") if user.info else None
|
997 |
-
),
|
998 |
-
}
|
999 |
-
if user
|
1000 |
-
else {}
|
1001 |
-
),
|
1002 |
-
)
|
1003 |
-
# Check if the payload already has a system message
|
1004 |
-
# If not, add a system message to the payload
|
1005 |
-
if payload.get("messages"):
|
1006 |
-
for message in payload["messages"]:
|
1007 |
-
if message.get("role") == "system":
|
1008 |
-
message["content"] = system + message["content"]
|
1009 |
-
break
|
1010 |
-
else:
|
1011 |
-
payload["messages"].insert(
|
1012 |
-
0,
|
1013 |
-
{
|
1014 |
-
"role": "system",
|
1015 |
-
"content": system,
|
1016 |
-
},
|
1017 |
-
)
|
1018 |
|
1019 |
-
|
1020 |
-
if ":" not in payload["model"]:
|
1021 |
-
payload["model"] = f"{payload['model']}:latest"
|
1022 |
-
|
1023 |
-
if payload["model"] in app.state.MODELS:
|
1024 |
-
url_idx = random.choice(app.state.MODELS[payload["model"]]["urls"])
|
1025 |
-
else:
|
1026 |
-
raise HTTPException(
|
1027 |
-
status_code=400,
|
1028 |
-
detail=ERROR_MESSAGES.MODEL_NOT_FOUND(form_data.model),
|
1029 |
-
)
|
1030 |
-
|
1031 |
-
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
1032 |
log.info(f"url: {url}")
|
1033 |
|
1034 |
return await post_streaming_url(
|
@@ -1044,7 +821,7 @@ async def get_openai_models(
|
|
1044 |
url_idx: Optional[int] = None,
|
1045 |
user=Depends(get_verified_user),
|
1046 |
):
|
1047 |
-
if url_idx
|
1048 |
models = await get_all_models()
|
1049 |
|
1050 |
if app.state.config.ENABLE_MODEL_FILTER:
|
@@ -1099,7 +876,7 @@ async def get_openai_models(
|
|
1099 |
res = r.json()
|
1100 |
if "error" in res:
|
1101 |
error_detail = f"Ollama: {res['error']}"
|
1102 |
-
except:
|
1103 |
error_detail = f"Ollama: {e}"
|
1104 |
|
1105 |
raise HTTPException(
|
@@ -1125,7 +902,6 @@ def parse_huggingface_url(hf_url):
|
|
1125 |
path_components = parsed_url.path.split("/")
|
1126 |
|
1127 |
# Extract the desired output
|
1128 |
-
user_repo = "/".join(path_components[1:3])
|
1129 |
model_file = path_components[-1]
|
1130 |
|
1131 |
return model_file
|
@@ -1190,7 +966,6 @@ async def download_model(
|
|
1190 |
url_idx: Optional[int] = None,
|
1191 |
user=Depends(get_admin_user),
|
1192 |
):
|
1193 |
-
|
1194 |
allowed_hosts = ["https://huggingface.co/", "https://github.com/"]
|
1195 |
|
1196 |
if not any(form_data.url.startswith(host) for host in allowed_hosts):
|
@@ -1199,7 +974,7 @@ async def download_model(
|
|
1199 |
detail="Invalid file_url. Only URLs from allowed hosts are permitted.",
|
1200 |
)
|
1201 |
|
1202 |
-
if url_idx
|
1203 |
url_idx = 0
|
1204 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
1205 |
|
@@ -1222,7 +997,7 @@ def upload_model(
|
|
1222 |
url_idx: Optional[int] = None,
|
1223 |
user=Depends(get_admin_user),
|
1224 |
):
|
1225 |
-
if url_idx
|
1226 |
url_idx = 0
|
1227 |
ollama_url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
1228 |
|
|
|
1 |
from fastapi import (
|
2 |
FastAPI,
|
3 |
Request,
|
|
|
4 |
HTTPException,
|
5 |
Depends,
|
|
|
6 |
UploadFile,
|
7 |
File,
|
|
|
8 |
)
|
9 |
from fastapi.middleware.cors import CORSMiddleware
|
10 |
from fastapi.responses import StreamingResponse
|
|
|
11 |
|
12 |
from pydantic import BaseModel, ConfigDict
|
13 |
|
14 |
import os
|
15 |
import re
|
|
|
16 |
import random
|
17 |
import requests
|
18 |
import json
|
|
|
19 |
import aiohttp
|
20 |
import asyncio
|
21 |
import logging
|
22 |
import time
|
23 |
from urllib.parse import urlparse
|
24 |
+
from typing import Optional, Union
|
25 |
|
26 |
from starlette.background import BackgroundTask
|
27 |
|
28 |
from apps.webui.models.models import Models
|
|
|
29 |
from constants import ERROR_MESSAGES
|
30 |
from utils.utils import (
|
|
|
|
|
31 |
get_verified_user,
|
32 |
get_admin_user,
|
33 |
)
|
|
|
|
|
34 |
|
35 |
from config import (
|
36 |
SRC_LOG_LEVELS,
|
|
|
42 |
UPLOAD_DIR,
|
43 |
AppConfig,
|
44 |
)
|
45 |
+
from utils.misc import (
|
46 |
+
calculate_sha256,
|
47 |
+
apply_model_params_to_body_ollama,
|
48 |
+
apply_model_params_to_body_openai,
|
49 |
+
apply_model_system_prompt_to_body,
|
50 |
+
)
|
51 |
|
52 |
log = logging.getLogger(__name__)
|
53 |
log.setLevel(SRC_LOG_LEVELS["OLLAMA"])
|
|
|
114 |
|
115 |
|
116 |
class UrlUpdateForm(BaseModel):
|
117 |
+
urls: list[str]
|
118 |
|
119 |
|
120 |
@app.post("/urls/update")
|
|
|
177 |
res = await r.json()
|
178 |
if "error" in res:
|
179 |
error_detail = f"Ollama: {res['error']}"
|
180 |
+
except Exception:
|
181 |
error_detail = f"Ollama: {e}"
|
182 |
|
183 |
raise HTTPException(
|
|
|
232 |
async def get_ollama_tags(
|
233 |
url_idx: Optional[int] = None, user=Depends(get_verified_user)
|
234 |
):
|
235 |
+
if url_idx is None:
|
236 |
models = await get_all_models()
|
237 |
|
238 |
if app.state.config.ENABLE_MODEL_FILTER:
|
|
|
263 |
res = r.json()
|
264 |
if "error" in res:
|
265 |
error_detail = f"Ollama: {res['error']}"
|
266 |
+
except Exception:
|
267 |
error_detail = f"Ollama: {e}"
|
268 |
|
269 |
raise HTTPException(
|
|
|
276 |
@app.get("/api/version/{url_idx}")
|
277 |
async def get_ollama_versions(url_idx: Optional[int] = None):
|
278 |
if app.state.config.ENABLE_OLLAMA_API:
|
279 |
+
if url_idx is None:
|
|
|
280 |
# returns lowest version
|
281 |
tasks = [
|
282 |
fetch_url(f"{url}/api/version")
|
|
|
316 |
res = r.json()
|
317 |
if "error" in res:
|
318 |
error_detail = f"Ollama: {res['error']}"
|
319 |
+
except Exception:
|
320 |
error_detail = f"Ollama: {e}"
|
321 |
|
322 |
raise HTTPException(
|
|
|
339 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
340 |
log.info(f"url: {url}")
|
341 |
|
|
|
|
|
342 |
# Admin should be able to pull models from any source
|
343 |
payload = {**form_data.model_dump(exclude_none=True), "insecure": True}
|
344 |
|
|
|
358 |
url_idx: Optional[int] = None,
|
359 |
user=Depends(get_admin_user),
|
360 |
):
|
361 |
+
if url_idx is None:
|
362 |
if form_data.name in app.state.MODELS:
|
363 |
url_idx = app.state.MODELS[form_data.name]["urls"][0]
|
364 |
else:
|
|
|
408 |
url_idx: Optional[int] = None,
|
409 |
user=Depends(get_admin_user),
|
410 |
):
|
411 |
+
if url_idx is None:
|
412 |
if form_data.source in app.state.MODELS:
|
413 |
url_idx = app.state.MODELS[form_data.source]["urls"][0]
|
414 |
else:
|
|
|
419 |
|
420 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
421 |
log.info(f"url: {url}")
|
422 |
+
r = requests.request(
|
423 |
+
method="POST",
|
424 |
+
url=f"{url}/api/copy",
|
425 |
+
data=form_data.model_dump_json(exclude_none=True).encode(),
|
426 |
+
)
|
427 |
|
428 |
try:
|
|
|
|
|
|
|
|
|
|
|
429 |
r.raise_for_status()
|
430 |
|
431 |
log.debug(f"r.text: {r.text}")
|
|
|
439 |
res = r.json()
|
440 |
if "error" in res:
|
441 |
error_detail = f"Ollama: {res['error']}"
|
442 |
+
except Exception:
|
443 |
error_detail = f"Ollama: {e}"
|
444 |
|
445 |
raise HTTPException(
|
|
|
455 |
url_idx: Optional[int] = None,
|
456 |
user=Depends(get_admin_user),
|
457 |
):
|
458 |
+
if url_idx is None:
|
459 |
if form_data.name in app.state.MODELS:
|
460 |
url_idx = app.state.MODELS[form_data.name]["urls"][0]
|
461 |
else:
|
|
|
467 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
468 |
log.info(f"url: {url}")
|
469 |
|
470 |
+
r = requests.request(
|
471 |
+
method="DELETE",
|
472 |
+
url=f"{url}/api/delete",
|
473 |
+
data=form_data.model_dump_json(exclude_none=True).encode(),
|
474 |
+
)
|
475 |
try:
|
|
|
|
|
|
|
|
|
|
|
476 |
r.raise_for_status()
|
477 |
|
478 |
log.debug(f"r.text: {r.text}")
|
|
|
486 |
res = r.json()
|
487 |
if "error" in res:
|
488 |
error_detail = f"Ollama: {res['error']}"
|
489 |
+
except Exception:
|
490 |
error_detail = f"Ollama: {e}"
|
491 |
|
492 |
raise HTTPException(
|
|
|
507 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
508 |
log.info(f"url: {url}")
|
509 |
|
510 |
+
r = requests.request(
|
511 |
+
method="POST",
|
512 |
+
url=f"{url}/api/show",
|
513 |
+
data=form_data.model_dump_json(exclude_none=True).encode(),
|
514 |
+
)
|
515 |
try:
|
|
|
|
|
|
|
|
|
|
|
516 |
r.raise_for_status()
|
517 |
|
518 |
return r.json()
|
|
|
524 |
res = r.json()
|
525 |
if "error" in res:
|
526 |
error_detail = f"Ollama: {res['error']}"
|
527 |
+
except Exception:
|
528 |
error_detail = f"Ollama: {e}"
|
529 |
|
530 |
raise HTTPException(
|
|
|
547 |
url_idx: Optional[int] = None,
|
548 |
user=Depends(get_verified_user),
|
549 |
):
|
550 |
+
if url_idx is None:
|
551 |
model = form_data.model
|
552 |
|
553 |
if ":" not in model:
|
|
|
564 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
565 |
log.info(f"url: {url}")
|
566 |
|
567 |
+
r = requests.request(
|
568 |
+
method="POST",
|
569 |
+
url=f"{url}/api/embeddings",
|
570 |
+
data=form_data.model_dump_json(exclude_none=True).encode(),
|
571 |
+
)
|
572 |
try:
|
|
|
|
|
|
|
|
|
|
|
573 |
r.raise_for_status()
|
574 |
|
575 |
return r.json()
|
|
|
581 |
res = r.json()
|
582 |
if "error" in res:
|
583 |
error_detail = f"Ollama: {res['error']}"
|
584 |
+
except Exception:
|
585 |
error_detail = f"Ollama: {e}"
|
586 |
|
587 |
raise HTTPException(
|
|
|
594 |
form_data: GenerateEmbeddingsForm,
|
595 |
url_idx: Optional[int] = None,
|
596 |
):
|
|
|
597 |
log.info(f"generate_ollama_embeddings {form_data}")
|
598 |
|
599 |
+
if url_idx is None:
|
600 |
model = form_data.model
|
601 |
|
602 |
if ":" not in model:
|
|
|
613 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
614 |
log.info(f"url: {url}")
|
615 |
|
616 |
+
r = requests.request(
|
617 |
+
method="POST",
|
618 |
+
url=f"{url}/api/embeddings",
|
619 |
+
data=form_data.model_dump_json(exclude_none=True).encode(),
|
620 |
+
)
|
621 |
try:
|
|
|
|
|
|
|
|
|
|
|
622 |
r.raise_for_status()
|
623 |
|
624 |
data = r.json()
|
|
|
628 |
if "embedding" in data:
|
629 |
return data["embedding"]
|
630 |
else:
|
631 |
+
raise Exception("Something went wrong :/")
|
632 |
except Exception as e:
|
633 |
log.exception(e)
|
634 |
error_detail = "Open WebUI: Server Connection Error"
|
|
|
637 |
res = r.json()
|
638 |
if "error" in res:
|
639 |
error_detail = f"Ollama: {res['error']}"
|
640 |
+
except Exception:
|
641 |
error_detail = f"Ollama: {e}"
|
642 |
|
643 |
+
raise Exception(error_detail)
|
644 |
|
645 |
|
646 |
class GenerateCompletionForm(BaseModel):
|
647 |
model: str
|
648 |
prompt: str
|
649 |
+
images: Optional[list[str]] = None
|
650 |
format: Optional[str] = None
|
651 |
options: Optional[dict] = None
|
652 |
system: Optional[str] = None
|
|
|
664 |
url_idx: Optional[int] = None,
|
665 |
user=Depends(get_verified_user),
|
666 |
):
|
667 |
+
if url_idx is None:
|
|
|
668 |
model = form_data.model
|
669 |
|
670 |
if ":" not in model:
|
|
|
689 |
class ChatMessage(BaseModel):
|
690 |
role: str
|
691 |
content: str
|
692 |
+
images: Optional[list[str]] = None
|
693 |
|
694 |
|
695 |
class GenerateChatCompletionForm(BaseModel):
|
696 |
model: str
|
697 |
+
messages: list[ChatMessage]
|
698 |
format: Optional[str] = None
|
699 |
options: Optional[dict] = None
|
700 |
template: Optional[str] = None
|
|
|
702 |
keep_alive: Optional[Union[int, str]] = None
|
703 |
|
704 |
|
705 |
+
def get_ollama_url(url_idx: Optional[int], model: str):
|
706 |
+
if url_idx is None:
|
707 |
+
if model not in app.state.MODELS:
|
708 |
+
raise HTTPException(
|
709 |
+
status_code=400,
|
710 |
+
detail=ERROR_MESSAGES.MODEL_NOT_FOUND(model),
|
711 |
+
)
|
712 |
+
url_idx = random.choice(app.state.MODELS[model]["urls"])
|
713 |
+
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
714 |
+
return url
|
715 |
+
|
716 |
+
|
717 |
@app.post("/api/chat")
|
718 |
@app.post("/api/chat/{url_idx}")
|
719 |
async def generate_chat_completion(
|
|
|
721 |
url_idx: Optional[int] = None,
|
722 |
user=Depends(get_verified_user),
|
723 |
):
|
724 |
+
log.debug(f"{form_data.model_dump_json(exclude_none=True).encode()}=")
|
|
|
|
|
|
|
|
|
|
|
725 |
|
726 |
payload = {
|
727 |
**form_data.model_dump(exclude_none=True, exclude=["metadata"]),
|
|
|
736 |
if model_info.base_model_id:
|
737 |
payload["model"] = model_info.base_model_id
|
738 |
|
739 |
+
params = model_info.params.model_dump()
|
740 |
|
741 |
+
if params:
|
742 |
if payload.get("options") is None:
|
743 |
payload["options"] = {}
|
744 |
|
745 |
+
payload["options"] = apply_model_params_to_body_ollama(
|
746 |
+
params, payload["options"]
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
747 |
)
|
748 |
+
payload = apply_model_system_prompt_to_body(params, payload, user)
|
749 |
|
750 |
+
if ":" not in payload["model"]:
|
751 |
+
payload["model"] = f"{payload['model']}:latest"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
752 |
|
753 |
+
url = get_ollama_url(url_idx, payload["model"])
|
754 |
log.info(f"url: {url}")
|
755 |
log.debug(payload)
|
756 |
|
|
|
772 |
|
773 |
class OpenAIChatCompletionForm(BaseModel):
|
774 |
model: str
|
775 |
+
messages: list[OpenAIChatMessage]
|
776 |
|
777 |
model_config = ConfigDict(extra="allow")
|
778 |
|
|
|
784 |
url_idx: Optional[int] = None,
|
785 |
user=Depends(get_verified_user),
|
786 |
):
|
787 |
+
completion_form = OpenAIChatCompletionForm(**form_data)
|
788 |
+
payload = {**completion_form.model_dump(exclude_none=True, exclude=["metadata"])}
|
|
|
789 |
if "metadata" in payload:
|
790 |
del payload["metadata"]
|
791 |
|
792 |
+
model_id = completion_form.model
|
793 |
model_info = Models.get_model_by_id(model_id)
|
794 |
|
795 |
if model_info:
|
796 |
if model_info.base_model_id:
|
797 |
payload["model"] = model_info.base_model_id
|
798 |
|
799 |
+
params = model_info.params.model_dump()
|
800 |
|
801 |
+
if params:
|
802 |
+
payload = apply_model_params_to_body_openai(params, payload)
|
803 |
+
payload = apply_model_system_prompt_to_body(params, payload, user)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
804 |
|
805 |
+
if ":" not in payload["model"]:
|
806 |
+
payload["model"] = f"{payload['model']}:latest"
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
807 |
|
808 |
+
url = get_ollama_url(url_idx, payload["model"])
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
809 |
log.info(f"url: {url}")
|
810 |
|
811 |
return await post_streaming_url(
|
|
|
821 |
url_idx: Optional[int] = None,
|
822 |
user=Depends(get_verified_user),
|
823 |
):
|
824 |
+
if url_idx is None:
|
825 |
models = await get_all_models()
|
826 |
|
827 |
if app.state.config.ENABLE_MODEL_FILTER:
|
|
|
876 |
res = r.json()
|
877 |
if "error" in res:
|
878 |
error_detail = f"Ollama: {res['error']}"
|
879 |
+
except Exception:
|
880 |
error_detail = f"Ollama: {e}"
|
881 |
|
882 |
raise HTTPException(
|
|
|
902 |
path_components = parsed_url.path.split("/")
|
903 |
|
904 |
# Extract the desired output
|
|
|
905 |
model_file = path_components[-1]
|
906 |
|
907 |
return model_file
|
|
|
966 |
url_idx: Optional[int] = None,
|
967 |
user=Depends(get_admin_user),
|
968 |
):
|
|
|
969 |
allowed_hosts = ["https://huggingface.co/", "https://github.com/"]
|
970 |
|
971 |
if not any(form_data.url.startswith(host) for host in allowed_hosts):
|
|
|
974 |
detail="Invalid file_url. Only URLs from allowed hosts are permitted.",
|
975 |
)
|
976 |
|
977 |
+
if url_idx is None:
|
978 |
url_idx = 0
|
979 |
url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
980 |
|
|
|
997 |
url_idx: Optional[int] = None,
|
998 |
user=Depends(get_admin_user),
|
999 |
):
|
1000 |
+
if url_idx is None:
|
1001 |
url_idx = 0
|
1002 |
ollama_url = app.state.config.OLLAMA_BASE_URLS[url_idx]
|
1003 |
|
backend/apps/openai/main.py
CHANGED
@@ -17,7 +17,10 @@ from utils.utils import (
|
|
17 |
get_verified_user,
|
18 |
get_admin_user,
|
19 |
)
|
20 |
-
from utils.misc import
|
|
|
|
|
|
|
21 |
|
22 |
from config import (
|
23 |
SRC_LOG_LEVELS,
|
@@ -30,7 +33,7 @@ from config import (
|
|
30 |
MODEL_FILTER_LIST,
|
31 |
AppConfig,
|
32 |
)
|
33 |
-
from typing import
|
34 |
|
35 |
|
36 |
import hashlib
|
@@ -86,11 +89,11 @@ async def update_config(form_data: OpenAIConfigForm, user=Depends(get_admin_user
|
|
86 |
|
87 |
|
88 |
class UrlsUpdateForm(BaseModel):
|
89 |
-
urls:
|
90 |
|
91 |
|
92 |
class KeysUpdateForm(BaseModel):
|
93 |
-
keys:
|
94 |
|
95 |
|
96 |
@app.get("/urls")
|
@@ -368,7 +371,7 @@ async def generate_chat_completion(
|
|
368 |
payload["model"] = model_info.base_model_id
|
369 |
|
370 |
params = model_info.params.model_dump()
|
371 |
-
payload =
|
372 |
payload = apply_model_system_prompt_to_body(params, payload, user)
|
373 |
|
374 |
model = app.state.MODELS[payload.get("model")]
|
|
|
17 |
get_verified_user,
|
18 |
get_admin_user,
|
19 |
)
|
20 |
+
from utils.misc import (
|
21 |
+
apply_model_params_to_body_openai,
|
22 |
+
apply_model_system_prompt_to_body,
|
23 |
+
)
|
24 |
|
25 |
from config import (
|
26 |
SRC_LOG_LEVELS,
|
|
|
33 |
MODEL_FILTER_LIST,
|
34 |
AppConfig,
|
35 |
)
|
36 |
+
from typing import Optional, Literal, overload
|
37 |
|
38 |
|
39 |
import hashlib
|
|
|
89 |
|
90 |
|
91 |
class UrlsUpdateForm(BaseModel):
|
92 |
+
urls: list[str]
|
93 |
|
94 |
|
95 |
class KeysUpdateForm(BaseModel):
|
96 |
+
keys: list[str]
|
97 |
|
98 |
|
99 |
@app.get("/urls")
|
|
|
371 |
payload["model"] = model_info.base_model_id
|
372 |
|
373 |
params = model_info.params.model_dump()
|
374 |
+
payload = apply_model_params_to_body_openai(params, payload)
|
375 |
payload = apply_model_system_prompt_to_body(params, payload, user)
|
376 |
|
377 |
model = app.state.MODELS[payload.get("model")]
|
backend/apps/rag/main.py
CHANGED
@@ -13,7 +13,7 @@ import os, shutil, logging, re
|
|
13 |
from datetime import datetime
|
14 |
|
15 |
from pathlib import Path
|
16 |
-
from typing import
|
17 |
|
18 |
from chromadb.utils.batch_utils import create_batches
|
19 |
from langchain_core.documents import Document
|
@@ -376,7 +376,7 @@ async def update_reranking_config(
|
|
376 |
try:
|
377 |
app.state.config.RAG_RERANKING_MODEL = form_data.reranking_model
|
378 |
|
379 |
-
update_reranking_model(app.state.config.RAG_RERANKING_MODEL
|
380 |
|
381 |
return {
|
382 |
"status": True,
|
@@ -439,7 +439,7 @@ class ChunkParamUpdateForm(BaseModel):
|
|
439 |
|
440 |
|
441 |
class YoutubeLoaderConfig(BaseModel):
|
442 |
-
language:
|
443 |
translation: Optional[str] = None
|
444 |
|
445 |
|
@@ -642,7 +642,7 @@ def query_doc_handler(
|
|
642 |
|
643 |
|
644 |
class QueryCollectionsForm(BaseModel):
|
645 |
-
collection_names:
|
646 |
query: str
|
647 |
k: Optional[int] = None
|
648 |
r: Optional[float] = None
|
@@ -1021,7 +1021,7 @@ class TikaLoader:
|
|
1021 |
self.file_path = file_path
|
1022 |
self.mime_type = mime_type
|
1023 |
|
1024 |
-
def load(self) ->
|
1025 |
with open(self.file_path, "rb") as f:
|
1026 |
data = f.read()
|
1027 |
|
@@ -1185,7 +1185,7 @@ def store_doc(
|
|
1185 |
f.close()
|
1186 |
|
1187 |
f = open(file_path, "rb")
|
1188 |
-
if collection_name
|
1189 |
collection_name = calculate_sha256(f)[:63]
|
1190 |
f.close()
|
1191 |
|
@@ -1238,7 +1238,7 @@ def process_doc(
|
|
1238 |
f = open(file_path, "rb")
|
1239 |
|
1240 |
collection_name = form_data.collection_name
|
1241 |
-
if collection_name
|
1242 |
collection_name = calculate_sha256(f)[:63]
|
1243 |
f.close()
|
1244 |
|
@@ -1296,7 +1296,7 @@ def store_text(
|
|
1296 |
):
|
1297 |
|
1298 |
collection_name = form_data.collection_name
|
1299 |
-
if collection_name
|
1300 |
collection_name = calculate_sha256_string(form_data.content)
|
1301 |
|
1302 |
result = store_text_in_vector_db(
|
@@ -1339,7 +1339,7 @@ def scan_docs_dir(user=Depends(get_admin_user)):
|
|
1339 |
sanitized_filename = sanitize_filename(filename)
|
1340 |
doc = Documents.get_doc_by_name(sanitized_filename)
|
1341 |
|
1342 |
-
if doc
|
1343 |
doc = Documents.insert_new_doc(
|
1344 |
user.id,
|
1345 |
DocumentForm(
|
|
|
13 |
from datetime import datetime
|
14 |
|
15 |
from pathlib import Path
|
16 |
+
from typing import Union, Sequence, Iterator, Any
|
17 |
|
18 |
from chromadb.utils.batch_utils import create_batches
|
19 |
from langchain_core.documents import Document
|
|
|
376 |
try:
|
377 |
app.state.config.RAG_RERANKING_MODEL = form_data.reranking_model
|
378 |
|
379 |
+
update_reranking_model(app.state.config.RAG_RERANKING_MODEL, True)
|
380 |
|
381 |
return {
|
382 |
"status": True,
|
|
|
439 |
|
440 |
|
441 |
class YoutubeLoaderConfig(BaseModel):
|
442 |
+
language: list[str]
|
443 |
translation: Optional[str] = None
|
444 |
|
445 |
|
|
|
642 |
|
643 |
|
644 |
class QueryCollectionsForm(BaseModel):
|
645 |
+
collection_names: list[str]
|
646 |
query: str
|
647 |
k: Optional[int] = None
|
648 |
r: Optional[float] = None
|
|
|
1021 |
self.file_path = file_path
|
1022 |
self.mime_type = mime_type
|
1023 |
|
1024 |
+
def load(self) -> list[Document]:
|
1025 |
with open(self.file_path, "rb") as f:
|
1026 |
data = f.read()
|
1027 |
|
|
|
1185 |
f.close()
|
1186 |
|
1187 |
f = open(file_path, "rb")
|
1188 |
+
if collection_name is None:
|
1189 |
collection_name = calculate_sha256(f)[:63]
|
1190 |
f.close()
|
1191 |
|
|
|
1238 |
f = open(file_path, "rb")
|
1239 |
|
1240 |
collection_name = form_data.collection_name
|
1241 |
+
if collection_name is None:
|
1242 |
collection_name = calculate_sha256(f)[:63]
|
1243 |
f.close()
|
1244 |
|
|
|
1296 |
):
|
1297 |
|
1298 |
collection_name = form_data.collection_name
|
1299 |
+
if collection_name is None:
|
1300 |
collection_name = calculate_sha256_string(form_data.content)
|
1301 |
|
1302 |
result = store_text_in_vector_db(
|
|
|
1339 |
sanitized_filename = sanitize_filename(filename)
|
1340 |
doc = Documents.get_doc_by_name(sanitized_filename)
|
1341 |
|
1342 |
+
if doc is None:
|
1343 |
doc = Documents.insert_new_doc(
|
1344 |
user.id,
|
1345 |
DocumentForm(
|
backend/apps/rag/search/brave.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
import logging
|
2 |
-
from typing import
|
3 |
import requests
|
4 |
|
5 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
@@ -10,7 +10,7 @@ log.setLevel(SRC_LOG_LEVELS["RAG"])
|
|
10 |
|
11 |
|
12 |
def search_brave(
|
13 |
-
api_key: str, query: str, count: int, filter_list: Optional[
|
14 |
) -> list[SearchResult]:
|
15 |
"""Search using Brave's Search API and return the results as a list of SearchResult objects.
|
16 |
|
|
|
1 |
import logging
|
2 |
+
from typing import Optional
|
3 |
import requests
|
4 |
|
5 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
|
|
10 |
|
11 |
|
12 |
def search_brave(
|
13 |
+
api_key: str, query: str, count: int, filter_list: Optional[list[str]] = None
|
14 |
) -> list[SearchResult]:
|
15 |
"""Search using Brave's Search API and return the results as a list of SearchResult objects.
|
16 |
|
backend/apps/rag/search/duckduckgo.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
import logging
|
2 |
-
from typing import
|
3 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
4 |
from duckduckgo_search import DDGS
|
5 |
from config import SRC_LOG_LEVELS
|
@@ -9,7 +9,7 @@ log.setLevel(SRC_LOG_LEVELS["RAG"])
|
|
9 |
|
10 |
|
11 |
def search_duckduckgo(
|
12 |
-
query: str, count: int, filter_list: Optional[
|
13 |
) -> list[SearchResult]:
|
14 |
"""
|
15 |
Search using DuckDuckGo's Search API and return the results as a list of SearchResult objects.
|
@@ -18,7 +18,7 @@ def search_duckduckgo(
|
|
18 |
count (int): The number of results to return
|
19 |
|
20 |
Returns:
|
21 |
-
|
22 |
"""
|
23 |
# Use the DDGS context manager to create a DDGS object
|
24 |
with DDGS() as ddgs:
|
|
|
1 |
import logging
|
2 |
+
from typing import Optional
|
3 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
4 |
from duckduckgo_search import DDGS
|
5 |
from config import SRC_LOG_LEVELS
|
|
|
9 |
|
10 |
|
11 |
def search_duckduckgo(
|
12 |
+
query: str, count: int, filter_list: Optional[list[str]] = None
|
13 |
) -> list[SearchResult]:
|
14 |
"""
|
15 |
Search using DuckDuckGo's Search API and return the results as a list of SearchResult objects.
|
|
|
18 |
count (int): The number of results to return
|
19 |
|
20 |
Returns:
|
21 |
+
list[SearchResult]: A list of search results
|
22 |
"""
|
23 |
# Use the DDGS context manager to create a DDGS object
|
24 |
with DDGS() as ddgs:
|
backend/apps/rag/search/google_pse.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
import json
|
2 |
import logging
|
3 |
-
from typing import
|
4 |
import requests
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
@@ -15,7 +15,7 @@ def search_google_pse(
|
|
15 |
search_engine_id: str,
|
16 |
query: str,
|
17 |
count: int,
|
18 |
-
filter_list: Optional[
|
19 |
) -> list[SearchResult]:
|
20 |
"""Search using Google's Programmable Search Engine API and return the results as a list of SearchResult objects.
|
21 |
|
|
|
1 |
import json
|
2 |
import logging
|
3 |
+
from typing import Optional
|
4 |
import requests
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
|
|
15 |
search_engine_id: str,
|
16 |
query: str,
|
17 |
count: int,
|
18 |
+
filter_list: Optional[list[str]] = None,
|
19 |
) -> list[SearchResult]:
|
20 |
"""Search using Google's Programmable Search Engine API and return the results as a list of SearchResult objects.
|
21 |
|
backend/apps/rag/search/jina_search.py
CHANGED
@@ -17,7 +17,7 @@ def search_jina(query: str, count: int) -> list[SearchResult]:
|
|
17 |
count (int): The number of results to return
|
18 |
|
19 |
Returns:
|
20 |
-
|
21 |
"""
|
22 |
jina_search_endpoint = "https://s.jina.ai/"
|
23 |
headers = {
|
|
|
17 |
count (int): The number of results to return
|
18 |
|
19 |
Returns:
|
20 |
+
list[SearchResult]: A list of search results
|
21 |
"""
|
22 |
jina_search_endpoint = "https://s.jina.ai/"
|
23 |
headers = {
|
backend/apps/rag/search/searxng.py
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
import logging
|
2 |
import requests
|
3 |
|
4 |
-
from typing import
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
7 |
from config import SRC_LOG_LEVELS
|
@@ -14,9 +14,9 @@ def search_searxng(
|
|
14 |
query_url: str,
|
15 |
query: str,
|
16 |
count: int,
|
17 |
-
filter_list: Optional[
|
18 |
**kwargs,
|
19 |
-
) ->
|
20 |
"""
|
21 |
Search a SearXNG instance for a given query and return the results as a list of SearchResult objects.
|
22 |
|
@@ -31,10 +31,10 @@ def search_searxng(
|
|
31 |
language (str): Language filter for the search results; e.g., "en-US". Defaults to an empty string.
|
32 |
safesearch (int): Safe search filter for safer web results; 0 = off, 1 = moderate, 2 = strict. Defaults to 1 (moderate).
|
33 |
time_range (str): Time range for filtering results by date; e.g., "2023-04-05..today" or "all-time". Defaults to ''.
|
34 |
-
categories: (Optional[
|
35 |
|
36 |
Returns:
|
37 |
-
|
38 |
|
39 |
Raise:
|
40 |
requests.exceptions.RequestException: If a request error occurs during the search process.
|
|
|
1 |
import logging
|
2 |
import requests
|
3 |
|
4 |
+
from typing import Optional
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
7 |
from config import SRC_LOG_LEVELS
|
|
|
14 |
query_url: str,
|
15 |
query: str,
|
16 |
count: int,
|
17 |
+
filter_list: Optional[list[str]] = None,
|
18 |
**kwargs,
|
19 |
+
) -> list[SearchResult]:
|
20 |
"""
|
21 |
Search a SearXNG instance for a given query and return the results as a list of SearchResult objects.
|
22 |
|
|
|
31 |
language (str): Language filter for the search results; e.g., "en-US". Defaults to an empty string.
|
32 |
safesearch (int): Safe search filter for safer web results; 0 = off, 1 = moderate, 2 = strict. Defaults to 1 (moderate).
|
33 |
time_range (str): Time range for filtering results by date; e.g., "2023-04-05..today" or "all-time". Defaults to ''.
|
34 |
+
categories: (Optional[list[str]]): Specific categories within which the search should be performed, defaulting to an empty string if not provided.
|
35 |
|
36 |
Returns:
|
37 |
+
list[SearchResult]: A list of SearchResults sorted by relevance score in descending order.
|
38 |
|
39 |
Raise:
|
40 |
requests.exceptions.RequestException: If a request error occurs during the search process.
|
backend/apps/rag/search/serper.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
import json
|
2 |
import logging
|
3 |
-
from typing import
|
4 |
import requests
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
@@ -11,7 +11,7 @@ log.setLevel(SRC_LOG_LEVELS["RAG"])
|
|
11 |
|
12 |
|
13 |
def search_serper(
|
14 |
-
api_key: str, query: str, count: int, filter_list: Optional[
|
15 |
) -> list[SearchResult]:
|
16 |
"""Search using serper.dev's API and return the results as a list of SearchResult objects.
|
17 |
|
|
|
1 |
import json
|
2 |
import logging
|
3 |
+
from typing import Optional
|
4 |
import requests
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
|
|
11 |
|
12 |
|
13 |
def search_serper(
|
14 |
+
api_key: str, query: str, count: int, filter_list: Optional[list[str]] = None
|
15 |
) -> list[SearchResult]:
|
16 |
"""Search using serper.dev's API and return the results as a list of SearchResult objects.
|
17 |
|
backend/apps/rag/search/serply.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
import json
|
2 |
import logging
|
3 |
-
from typing import
|
4 |
import requests
|
5 |
from urllib.parse import urlencode
|
6 |
|
@@ -19,7 +19,7 @@ def search_serply(
|
|
19 |
limit: int = 10,
|
20 |
device_type: str = "desktop",
|
21 |
proxy_location: str = "US",
|
22 |
-
filter_list: Optional[
|
23 |
) -> list[SearchResult]:
|
24 |
"""Search using serper.dev's API and return the results as a list of SearchResult objects.
|
25 |
|
|
|
1 |
import json
|
2 |
import logging
|
3 |
+
from typing import Optional
|
4 |
import requests
|
5 |
from urllib.parse import urlencode
|
6 |
|
|
|
19 |
limit: int = 10,
|
20 |
device_type: str = "desktop",
|
21 |
proxy_location: str = "US",
|
22 |
+
filter_list: Optional[list[str]] = None,
|
23 |
) -> list[SearchResult]:
|
24 |
"""Search using serper.dev's API and return the results as a list of SearchResult objects.
|
25 |
|
backend/apps/rag/search/serpstack.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
import json
|
2 |
import logging
|
3 |
-
from typing import
|
4 |
import requests
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
@@ -14,7 +14,7 @@ def search_serpstack(
|
|
14 |
api_key: str,
|
15 |
query: str,
|
16 |
count: int,
|
17 |
-
filter_list: Optional[
|
18 |
https_enabled: bool = True,
|
19 |
) -> list[SearchResult]:
|
20 |
"""Search using serpstack.com's and return the results as a list of SearchResult objects.
|
|
|
1 |
import json
|
2 |
import logging
|
3 |
+
from typing import Optional
|
4 |
import requests
|
5 |
|
6 |
from apps.rag.search.main import SearchResult, get_filtered_results
|
|
|
14 |
api_key: str,
|
15 |
query: str,
|
16 |
count: int,
|
17 |
+
filter_list: Optional[list[str]] = None,
|
18 |
https_enabled: bool = True,
|
19 |
) -> list[SearchResult]:
|
20 |
"""Search using serpstack.com's and return the results as a list of SearchResult objects.
|
backend/apps/rag/search/tavily.py
CHANGED
@@ -17,7 +17,7 @@ def search_tavily(api_key: str, query: str, count: int) -> list[SearchResult]:
|
|
17 |
query (str): The query to search for
|
18 |
|
19 |
Returns:
|
20 |
-
|
21 |
"""
|
22 |
url = "https://api.tavily.com/search"
|
23 |
data = {"query": query, "api_key": api_key}
|
|
|
17 |
query (str): The query to search for
|
18 |
|
19 |
Returns:
|
20 |
+
list[SearchResult]: A list of search results
|
21 |
"""
|
22 |
url = "https://api.tavily.com/search"
|
23 |
data = {"query": query, "api_key": api_key}
|
backend/apps/rag/utils.py
CHANGED
@@ -2,7 +2,7 @@ import os
|
|
2 |
import logging
|
3 |
import requests
|
4 |
|
5 |
-
from typing import
|
6 |
|
7 |
from apps.ollama.main import (
|
8 |
generate_ollama_embeddings,
|
@@ -142,7 +142,7 @@ def merge_and_sort_query_results(query_results, k, reverse=False):
|
|
142 |
|
143 |
|
144 |
def query_collection(
|
145 |
-
collection_names:
|
146 |
query: str,
|
147 |
embedding_function,
|
148 |
k: int,
|
@@ -157,13 +157,13 @@ def query_collection(
|
|
157 |
embedding_function=embedding_function,
|
158 |
)
|
159 |
results.append(result)
|
160 |
-
except:
|
161 |
pass
|
162 |
return merge_and_sort_query_results(results, k=k)
|
163 |
|
164 |
|
165 |
def query_collection_with_hybrid_search(
|
166 |
-
collection_names:
|
167 |
query: str,
|
168 |
embedding_function,
|
169 |
k: int,
|
@@ -182,7 +182,7 @@ def query_collection_with_hybrid_search(
|
|
182 |
r=r,
|
183 |
)
|
184 |
results.append(result)
|
185 |
-
except:
|
186 |
pass
|
187 |
return merge_and_sort_query_results(results, k=k, reverse=True)
|
188 |
|
@@ -411,7 +411,7 @@ class ChromaRetriever(BaseRetriever):
|
|
411 |
query: str,
|
412 |
*,
|
413 |
run_manager: CallbackManagerForRetrieverRun,
|
414 |
-
) ->
|
415 |
query_embeddings = self.embedding_function(query)
|
416 |
|
417 |
results = self.collection.query(
|
|
|
2 |
import logging
|
3 |
import requests
|
4 |
|
5 |
+
from typing import Union
|
6 |
|
7 |
from apps.ollama.main import (
|
8 |
generate_ollama_embeddings,
|
|
|
142 |
|
143 |
|
144 |
def query_collection(
|
145 |
+
collection_names: list[str],
|
146 |
query: str,
|
147 |
embedding_function,
|
148 |
k: int,
|
|
|
157 |
embedding_function=embedding_function,
|
158 |
)
|
159 |
results.append(result)
|
160 |
+
except Exception:
|
161 |
pass
|
162 |
return merge_and_sort_query_results(results, k=k)
|
163 |
|
164 |
|
165 |
def query_collection_with_hybrid_search(
|
166 |
+
collection_names: list[str],
|
167 |
query: str,
|
168 |
embedding_function,
|
169 |
k: int,
|
|
|
182 |
r=r,
|
183 |
)
|
184 |
results.append(result)
|
185 |
+
except Exception:
|
186 |
pass
|
187 |
return merge_and_sort_query_results(results, k=k, reverse=True)
|
188 |
|
|
|
411 |
query: str,
|
412 |
*,
|
413 |
run_manager: CallbackManagerForRetrieverRun,
|
414 |
+
) -> list[Document]:
|
415 |
query_embeddings = self.embedding_function(query)
|
416 |
|
417 |
results = self.collection.query(
|
backend/apps/webui/main.py
CHANGED
@@ -22,7 +22,7 @@ from apps.webui.utils import load_function_module_by_id
|
|
22 |
from utils.misc import (
|
23 |
openai_chat_chunk_message_template,
|
24 |
openai_chat_completion_message_template,
|
25 |
-
|
26 |
apply_model_system_prompt_to_body,
|
27 |
)
|
28 |
|
@@ -46,6 +46,7 @@ from config import (
|
|
46 |
AppConfig,
|
47 |
OAUTH_USERNAME_CLAIM,
|
48 |
OAUTH_PICTURE_CLAIM,
|
|
|
49 |
)
|
50 |
|
51 |
from apps.socket.main import get_event_call, get_event_emitter
|
@@ -84,6 +85,7 @@ app.state.config.ENABLE_COMMUNITY_SHARING = ENABLE_COMMUNITY_SHARING
|
|
84 |
|
85 |
app.state.config.OAUTH_USERNAME_CLAIM = OAUTH_USERNAME_CLAIM
|
86 |
app.state.config.OAUTH_PICTURE_CLAIM = OAUTH_PICTURE_CLAIM
|
|
|
87 |
|
88 |
app.state.MODELS = {}
|
89 |
app.state.TOOLS = {}
|
@@ -289,7 +291,7 @@ async def generate_function_chat_completion(form_data, user):
|
|
289 |
form_data["model"] = model_info.base_model_id
|
290 |
|
291 |
params = model_info.params.model_dump()
|
292 |
-
form_data =
|
293 |
form_data = apply_model_system_prompt_to_body(params, form_data, user)
|
294 |
|
295 |
pipe_id = get_pipe_id(form_data)
|
|
|
22 |
from utils.misc import (
|
23 |
openai_chat_chunk_message_template,
|
24 |
openai_chat_completion_message_template,
|
25 |
+
apply_model_params_to_body_openai,
|
26 |
apply_model_system_prompt_to_body,
|
27 |
)
|
28 |
|
|
|
46 |
AppConfig,
|
47 |
OAUTH_USERNAME_CLAIM,
|
48 |
OAUTH_PICTURE_CLAIM,
|
49 |
+
OAUTH_EMAIL_CLAIM,
|
50 |
)
|
51 |
|
52 |
from apps.socket.main import get_event_call, get_event_emitter
|
|
|
85 |
|
86 |
app.state.config.OAUTH_USERNAME_CLAIM = OAUTH_USERNAME_CLAIM
|
87 |
app.state.config.OAUTH_PICTURE_CLAIM = OAUTH_PICTURE_CLAIM
|
88 |
+
app.state.config.OAUTH_EMAIL_CLAIM = OAUTH_EMAIL_CLAIM
|
89 |
|
90 |
app.state.MODELS = {}
|
91 |
app.state.TOOLS = {}
|
|
|
291 |
form_data["model"] = model_info.base_model_id
|
292 |
|
293 |
params = model_info.params.model_dump()
|
294 |
+
form_data = apply_model_params_to_body_openai(params, form_data)
|
295 |
form_data = apply_model_system_prompt_to_body(params, form_data, user)
|
296 |
|
297 |
pipe_id = get_pipe_id(form_data)
|
backend/apps/webui/models/auths.py
CHANGED
@@ -140,7 +140,7 @@ class AuthsTable:
|
|
140 |
return None
|
141 |
else:
|
142 |
return None
|
143 |
-
except:
|
144 |
return None
|
145 |
|
146 |
def authenticate_user_by_api_key(self, api_key: str) -> Optional[UserModel]:
|
@@ -152,7 +152,7 @@ class AuthsTable:
|
|
152 |
try:
|
153 |
user = Users.get_user_by_api_key(api_key)
|
154 |
return user if user else None
|
155 |
-
except:
|
156 |
return False
|
157 |
|
158 |
def authenticate_user_by_trusted_header(self, email: str) -> Optional[UserModel]:
|
@@ -163,7 +163,7 @@ class AuthsTable:
|
|
163 |
if auth:
|
164 |
user = Users.get_user_by_id(auth.id)
|
165 |
return user
|
166 |
-
except:
|
167 |
return None
|
168 |
|
169 |
def update_user_password_by_id(self, id: str, new_password: str) -> bool:
|
@@ -174,7 +174,7 @@ class AuthsTable:
|
|
174 |
)
|
175 |
db.commit()
|
176 |
return True if result == 1 else False
|
177 |
-
except:
|
178 |
return False
|
179 |
|
180 |
def update_email_by_id(self, id: str, email: str) -> bool:
|
@@ -183,7 +183,7 @@ class AuthsTable:
|
|
183 |
result = db.query(Auth).filter_by(id=id).update({"email": email})
|
184 |
db.commit()
|
185 |
return True if result == 1 else False
|
186 |
-
except:
|
187 |
return False
|
188 |
|
189 |
def delete_auth_by_id(self, id: str) -> bool:
|
@@ -200,7 +200,7 @@ class AuthsTable:
|
|
200 |
return True
|
201 |
else:
|
202 |
return False
|
203 |
-
except:
|
204 |
return False
|
205 |
|
206 |
|
|
|
140 |
return None
|
141 |
else:
|
142 |
return None
|
143 |
+
except Exception:
|
144 |
return None
|
145 |
|
146 |
def authenticate_user_by_api_key(self, api_key: str) -> Optional[UserModel]:
|
|
|
152 |
try:
|
153 |
user = Users.get_user_by_api_key(api_key)
|
154 |
return user if user else None
|
155 |
+
except Exception:
|
156 |
return False
|
157 |
|
158 |
def authenticate_user_by_trusted_header(self, email: str) -> Optional[UserModel]:
|
|
|
163 |
if auth:
|
164 |
user = Users.get_user_by_id(auth.id)
|
165 |
return user
|
166 |
+
except Exception:
|
167 |
return None
|
168 |
|
169 |
def update_user_password_by_id(self, id: str, new_password: str) -> bool:
|
|
|
174 |
)
|
175 |
db.commit()
|
176 |
return True if result == 1 else False
|
177 |
+
except Exception:
|
178 |
return False
|
179 |
|
180 |
def update_email_by_id(self, id: str, email: str) -> bool:
|
|
|
183 |
result = db.query(Auth).filter_by(id=id).update({"email": email})
|
184 |
db.commit()
|
185 |
return True if result == 1 else False
|
186 |
+
except Exception:
|
187 |
return False
|
188 |
|
189 |
def delete_auth_by_id(self, id: str) -> bool:
|
|
|
200 |
return True
|
201 |
else:
|
202 |
return False
|
203 |
+
except Exception:
|
204 |
return False
|
205 |
|
206 |
|
backend/apps/webui/models/chats.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
|
4 |
import json
|
5 |
import uuid
|
@@ -164,7 +164,7 @@ class ChatTable:
|
|
164 |
db.refresh(chat)
|
165 |
|
166 |
return self.get_chat_by_id(chat.share_id)
|
167 |
-
except:
|
168 |
return None
|
169 |
|
170 |
def delete_shared_chat_by_chat_id(self, chat_id: str) -> bool:
|
@@ -175,7 +175,7 @@ class ChatTable:
|
|
175 |
db.commit()
|
176 |
|
177 |
return True
|
178 |
-
except:
|
179 |
return False
|
180 |
|
181 |
def update_chat_share_id_by_id(
|
@@ -189,7 +189,7 @@ class ChatTable:
|
|
189 |
db.commit()
|
190 |
db.refresh(chat)
|
191 |
return ChatModel.model_validate(chat)
|
192 |
-
except:
|
193 |
return None
|
194 |
|
195 |
def toggle_chat_archive_by_id(self, id: str) -> Optional[ChatModel]:
|
@@ -201,7 +201,7 @@ class ChatTable:
|
|
201 |
db.commit()
|
202 |
db.refresh(chat)
|
203 |
return ChatModel.model_validate(chat)
|
204 |
-
except:
|
205 |
return None
|
206 |
|
207 |
def archive_all_chats_by_user_id(self, user_id: str) -> bool:
|
@@ -210,12 +210,12 @@ class ChatTable:
|
|
210 |
db.query(Chat).filter_by(user_id=user_id).update({"archived": True})
|
211 |
db.commit()
|
212 |
return True
|
213 |
-
except:
|
214 |
return False
|
215 |
|
216 |
def get_archived_chat_list_by_user_id(
|
217 |
self, user_id: str, skip: int = 0, limit: int = 50
|
218 |
-
) ->
|
219 |
with get_db() as db:
|
220 |
|
221 |
all_chats = (
|
@@ -233,7 +233,7 @@ class ChatTable:
|
|
233 |
include_archived: bool = False,
|
234 |
skip: int = 0,
|
235 |
limit: int = 50,
|
236 |
-
) ->
|
237 |
with get_db() as db:
|
238 |
query = db.query(Chat).filter_by(user_id=user_id)
|
239 |
if not include_archived:
|
@@ -251,7 +251,7 @@ class ChatTable:
|
|
251 |
include_archived: bool = False,
|
252 |
skip: int = 0,
|
253 |
limit: int = -1,
|
254 |
-
) ->
|
255 |
with get_db() as db:
|
256 |
query = db.query(Chat).filter_by(user_id=user_id)
|
257 |
if not include_archived:
|
@@ -279,8 +279,8 @@ class ChatTable:
|
|
279 |
]
|
280 |
|
281 |
def get_chat_list_by_chat_ids(
|
282 |
-
self, chat_ids:
|
283 |
-
) ->
|
284 |
with get_db() as db:
|
285 |
all_chats = (
|
286 |
db.query(Chat)
|
@@ -297,7 +297,7 @@ class ChatTable:
|
|
297 |
|
298 |
chat = db.get(Chat, id)
|
299 |
return ChatModel.model_validate(chat)
|
300 |
-
except:
|
301 |
return None
|
302 |
|
303 |
def get_chat_by_share_id(self, id: str) -> Optional[ChatModel]:
|
@@ -319,10 +319,10 @@ class ChatTable:
|
|
319 |
|
320 |
chat = db.query(Chat).filter_by(id=id, user_id=user_id).first()
|
321 |
return ChatModel.model_validate(chat)
|
322 |
-
except:
|
323 |
return None
|
324 |
|
325 |
-
def get_chats(self, skip: int = 0, limit: int = 50) ->
|
326 |
with get_db() as db:
|
327 |
|
328 |
all_chats = (
|
@@ -332,7 +332,7 @@ class ChatTable:
|
|
332 |
)
|
333 |
return [ChatModel.model_validate(chat) for chat in all_chats]
|
334 |
|
335 |
-
def get_chats_by_user_id(self, user_id: str) ->
|
336 |
with get_db() as db:
|
337 |
|
338 |
all_chats = (
|
@@ -342,7 +342,7 @@ class ChatTable:
|
|
342 |
)
|
343 |
return [ChatModel.model_validate(chat) for chat in all_chats]
|
344 |
|
345 |
-
def get_archived_chats_by_user_id(self, user_id: str) ->
|
346 |
with get_db() as db:
|
347 |
|
348 |
all_chats = (
|
@@ -360,7 +360,7 @@ class ChatTable:
|
|
360 |
db.commit()
|
361 |
|
362 |
return True and self.delete_shared_chat_by_chat_id(id)
|
363 |
-
except:
|
364 |
return False
|
365 |
|
366 |
def delete_chat_by_id_and_user_id(self, id: str, user_id: str) -> bool:
|
@@ -371,7 +371,7 @@ class ChatTable:
|
|
371 |
db.commit()
|
372 |
|
373 |
return True and self.delete_shared_chat_by_chat_id(id)
|
374 |
-
except:
|
375 |
return False
|
376 |
|
377 |
def delete_chats_by_user_id(self, user_id: str) -> bool:
|
@@ -385,7 +385,7 @@ class ChatTable:
|
|
385 |
db.commit()
|
386 |
|
387 |
return True
|
388 |
-
except:
|
389 |
return False
|
390 |
|
391 |
def delete_shared_chats_by_user_id(self, user_id: str) -> bool:
|
@@ -400,7 +400,7 @@ class ChatTable:
|
|
400 |
db.commit()
|
401 |
|
402 |
return True
|
403 |
-
except:
|
404 |
return False
|
405 |
|
406 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Union, Optional
|
3 |
|
4 |
import json
|
5 |
import uuid
|
|
|
164 |
db.refresh(chat)
|
165 |
|
166 |
return self.get_chat_by_id(chat.share_id)
|
167 |
+
except Exception:
|
168 |
return None
|
169 |
|
170 |
def delete_shared_chat_by_chat_id(self, chat_id: str) -> bool:
|
|
|
175 |
db.commit()
|
176 |
|
177 |
return True
|
178 |
+
except Exception:
|
179 |
return False
|
180 |
|
181 |
def update_chat_share_id_by_id(
|
|
|
189 |
db.commit()
|
190 |
db.refresh(chat)
|
191 |
return ChatModel.model_validate(chat)
|
192 |
+
except Exception:
|
193 |
return None
|
194 |
|
195 |
def toggle_chat_archive_by_id(self, id: str) -> Optional[ChatModel]:
|
|
|
201 |
db.commit()
|
202 |
db.refresh(chat)
|
203 |
return ChatModel.model_validate(chat)
|
204 |
+
except Exception:
|
205 |
return None
|
206 |
|
207 |
def archive_all_chats_by_user_id(self, user_id: str) -> bool:
|
|
|
210 |
db.query(Chat).filter_by(user_id=user_id).update({"archived": True})
|
211 |
db.commit()
|
212 |
return True
|
213 |
+
except Exception:
|
214 |
return False
|
215 |
|
216 |
def get_archived_chat_list_by_user_id(
|
217 |
self, user_id: str, skip: int = 0, limit: int = 50
|
218 |
+
) -> list[ChatModel]:
|
219 |
with get_db() as db:
|
220 |
|
221 |
all_chats = (
|
|
|
233 |
include_archived: bool = False,
|
234 |
skip: int = 0,
|
235 |
limit: int = 50,
|
236 |
+
) -> list[ChatModel]:
|
237 |
with get_db() as db:
|
238 |
query = db.query(Chat).filter_by(user_id=user_id)
|
239 |
if not include_archived:
|
|
|
251 |
include_archived: bool = False,
|
252 |
skip: int = 0,
|
253 |
limit: int = -1,
|
254 |
+
) -> list[ChatTitleIdResponse]:
|
255 |
with get_db() as db:
|
256 |
query = db.query(Chat).filter_by(user_id=user_id)
|
257 |
if not include_archived:
|
|
|
279 |
]
|
280 |
|
281 |
def get_chat_list_by_chat_ids(
|
282 |
+
self, chat_ids: list[str], skip: int = 0, limit: int = 50
|
283 |
+
) -> list[ChatModel]:
|
284 |
with get_db() as db:
|
285 |
all_chats = (
|
286 |
db.query(Chat)
|
|
|
297 |
|
298 |
chat = db.get(Chat, id)
|
299 |
return ChatModel.model_validate(chat)
|
300 |
+
except Exception:
|
301 |
return None
|
302 |
|
303 |
def get_chat_by_share_id(self, id: str) -> Optional[ChatModel]:
|
|
|
319 |
|
320 |
chat = db.query(Chat).filter_by(id=id, user_id=user_id).first()
|
321 |
return ChatModel.model_validate(chat)
|
322 |
+
except Exception:
|
323 |
return None
|
324 |
|
325 |
+
def get_chats(self, skip: int = 0, limit: int = 50) -> list[ChatModel]:
|
326 |
with get_db() as db:
|
327 |
|
328 |
all_chats = (
|
|
|
332 |
)
|
333 |
return [ChatModel.model_validate(chat) for chat in all_chats]
|
334 |
|
335 |
+
def get_chats_by_user_id(self, user_id: str) -> list[ChatModel]:
|
336 |
with get_db() as db:
|
337 |
|
338 |
all_chats = (
|
|
|
342 |
)
|
343 |
return [ChatModel.model_validate(chat) for chat in all_chats]
|
344 |
|
345 |
+
def get_archived_chats_by_user_id(self, user_id: str) -> list[ChatModel]:
|
346 |
with get_db() as db:
|
347 |
|
348 |
all_chats = (
|
|
|
360 |
db.commit()
|
361 |
|
362 |
return True and self.delete_shared_chat_by_chat_id(id)
|
363 |
+
except Exception:
|
364 |
return False
|
365 |
|
366 |
def delete_chat_by_id_and_user_id(self, id: str, user_id: str) -> bool:
|
|
|
371 |
db.commit()
|
372 |
|
373 |
return True and self.delete_shared_chat_by_chat_id(id)
|
374 |
+
except Exception:
|
375 |
return False
|
376 |
|
377 |
def delete_chats_by_user_id(self, user_id: str) -> bool:
|
|
|
385 |
db.commit()
|
386 |
|
387 |
return True
|
388 |
+
except Exception:
|
389 |
return False
|
390 |
|
391 |
def delete_shared_chats_by_user_id(self, user_id: str) -> bool:
|
|
|
400 |
db.commit()
|
401 |
|
402 |
return True
|
403 |
+
except Exception:
|
404 |
return False
|
405 |
|
406 |
|
backend/apps/webui/models/documents.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
import time
|
4 |
import logging
|
5 |
|
@@ -93,7 +93,7 @@ class DocumentsTable:
|
|
93 |
return DocumentModel.model_validate(result)
|
94 |
else:
|
95 |
return None
|
96 |
-
except:
|
97 |
return None
|
98 |
|
99 |
def get_doc_by_name(self, name: str) -> Optional[DocumentModel]:
|
@@ -102,10 +102,10 @@ class DocumentsTable:
|
|
102 |
|
103 |
document = db.query(Document).filter_by(name=name).first()
|
104 |
return DocumentModel.model_validate(document) if document else None
|
105 |
-
except:
|
106 |
return None
|
107 |
|
108 |
-
def get_docs(self) ->
|
109 |
with get_db() as db:
|
110 |
|
111 |
return [
|
@@ -160,7 +160,7 @@ class DocumentsTable:
|
|
160 |
db.query(Document).filter_by(name=name).delete()
|
161 |
db.commit()
|
162 |
return True
|
163 |
-
except:
|
164 |
return False
|
165 |
|
166 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Optional
|
3 |
import time
|
4 |
import logging
|
5 |
|
|
|
93 |
return DocumentModel.model_validate(result)
|
94 |
else:
|
95 |
return None
|
96 |
+
except Exception:
|
97 |
return None
|
98 |
|
99 |
def get_doc_by_name(self, name: str) -> Optional[DocumentModel]:
|
|
|
102 |
|
103 |
document = db.query(Document).filter_by(name=name).first()
|
104 |
return DocumentModel.model_validate(document) if document else None
|
105 |
+
except Exception:
|
106 |
return None
|
107 |
|
108 |
+
def get_docs(self) -> list[DocumentModel]:
|
109 |
with get_db() as db:
|
110 |
|
111 |
return [
|
|
|
160 |
db.query(Document).filter_by(name=name).delete()
|
161 |
db.commit()
|
162 |
return True
|
163 |
+
except Exception:
|
164 |
return False
|
165 |
|
166 |
|
backend/apps/webui/models/files.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
import time
|
4 |
import logging
|
5 |
|
@@ -90,10 +90,10 @@ class FilesTable:
|
|
90 |
try:
|
91 |
file = db.get(File, id)
|
92 |
return FileModel.model_validate(file)
|
93 |
-
except:
|
94 |
return None
|
95 |
|
96 |
-
def get_files(self) ->
|
97 |
with get_db() as db:
|
98 |
|
99 |
return [FileModel.model_validate(file) for file in db.query(File).all()]
|
@@ -107,7 +107,7 @@ class FilesTable:
|
|
107 |
db.commit()
|
108 |
|
109 |
return True
|
110 |
-
except:
|
111 |
return False
|
112 |
|
113 |
def delete_all_files(self) -> bool:
|
@@ -119,7 +119,7 @@ class FilesTable:
|
|
119 |
db.commit()
|
120 |
|
121 |
return True
|
122 |
-
except:
|
123 |
return False
|
124 |
|
125 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Union, Optional
|
3 |
import time
|
4 |
import logging
|
5 |
|
|
|
90 |
try:
|
91 |
file = db.get(File, id)
|
92 |
return FileModel.model_validate(file)
|
93 |
+
except Exception:
|
94 |
return None
|
95 |
|
96 |
+
def get_files(self) -> list[FileModel]:
|
97 |
with get_db() as db:
|
98 |
|
99 |
return [FileModel.model_validate(file) for file in db.query(File).all()]
|
|
|
107 |
db.commit()
|
108 |
|
109 |
return True
|
110 |
+
except Exception:
|
111 |
return False
|
112 |
|
113 |
def delete_all_files(self) -> bool:
|
|
|
119 |
db.commit()
|
120 |
|
121 |
return True
|
122 |
+
except Exception:
|
123 |
return False
|
124 |
|
125 |
|
backend/apps/webui/models/functions.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
import time
|
4 |
import logging
|
5 |
|
@@ -122,10 +122,10 @@ class FunctionsTable:
|
|
122 |
|
123 |
function = db.get(Function, id)
|
124 |
return FunctionModel.model_validate(function)
|
125 |
-
except:
|
126 |
return None
|
127 |
|
128 |
-
def get_functions(self, active_only=False) ->
|
129 |
with get_db() as db:
|
130 |
|
131 |
if active_only:
|
@@ -141,7 +141,7 @@ class FunctionsTable:
|
|
141 |
|
142 |
def get_functions_by_type(
|
143 |
self, type: str, active_only=False
|
144 |
-
) ->
|
145 |
with get_db() as db:
|
146 |
|
147 |
if active_only:
|
@@ -157,7 +157,7 @@ class FunctionsTable:
|
|
157 |
for function in db.query(Function).filter_by(type=type).all()
|
158 |
]
|
159 |
|
160 |
-
def get_global_filter_functions(self) ->
|
161 |
with get_db() as db:
|
162 |
|
163 |
return [
|
@@ -167,7 +167,7 @@ class FunctionsTable:
|
|
167 |
.all()
|
168 |
]
|
169 |
|
170 |
-
def get_global_action_functions(self) ->
|
171 |
with get_db() as db:
|
172 |
return [
|
173 |
FunctionModel.model_validate(function)
|
@@ -198,7 +198,7 @@ class FunctionsTable:
|
|
198 |
db.commit()
|
199 |
db.refresh(function)
|
200 |
return self.get_function_by_id(id)
|
201 |
-
except:
|
202 |
return None
|
203 |
|
204 |
def get_user_valves_by_id_and_user_id(
|
@@ -256,7 +256,7 @@ class FunctionsTable:
|
|
256 |
)
|
257 |
db.commit()
|
258 |
return self.get_function_by_id(id)
|
259 |
-
except:
|
260 |
return None
|
261 |
|
262 |
def deactivate_all_functions(self) -> Optional[bool]:
|
@@ -271,7 +271,7 @@ class FunctionsTable:
|
|
271 |
)
|
272 |
db.commit()
|
273 |
return True
|
274 |
-
except:
|
275 |
return None
|
276 |
|
277 |
def delete_function_by_id(self, id: str) -> bool:
|
@@ -281,7 +281,7 @@ class FunctionsTable:
|
|
281 |
db.commit()
|
282 |
|
283 |
return True
|
284 |
-
except:
|
285 |
return False
|
286 |
|
287 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Union, Optional
|
3 |
import time
|
4 |
import logging
|
5 |
|
|
|
122 |
|
123 |
function = db.get(Function, id)
|
124 |
return FunctionModel.model_validate(function)
|
125 |
+
except Exception:
|
126 |
return None
|
127 |
|
128 |
+
def get_functions(self, active_only=False) -> list[FunctionModel]:
|
129 |
with get_db() as db:
|
130 |
|
131 |
if active_only:
|
|
|
141 |
|
142 |
def get_functions_by_type(
|
143 |
self, type: str, active_only=False
|
144 |
+
) -> list[FunctionModel]:
|
145 |
with get_db() as db:
|
146 |
|
147 |
if active_only:
|
|
|
157 |
for function in db.query(Function).filter_by(type=type).all()
|
158 |
]
|
159 |
|
160 |
+
def get_global_filter_functions(self) -> list[FunctionModel]:
|
161 |
with get_db() as db:
|
162 |
|
163 |
return [
|
|
|
167 |
.all()
|
168 |
]
|
169 |
|
170 |
+
def get_global_action_functions(self) -> list[FunctionModel]:
|
171 |
with get_db() as db:
|
172 |
return [
|
173 |
FunctionModel.model_validate(function)
|
|
|
198 |
db.commit()
|
199 |
db.refresh(function)
|
200 |
return self.get_function_by_id(id)
|
201 |
+
except Exception:
|
202 |
return None
|
203 |
|
204 |
def get_user_valves_by_id_and_user_id(
|
|
|
256 |
)
|
257 |
db.commit()
|
258 |
return self.get_function_by_id(id)
|
259 |
+
except Exception:
|
260 |
return None
|
261 |
|
262 |
def deactivate_all_functions(self) -> Optional[bool]:
|
|
|
271 |
)
|
272 |
db.commit()
|
273 |
return True
|
274 |
+
except Exception:
|
275 |
return None
|
276 |
|
277 |
def delete_function_by_id(self, id: str) -> bool:
|
|
|
281 |
db.commit()
|
282 |
|
283 |
return True
|
284 |
+
except Exception:
|
285 |
return False
|
286 |
|
287 |
|
backend/apps/webui/models/memories.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
|
4 |
from sqlalchemy import Column, String, BigInteger, Text
|
5 |
|
@@ -80,25 +80,25 @@ class MemoriesTable:
|
|
80 |
)
|
81 |
db.commit()
|
82 |
return self.get_memory_by_id(id)
|
83 |
-
except:
|
84 |
return None
|
85 |
|
86 |
-
def get_memories(self) ->
|
87 |
with get_db() as db:
|
88 |
|
89 |
try:
|
90 |
memories = db.query(Memory).all()
|
91 |
return [MemoryModel.model_validate(memory) for memory in memories]
|
92 |
-
except:
|
93 |
return None
|
94 |
|
95 |
-
def get_memories_by_user_id(self, user_id: str) ->
|
96 |
with get_db() as db:
|
97 |
|
98 |
try:
|
99 |
memories = db.query(Memory).filter_by(user_id=user_id).all()
|
100 |
return [MemoryModel.model_validate(memory) for memory in memories]
|
101 |
-
except:
|
102 |
return None
|
103 |
|
104 |
def get_memory_by_id(self, id: str) -> Optional[MemoryModel]:
|
@@ -107,7 +107,7 @@ class MemoriesTable:
|
|
107 |
try:
|
108 |
memory = db.get(Memory, id)
|
109 |
return MemoryModel.model_validate(memory)
|
110 |
-
except:
|
111 |
return None
|
112 |
|
113 |
def delete_memory_by_id(self, id: str) -> bool:
|
@@ -119,7 +119,7 @@ class MemoriesTable:
|
|
119 |
|
120 |
return True
|
121 |
|
122 |
-
except:
|
123 |
return False
|
124 |
|
125 |
def delete_memories_by_user_id(self, user_id: str) -> bool:
|
@@ -130,7 +130,7 @@ class MemoriesTable:
|
|
130 |
db.commit()
|
131 |
|
132 |
return True
|
133 |
-
except:
|
134 |
return False
|
135 |
|
136 |
def delete_memory_by_id_and_user_id(self, id: str, user_id: str) -> bool:
|
@@ -141,7 +141,7 @@ class MemoriesTable:
|
|
141 |
db.commit()
|
142 |
|
143 |
return True
|
144 |
-
except:
|
145 |
return False
|
146 |
|
147 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Union, Optional
|
3 |
|
4 |
from sqlalchemy import Column, String, BigInteger, Text
|
5 |
|
|
|
80 |
)
|
81 |
db.commit()
|
82 |
return self.get_memory_by_id(id)
|
83 |
+
except Exception:
|
84 |
return None
|
85 |
|
86 |
+
def get_memories(self) -> list[MemoryModel]:
|
87 |
with get_db() as db:
|
88 |
|
89 |
try:
|
90 |
memories = db.query(Memory).all()
|
91 |
return [MemoryModel.model_validate(memory) for memory in memories]
|
92 |
+
except Exception:
|
93 |
return None
|
94 |
|
95 |
+
def get_memories_by_user_id(self, user_id: str) -> list[MemoryModel]:
|
96 |
with get_db() as db:
|
97 |
|
98 |
try:
|
99 |
memories = db.query(Memory).filter_by(user_id=user_id).all()
|
100 |
return [MemoryModel.model_validate(memory) for memory in memories]
|
101 |
+
except Exception:
|
102 |
return None
|
103 |
|
104 |
def get_memory_by_id(self, id: str) -> Optional[MemoryModel]:
|
|
|
107 |
try:
|
108 |
memory = db.get(Memory, id)
|
109 |
return MemoryModel.model_validate(memory)
|
110 |
+
except Exception:
|
111 |
return None
|
112 |
|
113 |
def delete_memory_by_id(self, id: str) -> bool:
|
|
|
119 |
|
120 |
return True
|
121 |
|
122 |
+
except Exception:
|
123 |
return False
|
124 |
|
125 |
def delete_memories_by_user_id(self, user_id: str) -> bool:
|
|
|
130 |
db.commit()
|
131 |
|
132 |
return True
|
133 |
+
except Exception:
|
134 |
return False
|
135 |
|
136 |
def delete_memory_by_id_and_user_id(self, id: str, user_id: str) -> bool:
|
|
|
141 |
db.commit()
|
142 |
|
143 |
return True
|
144 |
+
except Exception:
|
145 |
return False
|
146 |
|
147 |
|
backend/apps/webui/models/models.py
CHANGED
@@ -137,7 +137,7 @@ class ModelsTable:
|
|
137 |
print(e)
|
138 |
return None
|
139 |
|
140 |
-
def get_all_models(self) ->
|
141 |
with get_db() as db:
|
142 |
return [ModelModel.model_validate(model) for model in db.query(Model).all()]
|
143 |
|
@@ -146,7 +146,7 @@ class ModelsTable:
|
|
146 |
with get_db() as db:
|
147 |
model = db.get(Model, id)
|
148 |
return ModelModel.model_validate(model)
|
149 |
-
except:
|
150 |
return None
|
151 |
|
152 |
def update_model_by_id(self, id: str, model: ModelForm) -> Optional[ModelModel]:
|
@@ -175,7 +175,7 @@ class ModelsTable:
|
|
175 |
db.commit()
|
176 |
|
177 |
return True
|
178 |
-
except:
|
179 |
return False
|
180 |
|
181 |
|
|
|
137 |
print(e)
|
138 |
return None
|
139 |
|
140 |
+
def get_all_models(self) -> list[ModelModel]:
|
141 |
with get_db() as db:
|
142 |
return [ModelModel.model_validate(model) for model in db.query(Model).all()]
|
143 |
|
|
|
146 |
with get_db() as db:
|
147 |
model = db.get(Model, id)
|
148 |
return ModelModel.model_validate(model)
|
149 |
+
except Exception:
|
150 |
return None
|
151 |
|
152 |
def update_model_by_id(self, id: str, model: ModelForm) -> Optional[ModelModel]:
|
|
|
175 |
db.commit()
|
176 |
|
177 |
return True
|
178 |
+
except Exception:
|
179 |
return False
|
180 |
|
181 |
|
backend/apps/webui/models/prompts.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
import time
|
4 |
|
5 |
from sqlalchemy import String, Column, BigInteger, Text
|
@@ -79,10 +79,10 @@ class PromptsTable:
|
|
79 |
|
80 |
prompt = db.query(Prompt).filter_by(command=command).first()
|
81 |
return PromptModel.model_validate(prompt)
|
82 |
-
except:
|
83 |
return None
|
84 |
|
85 |
-
def get_prompts(self) ->
|
86 |
with get_db() as db:
|
87 |
|
88 |
return [
|
@@ -101,7 +101,7 @@ class PromptsTable:
|
|
101 |
prompt.timestamp = int(time.time())
|
102 |
db.commit()
|
103 |
return PromptModel.model_validate(prompt)
|
104 |
-
except:
|
105 |
return None
|
106 |
|
107 |
def delete_prompt_by_command(self, command: str) -> bool:
|
@@ -112,7 +112,7 @@ class PromptsTable:
|
|
112 |
db.commit()
|
113 |
|
114 |
return True
|
115 |
-
except:
|
116 |
return False
|
117 |
|
118 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Optional
|
3 |
import time
|
4 |
|
5 |
from sqlalchemy import String, Column, BigInteger, Text
|
|
|
79 |
|
80 |
prompt = db.query(Prompt).filter_by(command=command).first()
|
81 |
return PromptModel.model_validate(prompt)
|
82 |
+
except Exception:
|
83 |
return None
|
84 |
|
85 |
+
def get_prompts(self) -> list[PromptModel]:
|
86 |
with get_db() as db:
|
87 |
|
88 |
return [
|
|
|
101 |
prompt.timestamp = int(time.time())
|
102 |
db.commit()
|
103 |
return PromptModel.model_validate(prompt)
|
104 |
+
except Exception:
|
105 |
return None
|
106 |
|
107 |
def delete_prompt_by_command(self, command: str) -> bool:
|
|
|
112 |
db.commit()
|
113 |
|
114 |
return True
|
115 |
+
except Exception:
|
116 |
return False
|
117 |
|
118 |
|
backend/apps/webui/models/tags.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
|
4 |
import json
|
5 |
import uuid
|
@@ -69,11 +69,11 @@ class ChatIdTagForm(BaseModel):
|
|
69 |
|
70 |
|
71 |
class TagChatIdsResponse(BaseModel):
|
72 |
-
chat_ids:
|
73 |
|
74 |
|
75 |
class ChatTagsResponse(BaseModel):
|
76 |
-
tags:
|
77 |
|
78 |
|
79 |
class TagTable:
|
@@ -109,7 +109,7 @@ class TagTable:
|
|
109 |
self, user_id: str, form_data: ChatIdTagForm
|
110 |
) -> Optional[ChatIdTagModel]:
|
111 |
tag = self.get_tag_by_name_and_user_id(form_data.tag_name, user_id)
|
112 |
-
if tag
|
113 |
tag = self.insert_new_tag(form_data.tag_name, user_id)
|
114 |
|
115 |
id = str(uuid.uuid4())
|
@@ -132,10 +132,10 @@ class TagTable:
|
|
132 |
return ChatIdTagModel.model_validate(result)
|
133 |
else:
|
134 |
return None
|
135 |
-
except:
|
136 |
return None
|
137 |
|
138 |
-
def get_tags_by_user_id(self, user_id: str) ->
|
139 |
with get_db() as db:
|
140 |
tag_names = [
|
141 |
chat_id_tag.tag_name
|
@@ -159,7 +159,7 @@ class TagTable:
|
|
159 |
|
160 |
def get_tags_by_chat_id_and_user_id(
|
161 |
self, chat_id: str, user_id: str
|
162 |
-
) ->
|
163 |
with get_db() as db:
|
164 |
|
165 |
tag_names = [
|
@@ -184,7 +184,7 @@ class TagTable:
|
|
184 |
|
185 |
def get_chat_ids_by_tag_name_and_user_id(
|
186 |
self, tag_name: str, user_id: str
|
187 |
-
) ->
|
188 |
with get_db() as db:
|
189 |
|
190 |
return [
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Optional
|
3 |
|
4 |
import json
|
5 |
import uuid
|
|
|
69 |
|
70 |
|
71 |
class TagChatIdsResponse(BaseModel):
|
72 |
+
chat_ids: list[str]
|
73 |
|
74 |
|
75 |
class ChatTagsResponse(BaseModel):
|
76 |
+
tags: list[str]
|
77 |
|
78 |
|
79 |
class TagTable:
|
|
|
109 |
self, user_id: str, form_data: ChatIdTagForm
|
110 |
) -> Optional[ChatIdTagModel]:
|
111 |
tag = self.get_tag_by_name_and_user_id(form_data.tag_name, user_id)
|
112 |
+
if tag is None:
|
113 |
tag = self.insert_new_tag(form_data.tag_name, user_id)
|
114 |
|
115 |
id = str(uuid.uuid4())
|
|
|
132 |
return ChatIdTagModel.model_validate(result)
|
133 |
else:
|
134 |
return None
|
135 |
+
except Exception:
|
136 |
return None
|
137 |
|
138 |
+
def get_tags_by_user_id(self, user_id: str) -> list[TagModel]:
|
139 |
with get_db() as db:
|
140 |
tag_names = [
|
141 |
chat_id_tag.tag_name
|
|
|
159 |
|
160 |
def get_tags_by_chat_id_and_user_id(
|
161 |
self, chat_id: str, user_id: str
|
162 |
+
) -> list[TagModel]:
|
163 |
with get_db() as db:
|
164 |
|
165 |
tag_names = [
|
|
|
184 |
|
185 |
def get_chat_ids_by_tag_name_and_user_id(
|
186 |
self, tag_name: str, user_id: str
|
187 |
+
) -> list[ChatIdTagModel]:
|
188 |
with get_db() as db:
|
189 |
|
190 |
return [
|
backend/apps/webui/models/tools.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
-
from typing import
|
3 |
import time
|
4 |
import logging
|
5 |
from sqlalchemy import String, Column, BigInteger, Text
|
@@ -45,7 +45,7 @@ class ToolModel(BaseModel):
|
|
45 |
user_id: str
|
46 |
name: str
|
47 |
content: str
|
48 |
-
specs:
|
49 |
meta: ToolMeta
|
50 |
updated_at: int # timestamp in epoch
|
51 |
created_at: int # timestamp in epoch
|
@@ -81,7 +81,7 @@ class ToolValves(BaseModel):
|
|
81 |
class ToolsTable:
|
82 |
|
83 |
def insert_new_tool(
|
84 |
-
self, user_id: str, form_data: ToolForm, specs:
|
85 |
) -> Optional[ToolModel]:
|
86 |
|
87 |
with get_db() as db:
|
@@ -115,10 +115,10 @@ class ToolsTable:
|
|
115 |
|
116 |
tool = db.get(Tool, id)
|
117 |
return ToolModel.model_validate(tool)
|
118 |
-
except:
|
119 |
return None
|
120 |
|
121 |
-
def get_tools(self) ->
|
122 |
with get_db() as db:
|
123 |
return [ToolModel.model_validate(tool) for tool in db.query(Tool).all()]
|
124 |
|
@@ -141,7 +141,7 @@ class ToolsTable:
|
|
141 |
)
|
142 |
db.commit()
|
143 |
return self.get_tool_by_id(id)
|
144 |
-
except:
|
145 |
return None
|
146 |
|
147 |
def get_user_valves_by_id_and_user_id(
|
@@ -196,7 +196,7 @@ class ToolsTable:
|
|
196 |
tool = db.query(Tool).get(id)
|
197 |
db.refresh(tool)
|
198 |
return ToolModel.model_validate(tool)
|
199 |
-
except:
|
200 |
return None
|
201 |
|
202 |
def delete_tool_by_id(self, id: str) -> bool:
|
@@ -206,7 +206,7 @@ class ToolsTable:
|
|
206 |
db.commit()
|
207 |
|
208 |
return True
|
209 |
-
except:
|
210 |
return False
|
211 |
|
212 |
|
|
|
1 |
from pydantic import BaseModel, ConfigDict
|
2 |
+
from typing import Optional
|
3 |
import time
|
4 |
import logging
|
5 |
from sqlalchemy import String, Column, BigInteger, Text
|
|
|
45 |
user_id: str
|
46 |
name: str
|
47 |
content: str
|
48 |
+
specs: list[dict]
|
49 |
meta: ToolMeta
|
50 |
updated_at: int # timestamp in epoch
|
51 |
created_at: int # timestamp in epoch
|
|
|
81 |
class ToolsTable:
|
82 |
|
83 |
def insert_new_tool(
|
84 |
+
self, user_id: str, form_data: ToolForm, specs: list[dict]
|
85 |
) -> Optional[ToolModel]:
|
86 |
|
87 |
with get_db() as db:
|
|
|
115 |
|
116 |
tool = db.get(Tool, id)
|
117 |
return ToolModel.model_validate(tool)
|
118 |
+
except Exception:
|
119 |
return None
|
120 |
|
121 |
+
def get_tools(self) -> list[ToolModel]:
|
122 |
with get_db() as db:
|
123 |
return [ToolModel.model_validate(tool) for tool in db.query(Tool).all()]
|
124 |
|
|
|
141 |
)
|
142 |
db.commit()
|
143 |
return self.get_tool_by_id(id)
|
144 |
+
except Exception:
|
145 |
return None
|
146 |
|
147 |
def get_user_valves_by_id_and_user_id(
|
|
|
196 |
tool = db.query(Tool).get(id)
|
197 |
db.refresh(tool)
|
198 |
return ToolModel.model_validate(tool)
|
199 |
+
except Exception:
|
200 |
return None
|
201 |
|
202 |
def delete_tool_by_id(self, id: str) -> bool:
|
|
|
206 |
db.commit()
|
207 |
|
208 |
return True
|
209 |
+
except Exception:
|
210 |
return False
|
211 |
|
212 |
|
backend/apps/webui/models/users.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from pydantic import BaseModel, ConfigDict, parse_obj_as
|
2 |
-
from typing import
|
3 |
import time
|
4 |
|
5 |
from sqlalchemy import String, Column, BigInteger, Text
|
@@ -125,7 +125,7 @@ class UsersTable:
|
|
125 |
|
126 |
user = db.query(User).filter_by(api_key=api_key).first()
|
127 |
return UserModel.model_validate(user)
|
128 |
-
except:
|
129 |
return None
|
130 |
|
131 |
def get_user_by_email(self, email: str) -> Optional[UserModel]:
|
@@ -134,7 +134,7 @@ class UsersTable:
|
|
134 |
|
135 |
user = db.query(User).filter_by(email=email).first()
|
136 |
return UserModel.model_validate(user)
|
137 |
-
except:
|
138 |
return None
|
139 |
|
140 |
def get_user_by_oauth_sub(self, sub: str) -> Optional[UserModel]:
|
@@ -143,10 +143,10 @@ class UsersTable:
|
|
143 |
|
144 |
user = db.query(User).filter_by(oauth_sub=sub).first()
|
145 |
return UserModel.model_validate(user)
|
146 |
-
except:
|
147 |
return None
|
148 |
|
149 |
-
def get_users(self, skip: int = 0, limit: int = 50) ->
|
150 |
with get_db() as db:
|
151 |
users = (
|
152 |
db.query(User)
|
@@ -164,7 +164,7 @@ class UsersTable:
|
|
164 |
with get_db() as db:
|
165 |
user = db.query(User).order_by(User.created_at).first()
|
166 |
return UserModel.model_validate(user)
|
167 |
-
except:
|
168 |
return None
|
169 |
|
170 |
def update_user_role_by_id(self, id: str, role: str) -> Optional[UserModel]:
|
@@ -174,7 +174,7 @@ class UsersTable:
|
|
174 |
db.commit()
|
175 |
user = db.query(User).filter_by(id=id).first()
|
176 |
return UserModel.model_validate(user)
|
177 |
-
except:
|
178 |
return None
|
179 |
|
180 |
def update_user_profile_image_url_by_id(
|
@@ -189,7 +189,7 @@ class UsersTable:
|
|
189 |
|
190 |
user = db.query(User).filter_by(id=id).first()
|
191 |
return UserModel.model_validate(user)
|
192 |
-
except:
|
193 |
return None
|
194 |
|
195 |
def update_user_last_active_by_id(self, id: str) -> Optional[UserModel]:
|
@@ -203,7 +203,7 @@ class UsersTable:
|
|
203 |
|
204 |
user = db.query(User).filter_by(id=id).first()
|
205 |
return UserModel.model_validate(user)
|
206 |
-
except:
|
207 |
return None
|
208 |
|
209 |
def update_user_oauth_sub_by_id(
|
@@ -216,7 +216,7 @@ class UsersTable:
|
|
216 |
|
217 |
user = db.query(User).filter_by(id=id).first()
|
218 |
return UserModel.model_validate(user)
|
219 |
-
except:
|
220 |
return None
|
221 |
|
222 |
def update_user_by_id(self, id: str, updated: dict) -> Optional[UserModel]:
|
@@ -245,7 +245,7 @@ class UsersTable:
|
|
245 |
return True
|
246 |
else:
|
247 |
return False
|
248 |
-
except:
|
249 |
return False
|
250 |
|
251 |
def update_user_api_key_by_id(self, id: str, api_key: str) -> str:
|
@@ -254,7 +254,7 @@ class UsersTable:
|
|
254 |
result = db.query(User).filter_by(id=id).update({"api_key": api_key})
|
255 |
db.commit()
|
256 |
return True if result == 1 else False
|
257 |
-
except:
|
258 |
return False
|
259 |
|
260 |
def get_user_api_key_by_id(self, id: str) -> Optional[str]:
|
|
|
1 |
from pydantic import BaseModel, ConfigDict, parse_obj_as
|
2 |
+
from typing import Union, Optional
|
3 |
import time
|
4 |
|
5 |
from sqlalchemy import String, Column, BigInteger, Text
|
|
|
125 |
|
126 |
user = db.query(User).filter_by(api_key=api_key).first()
|
127 |
return UserModel.model_validate(user)
|
128 |
+
except Exception:
|
129 |
return None
|
130 |
|
131 |
def get_user_by_email(self, email: str) -> Optional[UserModel]:
|
|
|
134 |
|
135 |
user = db.query(User).filter_by(email=email).first()
|
136 |
return UserModel.model_validate(user)
|
137 |
+
except Exception:
|
138 |
return None
|
139 |
|
140 |
def get_user_by_oauth_sub(self, sub: str) -> Optional[UserModel]:
|
|
|
143 |
|
144 |
user = db.query(User).filter_by(oauth_sub=sub).first()
|
145 |
return UserModel.model_validate(user)
|
146 |
+
except Exception:
|
147 |
return None
|
148 |
|
149 |
+
def get_users(self, skip: int = 0, limit: int = 50) -> list[UserModel]:
|
150 |
with get_db() as db:
|
151 |
users = (
|
152 |
db.query(User)
|
|
|
164 |
with get_db() as db:
|
165 |
user = db.query(User).order_by(User.created_at).first()
|
166 |
return UserModel.model_validate(user)
|
167 |
+
except Exception:
|
168 |
return None
|
169 |
|
170 |
def update_user_role_by_id(self, id: str, role: str) -> Optional[UserModel]:
|
|
|
174 |
db.commit()
|
175 |
user = db.query(User).filter_by(id=id).first()
|
176 |
return UserModel.model_validate(user)
|
177 |
+
except Exception:
|
178 |
return None
|
179 |
|
180 |
def update_user_profile_image_url_by_id(
|
|
|
189 |
|
190 |
user = db.query(User).filter_by(id=id).first()
|
191 |
return UserModel.model_validate(user)
|
192 |
+
except Exception:
|
193 |
return None
|
194 |
|
195 |
def update_user_last_active_by_id(self, id: str) -> Optional[UserModel]:
|
|
|
203 |
|
204 |
user = db.query(User).filter_by(id=id).first()
|
205 |
return UserModel.model_validate(user)
|
206 |
+
except Exception:
|
207 |
return None
|
208 |
|
209 |
def update_user_oauth_sub_by_id(
|
|
|
216 |
|
217 |
user = db.query(User).filter_by(id=id).first()
|
218 |
return UserModel.model_validate(user)
|
219 |
+
except Exception:
|
220 |
return None
|
221 |
|
222 |
def update_user_by_id(self, id: str, updated: dict) -> Optional[UserModel]:
|
|
|
245 |
return True
|
246 |
else:
|
247 |
return False
|
248 |
+
except Exception:
|
249 |
return False
|
250 |
|
251 |
def update_user_api_key_by_id(self, id: str, api_key: str) -> str:
|
|
|
254 |
result = db.query(User).filter_by(id=id).update({"api_key": api_key})
|
255 |
db.commit()
|
256 |
return True if result == 1 else False
|
257 |
+
except Exception:
|
258 |
return False
|
259 |
|
260 |
def get_user_api_key_by_id(self, id: str) -> Optional[str]:
|
backend/apps/webui/routers/chats.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
from fastapi import Depends, Request, HTTPException, status
|
2 |
from datetime import datetime, timedelta
|
3 |
-
from typing import
|
4 |
from utils.utils import get_verified_user, get_admin_user
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
@@ -40,8 +40,8 @@ router = APIRouter()
|
|
40 |
############################
|
41 |
|
42 |
|
43 |
-
@router.get("/", response_model=
|
44 |
-
@router.get("/list", response_model=
|
45 |
async def get_session_user_chat_list(
|
46 |
user=Depends(get_verified_user), page: Optional[int] = None
|
47 |
):
|
@@ -80,7 +80,7 @@ async def delete_all_user_chats(request: Request, user=Depends(get_verified_user
|
|
80 |
############################
|
81 |
|
82 |
|
83 |
-
@router.get("/list/user/{user_id}", response_model=
|
84 |
async def get_user_chat_list_by_user_id(
|
85 |
user_id: str,
|
86 |
user=Depends(get_admin_user),
|
@@ -119,7 +119,7 @@ async def create_new_chat(form_data: ChatForm, user=Depends(get_verified_user)):
|
|
119 |
############################
|
120 |
|
121 |
|
122 |
-
@router.get("/all", response_model=
|
123 |
async def get_user_chats(user=Depends(get_verified_user)):
|
124 |
return [
|
125 |
ChatResponse(**{**chat.model_dump(), "chat": json.loads(chat.chat)})
|
@@ -132,7 +132,7 @@ async def get_user_chats(user=Depends(get_verified_user)):
|
|
132 |
############################
|
133 |
|
134 |
|
135 |
-
@router.get("/all/archived", response_model=
|
136 |
async def get_user_archived_chats(user=Depends(get_verified_user)):
|
137 |
return [
|
138 |
ChatResponse(**{**chat.model_dump(), "chat": json.loads(chat.chat)})
|
@@ -145,7 +145,7 @@ async def get_user_archived_chats(user=Depends(get_verified_user)):
|
|
145 |
############################
|
146 |
|
147 |
|
148 |
-
@router.get("/all/db", response_model=
|
149 |
async def get_all_user_chats_in_db(user=Depends(get_admin_user)):
|
150 |
if not ENABLE_ADMIN_EXPORT:
|
151 |
raise HTTPException(
|
@@ -163,7 +163,7 @@ async def get_all_user_chats_in_db(user=Depends(get_admin_user)):
|
|
163 |
############################
|
164 |
|
165 |
|
166 |
-
@router.get("/archived", response_model=
|
167 |
async def get_archived_session_user_chat_list(
|
168 |
user=Depends(get_verified_user), skip: int = 0, limit: int = 50
|
169 |
):
|
@@ -216,7 +216,7 @@ class TagNameForm(BaseModel):
|
|
216 |
limit: Optional[int] = 50
|
217 |
|
218 |
|
219 |
-
@router.post("/tags", response_model=
|
220 |
async def get_user_chat_list_by_tag_name(
|
221 |
form_data: TagNameForm, user=Depends(get_verified_user)
|
222 |
):
|
@@ -241,7 +241,7 @@ async def get_user_chat_list_by_tag_name(
|
|
241 |
############################
|
242 |
|
243 |
|
244 |
-
@router.get("/tags/all", response_model=
|
245 |
async def get_all_tags(user=Depends(get_verified_user)):
|
246 |
try:
|
247 |
tags = Tags.get_tags_by_user_id(user.id)
|
@@ -417,7 +417,7 @@ async def delete_shared_chat_by_id(id: str, user=Depends(get_verified_user)):
|
|
417 |
############################
|
418 |
|
419 |
|
420 |
-
@router.get("/{id}/tags", response_model=
|
421 |
async def get_chat_tags_by_id(id: str, user=Depends(get_verified_user)):
|
422 |
tags = Tags.get_tags_by_chat_id_and_user_id(id, user.id)
|
423 |
|
|
|
1 |
from fastapi import Depends, Request, HTTPException, status
|
2 |
from datetime import datetime, timedelta
|
3 |
+
from typing import Union, Optional
|
4 |
from utils.utils import get_verified_user, get_admin_user
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
|
|
40 |
############################
|
41 |
|
42 |
|
43 |
+
@router.get("/", response_model=list[ChatTitleIdResponse])
|
44 |
+
@router.get("/list", response_model=list[ChatTitleIdResponse])
|
45 |
async def get_session_user_chat_list(
|
46 |
user=Depends(get_verified_user), page: Optional[int] = None
|
47 |
):
|
|
|
80 |
############################
|
81 |
|
82 |
|
83 |
+
@router.get("/list/user/{user_id}", response_model=list[ChatTitleIdResponse])
|
84 |
async def get_user_chat_list_by_user_id(
|
85 |
user_id: str,
|
86 |
user=Depends(get_admin_user),
|
|
|
119 |
############################
|
120 |
|
121 |
|
122 |
+
@router.get("/all", response_model=list[ChatResponse])
|
123 |
async def get_user_chats(user=Depends(get_verified_user)):
|
124 |
return [
|
125 |
ChatResponse(**{**chat.model_dump(), "chat": json.loads(chat.chat)})
|
|
|
132 |
############################
|
133 |
|
134 |
|
135 |
+
@router.get("/all/archived", response_model=list[ChatResponse])
|
136 |
async def get_user_archived_chats(user=Depends(get_verified_user)):
|
137 |
return [
|
138 |
ChatResponse(**{**chat.model_dump(), "chat": json.loads(chat.chat)})
|
|
|
145 |
############################
|
146 |
|
147 |
|
148 |
+
@router.get("/all/db", response_model=list[ChatResponse])
|
149 |
async def get_all_user_chats_in_db(user=Depends(get_admin_user)):
|
150 |
if not ENABLE_ADMIN_EXPORT:
|
151 |
raise HTTPException(
|
|
|
163 |
############################
|
164 |
|
165 |
|
166 |
+
@router.get("/archived", response_model=list[ChatTitleIdResponse])
|
167 |
async def get_archived_session_user_chat_list(
|
168 |
user=Depends(get_verified_user), skip: int = 0, limit: int = 50
|
169 |
):
|
|
|
216 |
limit: Optional[int] = 50
|
217 |
|
218 |
|
219 |
+
@router.post("/tags", response_model=list[ChatTitleIdResponse])
|
220 |
async def get_user_chat_list_by_tag_name(
|
221 |
form_data: TagNameForm, user=Depends(get_verified_user)
|
222 |
):
|
|
|
241 |
############################
|
242 |
|
243 |
|
244 |
+
@router.get("/tags/all", response_model=list[TagModel])
|
245 |
async def get_all_tags(user=Depends(get_verified_user)):
|
246 |
try:
|
247 |
tags = Tags.get_tags_by_user_id(user.id)
|
|
|
417 |
############################
|
418 |
|
419 |
|
420 |
+
@router.get("/{id}/tags", response_model=list[TagModel])
|
421 |
async def get_chat_tags_by_id(id: str, user=Depends(get_verified_user)):
|
422 |
tags = Tags.get_tags_by_chat_id_and_user_id(id, user.id)
|
423 |
|
backend/apps/webui/routers/configs.py
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
from fastapi import Response, Request
|
2 |
from fastapi import Depends, FastAPI, HTTPException, status
|
3 |
from datetime import datetime, timedelta
|
4 |
-
from typing import
|
5 |
|
6 |
from fastapi import APIRouter
|
7 |
from pydantic import BaseModel
|
@@ -29,12 +29,12 @@ class SetDefaultModelsForm(BaseModel):
|
|
29 |
|
30 |
|
31 |
class PromptSuggestion(BaseModel):
|
32 |
-
title:
|
33 |
content: str
|
34 |
|
35 |
|
36 |
class SetDefaultSuggestionsForm(BaseModel):
|
37 |
-
suggestions:
|
38 |
|
39 |
|
40 |
############################
|
@@ -50,7 +50,7 @@ async def set_global_default_models(
|
|
50 |
return request.app.state.config.DEFAULT_MODELS
|
51 |
|
52 |
|
53 |
-
@router.post("/default/suggestions", response_model=
|
54 |
async def set_global_default_suggestions(
|
55 |
request: Request,
|
56 |
form_data: SetDefaultSuggestionsForm,
|
@@ -67,10 +67,10 @@ async def set_global_default_suggestions(
|
|
67 |
|
68 |
|
69 |
class SetBannersForm(BaseModel):
|
70 |
-
banners:
|
71 |
|
72 |
|
73 |
-
@router.post("/banners", response_model=
|
74 |
async def set_banners(
|
75 |
request: Request,
|
76 |
form_data: SetBannersForm,
|
@@ -81,7 +81,7 @@ async def set_banners(
|
|
81 |
return request.app.state.config.BANNERS
|
82 |
|
83 |
|
84 |
-
@router.get("/banners", response_model=
|
85 |
async def get_banners(
|
86 |
request: Request,
|
87 |
user=Depends(get_verified_user),
|
|
|
1 |
from fastapi import Response, Request
|
2 |
from fastapi import Depends, FastAPI, HTTPException, status
|
3 |
from datetime import datetime, timedelta
|
4 |
+
from typing import Union
|
5 |
|
6 |
from fastapi import APIRouter
|
7 |
from pydantic import BaseModel
|
|
|
29 |
|
30 |
|
31 |
class PromptSuggestion(BaseModel):
|
32 |
+
title: list[str]
|
33 |
content: str
|
34 |
|
35 |
|
36 |
class SetDefaultSuggestionsForm(BaseModel):
|
37 |
+
suggestions: list[PromptSuggestion]
|
38 |
|
39 |
|
40 |
############################
|
|
|
50 |
return request.app.state.config.DEFAULT_MODELS
|
51 |
|
52 |
|
53 |
+
@router.post("/default/suggestions", response_model=list[PromptSuggestion])
|
54 |
async def set_global_default_suggestions(
|
55 |
request: Request,
|
56 |
form_data: SetDefaultSuggestionsForm,
|
|
|
67 |
|
68 |
|
69 |
class SetBannersForm(BaseModel):
|
70 |
+
banners: list[BannerModel]
|
71 |
|
72 |
|
73 |
+
@router.post("/banners", response_model=list[BannerModel])
|
74 |
async def set_banners(
|
75 |
request: Request,
|
76 |
form_data: SetBannersForm,
|
|
|
81 |
return request.app.state.config.BANNERS
|
82 |
|
83 |
|
84 |
+
@router.get("/banners", response_model=list[BannerModel])
|
85 |
async def get_banners(
|
86 |
request: Request,
|
87 |
user=Depends(get_verified_user),
|
backend/apps/webui/routers/documents.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status
|
2 |
from datetime import datetime, timedelta
|
3 |
-
from typing import
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
@@ -24,7 +24,7 @@ router = APIRouter()
|
|
24 |
############################
|
25 |
|
26 |
|
27 |
-
@router.get("/", response_model=
|
28 |
async def get_documents(user=Depends(get_verified_user)):
|
29 |
docs = [
|
30 |
DocumentResponse(
|
@@ -46,7 +46,7 @@ async def get_documents(user=Depends(get_verified_user)):
|
|
46 |
@router.post("/create", response_model=Optional[DocumentResponse])
|
47 |
async def create_new_doc(form_data: DocumentForm, user=Depends(get_admin_user)):
|
48 |
doc = Documents.get_doc_by_name(form_data.name)
|
49 |
-
if doc
|
50 |
doc = Documents.insert_new_doc(user.id, form_data)
|
51 |
|
52 |
if doc:
|
@@ -102,7 +102,7 @@ class TagItem(BaseModel):
|
|
102 |
|
103 |
class TagDocumentForm(BaseModel):
|
104 |
name: str
|
105 |
-
tags:
|
106 |
|
107 |
|
108 |
@router.post("/doc/tags", response_model=Optional[DocumentResponse])
|
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status
|
2 |
from datetime import datetime, timedelta
|
3 |
+
from typing import Union, Optional
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
|
|
24 |
############################
|
25 |
|
26 |
|
27 |
+
@router.get("/", response_model=list[DocumentResponse])
|
28 |
async def get_documents(user=Depends(get_verified_user)):
|
29 |
docs = [
|
30 |
DocumentResponse(
|
|
|
46 |
@router.post("/create", response_model=Optional[DocumentResponse])
|
47 |
async def create_new_doc(form_data: DocumentForm, user=Depends(get_admin_user)):
|
48 |
doc = Documents.get_doc_by_name(form_data.name)
|
49 |
+
if doc is None:
|
50 |
doc = Documents.insert_new_doc(user.id, form_data)
|
51 |
|
52 |
if doc:
|
|
|
102 |
|
103 |
class TagDocumentForm(BaseModel):
|
104 |
name: str
|
105 |
+
tags: list[dict]
|
106 |
|
107 |
|
108 |
@router.post("/doc/tags", response_model=Optional[DocumentResponse])
|
backend/apps/webui/routers/files.py
CHANGED
@@ -11,7 +11,7 @@ from fastapi import (
|
|
11 |
|
12 |
|
13 |
from datetime import datetime, timedelta
|
14 |
-
from typing import
|
15 |
from pathlib import Path
|
16 |
|
17 |
from fastapi import APIRouter
|
@@ -104,7 +104,7 @@ def upload_file(file: UploadFile = File(...), user=Depends(get_verified_user)):
|
|
104 |
############################
|
105 |
|
106 |
|
107 |
-
@router.get("/", response_model=
|
108 |
async def list_files(user=Depends(get_verified_user)):
|
109 |
files = Files.get_files()
|
110 |
return files
|
|
|
11 |
|
12 |
|
13 |
from datetime import datetime, timedelta
|
14 |
+
from typing import Union, Optional
|
15 |
from pathlib import Path
|
16 |
|
17 |
from fastapi import APIRouter
|
|
|
104 |
############################
|
105 |
|
106 |
|
107 |
+
@router.get("/", response_model=list[FileModel])
|
108 |
async def list_files(user=Depends(get_verified_user)):
|
109 |
files = Files.get_files()
|
110 |
return files
|
backend/apps/webui/routers/functions.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status, Request
|
2 |
from datetime import datetime, timedelta
|
3 |
-
from typing import
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
@@ -30,7 +30,7 @@ router = APIRouter()
|
|
30 |
############################
|
31 |
|
32 |
|
33 |
-
@router.get("/", response_model=
|
34 |
async def get_functions(user=Depends(get_verified_user)):
|
35 |
return Functions.get_functions()
|
36 |
|
@@ -40,7 +40,7 @@ async def get_functions(user=Depends(get_verified_user)):
|
|
40 |
############################
|
41 |
|
42 |
|
43 |
-
@router.get("/export", response_model=
|
44 |
async def get_functions(user=Depends(get_admin_user)):
|
45 |
return Functions.get_functions()
|
46 |
|
@@ -63,7 +63,7 @@ async def create_new_function(
|
|
63 |
form_data.id = form_data.id.lower()
|
64 |
|
65 |
function = Functions.get_function_by_id(form_data.id)
|
66 |
-
if function
|
67 |
function_path = os.path.join(FUNCTIONS_DIR, f"{form_data.id}.py")
|
68 |
try:
|
69 |
with open(function_path, "w") as function_file:
|
@@ -235,7 +235,7 @@ async def delete_function_by_id(
|
|
235 |
function_path = os.path.join(FUNCTIONS_DIR, f"{id}.py")
|
236 |
try:
|
237 |
os.remove(function_path)
|
238 |
-
except:
|
239 |
pass
|
240 |
|
241 |
return result
|
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status, Request
|
2 |
from datetime import datetime, timedelta
|
3 |
+
from typing import Union, Optional
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
|
|
30 |
############################
|
31 |
|
32 |
|
33 |
+
@router.get("/", response_model=list[FunctionResponse])
|
34 |
async def get_functions(user=Depends(get_verified_user)):
|
35 |
return Functions.get_functions()
|
36 |
|
|
|
40 |
############################
|
41 |
|
42 |
|
43 |
+
@router.get("/export", response_model=list[FunctionModel])
|
44 |
async def get_functions(user=Depends(get_admin_user)):
|
45 |
return Functions.get_functions()
|
46 |
|
|
|
63 |
form_data.id = form_data.id.lower()
|
64 |
|
65 |
function = Functions.get_function_by_id(form_data.id)
|
66 |
+
if function is None:
|
67 |
function_path = os.path.join(FUNCTIONS_DIR, f"{form_data.id}.py")
|
68 |
try:
|
69 |
with open(function_path, "w") as function_file:
|
|
|
235 |
function_path = os.path.join(FUNCTIONS_DIR, f"{id}.py")
|
236 |
try:
|
237 |
os.remove(function_path)
|
238 |
+
except Exception:
|
239 |
pass
|
240 |
|
241 |
return result
|
backend/apps/webui/routers/memories.py
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
from fastapi import Response, Request
|
2 |
from fastapi import Depends, FastAPI, HTTPException, status
|
3 |
from datetime import datetime, timedelta
|
4 |
-
from typing import
|
5 |
|
6 |
from fastapi import APIRouter
|
7 |
from pydantic import BaseModel
|
@@ -30,7 +30,7 @@ async def get_embeddings(request: Request):
|
|
30 |
############################
|
31 |
|
32 |
|
33 |
-
@router.get("/", response_model=
|
34 |
async def get_memories(user=Depends(get_verified_user)):
|
35 |
return Memories.get_memories_by_user_id(user.id)
|
36 |
|
|
|
1 |
from fastapi import Response, Request
|
2 |
from fastapi import Depends, FastAPI, HTTPException, status
|
3 |
from datetime import datetime, timedelta
|
4 |
+
from typing import Union, Optional
|
5 |
|
6 |
from fastapi import APIRouter
|
7 |
from pydantic import BaseModel
|
|
|
30 |
############################
|
31 |
|
32 |
|
33 |
+
@router.get("/", response_model=list[MemoryModel])
|
34 |
async def get_memories(user=Depends(get_verified_user)):
|
35 |
return Memories.get_memories_by_user_id(user.id)
|
36 |
|
backend/apps/webui/routers/models.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status, Request
|
2 |
from datetime import datetime, timedelta
|
3 |
-
from typing import
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
@@ -18,7 +18,7 @@ router = APIRouter()
|
|
18 |
###########################
|
19 |
|
20 |
|
21 |
-
@router.get("/", response_model=
|
22 |
async def get_models(user=Depends(get_verified_user)):
|
23 |
return Models.get_all_models()
|
24 |
|
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status, Request
|
2 |
from datetime import datetime, timedelta
|
3 |
+
from typing import Union, Optional
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
|
|
18 |
###########################
|
19 |
|
20 |
|
21 |
+
@router.get("/", response_model=list[ModelResponse])
|
22 |
async def get_models(user=Depends(get_verified_user)):
|
23 |
return Models.get_all_models()
|
24 |
|
backend/apps/webui/routers/prompts.py
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status
|
2 |
from datetime import datetime, timedelta
|
3 |
-
from typing import
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
@@ -18,7 +18,7 @@ router = APIRouter()
|
|
18 |
############################
|
19 |
|
20 |
|
21 |
-
@router.get("/", response_model=
|
22 |
async def get_prompts(user=Depends(get_verified_user)):
|
23 |
return Prompts.get_prompts()
|
24 |
|
@@ -31,7 +31,7 @@ async def get_prompts(user=Depends(get_verified_user)):
|
|
31 |
@router.post("/create", response_model=Optional[PromptModel])
|
32 |
async def create_new_prompt(form_data: PromptForm, user=Depends(get_admin_user)):
|
33 |
prompt = Prompts.get_prompt_by_command(form_data.command)
|
34 |
-
if prompt
|
35 |
prompt = Prompts.insert_new_prompt(user.id, form_data)
|
36 |
|
37 |
if prompt:
|
|
|
1 |
from fastapi import Depends, FastAPI, HTTPException, status
|
2 |
from datetime import datetime, timedelta
|
3 |
+
from typing import Union, Optional
|
4 |
|
5 |
from fastapi import APIRouter
|
6 |
from pydantic import BaseModel
|
|
|
18 |
############################
|
19 |
|
20 |
|
21 |
+
@router.get("/", response_model=list[PromptModel])
|
22 |
async def get_prompts(user=Depends(get_verified_user)):
|
23 |
return Prompts.get_prompts()
|
24 |
|
|
|
31 |
@router.post("/create", response_model=Optional[PromptModel])
|
32 |
async def create_new_prompt(form_data: PromptForm, user=Depends(get_admin_user)):
|
33 |
prompt = Prompts.get_prompt_by_command(form_data.command)
|
34 |
+
if prompt is None:
|
35 |
prompt = Prompts.insert_new_prompt(user.id, form_data)
|
36 |
|
37 |
if prompt:
|
backend/apps/webui/routers/tools.py
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
from fastapi import Depends, HTTPException, status, Request
|
2 |
-
from typing import
|
3 |
|
4 |
from fastapi import APIRouter
|
5 |
|
@@ -27,7 +27,7 @@ router = APIRouter()
|
|
27 |
############################
|
28 |
|
29 |
|
30 |
-
@router.get("/", response_model=
|
31 |
async def get_toolkits(user=Depends(get_verified_user)):
|
32 |
toolkits = [toolkit for toolkit in Tools.get_tools()]
|
33 |
return toolkits
|
@@ -38,7 +38,7 @@ async def get_toolkits(user=Depends(get_verified_user)):
|
|
38 |
############################
|
39 |
|
40 |
|
41 |
-
@router.get("/export", response_model=
|
42 |
async def get_toolkits(user=Depends(get_admin_user)):
|
43 |
toolkits = [toolkit for toolkit in Tools.get_tools()]
|
44 |
return toolkits
|
|
|
1 |
from fastapi import Depends, HTTPException, status, Request
|
2 |
+
from typing import Optional
|
3 |
|
4 |
from fastapi import APIRouter
|
5 |
|
|
|
27 |
############################
|
28 |
|
29 |
|
30 |
+
@router.get("/", response_model=list[ToolResponse])
|
31 |
async def get_toolkits(user=Depends(get_verified_user)):
|
32 |
toolkits = [toolkit for toolkit in Tools.get_tools()]
|
33 |
return toolkits
|
|
|
38 |
############################
|
39 |
|
40 |
|
41 |
+
@router.get("/export", response_model=list[ToolModel])
|
42 |
async def get_toolkits(user=Depends(get_admin_user)):
|
43 |
toolkits = [toolkit for toolkit in Tools.get_tools()]
|
44 |
return toolkits
|
backend/apps/webui/routers/users.py
CHANGED
@@ -1,7 +1,7 @@
|
|
1 |
from fastapi import Response, Request
|
2 |
from fastapi import Depends, FastAPI, HTTPException, status
|
3 |
from datetime import datetime, timedelta
|
4 |
-
from typing import
|
5 |
|
6 |
from fastapi import APIRouter
|
7 |
from pydantic import BaseModel
|
@@ -39,7 +39,7 @@ router = APIRouter()
|
|
39 |
############################
|
40 |
|
41 |
|
42 |
-
@router.get("/", response_model=
|
43 |
async def get_users(skip: int = 0, limit: int = 50, user=Depends(get_admin_user)):
|
44 |
return Users.get_users(skip, limit)
|
45 |
|
|
|
1 |
from fastapi import Response, Request
|
2 |
from fastapi import Depends, FastAPI, HTTPException, status
|
3 |
from datetime import datetime, timedelta
|
4 |
+
from typing import Union, Optional
|
5 |
|
6 |
from fastapi import APIRouter
|
7 |
from pydantic import BaseModel
|
|
|
39 |
############################
|
40 |
|
41 |
|
42 |
+
@router.get("/", response_model=list[UserModel])
|
43 |
async def get_users(skip: int = 0, limit: int = 50, user=Depends(get_admin_user)):
|
44 |
return Users.get_users(skip, limit)
|
45 |
|
backend/apps/webui/routers/utils.py
CHANGED
@@ -17,7 +17,7 @@ from utils.misc import calculate_sha256, get_gravatar_url
|
|
17 |
|
18 |
from config import OLLAMA_BASE_URLS, DATA_DIR, UPLOAD_DIR, ENABLE_ADMIN_EXPORT
|
19 |
from constants import ERROR_MESSAGES
|
20 |
-
|
21 |
|
22 |
router = APIRouter()
|
23 |
|
@@ -57,7 +57,7 @@ async def get_html_from_markdown(
|
|
57 |
|
58 |
class ChatForm(BaseModel):
|
59 |
title: str
|
60 |
-
messages:
|
61 |
|
62 |
|
63 |
@router.post("/pdf")
|
|
|
17 |
|
18 |
from config import OLLAMA_BASE_URLS, DATA_DIR, UPLOAD_DIR, ENABLE_ADMIN_EXPORT
|
19 |
from constants import ERROR_MESSAGES
|
20 |
+
|
21 |
|
22 |
router = APIRouter()
|
23 |
|
|
|
57 |
|
58 |
class ChatForm(BaseModel):
|
59 |
title: str
|
60 |
+
messages: list[dict]
|
61 |
|
62 |
|
63 |
@router.post("/pdf")
|
backend/apps/webui/utils.py
CHANGED
@@ -1,6 +1,8 @@
|
|
1 |
from importlib import util
|
2 |
import os
|
3 |
import re
|
|
|
|
|
4 |
|
5 |
from config import TOOLS_DIR, FUNCTIONS_DIR
|
6 |
|
@@ -52,6 +54,7 @@ def load_toolkit_module_by_id(toolkit_id):
|
|
52 |
frontmatter = extract_frontmatter(toolkit_path)
|
53 |
|
54 |
try:
|
|
|
55 |
spec.loader.exec_module(module)
|
56 |
print(f"Loaded module: {module.__name__}")
|
57 |
if hasattr(module, "Tools"):
|
@@ -73,6 +76,7 @@ def load_function_module_by_id(function_id):
|
|
73 |
frontmatter = extract_frontmatter(function_path)
|
74 |
|
75 |
try:
|
|
|
76 |
spec.loader.exec_module(module)
|
77 |
print(f"Loaded module: {module.__name__}")
|
78 |
if hasattr(module, "Pipe"):
|
@@ -88,3 +92,13 @@ def load_function_module_by_id(function_id):
|
|
88 |
# Move the file to the error folder
|
89 |
os.rename(function_path, f"{function_path}.error")
|
90 |
raise e
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
from importlib import util
|
2 |
import os
|
3 |
import re
|
4 |
+
import sys
|
5 |
+
import subprocess
|
6 |
|
7 |
from config import TOOLS_DIR, FUNCTIONS_DIR
|
8 |
|
|
|
54 |
frontmatter = extract_frontmatter(toolkit_path)
|
55 |
|
56 |
try:
|
57 |
+
install_frontmatter_requirements(frontmatter.get("requirements", ""))
|
58 |
spec.loader.exec_module(module)
|
59 |
print(f"Loaded module: {module.__name__}")
|
60 |
if hasattr(module, "Tools"):
|
|
|
76 |
frontmatter = extract_frontmatter(function_path)
|
77 |
|
78 |
try:
|
79 |
+
install_frontmatter_requirements(frontmatter.get("requirements", ""))
|
80 |
spec.loader.exec_module(module)
|
81 |
print(f"Loaded module: {module.__name__}")
|
82 |
if hasattr(module, "Pipe"):
|
|
|
92 |
# Move the file to the error folder
|
93 |
os.rename(function_path, f"{function_path}.error")
|
94 |
raise e
|
95 |
+
|
96 |
+
|
97 |
+
def install_frontmatter_requirements(requirements):
|
98 |
+
if requirements:
|
99 |
+
req_list = [req.strip() for req in requirements.split(",")]
|
100 |
+
for req in req_list:
|
101 |
+
print(f"Installing requirement: {req}")
|
102 |
+
subprocess.check_call([sys.executable, "-m", "pip", "install", req])
|
103 |
+
else:
|
104 |
+
print("No requirements found in frontmatter.")
|
backend/config.py
CHANGED
@@ -87,14 +87,14 @@ class EndpointFilter(logging.Filter):
|
|
87 |
logging.getLogger("uvicorn.access").addFilter(EndpointFilter())
|
88 |
|
89 |
|
90 |
-
WEBUI_NAME = os.environ.get("WEBUI_NAME", "
|
91 |
-
|
92 |
-
|
93 |
|
94 |
WEBUI_URL = os.environ.get("WEBUI_URL", "http://localhost:3000")
|
95 |
|
96 |
-
|
97 |
-
|
98 |
|
99 |
####################################
|
100 |
# ENV (dev,test,prod)
|
@@ -104,7 +104,7 @@ ENV = os.environ.get("ENV", "dev")
|
|
104 |
|
105 |
try:
|
106 |
PACKAGE_DATA = json.loads((BASE_DIR / "package.json").read_text())
|
107 |
-
except:
|
108 |
try:
|
109 |
PACKAGE_DATA = {"version": importlib.metadata.version("open-webui")}
|
110 |
except importlib.metadata.PackageNotFoundError:
|
@@ -137,7 +137,7 @@ try:
|
|
137 |
with open(str(changelog_path.absolute()), "r", encoding="utf8") as file:
|
138 |
changelog_content = file.read()
|
139 |
|
140 |
-
except:
|
141 |
changelog_content = (pkgutil.get_data("open_webui", "CHANGELOG.md") or b"").decode()
|
142 |
|
143 |
|
@@ -202,12 +202,12 @@ if RESET_CONFIG_ON_START:
|
|
202 |
os.remove(f"{DATA_DIR}/config.json")
|
203 |
with open(f"{DATA_DIR}/config.json", "w") as f:
|
204 |
f.write("{}")
|
205 |
-
except:
|
206 |
pass
|
207 |
|
208 |
try:
|
209 |
CONFIG_DATA = json.loads((DATA_DIR / "config.json").read_text())
|
210 |
-
except:
|
211 |
CONFIG_DATA = {}
|
212 |
|
213 |
|
@@ -433,6 +433,12 @@ OAUTH_PICTURE_CLAIM = PersistentConfig(
|
|
433 |
os.environ.get("OAUTH_PICTURE_CLAIM", "picture"),
|
434 |
)
|
435 |
|
|
|
|
|
|
|
|
|
|
|
|
|
436 |
|
437 |
def load_oauth_providers():
|
438 |
OAUTH_PROVIDERS.clear()
|
@@ -514,7 +520,6 @@ if CUSTOM_NAME:
|
|
514 |
data = r.json()
|
515 |
if r.ok:
|
516 |
if "logo" in data:
|
517 |
-
|
518 |
WEBUI_FAVICON_URL = url = (
|
519 |
f"https://api.openwebui.com{data['logo']}"
|
520 |
if data["logo"][0] == "/"
|
@@ -642,7 +647,7 @@ if AIOHTTP_CLIENT_TIMEOUT == "":
|
|
642 |
else:
|
643 |
try:
|
644 |
AIOHTTP_CLIENT_TIMEOUT = int(AIOHTTP_CLIENT_TIMEOUT)
|
645 |
-
except:
|
646 |
AIOHTTP_CLIENT_TIMEOUT = 300
|
647 |
|
648 |
|
@@ -722,7 +727,7 @@ try:
|
|
722 |
OPENAI_API_KEY = OPENAI_API_KEYS.value[
|
723 |
OPENAI_API_BASE_URLS.value.index("https://api.openai.com/v1")
|
724 |
]
|
725 |
-
except:
|
726 |
pass
|
727 |
|
728 |
OPENAI_API_BASE_URL = "https://api.openai.com/v1"
|
@@ -1038,7 +1043,7 @@ RAG_EMBEDDING_MODEL = PersistentConfig(
|
|
1038 |
"rag.embedding_model",
|
1039 |
os.environ.get("RAG_EMBEDDING_MODEL", "sentence-transformers/all-MiniLM-L6-v2"),
|
1040 |
)
|
1041 |
-
log.info(f"Embedding model set: {RAG_EMBEDDING_MODEL.value}")
|
1042 |
|
1043 |
RAG_EMBEDDING_MODEL_AUTO_UPDATE = (
|
1044 |
os.environ.get("RAG_EMBEDDING_MODEL_AUTO_UPDATE", "").lower() == "true"
|
@@ -1060,7 +1065,7 @@ RAG_RERANKING_MODEL = PersistentConfig(
|
|
1060 |
os.environ.get("RAG_RERANKING_MODEL", ""),
|
1061 |
)
|
1062 |
if RAG_RERANKING_MODEL.value != "":
|
1063 |
-
log.info(f"Reranking model set: {RAG_RERANKING_MODEL.value}")
|
1064 |
|
1065 |
RAG_RERANKING_MODEL_AUTO_UPDATE = (
|
1066 |
os.environ.get("RAG_RERANKING_MODEL_AUTO_UPDATE", "").lower() == "true"
|
|
|
87 |
logging.getLogger("uvicorn.access").addFilter(EndpointFilter())
|
88 |
|
89 |
|
90 |
+
WEBUI_NAME = os.environ.get("WEBUI_NAME", "Open WebUI")
|
91 |
+
if WEBUI_NAME != "Open WebUI":
|
92 |
+
WEBUI_NAME += " (Open WebUI)"
|
93 |
|
94 |
WEBUI_URL = os.environ.get("WEBUI_URL", "http://localhost:3000")
|
95 |
|
96 |
+
WEBUI_FAVICON_URL = "https://openwebui.com/favicon.png"
|
97 |
+
|
98 |
|
99 |
####################################
|
100 |
# ENV (dev,test,prod)
|
|
|
104 |
|
105 |
try:
|
106 |
PACKAGE_DATA = json.loads((BASE_DIR / "package.json").read_text())
|
107 |
+
except Exception:
|
108 |
try:
|
109 |
PACKAGE_DATA = {"version": importlib.metadata.version("open-webui")}
|
110 |
except importlib.metadata.PackageNotFoundError:
|
|
|
137 |
with open(str(changelog_path.absolute()), "r", encoding="utf8") as file:
|
138 |
changelog_content = file.read()
|
139 |
|
140 |
+
except Exception:
|
141 |
changelog_content = (pkgutil.get_data("open_webui", "CHANGELOG.md") or b"").decode()
|
142 |
|
143 |
|
|
|
202 |
os.remove(f"{DATA_DIR}/config.json")
|
203 |
with open(f"{DATA_DIR}/config.json", "w") as f:
|
204 |
f.write("{}")
|
205 |
+
except Exception:
|
206 |
pass
|
207 |
|
208 |
try:
|
209 |
CONFIG_DATA = json.loads((DATA_DIR / "config.json").read_text())
|
210 |
+
except Exception:
|
211 |
CONFIG_DATA = {}
|
212 |
|
213 |
|
|
|
433 |
os.environ.get("OAUTH_PICTURE_CLAIM", "picture"),
|
434 |
)
|
435 |
|
436 |
+
OAUTH_EMAIL_CLAIM = PersistentConfig(
|
437 |
+
"OAUTH_EMAIL_CLAIM",
|
438 |
+
"oauth.oidc.email_claim",
|
439 |
+
os.environ.get("OAUTH_EMAIL_CLAIM", "email"),
|
440 |
+
)
|
441 |
+
|
442 |
|
443 |
def load_oauth_providers():
|
444 |
OAUTH_PROVIDERS.clear()
|
|
|
520 |
data = r.json()
|
521 |
if r.ok:
|
522 |
if "logo" in data:
|
|
|
523 |
WEBUI_FAVICON_URL = url = (
|
524 |
f"https://api.openwebui.com{data['logo']}"
|
525 |
if data["logo"][0] == "/"
|
|
|
647 |
else:
|
648 |
try:
|
649 |
AIOHTTP_CLIENT_TIMEOUT = int(AIOHTTP_CLIENT_TIMEOUT)
|
650 |
+
except Exception:
|
651 |
AIOHTTP_CLIENT_TIMEOUT = 300
|
652 |
|
653 |
|
|
|
727 |
OPENAI_API_KEY = OPENAI_API_KEYS.value[
|
728 |
OPENAI_API_BASE_URLS.value.index("https://api.openai.com/v1")
|
729 |
]
|
730 |
+
except Exception:
|
731 |
pass
|
732 |
|
733 |
OPENAI_API_BASE_URL = "https://api.openai.com/v1"
|
|
|
1043 |
"rag.embedding_model",
|
1044 |
os.environ.get("RAG_EMBEDDING_MODEL", "sentence-transformers/all-MiniLM-L6-v2"),
|
1045 |
)
|
1046 |
+
log.info(f"Embedding model set: {RAG_EMBEDDING_MODEL.value}")
|
1047 |
|
1048 |
RAG_EMBEDDING_MODEL_AUTO_UPDATE = (
|
1049 |
os.environ.get("RAG_EMBEDDING_MODEL_AUTO_UPDATE", "").lower() == "true"
|
|
|
1065 |
os.environ.get("RAG_RERANKING_MODEL", ""),
|
1066 |
)
|
1067 |
if RAG_RERANKING_MODEL.value != "":
|
1068 |
+
log.info(f"Reranking model set: {RAG_RERANKING_MODEL.value}")
|
1069 |
|
1070 |
RAG_RERANKING_MODEL_AUTO_UPDATE = (
|
1071 |
os.environ.get("RAG_RERANKING_MODEL_AUTO_UPDATE", "").lower() == "true"
|
backend/data/litellm/config.yaml
CHANGED
@@ -1,6 +1,4 @@
|
|
1 |
general_settings: {}
|
2 |
-
litellm_settings:
|
3 |
-
success_callback: ["langfuse"]
|
4 |
-
failure_callback: ["langfuse"]
|
5 |
model_list: []
|
6 |
router_settings: {}
|
|
|
1 |
general_settings: {}
|
2 |
+
litellm_settings: {}
|
|
|
|
|
3 |
model_list: []
|
4 |
router_settings: {}
|
backend/main.py
CHANGED
@@ -51,7 +51,7 @@ from apps.webui.internal.db import Session
|
|
51 |
|
52 |
|
53 |
from pydantic import BaseModel
|
54 |
-
from typing import
|
55 |
|
56 |
from apps.webui.models.auths import Auths
|
57 |
from apps.webui.models.models import Models
|
@@ -1883,7 +1883,7 @@ async def get_pipeline_valves(
|
|
1883 |
res = r.json()
|
1884 |
if "detail" in res:
|
1885 |
detail = res["detail"]
|
1886 |
-
except:
|
1887 |
pass
|
1888 |
|
1889 |
raise HTTPException(
|
@@ -2027,7 +2027,7 @@ async def get_model_filter_config(user=Depends(get_admin_user)):
|
|
2027 |
|
2028 |
class ModelFilterConfigForm(BaseModel):
|
2029 |
enabled: bool
|
2030 |
-
models:
|
2031 |
|
2032 |
|
2033 |
@app.post("/api/config/model/filter")
|
@@ -2158,7 +2158,8 @@ async def oauth_callback(provider: str, request: Request, response: Response):
|
|
2158 |
log.warning(f"OAuth callback failed, sub is missing: {user_data}")
|
2159 |
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
2160 |
provider_sub = f"{provider}@{sub}"
|
2161 |
-
|
|
|
2162 |
# We currently mandate that email addresses are provided
|
2163 |
if not email:
|
2164 |
log.warning(f"OAuth callback failed, email is missing: {user_data}")
|
@@ -2263,7 +2264,7 @@ async def get_manifest_json():
|
|
2263 |
"display": "standalone",
|
2264 |
"background_color": "#343541",
|
2265 |
"orientation": "portrait-primary",
|
2266 |
-
"icons": [{"src": "/static/logo.png", "type": "image/png", "sizes": "
|
2267 |
}
|
2268 |
|
2269 |
|
|
|
51 |
|
52 |
|
53 |
from pydantic import BaseModel
|
54 |
+
from typing import Optional
|
55 |
|
56 |
from apps.webui.models.auths import Auths
|
57 |
from apps.webui.models.models import Models
|
|
|
1883 |
res = r.json()
|
1884 |
if "detail" in res:
|
1885 |
detail = res["detail"]
|
1886 |
+
except Exception:
|
1887 |
pass
|
1888 |
|
1889 |
raise HTTPException(
|
|
|
2027 |
|
2028 |
class ModelFilterConfigForm(BaseModel):
|
2029 |
enabled: bool
|
2030 |
+
models: list[str]
|
2031 |
|
2032 |
|
2033 |
@app.post("/api/config/model/filter")
|
|
|
2158 |
log.warning(f"OAuth callback failed, sub is missing: {user_data}")
|
2159 |
raise HTTPException(400, detail=ERROR_MESSAGES.INVALID_CRED)
|
2160 |
provider_sub = f"{provider}@{sub}"
|
2161 |
+
email_claim = webui_app.state.config.OAUTH_EMAIL_CLAIM
|
2162 |
+
email = user_data.get(email_claim, "").lower()
|
2163 |
# We currently mandate that email addresses are provided
|
2164 |
if not email:
|
2165 |
log.warning(f"OAuth callback failed, email is missing: {user_data}")
|
|
|
2264 |
"display": "standalone",
|
2265 |
"background_color": "#343541",
|
2266 |
"orientation": "portrait-primary",
|
2267 |
+
"icons": [{"src": "/static/logo.png", "type": "image/png", "sizes": "500x500"}],
|
2268 |
}
|
2269 |
|
2270 |
|
backend/requirements.txt
CHANGED
@@ -11,7 +11,7 @@ python-jose==3.3.0
|
|
11 |
passlib[bcrypt]==1.7.4
|
12 |
|
13 |
requests==2.32.3
|
14 |
-
aiohttp==3.
|
15 |
|
16 |
sqlalchemy==2.0.31
|
17 |
alembic==1.13.2
|
@@ -34,12 +34,12 @@ anthropic
|
|
34 |
google-generativeai==0.7.2
|
35 |
tiktoken
|
36 |
|
37 |
-
langchain==0.2.
|
38 |
langchain-community==0.2.10
|
39 |
langchain-chroma==0.1.2
|
40 |
|
41 |
fake-useragent==1.5.1
|
42 |
-
chromadb==0.5.
|
43 |
sentence-transformers==3.0.1
|
44 |
pypdf==4.3.1
|
45 |
docx2txt==0.8
|
@@ -62,11 +62,11 @@ rank-bm25==0.2.2
|
|
62 |
|
63 |
faster-whisper==1.0.2
|
64 |
|
65 |
-
PyJWT[crypto]==2.
|
66 |
authlib==1.3.1
|
67 |
|
68 |
black==24.8.0
|
69 |
-
langfuse==2.
|
70 |
youtube-transcript-api==0.6.2
|
71 |
pytube==15.0.0
|
72 |
|
@@ -76,5 +76,5 @@ duckduckgo-search~=6.2.1
|
|
76 |
|
77 |
## Tests
|
78 |
docker~=7.1.0
|
79 |
-
pytest~=8.
|
80 |
pytest-docker~=3.1.1
|
|
|
11 |
passlib[bcrypt]==1.7.4
|
12 |
|
13 |
requests==2.32.3
|
14 |
+
aiohttp==3.10.2
|
15 |
|
16 |
sqlalchemy==2.0.31
|
17 |
alembic==1.13.2
|
|
|
34 |
google-generativeai==0.7.2
|
35 |
tiktoken
|
36 |
|
37 |
+
langchain==0.2.12
|
38 |
langchain-community==0.2.10
|
39 |
langchain-chroma==0.1.2
|
40 |
|
41 |
fake-useragent==1.5.1
|
42 |
+
chromadb==0.5.5
|
43 |
sentence-transformers==3.0.1
|
44 |
pypdf==4.3.1
|
45 |
docx2txt==0.8
|
|
|
62 |
|
63 |
faster-whisper==1.0.2
|
64 |
|
65 |
+
PyJWT[crypto]==2.9.0
|
66 |
authlib==1.3.1
|
67 |
|
68 |
black==24.8.0
|
69 |
+
langfuse==2.43.3
|
70 |
youtube-transcript-api==0.6.2
|
71 |
pytube==15.0.0
|
72 |
|
|
|
76 |
|
77 |
## Tests
|
78 |
docker~=7.1.0
|
79 |
+
pytest~=8.3.2
|
80 |
pytest-docker~=3.1.1
|
backend/start.sh
CHANGED
@@ -30,7 +30,6 @@ if [[ "${USE_CUDA_DOCKER,,}" == "true" ]]; then
|
|
30 |
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/lib/python3.11/site-packages/torch/lib:/usr/local/lib/python3.11/site-packages/nvidia/cudnn/lib"
|
31 |
fi
|
32 |
|
33 |
-
|
34 |
# Check if SPACE_ID is set, if so, configure for space
|
35 |
if [ -n "$SPACE_ID" ]; then
|
36 |
echo "Configuring for HuggingFace Space deployment"
|
|
|
30 |
export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:/usr/local/lib/python3.11/site-packages/torch/lib:/usr/local/lib/python3.11/site-packages/nvidia/cudnn/lib"
|
31 |
fi
|
32 |
|
|
|
33 |
# Check if SPACE_ID is set, if so, configure for space
|
34 |
if [ -n "$SPACE_ID" ]; then
|
35 |
echo "Configuring for HuggingFace Space deployment"
|